I'm upldating my app layout by json loaded from api in app.js (saved by reloadLayoutsConfig function). When I want to access my data inside app everything is ok but how can I load layout inside my layoutRedux file?
export const reloadLayoutsConfig = async (newLayoutsConfig) => {
try {
await AsyncStorage.setItem("#LayoutsConfig", JSON.stringify(newLayoutsConfig));
} catch (error) {
console.log("error save user data", error);
}
};
export const getLayoutsConfig = async () => {
try {
const LayoutsConfig = await AsyncStorage.getItem("#LayoutsConfig");
return JSON.parse(LayoutsConfig);
} catch (error) {
console.log(error);
}
};
These two functions work like a charm, whenever I need a value I just do this:
getLayoutsConfig().then((LayoutsConfig) => {
this.setState({ LayoutsConfig : LayoutsConfig });
});
This is my layoutRedux:
import { getLayoutsConfig} from '#config'
const types = {
// LOTS OF CODE
}
const initialState = {
layout: [],
isFetching: false,
}
var layouts = [];
// OLD TYPE FROM STORED JSON FILE IN CODE
var layouts = [...LayoutsConfig];
// NEW WAY FROM LOADED JSON FILE FROM API
getLayoutsConfig().then((LayoutsConfig) => {
layouts = LayoutsConfig;
});
initialState.layout = layouts;
export const actions = {
// LOTS OF CODE
}
export const reducer = (state = initialState, action) => {
// LOTS OF CODE
}
I have mentioned my old and new way to access json file in same code for you to check it. In older version I used to have layout.json file in my code and when I needed to access file i just used it like this:
import { LayoutsConfig } from '#config'
var layouts = [...LayoutsConfig];
in config :
export const LayoutsConfig = AppConfig.LayoutsConfig;
but now i call it like this :
var layouts = [];
getLayoutsConfig().then((LayoutsConfig) => {
layouts = LayoutsConfig;
});
I get error like layout is not loaded. What can I do and how to call my function?
Related
Im trying to dynamically load modules from a nitro server in a nuxt app, but I get the following error:
Cannot find module projectpath/.nuxt/services/listing imported from projectpath/.nuxt/dev/index.mjs
This is the snippet of code Im using for the handler where the dynamic import should take place:
export default defineEventHandler(async (event) => {
const { method, resource, paramValue } = parseRequestResource(event.node.req)
let ServiceInstance = services[resource]
if (ServiceInstance) {
return callResourceMethod(ServiceInstance, method, paramValue, event)
} else {
try {
ServiceInstance = await import(`../services/${resource}`)
} catch (error) {
const Proto = Object.assign({}, Service.prototype, { tableName: resource })
ServiceInstance = Object.create(Proto)
services[resource] = ServiceInstance
}
return callResourceMethod(ServiceInstance, method, paramValue, event)
}
})
How can I this to work? Is there some feature that nitro/nuxt have where I can do this?
I was able to achieve this functionality by using a nitro plugin. However the files being imported need to be *.mjs.
import fs from 'fs'
import { resolve } from 'path'
export default defineNitroPlugin(async (nitroApp) => {
const __dirname = resolve()
const servicesFolderPath = `${__dirname}/server/services`
const serviceFiles = fs.readdirSync(servicesFolderPath)
const services = {}
for (const fileName of serviceFiles) {
if (fileName == '__proto__.mjs') continue
try {
const moduleName = fileName.split('.')[0]
const module = await import(`${servicesFolderPath}/${fileName}`)
services[moduleName] = module.default
} catch (error) {
console.log(error);
}
}
nitroApp.$services = services
})
I'm new to nextjs and I'm creating API on next.js to perform db update using the pg-promise. However, it always hit the WARNING: Creating a duplicate database object for the same connection on console when the app is calling the API.
I tried browsing the docs but couldn't find a solution. I also tried solution (update-2) mentioned on stackoverflow page below, but the warning still exists.
Where should I initialize pg-promise
I think the problem is on the method I used to set the columnset. However I can't find proper way to do it. How should I fix it with pg-promise ?
Db setting code:
import ConfigEnv from 'utils/configuration';
import * as pgLib from 'pg-promise';
const initOptions = {
capSQL: true,
};
const pgp = require('pg-promise')(initOptions);
interface IDatabaseScope {
db: pgLib.IDatabase<any>;
pgp: pgLib.IMain;
}
export function createSingleton<T>(name: string, create: () => T): T {
const s = Symbol.for(name);
let scope = (global as any)[s];
if (!scope) {
scope = {...create()};
(global as any)[s] = scope;
}
return scope;
}
export function getDB(): IDatabaseScope {
return createSingleton<IDatabaseScope>('my-app-db-space', () => {
return {
db: pgp(ConfigEnv.pgp),
pgp
};
});
}
API code:
import {getDB} from 'db/pgpdb';
const {db, pgp} = getDB();
const cs = new pgp.helpers.ColumnSet([
'?detail_id',
'age',
'name'
// 'last_modified_date',
], {
table: 'user_detail',
})
export default async (req, res) => {
try {
// generating the update query where it is needed:
const update = pgp.helpers.update(req.body.content, cs) + ` WHERE v.detail_id = t.detail_id`;
// executing the query
await db
.none(update)
.then(() => {
return res.status(200).end();
})
.catch((error) => {
console.log('error', error);
return res.status(500).send(error);
});
} catch (error) {
console.log(error);
}
};
How can I generate a different title on every page within a sub-directory?
My code throws no errors, but unfortunately the Title component renders every title-item on every page that it is supposed to, e.g. every app.com/title/<title> renders the same view ( a list of titles). I think that getStaticPaths is correctly parameterised, but I don't think that getStaticProps is.
export default function Title({ paper }) {
// paper is just the entire dataset, and isn't split by id or author etc.
return (
<div>
{paper.map(paper => (
<h1>{paper.data.title}</h1>
))}
</div>
)
}
export async function getStaticProps({ params }) {
// ideally, results should be split down to e.g. `/api/getPapers/title`, but this didn't work
const results = await fetch(`http://localhost:3000/api/getPapers/`).then(res => res.json());
return {
props: {
paper: results
}
}
}
export async function getStaticPaths() {
const papers = await fetch('http://localhost:3000/api/getPapers').then(res => res.json());
const paths = papers.map(paper => {
return {
params: {
authors: paper.data.title.toLowerCase().replace(/ /g, '-')
}
}
})
return {
paths,
fallback: false
}
}
This is the getPapers API function.
const faunadb = require("faunadb");
// your secret hash
const secret = process.env.FAUNADB_SECRET_KEY;
const q = faunadb.query;
const client = new faunadb.Client({ secret });
module.exports = async (req, res) => {
try {
const dbs = await client.query(
q.Map(
// iterate each item in result
q.Paginate(
// make paginatable
q.Match(
// query index
q.Index("all_research_papers") // specify source
)
),
(ref) => q.Get(ref) // lookup each result by its reference
)
);
// ok
res.status(200).json(dbs.data);
} catch (e) {
// something went wrong
res.status(500).json({ error: e.message });
}
};
My attempts to render a separate page for each document were missing a dynamic API call. I was simply hoping to render dynamic pages with a single (batched-document) API call.
Here is a typical dynamic API route called [index.js]:
const faunadb = require('faunadb')
// your secret hash
const secret = process.env.FAUNADB_SECRET_KEY
const q = faunadb.query
const client = new faunadb.Client({ secret })
export default async (req, res) => {
const {
query: { index },
} = req;
try {
const papers = await client.query(
q.Get(q.Ref(q.Collection('<name of the collection>'), index))
);
res.status(200).json(papers.data);
} catch (e) {
res.status(500).json({ error: e.message });
}
};
Once your data is being retrieved dynamically, you can set up a dynamic page route, e.g. [id].js, that fetches the data using useSWR.
const { data, error } = useSWR(`/api/getPapers/${id}`, fetcher);
This is an example fetcher function:
const fetcher = (url) => fetch(url).then((r) => r.json());
In my case, I could then retrieve any given field using the call {data.<field>}.
You are returning authors in your Path object. You will need to make sure that your page file is named with authors included. For example:
app_directory
|- pages
|- home.js
|- title
|- [authors].js
Perhaps where you say authors in your params object, you do mean title. In which case, rename the params object and page filename.
const paths = papers.map(paper => {
return {
params: {
title: paper.data.title.toLowerCase().replace(/ /g, '-')
}
}
})
app_directory
|- pages
|- home.js
|- title
|- [title].js
Here are the docs for getStaticPaths(). https://nextjs.org/docs/basic-features/data-fetching#getstaticpaths-static-generation
EDIT:
Since your API function returns the Page from your query, the shape of the result will likely be
{
before: [/* before cursor */],
after: [/* after cursor */],
data: [
{ /* paper Document */ },
{ /* paper Document */ },
{ /* paper Document */ },
]
}
In which case, your code will need to map over papers.data not on papers itself.
const paths = papers.data // select the data
.map(paper => {
return {
params: {
title: paper.data.title.toLowerCase().replace(/ /g, '-')
}
}
})
I'm beginner and tried to transfer the model to another file, it didn't work for me, suggest me how to do it correctly. The question may seem silly, but if I knew the answer, I would not ask it.
file todo.controller.js
const fs = require("fs");
const { v4: uuidv4 } = require("uuid");
const data = fs.readFileSync("./data/data.json");
let todos = JSON.parse(data);
class todoController {
async createTodo(req, res) {
req.on("data", (data) => {
const jsondata = JSON.parse(data);
const title = jsondata.title;
const description = jsondata.description;
if ((title, description)) {
todos.push({
id: uuidv4(),
title,
description,
dateOfCreate: new Date(),
lastModified: new Date(),
check: new Boolean(false),
});
fs.writeFile(
"./data/data.json",
JSON.stringify(todos, null, 2),
(err) => {
if (err) throw error;
}
);
}
});
}}
file todo.router.js
const url = require("url");
const todoController = require("../controllers/todo.controller");
const todoRouter = (req, res) => {
const urlparse = url.parse(req.url, true);
if (urlparse.pathname == "/todos" && req.method == "POST") {
todoController.createTodo(req, res);
}
};
module.exports = todoRouter;
here is file data.json
data.json
You have two separate problems here, separating your code to a different file and also saving or persisting that data somewhere, in this case a file.
You have to create something like a data model and then you have to import it in your other code.
// data.js
export const get = async () => {} // we will implement this just now
export const set = async (data) => {} // we will implement this just now
...
// controller.js
import {get, set} from './data.js' // import the methods we just created
...
const createTodo = async (req, res) => {
req.on("data", (data) => {
// here you can use get() if you want to use the data
set(JSON.stringify(data)) // send data to your data model
}
}
Then we also have to actually do something with those methods.
// data.js
export const get = async () => {
// may need to use JSON.parse here depending on how you'll use it
return fs.readFile('./data.json')
}
export const set = async (data) => {
fs.writeFile('data.json', JSON.stringify(data))
}
So the idea is to have a model responsible for managing the data, retrieving it and saving it, then importing and using those methods in the main controller. The code above isn't perfect, it's just to show you how to think about it.
I started integrating websockets into an existing React/Django app following along with this example (accompanying repo here). In that repo, the websocket interface is in websockets.js, and is implemented in containers/Chat.js.
I can get that code working correctly as-is.
I then started re-writing my implementation to use Hooks, and hit a little wall. The data flows through the socket correctly, arrives in the handler of each client correctly, and within the handler can read the correct state. Within that handler, I'm calling my useState function to update state with the incoming data.
Originally I had a problem of my single useState function within addMessage() inconsistently firing (1 in 10 times?). I split my one useState hook into two (one for current message, one for all messages). Now in addMessage() upon receiving data from the server, my setAllMessages hook will only update the client where I type the message in - no other clients. All clients receive/can log the data correctly, they just don't run the setAllMessages function.
If I push to an empty array outside the function, it works as expected. So it seems like a problem in the function update cycle, but I haven't been able to track it down.
Here's my version of websocket.js:
class WebSocketService {
static instance = null;
static getInstance() {
if (!WebSocketService.instance) {
WebSocketService.instance = new WebSocketService();
}
return WebSocketService.instance;
}
constructor() {
this.socketRef = null;
this.callbacks = {};
}
disconnect() {
this.socketRef.close();
}
connect(chatUrl) {
const path = `${URLS.SOCKET.BASE}${URLS.SOCKET.TEST}`;
this.socketRef = new WebSocket(path);
this.socketRef.onopen = () => {
console.log('WebSocket open');
};
this.socketRef.onmessage = e => {
this.socketNewMessage(e.data);
};
this.socketRef.onerror = e => {
console.log(e.message);
};
this.socketRef.onclose = () => {
this.connect();
};
}
socketNewMessage(data) {
const parsedData = JSON.parse(data);
const { command } = parsedData;
if (Object.keys(this.callbacks).length === 0) {
return;
}
Object.keys(SOCKET_COMMANDS).forEach(clientCommand => {
if (command === SOCKET_COMMANDS[clientCommand]) {
this.callbacks[command](parsedData.presentation);
}
});
}
backend_receive_data_then_post_new(message) {
this.sendMessage({
command_for_backend: 'backend_receive_data_then_post_new',
message: message.content,
from: message.from,
});
}
sendMessage(data) {
try {
this.socketRef.send(JSON.stringify({ ...data }));
} catch (err) {
console.log(err.message);
}
}
addCallbacks(allCallbacks) {
Object.keys(SOCKET_COMMANDS).forEach(command => {
this.callbacks[SOCKET_COMMANDS[command]] = allCallbacks;
});
}
state() {
return this.socketRef.readyState;
}
}
const WebSocketInstance = WebSocketService.getInstance();
export default WebSocketInstance;
And here's my version of Chat.js
export function Chat() {
const [allMessages, setAllMessages] = useState([]);
const [currMessage, setCurrMessage] = useState('');
function waitForSocketConnection(callback) {
setTimeout(() => {
if (WebSocketInstance.state() === 1) {
callback();
} else {
waitForSocketConnection(callback);
}
}, 100);
}
waitForSocketConnection(() => {
const allCallbacks = [addMessage];
allCallbacks.forEach(callback => {
WebSocketInstance.addCallbacks(callback);
});
});
/*
* This is the problem area
* `incoming` shows the correct data, and I have access to all state
* But `setAllMessages` only updates on the client I type the message into
*/
const addMessage = (incoming) => {
setAllMessages([incoming]);
};
// update with value from input
const messageChangeHandler = e => {
setCurrMessage(e.target.value);
};
// Send data to socket interface, then to server
const sendMessageHandler = e => {
e.preventDefault();
const messageObject = {
from: 'user',
content: currMessage,
};
setCurrMessage('');
WebSocketInstance.backend_receive_data_then_post_new(messageObject);
};
return (
<div>
// rendering stuff here
</div>
);
}
There is no need to rewrite everything into functional components with hooks.
You should decompose it functionally - main (parent, class/FC) for initialization and providing [data and] methods (as props) to 2 functional childrens/components responsible for rendering list and input (new message).
If you still need it ... useEffect is a key ... as all code is run on every render in functional components ... including function definitions, redefinitions, new refs, duplications in callbacks array etc.
You can try to move all once defined functions into useEffect
useEffect(() => {
const waitForSocketConnection = (callback) => {
...
}
const addMessage = (incoming) => {
setAllMessages([incoming]);
};
waitForSocketConnection(() => {
...
}
}, [] ); // <<< RUN ONCE