How to make a model in another file - javascript

I'm beginner and tried to transfer the model to another file, it didn't work for me, suggest me how to do it correctly. The question may seem silly, but if I knew the answer, I would not ask it.
file todo.controller.js
const fs = require("fs");
const { v4: uuidv4 } = require("uuid");
const data = fs.readFileSync("./data/data.json");
let todos = JSON.parse(data);
class todoController {
async createTodo(req, res) {
req.on("data", (data) => {
const jsondata = JSON.parse(data);
const title = jsondata.title;
const description = jsondata.description;
if ((title, description)) {
todos.push({
id: uuidv4(),
title,
description,
dateOfCreate: new Date(),
lastModified: new Date(),
check: new Boolean(false),
});
fs.writeFile(
"./data/data.json",
JSON.stringify(todos, null, 2),
(err) => {
if (err) throw error;
}
);
}
});
}}
file todo.router.js
const url = require("url");
const todoController = require("../controllers/todo.controller");
const todoRouter = (req, res) => {
const urlparse = url.parse(req.url, true);
if (urlparse.pathname == "/todos" && req.method == "POST") {
todoController.createTodo(req, res);
}
};
module.exports = todoRouter;
here is file data.json
data.json

You have two separate problems here, separating your code to a different file and also saving or persisting that data somewhere, in this case a file.
You have to create something like a data model and then you have to import it in your other code.
// data.js
export const get = async () => {} // we will implement this just now
export const set = async (data) => {} // we will implement this just now
...
// controller.js
import {get, set} from './data.js' // import the methods we just created
...
const createTodo = async (req, res) => {
req.on("data", (data) => {
// here you can use get() if you want to use the data
set(JSON.stringify(data)) // send data to your data model
}
}
Then we also have to actually do something with those methods.
// data.js
export const get = async () => {
// may need to use JSON.parse here depending on how you'll use it
return fs.readFile('./data.json')
}
export const set = async (data) => {
fs.writeFile('data.json', JSON.stringify(data))
}
So the idea is to have a model responsible for managing the data, retrieving it and saving it, then importing and using those methods in the main controller. The code above isn't perfect, it's just to show you how to think about it.

Related

pg-promise duplicate connection warning on console when set new column set

I'm new to nextjs and I'm creating API on next.js to perform db update using the pg-promise. However, it always hit the WARNING: Creating a duplicate database object for the same connection on console when the app is calling the API.
I tried browsing the docs but couldn't find a solution. I also tried solution (update-2) mentioned on stackoverflow page below, but the warning still exists.
Where should I initialize pg-promise
I think the problem is on the method I used to set the columnset. However I can't find proper way to do it. How should I fix it with pg-promise ?
Db setting code:
import ConfigEnv from 'utils/configuration';
import * as pgLib from 'pg-promise';
const initOptions = {
capSQL: true,
};
const pgp = require('pg-promise')(initOptions);
interface IDatabaseScope {
db: pgLib.IDatabase<any>;
pgp: pgLib.IMain;
}
export function createSingleton<T>(name: string, create: () => T): T {
const s = Symbol.for(name);
let scope = (global as any)[s];
if (!scope) {
scope = {...create()};
(global as any)[s] = scope;
}
return scope;
}
export function getDB(): IDatabaseScope {
return createSingleton<IDatabaseScope>('my-app-db-space', () => {
return {
db: pgp(ConfigEnv.pgp),
pgp
};
});
}
API code:
import {getDB} from 'db/pgpdb';
const {db, pgp} = getDB();
const cs = new pgp.helpers.ColumnSet([
'?detail_id',
'age',
'name'
// 'last_modified_date',
], {
table: 'user_detail',
})
export default async (req, res) => {
try {
// generating the update query where it is needed:
const update = pgp.helpers.update(req.body.content, cs) + ` WHERE v.detail_id = t.detail_id`;
// executing the query
await db
.none(update)
.then(() => {
return res.status(200).end();
})
.catch((error) => {
console.log('error', error);
return res.status(500).send(error);
});
} catch (error) {
console.log(error);
}
};

How can I generate a separate NextJS page for each FaunaDB Document?

How can I generate a different title on every page within a sub-directory?
My code throws no errors, but unfortunately the Title component renders every title-item on every page that it is supposed to, e.g. every app.com/title/<title> renders the same view ( a list of titles). I think that getStaticPaths is correctly parameterised, but I don't think that getStaticProps is.
export default function Title({ paper }) {
// paper is just the entire dataset, and isn't split by id or author etc.
return (
<div>
{paper.map(paper => (
<h1>{paper.data.title}</h1>
))}
</div>
)
}
export async function getStaticProps({ params }) {
// ideally, results should be split down to e.g. `/api/getPapers/title`, but this didn't work
const results = await fetch(`http://localhost:3000/api/getPapers/`).then(res => res.json());
return {
props: {
paper: results
}
}
}
export async function getStaticPaths() {
const papers = await fetch('http://localhost:3000/api/getPapers').then(res => res.json());
const paths = papers.map(paper => {
return {
params: {
authors: paper.data.title.toLowerCase().replace(/ /g, '-')
}
}
})
return {
paths,
fallback: false
}
}
This is the getPapers API function.
const faunadb = require("faunadb");
// your secret hash
const secret = process.env.FAUNADB_SECRET_KEY;
const q = faunadb.query;
const client = new faunadb.Client({ secret });
module.exports = async (req, res) => {
try {
const dbs = await client.query(
q.Map(
// iterate each item in result
q.Paginate(
// make paginatable
q.Match(
// query index
q.Index("all_research_papers") // specify source
)
),
(ref) => q.Get(ref) // lookup each result by its reference
)
);
// ok
res.status(200).json(dbs.data);
} catch (e) {
// something went wrong
res.status(500).json({ error: e.message });
}
};
My attempts to render a separate page for each document were missing a dynamic API call. I was simply hoping to render dynamic pages with a single (batched-document) API call.
Here is a typical dynamic API route called [index.js]:
const faunadb = require('faunadb')
// your secret hash
const secret = process.env.FAUNADB_SECRET_KEY
const q = faunadb.query
const client = new faunadb.Client({ secret })
export default async (req, res) => {
const {
query: { index },
} = req;
try {
const papers = await client.query(
q.Get(q.Ref(q.Collection('<name of the collection>'), index))
);
res.status(200).json(papers.data);
} catch (e) {
res.status(500).json({ error: e.message });
}
};
Once your data is being retrieved dynamically, you can set up a dynamic page route, e.g. [id].js, that fetches the data using useSWR.
const { data, error } = useSWR(`/api/getPapers/${id}`, fetcher);
This is an example fetcher function:
const fetcher = (url) => fetch(url).then((r) => r.json());
In my case, I could then retrieve any given field using the call {data.<field>}.
You are returning authors in your Path object. You will need to make sure that your page file is named with authors included. For example:
app_directory
|- pages
|- home.js
|- title
|- [authors].js
Perhaps where you say authors in your params object, you do mean title. In which case, rename the params object and page filename.
const paths = papers.map(paper => {
return {
params: {
title: paper.data.title.toLowerCase().replace(/ /g, '-')
}
}
})
app_directory
|- pages
|- home.js
|- title
|- [title].js
Here are the docs for getStaticPaths(). https://nextjs.org/docs/basic-features/data-fetching#getstaticpaths-static-generation
EDIT:
Since your API function returns the Page from your query, the shape of the result will likely be
{
before: [/* before cursor */],
after: [/* after cursor */],
data: [
{ /* paper Document */ },
{ /* paper Document */ },
{ /* paper Document */ },
]
}
In which case, your code will need to map over papers.data not on papers itself.
const paths = papers.data // select the data
.map(paper => {
return {
params: {
title: paper.data.title.toLowerCase().replace(/ /g, '-')
}
}
})

Missing data when recording model into JSON file

I'm beginner and trying to save the model todoModel into data.json, but some elements (title, description) are not saved.
app.js
const http = require("http");
const todoRouter = require("./routes/todo.router");
const server = http.createServer(todoRouter);
const PORT = process.env.PORT || 3000;
server.listen(PORT, () =>
console.log(`Server listening on http://localhost:${PORT}`)
);
todo.router.js
const url = require("url");
const todoController = require("../controllers/todo.controller");
const todoRouter = (req, res) => {
const urlparse = url.parse(req.url, true);
if (urlparse.pathname == "/todos" && req.method == "POST") {
todoController.createTodo(req, res);
}
};
module.exports = todoRouter;
todo.controller.js
const fs = require("fs");
class todoController {
async createTodo(req, res) {
req.on("data", (data) => {
if (data) {
todos.push(todoModel);
fs.writeFile(
"./data/data.json",
JSON.stringify(todos, null, 2),
(err) => {
if (err) throw error;
}
);
}
});
}}
todo.model.js
const { v4: uuidv4 } = require("uuid");
const fs = require("fs");
const data = fs.readFileSync("./data/data.json");
const jsondata = JSON.parse(data);
const title = jsondata.title;
const description = jsondata.description;
const todoModel ={
id: uuidv4(),
title,
description,
dateOfCreate: new Date(),
lastModified: new Date(),
check: new Boolean(false),
};
module.exports = todoModel;
Saved model todoModel in data.json looks like that:
[
{
"id": "cb996b22-d9d8-49ee-8e35-6f8bfc005268",
"dateOfCreate": "2021-11-06T14:53:28.608Z",
"lastModified": "2021-11-06T14:53:28.608Z",
"check": false
}
]
Well i recommend to just export functions and constants and not variables or object witch can change. However i would rather use an function here
function getTodoModel() {
const data = fs.readFileSync("./data/data.json");
const jsondata = JSON.parse(data);
const title = jsondata.title;
const description = jsondata.description;
return {
id: uuidv4(),
title,
description,
dateOfCreate: new Date(),
lastModified: new Date(),
check: new Boolean(false),
}
}
module.exports = getTodoModel
Later you simply import your function and call it:
let todoModel = getTodoModel()
The variables will get evaluated
Your values title and description don't have assigned a key in your todoModel. Just assign these values to proper keys in your object.
const todoModel ={
id: uuidv4(),
title: title, // add key for title
description: description, // and here add key for description
dateOfCreate: new Date(),
lastModified: new Date(),
check: new Boolean(false),
};
But the problem doesn't stop just here. Your problem is that you read the JSON file, load its content and then write the content back without any modifications. In the file todo.controller.js, you load the todoModel, which loads the content of the file. Then you add this content to todos array and write it right back to the file. You don't do any modification to todoModel, so what it does is pretty much nothing.

In Node - How can I modify this code so that it pipes to a new readable stream

I have a function that should take in a filestream, parse a csv file and do some data transformation, then it should pipe the transformed data to a new readable stream for another function to accept as an input param.
Currently I have this:
const fs = require('fs')
const csvParse = require('csv-parse')
const transform = require('stream-transform')
let filePath
filePath = filePath || ''
export const stream = (filePath) => {
const readerStream = fs.createReadStream(filePath)
return readerStream
.pipe(csvParse({relax_column_count: true}))
.pipe(transformer)
}
const transformer = transform(function(record,cb) {
record.forEach((cc, i, self) => {
...do data transformation logic here
})
cb(null, record.join(','))
})
the function that should accept a file stream looks like:
import {stream} from '../csvStream'
import * as formData from 'form-data'
import { createReadStream } from 'fs'
public static async uploadContent(
file:any
) { ...some logic here
.then(async (_err, _info) => {
try {
const form = new FormData()
...some logic here
const isCsv = fileName.slice(fileName.length-3) === 'csv'
if(isCsv) {
form.append('VersionData', stream(file.path), {
contentType: 'application/octet-stream',
})
} else {
form.append('VersionData', createReadStream(file.path), {
contentType: 'application/octet-stream',
})
}
} catch(e) {
console.log(e)
throw(e)
}
})
}
i've only included the relevant code in the second function since it is lengthy. The else portion succeeds while the code inside of the if statement just hangs. the issue is somewhere in my first code block, i dont think it is piping a readStream correctly.

How to access saved json file by AsyncStorage in layoutRedux file?

I'm upldating my app layout by json loaded from api in app.js (saved by reloadLayoutsConfig function). When I want to access my data inside app everything is ok but how can I load layout inside my layoutRedux file?
export const reloadLayoutsConfig = async (newLayoutsConfig) => {
try {
await AsyncStorage.setItem("#LayoutsConfig", JSON.stringify(newLayoutsConfig));
} catch (error) {
console.log("error save user data", error);
}
};
export const getLayoutsConfig = async () => {
try {
const LayoutsConfig = await AsyncStorage.getItem("#LayoutsConfig");
return JSON.parse(LayoutsConfig);
} catch (error) {
console.log(error);
}
};
These two functions work like a charm, whenever I need a value I just do this:
getLayoutsConfig().then((LayoutsConfig) => {
this.setState({ LayoutsConfig : LayoutsConfig });
});
This is my layoutRedux:
import { getLayoutsConfig} from '#config'
const types = {
// LOTS OF CODE
}
const initialState = {
layout: [],
isFetching: false,
}
var layouts = [];
// OLD TYPE FROM STORED JSON FILE IN CODE
var layouts = [...LayoutsConfig];
// NEW WAY FROM LOADED JSON FILE FROM API
getLayoutsConfig().then((LayoutsConfig) => {
layouts = LayoutsConfig;
});
initialState.layout = layouts;
export const actions = {
// LOTS OF CODE
}
export const reducer = (state = initialState, action) => {
// LOTS OF CODE
}
I have mentioned my old and new way to access json file in same code for you to check it. In older version I used to have layout.json file in my code and when I needed to access file i just used it like this:
import { LayoutsConfig } from '#config'
var layouts = [...LayoutsConfig];
in config :
export const LayoutsConfig = AppConfig.LayoutsConfig;
but now i call it like this :
var layouts = [];
getLayoutsConfig().then((LayoutsConfig) => {
layouts = LayoutsConfig;
});
I get error like layout is not loaded. What can I do and how to call my function?

Categories