Parse json object when node server is started - javascript

I need to parse json object when the node server.js(which is my entry point to the program) is started ,the parse of the json file is done in diffrent module in my project.
I've two questions
Is it recommended to invoke the parse function with event in the server.js file
I read about the event.emiter but not sure how to invoke function
from different module...example will be very helpful
I've multiple JSON files
UPDATE to make it more clear
if I read 3 json file object (50 lines each) when the server/app is loaded (server.js file) this will be fast I guess. my scenario is that the list of the valid path's for the express call is in this json files
app.get('/run1', function (req, res) {
res.send('Hello World!');
});
So run1 should be defined in the json file(like white list of path's) if user put run2 which I not defined I need to provide error so I think that when the server is up to do this call and keep this obj with all config valid path and when user make a call just get this object which alreay parsed (when the server loaded ) and verify if its OK, I think its better approach instead doing this on call
UPDATE 2
I'll try explain more simple.
Lets assume that you have white list of path which you should listen,
like run1
app.get('/run1', function
Those path list are defined in jsons files inside your project under specific folder,before every call to your application via express you should verify that this path that was requested is in the path list of json. this is given. now how to do it.
Currently I've develop module which seek the json files in this and find if specific path is exist there.
Now I think that right solution is that when the node application is started to invoke this functionality and keep the list of valid paths in some object which I can access very easy during the user call and check if path there.
my question is how to provide some event to the validator module when the node app(Server.js) is up to provide this object.

If it's a part of your application initialization, then you could read and parse this JSON file synchronously, using either fs.readFileSync and JSON.parse, or require:
var config = require('path/to/my/config.json');
Just make sure that the module handling this JSON loading is required in your application root before app.listen call.
In this case JSON data will be loaded and parsed by the time you server will start, and there will be no need to trouble yourself with callbacks or event emitters.
I can't see any benefits of loading your initial config asynchronously for two reasons:
The bottleneck of JSON parsing is the parser itself, but since it's synchronous, you won't gain anything here. So, the only part you'll be able to optimize is interactions with your file system (i.e. reading data from disk).
Your application won't be able to work properly until this data will be loaded.
Update
If for some reason you can't make your initialization synchronous, you could delay starting your application until initialization is done.
The easiest solution here is to move app.listen part inside of initialization callback:
// initialization.js
var glob = require('glob')
var path = require('path')
module.exports = function initialization (done) {
var data = {}
glob('./config/*.json', function (err, files) {
if (err) throw err
files.forEach(function (file) {
var filename = path.basename(file)
data[filename] = require(file)
})
done(data);
})
}
// server.js
var initialization = require('./initialization')
var app = require('express')()
initialization(function (data) {
app.use(require('./my-middleware')(data))
app.listen(8000)
})
An alternative solution is to use simple event emitter to signal that your data is ready:
// config.js
var glob = require('glob')
var path = require('path')
var events = require('events')
var obj = new events.EventEmitter()
obj.data = {}
glob('./config/*.json', function (err, files) {
if (err) throw err
files.forEach(function (file) {
var filename = path.basename(file)
obj.data[filename] = require(file)
})
obj.emit('ready')
})
module.exports = obj
// server.js
var config = require('./config')
var app = require('express')()
app.use(require('./my-middleware'))
config.on('ready', function () {
app.listen(8000)
})

Related

What is the best way to keep a file open to read/write?

I have a local JSON file which I intent to read/write from a NodeJS electron app. I am not sure, but I believe that instead of using readFile() and writeFile(), I should get a FileHandle to avoid multiple open and close actions.
So I've tried to grab a FileHandle from fs.promises.open(), but the problem seems to be that I am unable to get a FileHandle from an existing file without truncate it and clear it to 0.
const { resolve } = require('path');
const fsPromises = require('fs').promises;
function init() {
// Save table name
this.path = resolve(__dirname, '..', 'data', `test.json`);
// Create/Open the json file
fsPromises
.open(this.path, 'wx+')
.then(fileHandle => {
// Grab file handle if the file don't exists
// because of the flag 'wx+'
this.fh = fileHandle;
})
.catch(err => {
if (err.code === 'EEXIST') {
// File exists
}
});
}
Am I doing something wrong? Are there better ways to do it?
Links:
https://nodejs.org/api/fs.html#fs_fspromises_open_path_flags_mode
https://nodejs.org/api/fs.html#fs_file_system_flags
Because JSON is a text format that has to be read or written all at once and can't be easily modified or added onto in place, you're going to have to read the whole file or write the whole file at once.
So, your simplest option will be to just use fs.promises.readFile() and fs.promises.writeFile() and let the library open the file, read/write it and close the file. Opening and closing a file in a modern OS takes advantage of disk caching so if you're reopening a file you just previously opened not long ago, it's not going to be a slow operation. Further, since nodejs performs these operations in secondary threads in libuv, it doesn't block the main thread of nodejs either so its generally not a performance issue for your server.
If you really wanted to open the file once and hold it open, you would open it for reading and writing using the r+ flag as in:
const fileHandle = await fsPromises.open(this.path, 'r+');
Reading the whole file would be simple as the new fileHandle object has a .readFile() method.
const text = await fileHandle.readFile({encoding 'utf8'});
For writing the whole file from an open filehandle, you would have to truncate the file, then write your bytes, then flush the write buffer to ensure the last bit of the data got to the disk and isn't sitting in a buffer.
await fileHandle.truncate(0); // clear previous contents
let {bytesWritten} = await fileHandle.write(mybuffer, 0, someLength, 0); // write new data
assert(bytesWritten === someLength);
await fileHandle.sync(); // flush buffering to disk

How to download a file in the browser directly from the node.js server side without any variable?

My app is created with mean and I am a user of docker too. The purpose of my app is to create and download a CSV file. I already created my file, compressed it and placed it in a temp folder (the file will be removed after the download). This part is in the nodejs server side and works without problems.
I already use several things like (res.download) which is supposed to download directly the file in the browser but nothing append. I tried to use blob in the angularjs part but it doesn't work.
The getData function creates and compresses the file (it exists I can reach it directly when I look where the app is saved).
exports.getData = function getData(req, res, next){
var listRequest = req.body.params.listURL;
var stringTags = req.body.params.tagString;
//The name of the compressed CSV file
var nameFile = req.body.params.fileName;
var query = url.parse(req.url, true).query;
//The function which create the file
ApollineData.getData(listRequest, stringTags, nameFile)
.then(function (response){
var filePath = '/opt/mean.js/modules/apolline/client/CSVDownload/'+response;
const file = fs.createReadStream(filePath);
res.download(filePath, response);
})
.catch(function (response){
console.log(response);
});
};
My main problem is to download this file directly in the browser without using any variable because it could be huge (like several GB). I want to download it and then delete it.
There is nothing wrong with res.download
Probably the reason why res.download don't work for you is b/c you are using AJAX to fetch the resource, Do a regular navigation. Or if it requires some post data and another method: create a form and submit.

fs.write json file to capture and save streaming json file

I want json stream stored in text file. When running the node server, the json file isn't appended to the json.txt file. What am I missing? Am new to to node, so be gentle..
Here is a code chunk I expect to capture the json content:
var fs = require('fs');
fs.writeFile("json.txt",{encoding:"utf8"}, function(err) {
if(err) {
console.log(err);
} else {
console.log("The file was saved!");
}
});
The issue is you aren't using the correct parameters. When calling fs.writeFile it expects a string for the filename, a buffer or string for the content, an object for the options and a callback function. What it looks like you're doing is sending the options as the second parameter when it expects a buffer or a string. Correction below;
var fs = require('fs');
fs.writeFile("json.txt", JSON.stringify({some: object}), {encoding:"utf8"}, function(err) {
if(err) {
console.log(err);
} else {
console.log("The file was saved!");
}
});
You can replace the JSON.stringify part with some plain text if you wanted to, but you specified JSON in your question so I assumed you wanted to store some object in a file
Source (NodeJS documentation)
EDIT:
The links to other questions in the comments may be more relevant if you want to add new lines to the end of the file and not completely overwrite the old one. However I made the assumption that fs.writeFile was the intended function. If that wasn't the intention, those other questions will help a lot more
UPDATE:
It seems the issue was the fact that the body wasn't being parsed, so when the POST request was going through, Node didn't have the request body. To alleviate this, during the express configuration, the following code is needed:
var bodyParser = require('body-parser');
app.use(bodyParser.json());
Uses the npm module body-parser. This will convert the JSON body to a JavaScript object, and it is accessible via req.body.

Displaying images by relative path in Node.js

I am building my first Node.js MVC app (native node, not using Express) and having trouble displaying images from my html files via relative their paths.
I'll spare you my server.js and router.js code, but here is my controller code which is basically how Im loading my views:
var fs = require("fs");
var controller = require("controller");
var load_view = 'view_home';
var context = { foo : 'bar' };
controller.load_view(res, req, fs, load_view, context);
this gets passed to...
var url = require('url');
var Handlebars = require('handlebars');
function load_view(res, req, fs, view, context, session){
// Files to be loaded, in order
var files = [
'view/elements/head.html',
'view/elements/header.html',
'view/' + view +'.html',
'view/elements/footer.html'
];
// Start read requests, each with a callback that has an extra
// argument bound to it so it will be unshifted onto the callback's
// parameter list
for(var i=0; i<files.length; ++i)
fs.readFile(files[i], handler.bind(null, i));
var count = 0;
function handler(index, err, content) {
// Make sure we don't send more than one response on error
if(count < 0) return;
if(err) {
count = -1;
console.log('Error for file: ' + files[index]);
console.log(err);
res.writeHead(500);
return res.end();
}
// Reuse our `files` array by simply replacing filenames with
// their respective content
files[index] = content;
// Check if we've read all the files and write them in order if
// we are finished
if(++count===files.length) {
res.writeHead(200, { 'Content-Type': 'text/html' });
for(var i = 0; i < files.length; ++i) {
var source = files[i].toString('utf8');
// Handlebars
var template = Handlebars.compile(source);
var html = template(context);
res.write(html); /* WRITE THE VIEW (FINALLY) */
}
res.end();
}
} // handler()
} // load()
Finally, here is my html with tag and relative path in the src attribute:
<div class="help">
<img src="../public/img/test.jpg" />
</div>
My file system as it relates to the above is as follows:
I am certain the relative path is correct but have tried all combinations of a relative path and even an absolute path. Still, no image is displayed.
Coming from a LAMP background, accessing images in the file tree is trivial but I realize the server is set up much differently here (since I was the one who set it up).
I access other files (like stylesheets) in the filesystem by creating a separate controller to load those files explicitly and access that controller using a URI. But this approach in impractical for an indeterminate number of images.
How do I access & display my images in the filesystem with Node.js?
I access other files (like stylesheets) in the filesystem by creating a separate controller to load those files explicitly and access that controller using a URI.
You need to have a controller. That's how the code translates the URL that the browser requests into a response.
But this approach in impractical for an indeterminate number of images.
Write a generic one then. Convert all URLs that don't match another controller, or which start with a certain path, or which end with .jpg, or whatever logic you like into file paths based on the URL.
The read the contents of the file and output a suitable HTTP header (including the right content-type) followed by the data.

Get updated file in each request nodejs

Basically, I wrote a server that response a js file(object format) to users who made the request. The js file is generated by two config file. I call them config1.js and config2.js.
Here is my code:
var express = require('express');
var app = express();
var _ = require('underscore');
app.use('/config.js', function (req, res) {
var config1 = require('config1');
var config2 = require('config2');
var config = _.extend(config1, config2);
res.set({'Content-Type': 'application/javascript'});
res.send(JSON.stringify(config));
});
For what I am understanding, every time I make a request to /config.js, it will fetch the latest code in config1 and config2 file even after I start server. However, if I start server, make some modification in config1.js. then make the request, it will still return me the old code. Can anyone explain and how to fix that? thanks
You should not use require in order to load your files because it is not its purpose, it caches the loaded file (see this post for more information), that is why you get the same content every time you make a request.
Use a tool like concat-files instead, or concat it "by hand" if you prefer.
Concat files and extend objects aren't equal operations. You can read the files via 'fs' module, parse objects, extend, and send.

Categories