Basically, I wrote a server that response a js file(object format) to users who made the request. The js file is generated by two config file. I call them config1.js and config2.js.
Here is my code:
var express = require('express');
var app = express();
var _ = require('underscore');
app.use('/config.js', function (req, res) {
var config1 = require('config1');
var config2 = require('config2');
var config = _.extend(config1, config2);
res.set({'Content-Type': 'application/javascript'});
res.send(JSON.stringify(config));
});
For what I am understanding, every time I make a request to /config.js, it will fetch the latest code in config1 and config2 file even after I start server. However, if I start server, make some modification in config1.js. then make the request, it will still return me the old code. Can anyone explain and how to fix that? thanks
You should not use require in order to load your files because it is not its purpose, it caches the loaded file (see this post for more information), that is why you get the same content every time you make a request.
Use a tool like concat-files instead, or concat it "by hand" if you prefer.
Concat files and extend objects aren't equal operations. You can read the files via 'fs' module, parse objects, extend, and send.
Related
My app is created with mean and I am a user of docker too. The purpose of my app is to create and download a CSV file. I already created my file, compressed it and placed it in a temp folder (the file will be removed after the download). This part is in the nodejs server side and works without problems.
I already use several things like (res.download) which is supposed to download directly the file in the browser but nothing append. I tried to use blob in the angularjs part but it doesn't work.
The getData function creates and compresses the file (it exists I can reach it directly when I look where the app is saved).
exports.getData = function getData(req, res, next){
var listRequest = req.body.params.listURL;
var stringTags = req.body.params.tagString;
//The name of the compressed CSV file
var nameFile = req.body.params.fileName;
var query = url.parse(req.url, true).query;
//The function which create the file
ApollineData.getData(listRequest, stringTags, nameFile)
.then(function (response){
var filePath = '/opt/mean.js/modules/apolline/client/CSVDownload/'+response;
const file = fs.createReadStream(filePath);
res.download(filePath, response);
})
.catch(function (response){
console.log(response);
});
};
My main problem is to download this file directly in the browser without using any variable because it could be huge (like several GB). I want to download it and then delete it.
There is nothing wrong with res.download
Probably the reason why res.download don't work for you is b/c you are using AJAX to fetch the resource, Do a regular navigation. Or if it requires some post data and another method: create a form and submit.
I'm having problems with using 'pairs' in my java script file, more specifically whenever I run the code pairs is not defined. I have tried everything i can think of and this is what i think should work:
This is my node.js file:
const fs = require('fs');
let data = fs.readFileSync('inputs.txt', 'utf8').split('\n');
global.pairs = data;
And this is my java script file:
console.log(pairs)
Obviosly it doesn't work and instead of an array of inputs i get an error. I cannot just do the console.log in node.js because i need the inputs for other things that cannot be done in node. Thank you for your help.
So, I'm having trouble parsing a PDF file from a GET request, I'm using the pdf2text lib.
I can parse files normally from the file path:
var pdfText = require('pdf2Text');
var fs = require('fs');
var buffer = fs.readFileSync('C:/myPdf.pdf');
pdfText(buffer).then(function(result){
// do some stuff with the result here
})
But I'm not sure how to get the buffer from a HTTP request, I tried to do a new Buffer(response) but it didn't work (I'm using the request-promise library by the way). Keep in mind that I don't really want to save the file, just read it as buffer.
EDIT: What I'm trying to do:
request('http://blabla.com/pdfs/myPdf.pdf').then(function(response){
var buffer = new Buffer(response);
pdfText(buffer).then(function(result){
// doesn't work with this buffer
})
});
I guess this probably isn't doable with the request-promise and I should use the standard request, but still I'm not sure what I'm supposed to do.
Im trying to parse a 300MB xml file into json in a worker nodejs app, so the client makes the request to the main web app and the server performs request to the worker server with the file location, after the worker server finished parsing the xml it saves it to a json file and return its location back to the main server.
Everything works fine with xml files under 130MB, however when it encounters a large file the worker server finished parsing the file and saves it, as soon as the response comes to the main server it receives :
{ [Error: socket hang up] code: 'ECONNRESET' }
I have tried to use timeout in the request options, but it still happening.
request
.post({
url:'http://localhost:6666/parsexml',
formData: {filePath:filePath},
json: true
},function(err,httpResponse,jsonResObj){
// throws error here
})
the xml parser worker return 200 message back to the server, and then it crashes.
Anny suggestion on how can i implement this ?
Maybe you have to use different XML parser, for example xml-stream, which can handle large (500+ MB) files.
How can you install it
Install it using the following command:
npm install -g xml-stream
How you can use it in your app
Require it in your project and pass the ReadFile object to initialize it:
var fs = require('fs');
var XmlStream = require('xml-stream');
/*
* Pass the ReadStream object to xml-stream
*/
var stream=fs.createReadStream('file_name.xml');
var xml = new XmlStream(stream);
/*
*Code continues here
*/
If you want to extract only values of, let's say, id from your XML and print them this is a code to do so:
var fs = require('fs')
var XmlStream = require('xml-stream') ;
var stream=fs.createReadStream('some-large-XML.xml');
var xml = new XmlStream(stream);
xml.preserve('id', true);
xml.collect('subitem');
xml.on('endElement: id', function(item) {
console.log(item);
});
More detailed information can be found here.
I need to parse json object when the node server.js(which is my entry point to the program) is started ,the parse of the json file is done in diffrent module in my project.
I've two questions
Is it recommended to invoke the parse function with event in the server.js file
I read about the event.emiter but not sure how to invoke function
from different module...example will be very helpful
I've multiple JSON files
UPDATE to make it more clear
if I read 3 json file object (50 lines each) when the server/app is loaded (server.js file) this will be fast I guess. my scenario is that the list of the valid path's for the express call is in this json files
app.get('/run1', function (req, res) {
res.send('Hello World!');
});
So run1 should be defined in the json file(like white list of path's) if user put run2 which I not defined I need to provide error so I think that when the server is up to do this call and keep this obj with all config valid path and when user make a call just get this object which alreay parsed (when the server loaded ) and verify if its OK, I think its better approach instead doing this on call
UPDATE 2
I'll try explain more simple.
Lets assume that you have white list of path which you should listen,
like run1
app.get('/run1', function
Those path list are defined in jsons files inside your project under specific folder,before every call to your application via express you should verify that this path that was requested is in the path list of json. this is given. now how to do it.
Currently I've develop module which seek the json files in this and find if specific path is exist there.
Now I think that right solution is that when the node application is started to invoke this functionality and keep the list of valid paths in some object which I can access very easy during the user call and check if path there.
my question is how to provide some event to the validator module when the node app(Server.js) is up to provide this object.
If it's a part of your application initialization, then you could read and parse this JSON file synchronously, using either fs.readFileSync and JSON.parse, or require:
var config = require('path/to/my/config.json');
Just make sure that the module handling this JSON loading is required in your application root before app.listen call.
In this case JSON data will be loaded and parsed by the time you server will start, and there will be no need to trouble yourself with callbacks or event emitters.
I can't see any benefits of loading your initial config asynchronously for two reasons:
The bottleneck of JSON parsing is the parser itself, but since it's synchronous, you won't gain anything here. So, the only part you'll be able to optimize is interactions with your file system (i.e. reading data from disk).
Your application won't be able to work properly until this data will be loaded.
Update
If for some reason you can't make your initialization synchronous, you could delay starting your application until initialization is done.
The easiest solution here is to move app.listen part inside of initialization callback:
// initialization.js
var glob = require('glob')
var path = require('path')
module.exports = function initialization (done) {
var data = {}
glob('./config/*.json', function (err, files) {
if (err) throw err
files.forEach(function (file) {
var filename = path.basename(file)
data[filename] = require(file)
})
done(data);
})
}
// server.js
var initialization = require('./initialization')
var app = require('express')()
initialization(function (data) {
app.use(require('./my-middleware')(data))
app.listen(8000)
})
An alternative solution is to use simple event emitter to signal that your data is ready:
// config.js
var glob = require('glob')
var path = require('path')
var events = require('events')
var obj = new events.EventEmitter()
obj.data = {}
glob('./config/*.json', function (err, files) {
if (err) throw err
files.forEach(function (file) {
var filename = path.basename(file)
obj.data[filename] = require(file)
})
obj.emit('ready')
})
module.exports = obj
// server.js
var config = require('./config')
var app = require('express')()
app.use(require('./my-middleware'))
config.on('ready', function () {
app.listen(8000)
})