Im trying to parse a 300MB xml file into json in a worker nodejs app, so the client makes the request to the main web app and the server performs request to the worker server with the file location, after the worker server finished parsing the xml it saves it to a json file and return its location back to the main server.
Everything works fine with xml files under 130MB, however when it encounters a large file the worker server finished parsing the file and saves it, as soon as the response comes to the main server it receives :
{ [Error: socket hang up] code: 'ECONNRESET' }
I have tried to use timeout in the request options, but it still happening.
request
.post({
url:'http://localhost:6666/parsexml',
formData: {filePath:filePath},
json: true
},function(err,httpResponse,jsonResObj){
// throws error here
})
the xml parser worker return 200 message back to the server, and then it crashes.
Anny suggestion on how can i implement this ?
Maybe you have to use different XML parser, for example xml-stream, which can handle large (500+ MB) files.
How can you install it
Install it using the following command:
npm install -g xml-stream
How you can use it in your app
Require it in your project and pass the ReadFile object to initialize it:
var fs = require('fs');
var XmlStream = require('xml-stream');
/*
* Pass the ReadStream object to xml-stream
*/
var stream=fs.createReadStream('file_name.xml');
var xml = new XmlStream(stream);
/*
*Code continues here
*/
If you want to extract only values of, let's say, id from your XML and print them this is a code to do so:
var fs = require('fs')
var XmlStream = require('xml-stream') ;
var stream=fs.createReadStream('some-large-XML.xml');
var xml = new XmlStream(stream);
xml.preserve('id', true);
xml.collect('subitem');
xml.on('endElement: id', function(item) {
console.log(item);
});
More detailed information can be found here.
Related
I am using a node sever to send a table from a sqlite db to the browser. This table contains filename and path of a pdf file that I want to render on the browser. Until now I was using hard coded paths for the the pdf file and rendering. But now i have setup a get route and a controller in node such that whenever '/content' is hit in browser , the server queries the database and and sends the data to the client. To the send the data I am using
res.render('content/index',{data:queryData});
Now, how do I access this data using client side javascript so that I can pass the path of the pdf file to the function that renders the pdf? I have done research and the nearest answer I got was using XMLHttpRequest. I tried this method
var xhr = new XMLHttpRequest();
const path = "http://localhost:3000/content";
xhr.onreadystatechange = function () {
if (xhr.readyState == 4 && xhr.status == 200)
{
var myResponseText = xhr.responseText;
console.log(myResponseText);
}
};
xhr.open('get', path, true);
xhr.send();
When I do this I get the entire html code for the view. Not the data I expected. How do I solve this issue. I have done some more reading while writing this post and I suppose. I have set a header somewhere? But the documentation says
app.render(view, [locals], callback)
which means res.render can take local variables, shouldn't be setting the headers?
You should return json instead of render template:
app.get('content/index', (req, res) => {
res.json({data: queryData});
});
I am using pdf.js
PDF.js needs the PDF file, e.g.:
pdfjsLib.getDocument('helloworld.pdf')
I'm assuming your queryData goes something like this:
{ filename: 'file.pdf', path: './path/to/file.pdf' }
I'm not sure what's in your content/index or what path this is on, but you obviously need to find a way to make your PDF file ('./path/to/file.pdf') available (as a download). See Express's built-in static server or res.download() to do that.
Once you have the PDF file available as a download, plug that path into PDF.js's .getDocument('/content/file.pdf') and do the rest to render the PDF onto the canvas or whatever.
Hope that helps.
My app is created with mean and I am a user of docker too. The purpose of my app is to create and download a CSV file. I already created my file, compressed it and placed it in a temp folder (the file will be removed after the download). This part is in the nodejs server side and works without problems.
I already use several things like (res.download) which is supposed to download directly the file in the browser but nothing append. I tried to use blob in the angularjs part but it doesn't work.
The getData function creates and compresses the file (it exists I can reach it directly when I look where the app is saved).
exports.getData = function getData(req, res, next){
var listRequest = req.body.params.listURL;
var stringTags = req.body.params.tagString;
//The name of the compressed CSV file
var nameFile = req.body.params.fileName;
var query = url.parse(req.url, true).query;
//The function which create the file
ApollineData.getData(listRequest, stringTags, nameFile)
.then(function (response){
var filePath = '/opt/mean.js/modules/apolline/client/CSVDownload/'+response;
const file = fs.createReadStream(filePath);
res.download(filePath, response);
})
.catch(function (response){
console.log(response);
});
};
My main problem is to download this file directly in the browser without using any variable because it could be huge (like several GB). I want to download it and then delete it.
There is nothing wrong with res.download
Probably the reason why res.download don't work for you is b/c you are using AJAX to fetch the resource, Do a regular navigation. Or if it requires some post data and another method: create a form and submit.
I was just introduced to Node-Red after asking around for some suggestions on an IoT setup. I have a piece of javascript code that is sending data to a web socket. The code that it is sending is in a HEX format and is sent to the web socket.
I am trying to replicate this using node-red and I am having some trouble figuring out which node to use for sending the data.
Vanilla Javascript:
function connectToSocket() {
// Try to connect to the socket
try {
// Create our socket connection
connection = new WebSocket('ws://' + gatewayIP + ':8000');
connection.binaryType = "arraybuffer";
// Failed to create the socket connection
} catch (e) {
// Log error message
logMessage('Failed to connect to socket');
return;
}
}
connection.send('\x02\x00\x01\x04\x26\x2D');
I have tried sending this as a string and json object as msg.payload but it is not triggering the device as I expect it to such as when I run the normal JS function in a browser.
What would be an appropriate format to send this hex string in?
What you want to send is a buffer and the inject node can't generate a buffer at this point. The easiest way to do this will be to insert a function node between the inject and the WebSocket Out node.
The function node should contain something like:
msg.payload = Buffer.from("\x02\x00\x01\x04\x26\x2D");
return msg;
This will swap the payload for a buffer with the right values.
EDIT:
For NodeJS 0.10.x you should use something like as Buffer.from() was introduced in NodeJS 4.x:
msg.payload = new Buffer("\x02\x00\x01\x04\x26\x2D");
return msg;
I am using an API for a Twitch.tv streaming bot called DeepBot.
Here is the link to it on github https://github.com/DeepBot-API/client-websocket
My goal is to create a text document listing all the information pulled from the bot using the command api|get_users|. The bot's response is always a json object. How can I take the json object from the bot and save it as a text file?
Edit: My code
var WebSocket = require('ws');
var ws = new WebSocket('ws://Ip and Port/');
ws.on('open', function () {
console.log('sending API registration');
ws.send('api|register|SECRET');
});
ws.on('close', function close() {
console.log('disconnected');
});
ws.on('message', function (message) {
console.log('Received: ' + message);
});
ws.on('open', function () {
ws.send('api|get_users|');
});
Well that depends on how your setup is? You posted this under javascript. So I guess you are either:
using a browser, to make the websocket connection, in with case there is no direct way to save a file on the client. But in HTML5 you can store key,value pairs with local storage.
using node js (server side javascript) in witch case the code is as below:
some other setup, that I can't guess. in witch case you might tell a little more about it?
In browser with HTML5 capabilities:
// where msg is an object returned from the API
localStorage.setItem('Some key', JSON.stringify(msg));
In Node JS
var fs = require("fs"); // Has to be installed first with “npm install fs”
// where msg is an object returned from the API
fs.writeFile("some-file.json", JSON.stringify(msg), function (err) {
if (err) throw err;
});
Edit: OK, Thanks for clearing it up.
I believe Blag's solution is the way to go.
Good luck with your project!
If it's for a client side JS save :
Create a file in memory for user to download, not through server
and
Convert JS object to JSON string
Is what you need. ( I don't test it, but it'll look like this : )
var j = {"name":"binchen"};
var s = JSON.stringify(j);
window.location = 'data:text/plain;charset=utf-8,'+encodeURIComponent(s);
Basically, I wrote a server that response a js file(object format) to users who made the request. The js file is generated by two config file. I call them config1.js and config2.js.
Here is my code:
var express = require('express');
var app = express();
var _ = require('underscore');
app.use('/config.js', function (req, res) {
var config1 = require('config1');
var config2 = require('config2');
var config = _.extend(config1, config2);
res.set({'Content-Type': 'application/javascript'});
res.send(JSON.stringify(config));
});
For what I am understanding, every time I make a request to /config.js, it will fetch the latest code in config1 and config2 file even after I start server. However, if I start server, make some modification in config1.js. then make the request, it will still return me the old code. Can anyone explain and how to fix that? thanks
You should not use require in order to load your files because it is not its purpose, it caches the loaded file (see this post for more information), that is why you get the same content every time you make a request.
Use a tool like concat-files instead, or concat it "by hand" if you prefer.
Concat files and extend objects aren't equal operations. You can read the files via 'fs' module, parse objects, extend, and send.