How send huge datapool in response? - javascript

I am new in node.js ecosystem and need some help.
I have controller which triggered when user call URL.
async function get_all_information(req, res, next) {
try {
let start_date = req.query.start_date;
let end_date = req.query.end_date;
const binds = {};
binds.start_date = start_date;
binds.end_date = end_date;
let query = `SOME LONG SQL STATEMENT`;
await pool.query(query, binds, function (error, results) {
if (error) throw error;
console.log(results); // ~ 10 seconds
res.send(JSON.stringify(results)); // ~ 15 seconds
});
} catch (error) {
next(error);
}
}
I tried to use this code but faced the problem. Monthly datapool which return a database has 227011 rows. It seems like stringify method create too huge JSON file. The Postman application just crash when I tried to test. I tried to analyze and notice that daily datapool create ~ 13 MB JSON file. We can say that monthly datapool could create ~ 400 MB JSON file.
Then I tried to streaming query rows like that:
pool.query('HUGE SQL QUERY').stream({ highWaterMark: 5 }).pipe(res);
Unfortunatly such code raise error:
TypeError [ERR_INVALID_ARG_TYPE]: The first argument must be one of type string or Buffer. Received type object
Can someone show me how correctly send huge data from MySQL database in response?
#akram I used fs package as you adviced. I use next code to create JSON file:
await pool.query(sql_statement, binds, function (error, results) {
if (error) throw error;
fs.writeFile("information.json", results, 'utf8', function (error) {
if(error) throw error;
});
});
This code create json file which has ~ 3.5 MB size for monthly datapool. In editor I have next message:
This document contains very long lines. Soft wraps were forcibly enable to improve editor performance.
Also that json file contains:
It seems to me too strange.

Try to use JSONStream https://github.com/dominictarr/JSONStream, i think can resolve your issue, there is a lof of example in doc of JSONStream

Related

Handling JSON Parsing Error in Node.js Stream

I'm working on a small simple Node.js project that I expect to autonomously work with a public-facing API and my mongoDB server. For the most part everything is working smoothly and I don't expect any problem hitting the API once an hour.
However, there are times which instead of returning the body of data the API returns a JSON-formatted error statement (see below). In these cases JSON.parse(data) throws an exception because the format is already in JSON (and not the buffered bytes expected). Since this is a portfolio piece and others will be looking at the code, I want to be as prepared as possible for conditions I can't foresee overtime but haven't found a good way of approaching the parsing problem yet.
At first I had assumed that .on("error", ... ) would handle this error but I guess because it's the JSON parser and not the stream which catches the exception it will have to be handled there. I'm happy to just break out of readStream and get the data next hour but haven't found a configuration of try/catch that will do this. What are some ways to handle this exception so that it doesn't crash the server?
function readStream(stream) {
return new Promise((resolve, reject) => {
let saved = [];
stream
.on('data', function (data) {
console.log('Debug | readStream data: ', data); //See Output Below
const json = JSON.parse(data); // Exception here Line 37
saved.push(json.data);
if (saved.length > 4) {
stream.destroy();
}
})
.on('close', () => {
resolve(saved);
})
.on('error', (error) => reject(error))
.on('timeout', (error) => reject(error));
});
}
// API sometimes returns
{
title: 'ConnectionException',
detail: 'This stream is currently at the maximum allowed connection limit.',
connection_issue: 'TooManyConnections',
type: URL
}
//Exception Text from console
undefined:1
[object Object]
^
SyntaxError: Unexpected token o in JSON at position 1
at JSON.parse (<anonymous>)
at PassThrough.<anonymous> (/Users/?????/Documents/JavaScript-Projects/website-dashbaord1-nodejs/src/API.js:37:23)

How to compress json response size in Firebase cloud functions?

i have a firebase callable function that takes data from a sql cloud database (mysql).
this is the code:
// Get records
exports.getRecords = functions.region("europe-west1").https.onCall((data, context) => {
// get data
const table_name = data.table_name
return mysqlPromise.createPool(db.connectionOptions)
.then(pool => {
return pool.query(`
SELECT * FROM ${table_name};`
)
})
.then(res => {
return { result: res };
})
.catch(error => {
throw new functions.https.HttpsError('failed-precondition', error);
});
});
If I run the function on a table with few rows (up to 10000) it works perfectly, but if the table is bigger (100K records) the execution fails showing a "CORS error".
Analyzing the response, I noticed that the response size on a query that requires 100K rows is around 25MB.
I know that the limit quota of the firebase functions is 10MB for each response, and I am pretty sure that the error is due to that (even if CORS sounds strange).
The question is:
"Is there any way to run that query that returns a 100k rows json to me?"
maybe I have to compress the response or else ... what is the best solution?
Thanks in advance! :)

How to call a large 10000 row data with maintain speed in nodejs

pool.getConnection(function (err, connection) {
connection.query("SELECT * FROM ALLURELIBRARY", function (err, rows) {
connection.release();
if (err) throw err;
console.log(rows);
res.render('index', { title: 'AllureCostCenter',data:rows });
});
});
This request give me 10,000 data from my sql cloud. It takes about 5 to 10 second to process. Would you please tell me a better way to print this large amount of data in nodejs without time delaying
The more data you have the more longer it will take to retrieve. It's a regular behavior. If you are fixing it for 10.000 today, you i'll get the same problem tomorrow with 15.000.
Instead of performing one request and wait for all data to get load. You can use of cursors. Cursors allow you to retrieve some data, treat them and do it again until you treated all data.
Here is the cursors documentation of google-cloud.

Node (Express) - Trying to save a PDF from an API call

I've tried all sorts to get this to work. I'm trying to request a PDF from an API on node, then send this back to the client who called it to begin with.
For the minute I just want to successfully save and view the PDF on the node server.
The issue is the PDF file is always empty when I open it (Even though it has a size of 30kb).
The basic flow is like this (removed a few bits, but the below code works and returns me the PDF fine)
// We pass through session ID's, request dates etc through in body
app.post("/getPayslipURL", function(client_request, res) {
// create request, which will simply pass on the data to the database (In order to get the NI number we need for the pay API)
const NI_NUMBER_REQUEST = db_api.createRequestTemplate({
body: JSON.stringify(client_request.body)
});
// Create a chain of HTTPS Requests, Starting with our call to the DB
requestPromise(NI_NUMBER_REQUEST)
.then((db_response) => {
const PAY_API_OPTIONS = /*Code to generate options based on furhter DB info (Includes dates etc)*/
return requestPromise(PAY_API_OPTIONS); // Call pay API
})
.then((pay_pdf_data) => {
console.log(typeof pay_pdf_data); // It's a string
// At this point I can log pay_pdf_data, But if I try to save it to file it's always empty
// No matter how I encode it etc
fs.writeFile("./test.pdf", pay_pdf_data, 'binary', function(err) {
if(err) {
return console.log(err);
}
console.log("The file was saved!");
});
})
.catch(err => `Error caught: ${console.log}`) // Catch any errors on our request chain
});
}
I've tried saving with / without the binary flag as suggested in other posts in both the file save aswell as within the requests itself. Also various types of decoding methods have been tried, I always get an empty PDF saved.
My return data looks like this (is much bigger, when saved as test.pdf I get a 30kb file as before mentioned)
%PDF-1.4
%����
1 0 obj
0 obj
<
I've found a post which says about piping the data all the way through, I have a feeling my pdf_data is corrupted when getting converted to a string
Any ideas how would I go about doing this with the current setup?
e/ RequestPromise is a library, could also use the standards request library if it's easier
https://github.com/request/request-promise -
https://github.com/request/request
Thanks!
Your code doesn't work because the underlying request library (used by request-promise) requires the option encoding set to null for binary data - see https://github.com/request/request#requestoptions-callback.
Here's how you download binary data using that module -
app.post("/getPayslipURL", function(client_request, res) {
const NI_NUMBER_REQUEST = db_api.createRequestTemplate({
body: JSON.stringify(client_request.body),
encoding: null
});
requestPromise(NI_NUMBER_REQUEST)
.then((db_response) => {
const PAY_API_OPTIONS = /*Code to generate options based on furhter DB info (Includes dates etc)*/
return requestPromise(PAY_API_OPTIONS); // Call pay API
})
.then((pay_pdf_data) => {
fs.writeFile("./test.pdf", pay_pdf_data, 'binary', (err) => {
if(err) {
return console.log(err);
}
console.log("The file was saved!");
});
})
.catch(err => `Error caught: ${console.log}`) // Catch any errors on our request chain
});
}

writestream and express for json object?

I might be out of depth but I really need something to work. I think a write/read stream will solve both my issues but I dont quite understand the syntax or whats required for it to work.
I read the stream handbook and thought i understood some of the basics but when I try to apply it to my situation, it seems to break down.
Currently I have this as the crux of my information.
function readDataTop (x) {
console.log("Read "+x[6]+" and Sent Cached Top Half");
jf.readFile( "loadedreports/top"+x[6], 'utf8', function (err, data) {
resT = data
});
};
Im using Jsonfile plugin for node which basically shortens the fs.write and makes it easier to write instead of constantly writing catch and try blocks for the fs.write and read.
Anyways, I want to implement a stream here but I am unsure of what would happen to my express end and how the object will be received.
I assume since its a stream express wont do anything to the object until it receives it? Or would I have to write a callback to also make sure when my function is called, the stream is complete before express sends the object off to fullfill the ajax request?
app.get('/:report/top', function(req, res) {
readDataTop(global[req.params.report]);
res.header("Content-Type", "application/json; charset=utf-8");
res.header("Cache-Control", "max-age=3600");
res.json(resT);
resT = 0;
});
I am hoping if I change the read part to a stream it will allievate two problems. The issue of sometimes receiving impartial json files when the browser makes the ajax call due to the read speed of larger json objects. (This might be the callback issue i need to solve but a stream should make it more consistent).
Then secondly when I load this node app, it needs to run 30+ write files while it gets the data from my DB. The goal was to disconnect the browser from the db side so node acts as the db by reading and writing. This due to an old SQL server that is being bombarded by a lot of requests already (stale data isnt an issue).
Any help on the syntax here?
Is there a tutorial I can see in code of someone piping an response into a write stream? (the mssql node I use puts the SQL response into an object and I need in JSON format).
function getDataTop (x) {
var connection = new sql.Connection(config, function(err) {
var request = new sql.Request(connection);
request.query(x[0], function(err, topres) {
jf.writeFile( "loadedreports/top"+x[6], topres, function(err) {
if(err) {
console.log(err);
} else {
console.log(x[6]+" top half was saved!");
}
});
});
});
};
Your problem is that you're not waiting for the file to load before sending the response. Use a callback:
function readDataTop(x, cb) {
console.log('Read ' + x[6] + ' and Sent Cached Top Half');
jf.readFile('loadedreports/top' + x[6], 'utf8', cb);
};
// ...
app.get('/:report/top', function(req, res) {
// you should really avoid using globals like this ...
readDataTop(global[req.params.report], function(err, obj) {
// setting the content-type is automatically done by `res.json()`
// cache the data here in-memory if you need to and check for its existence
// before `readDataTop`
res.header('Cache-Control', 'max-age=3600');
res.json(obj);
});
});

Categories