How to compress json response size in Firebase cloud functions? - javascript

i have a firebase callable function that takes data from a sql cloud database (mysql).
this is the code:
// Get records
exports.getRecords = functions.region("europe-west1").https.onCall((data, context) => {
// get data
const table_name = data.table_name
return mysqlPromise.createPool(db.connectionOptions)
.then(pool => {
return pool.query(`
SELECT * FROM ${table_name};`
)
})
.then(res => {
return { result: res };
})
.catch(error => {
throw new functions.https.HttpsError('failed-precondition', error);
});
});
If I run the function on a table with few rows (up to 10000) it works perfectly, but if the table is bigger (100K records) the execution fails showing a "CORS error".
Analyzing the response, I noticed that the response size on a query that requires 100K rows is around 25MB.
I know that the limit quota of the firebase functions is 10MB for each response, and I am pretty sure that the error is due to that (even if CORS sounds strange).
The question is:
"Is there any way to run that query that returns a 100k rows json to me?"
maybe I have to compress the response or else ... what is the best solution?
Thanks in advance! :)

Related

How to send real time response from server to client using Node.js express

on post request server is generating data after few seconds, these are almost 1000 to 10000 entries. Currently i'm saving data into csv file and it's working fine. how can i pas data to client with json array.
Name and Age variable getting data after few seconds
app.post('/', (req, res) => {
// currently passing to results.csv. its working fine i want to send this real time (Name and age) data to client.
const stream = createWriteStream('./results.csv', { flags: 'a', encoding: 'utf8' })
// Append evaluation from response to file
stream.write(`${Name}, ${Age}\n`)
// example data: Patrick, 32
// End stream to avoid accumulation
stream.end()
})
res.send() only send first row but Name and age variable getting update after after each 10 seconds
app.listen(port, () => {
console.log("App is running on port " + port);
}); ```
To parse data into json array, you would have to read the stream before sending the data. Example:
let rows = []
fs.createReadStream('./results.csv')
.pipe(parse())
.on("error", (error) => console.error(error))
.on("data", (row) => rows.push(row))
.on("end", () => res.send(rows))
But if you want to send the csv file data, do something like this:
return res.sendFile("./results.csv", options, (err) => {
if(error) console.error(error)
console.log("File has been sent")
})
NOTE: The parse() method used in the first example is gotten from the fast-csv package.
Also, using a package like fast-csv makes it easy to write data to the csv file while setting headers:true. Trying to read the data, the parse method has to be informed of the headers already set in order to use those values for the json keys.

Problems with API Call and mongodb

I have a problem to insert a API call with request in mongodb with mongoose. Need your help!!! I have a problem with the api call and mongodb. I call the api call over an interval and delete all data with deleteMany and then I fill the data again with insertMany. That works well but the problem is that my _id is deleted every time and is reassigned when it is inserted. So I have problems with queries with the _id. Is there a more elegant way to update the data in mongodb every 10 minutes? How can i solve this problem any ideas. My target is to save the Api call from the url and then update it all 10 min. For all help i am very grateful.
const updateBaseCoin = async () => {
request(
{
url:
'https://api.coingecko.com/api/v3/coins/markets?vs_currency=eur&order=market_cap_desc&per_page=250&page=1&sparkline=false&price_change_percentage=2h%2C1h%2C24h%2C7d%2C14d%2C30d%2C200d%2C1y%2C5y',
json: true,
},
async (error, response, body) => {
try {
if (error) {
console.log(error);
} else {
await BaseCoin.deleteMany({});
await BaseCoin.insertMany(body.map((item) => item));
}
} catch (e) {
console.log(e);
}
}
);
};
setInterval(updateBaseCoin, 10 * 10 * 1000);
First, to solve the problem of the _id in a naive way is to change the map in that way (or something similar with another field as id) -
await BaseCoin.insertMany(body.map((item) => ({_id: item.id, ...item})));
Second,
This is not the best way to handle data fetching from another source.
You can create a mechanism that knows how to handle crud operations of data based on the existing data in the DB.

Error INTERNAL from Firebase Cloud Function

There is myFunction in Firebase Cloud Functions:
const myFunctionHandler = (params, context) => {
return new Promise((resolve, reject) => {
return admin
.database()
.ref("checks")
.push(params)
.then((r) => resolve(r))
.catch((e) => reject(e));
};
exports.myFunction = functions.https.onCall(myFunctionHandler);
this is how I call it from the client:
functions()
.httpsCallable('myFunction')({
myParam: true,
})
.then(resp => console.log(resp))
.catch(e => console.log(e));
In the Firebase Cloud Functions these are the logs:
Function execution started
Unhandled error RangeError: Maximum call stack size exceeded
at Function.mapValues (/workspace/node_modules/lodash/lodash.js:13426:7)
at encode (/workspace/node_modules/firebase-functions/lib/providers/https.js:183:18)
at encode (/workspace/node_modules/firebase-functions/lib/providers/https.js:157:16)
at /workspace/node_modules/lodash/lodash.js:13427:38
at /workspace/node_modules/lodash/lodash.js:4925:15
at baseForOwn (/workspace/node_modules/lodash/lodash.js:2990:24)
at baseForOwn (/workspace/node_modules/lodash/lodash.js:2990:24)
at Function.mapValues (/workspace/node_modules/lodash/lodash.js:13426:7)
Function execution took 2438 ms, finished with status code: 500
After 2438ms the data is entered correctly in Firebase Realtime Database but the response gives [Error: INTERNAL]. Why?
[EDIT]
I've tried to copy the same database push function in the client like this:
database()
.ref()
.child(`checks`)
.push(params)
.then(r => resolve(r))
.catch(e => reject(e));
and the response I have is: https://myApp.firebaseio.com/checks/-MHABiZl5lsDBLSP22-3 that is a positive feedback that tells me the info are stored correctly.
I aspect the same positive response from the Cloud Functions BUT what I have is the [Error: INTERNAL].
Receiving (from the function in the Firebase Cloud Functions) an Error as response the idea I have is that the info are not stored correctly.
Callable functions send a JSON payload to the client. That's all they can send. When you return a promise from a callable function, it will attempt to serialize the object resolved from the promise. The object you're trying to send (the result of a push() is a ThennableReference) is too complex for serialization, so it fails. It contains self-referential links, which causes the error you see.
Your question still doesn't indicate exactly what the JSON payload is supposed to be. You're going to have to figure out what that is, and send only that object to the client. It obviously can't be a ThennableReference. If you need to convert that to something, figure out how to convert it, and send that object instead.
The same function gives different response from ether client or server.
In the FCF I just avoid it to parse the response of the push.
In the client it returns the https link as described in the google documentation.
Be sure also the localhost:5001 Node server is not running. In my case interfered.

How send huge datapool in response?

I am new in node.js ecosystem and need some help.
I have controller which triggered when user call URL.
async function get_all_information(req, res, next) {
try {
let start_date = req.query.start_date;
let end_date = req.query.end_date;
const binds = {};
binds.start_date = start_date;
binds.end_date = end_date;
let query = `SOME LONG SQL STATEMENT`;
await pool.query(query, binds, function (error, results) {
if (error) throw error;
console.log(results); // ~ 10 seconds
res.send(JSON.stringify(results)); // ~ 15 seconds
});
} catch (error) {
next(error);
}
}
I tried to use this code but faced the problem. Monthly datapool which return a database has 227011 rows. It seems like stringify method create too huge JSON file. The Postman application just crash when I tried to test. I tried to analyze and notice that daily datapool create ~ 13 MB JSON file. We can say that monthly datapool could create ~ 400 MB JSON file.
Then I tried to streaming query rows like that:
pool.query('HUGE SQL QUERY').stream({ highWaterMark: 5 }).pipe(res);
Unfortunatly such code raise error:
TypeError [ERR_INVALID_ARG_TYPE]: The first argument must be one of type string or Buffer. Received type object
Can someone show me how correctly send huge data from MySQL database in response?
#akram I used fs package as you adviced. I use next code to create JSON file:
await pool.query(sql_statement, binds, function (error, results) {
if (error) throw error;
fs.writeFile("information.json", results, 'utf8', function (error) {
if(error) throw error;
});
});
This code create json file which has ~ 3.5 MB size for monthly datapool. In editor I have next message:
This document contains very long lines. Soft wraps were forcibly enable to improve editor performance.
Also that json file contains:
It seems to me too strange.
Try to use JSONStream https://github.com/dominictarr/JSONStream, i think can resolve your issue, there is a lof of example in doc of JSONStream

Node (Express) - Trying to save a PDF from an API call

I've tried all sorts to get this to work. I'm trying to request a PDF from an API on node, then send this back to the client who called it to begin with.
For the minute I just want to successfully save and view the PDF on the node server.
The issue is the PDF file is always empty when I open it (Even though it has a size of 30kb).
The basic flow is like this (removed a few bits, but the below code works and returns me the PDF fine)
// We pass through session ID's, request dates etc through in body
app.post("/getPayslipURL", function(client_request, res) {
// create request, which will simply pass on the data to the database (In order to get the NI number we need for the pay API)
const NI_NUMBER_REQUEST = db_api.createRequestTemplate({
body: JSON.stringify(client_request.body)
});
// Create a chain of HTTPS Requests, Starting with our call to the DB
requestPromise(NI_NUMBER_REQUEST)
.then((db_response) => {
const PAY_API_OPTIONS = /*Code to generate options based on furhter DB info (Includes dates etc)*/
return requestPromise(PAY_API_OPTIONS); // Call pay API
})
.then((pay_pdf_data) => {
console.log(typeof pay_pdf_data); // It's a string
// At this point I can log pay_pdf_data, But if I try to save it to file it's always empty
// No matter how I encode it etc
fs.writeFile("./test.pdf", pay_pdf_data, 'binary', function(err) {
if(err) {
return console.log(err);
}
console.log("The file was saved!");
});
})
.catch(err => `Error caught: ${console.log}`) // Catch any errors on our request chain
});
}
I've tried saving with / without the binary flag as suggested in other posts in both the file save aswell as within the requests itself. Also various types of decoding methods have been tried, I always get an empty PDF saved.
My return data looks like this (is much bigger, when saved as test.pdf I get a 30kb file as before mentioned)
%PDF-1.4
%����
1 0 obj
0 obj
<
I've found a post which says about piping the data all the way through, I have a feeling my pdf_data is corrupted when getting converted to a string
Any ideas how would I go about doing this with the current setup?
e/ RequestPromise is a library, could also use the standards request library if it's easier
https://github.com/request/request-promise -
https://github.com/request/request
Thanks!
Your code doesn't work because the underlying request library (used by request-promise) requires the option encoding set to null for binary data - see https://github.com/request/request#requestoptions-callback.
Here's how you download binary data using that module -
app.post("/getPayslipURL", function(client_request, res) {
const NI_NUMBER_REQUEST = db_api.createRequestTemplate({
body: JSON.stringify(client_request.body),
encoding: null
});
requestPromise(NI_NUMBER_REQUEST)
.then((db_response) => {
const PAY_API_OPTIONS = /*Code to generate options based on furhter DB info (Includes dates etc)*/
return requestPromise(PAY_API_OPTIONS); // Call pay API
})
.then((pay_pdf_data) => {
fs.writeFile("./test.pdf", pay_pdf_data, 'binary', (err) => {
if(err) {
return console.log(err);
}
console.log("The file was saved!");
});
})
.catch(err => `Error caught: ${console.log}`) // Catch any errors on our request chain
});
}

Categories