Problems with API Call and mongodb - javascript

I have a problem to insert a API call with request in mongodb with mongoose. Need your help!!! I have a problem with the api call and mongodb. I call the api call over an interval and delete all data with deleteMany and then I fill the data again with insertMany. That works well but the problem is that my _id is deleted every time and is reassigned when it is inserted. So I have problems with queries with the _id. Is there a more elegant way to update the data in mongodb every 10 minutes? How can i solve this problem any ideas. My target is to save the Api call from the url and then update it all 10 min. For all help i am very grateful.
const updateBaseCoin = async () => {
request(
{
url:
'https://api.coingecko.com/api/v3/coins/markets?vs_currency=eur&order=market_cap_desc&per_page=250&page=1&sparkline=false&price_change_percentage=2h%2C1h%2C24h%2C7d%2C14d%2C30d%2C200d%2C1y%2C5y',
json: true,
},
async (error, response, body) => {
try {
if (error) {
console.log(error);
} else {
await BaseCoin.deleteMany({});
await BaseCoin.insertMany(body.map((item) => item));
}
} catch (e) {
console.log(e);
}
}
);
};
setInterval(updateBaseCoin, 10 * 10 * 1000);

First, to solve the problem of the _id in a naive way is to change the map in that way (or something similar with another field as id) -
await BaseCoin.insertMany(body.map((item) => ({_id: item.id, ...item})));
Second,
This is not the best way to handle data fetching from another source.
You can create a mechanism that knows how to handle crud operations of data based on the existing data in the DB.

Related

Catching Error With Custom Rest API For FireStore Javascript

I have been watching a tutorial on making a Rest API for Firestore which appears to work but I cannot figure out how to catch an error.
The code below basically uses an end point to retrieve a document id from the firestore database.
The client uses javascript fetch to call the API.
I am trying to workout how to return something back to the client from the API if the document id is not there. I thought I might get a 404 status returned but I always get status 200.
This is the API code I used
app.get("/api/read/:id", (req, res) => {
(async () => {
try {
const document = db.collection("users").doc(req.params.id);
let product = await document.get();
let response = product.data();
return res.status(200).send(response);
} catch (error) {
console.log(error);
return res.status(500).send(error);
}
})();
})
I'm fairly certain that the 404 message is for the server itself not being found (though I do need to brush up on my error codes).
However, if you're looking to check to see if a document exists there's a command specifically for that demonstrated in the examples in the firebase docs

How to compress json response size in Firebase cloud functions?

i have a firebase callable function that takes data from a sql cloud database (mysql).
this is the code:
// Get records
exports.getRecords = functions.region("europe-west1").https.onCall((data, context) => {
// get data
const table_name = data.table_name
return mysqlPromise.createPool(db.connectionOptions)
.then(pool => {
return pool.query(`
SELECT * FROM ${table_name};`
)
})
.then(res => {
return { result: res };
})
.catch(error => {
throw new functions.https.HttpsError('failed-precondition', error);
});
});
If I run the function on a table with few rows (up to 10000) it works perfectly, but if the table is bigger (100K records) the execution fails showing a "CORS error".
Analyzing the response, I noticed that the response size on a query that requires 100K rows is around 25MB.
I know that the limit quota of the firebase functions is 10MB for each response, and I am pretty sure that the error is due to that (even if CORS sounds strange).
The question is:
"Is there any way to run that query that returns a 100k rows json to me?"
maybe I have to compress the response or else ... what is the best solution?
Thanks in advance! :)

Problem with React making Get request to Node(express)

As the title says, i have a part of my react app that tries to get some data from my database, making a select based on the value I passed to it. So im gonna go ahead and first show the code where i think the problem lies:
So first, this is the function from one of my forms that sends the request to the server, i know code is probably ugly, but i can tell from the console.logs that the parameters im sending are what i intend to send(a string called "licenciaInput"
async handleClickLicencia (event) {
event.preventDefault();
console.log(this.state);
console.log("licenciaInput: "+this.state.licenciaInput);
const datoBuscar = this.state.licenciaInput;
axios.get('http://localhost:3001/atletas/:licencia',this.state)
.then(response =>{
console.log(response)
})
.catch(error =>{
console.log(error)
})
And then, i have this function which is called in that localhost route which attempts to get "licencia", and launch a select in my postgresql db where licencia="whatever", you can see the sentence in the code:
const getAtletasByLicencia = (request, response) => {
const licencia = request.body.licenciaInput;
console.log("Request: "+request);
console.log("what the server gets: "+licencia);
// const licencia = request.licenciaInput;
const sentencia ="SELECT * FROM atleta WHERE licencia ='"+licencia+"'";
pool.query(sentencia, (error, results) =>{
if(error){
throw error
}
response.status(200).json(results.rows)
})
}
As you can see, i have console.logs everywhere, and i still cannot access whatever element i send, because i always get on the server console "undefined" value.
TLDR:How can i access the "licenciaInput" i passed from my client form to my server, i have tried request.body.licenciaInput, request.params.licenciaInput, and request.licenciaInput, but none of those seem to work
I also know i have to treat after that the data i receive from the server, but i need to solve this before looking two steps ahead. Im also really new to React and node/express, so feel free to burn me with good practices im not meeting.Thanks in advance
EDIT: Im also adding this code that i have which shows the route for my method in the server:
app.get('/atletas/:licencia', db.getAtletasByLicencia)
As #Gillespie59 suggested that i should send a POST request, but i dont think i should if im both trying to send a parameter to the server to make a select, and then send the results back to the client
Change your request to:
axios.get(`http://localhost:3001/atletas/${this.state.licenciaInput}`)
...
and your route (if you are using express) should look like this:
app.get('/atletas/:licencia', function (req, res) {
var licencia = req.params.licencia
...
})
As you are using request.body you should send a POST request with axios and add a body.

How many request we can make at a time using "request" middle ware in nodeJS application

I am running a cron job every 5 mins to get data from 3rd party API, It can be N number of request at a time from NodeJS application. Below are the details and code samples:
1> Running cron Job every 5 mins:
const cron = require('node-cron');
const request = require('request');
const otherServices= require('./services/otherServices');
cron.schedule("0 */5 * * * *", function () {
initiateScheduler();
});
2> Get the list of elements for which I want to initiate the request. Can receive N number of elements. I have called request function (getSingleElementUpdate()) in the forEach loop
var initiateScheduler = function () {
//Database call to get elements list
otherServices.moduleName()
.then((arrayList) => {
arrayList.forEach(function (singleElement, index) {
getSingleElementUpdate(singleElement, 1);
}, this);
})
.catch((err) => {
console.log(err);
})
}
3> Start initiating the request for singleElement. Please note I don't need any callback if I received a successful (200) response from the request. I just have to update my database entries on success.
var getSingleElementUpdate = function (singleElement, count) {
var bodyReq = {
"id": singleElement.elem_id
}
var options = {
method: 'POST',
url: 'http://example.url.com',
body: bodyReq,
dataType: 'json',
json: true,
crossDomain: true
};
request(options, function (error, response, body) {
if (error) {
if (count < 3) {
count = count + 1;
initiateScheduler(singleElement, count)
}
} else{
//Request Success
//In this: No callback required
// Just need to update database entries on successful response
}
});
}
I have already checked this:
request-promise: But, I don't need any callback after a successful request. So, I didn't find any advantage of implementing this in my code. Let me know if you have any positive point to add this.
I need your help with the following things:
I have checked the performance when I received 10 elements in arrayList of step 2. Now, the problem is I don't have any clear vision about what will happen when I start receiving 100 and 1000 of elements in step 2. So, I need your help in determining whether I need to update my code for that scenario or not or is there anything I missed out which degrade the performance. Also, How many maximum requests I can make at a time. Any help from you is appreciable.
Thanks!
AFAIK there is no hard limit on a number of request. However, there are (at least) two things to consider: your hardware limits (memory/CPU) and remote server latency (is it able to respond to all requests in 5 mins before the next batch). Without knowing the context it's also impossible to predict what scaling mechanism you might need.
The question is actually more about app architecture and not about some specific piece of code, so you might want to try softwareengineering instead of SO.

Nodejs map serial port write to receive data

I am currently using node-serialport module for serial port communication. I will send a command ATEC and it will respond with ECHO.
However, this process of sending and receiving data is async(after i send the data, i will not know when the data will arrive in the data event), the example code is below:
//Register the data event from the serial port
port.on('data', (data) => {
console.log(data);
});
//Send data using serialport
port.write('ATEC');
Is there anyway I could write it in this way?
//When i send the command, I could receive the data
port.write('ATEC').then((data)=> {
console.log(data);
});
Is this possible to achieve?
In http communication using request client, we could do something like
request.get('http:\\google.com')
.on('response', (res) => {
console.log(res);
});
I want to replicate the same behaviour using serialport
I wrap a promise in the serial data receive
function sendSync(port, src) {
return new Promise((resolve, reject) => {
port.write(src);
port.once('data', (data) => {
resolve(data.toString());
});
port.once('error', (err) => {
reject(err);
});
});
}
Please take note, the event is using once instead of on to prevent event from stacking (please check the comments below for more information - thanks #DKebler for spotting it)
Then, I could write the code in sync as below
sendSync(port, 'AThello\n').then((data) => {
//receive data
});
sendSync(port, 'ATecho\n').then((data) => {
//receive data
});
or I could use a generator, using co package
co(function* () {
const echo = yield sendSync(port, 'echo\n');
const hello = yield sendSync(port, 'hello 123\n');
return [echo, hello]
}).then((result) => {
console.log(result)
}).catch((err) => {
console.error(err);
})
We have a similar problem in a project I'm working on. Needed a synchronous send/receive loop for serial, and the serialport package makes that kinda weird.
Our solution is to make some sort of queue of functions/promises/generators/etc (depends on your architecture) that the serial port "data" event services. Every time you write something, put a function/promise/etc into the queue.
Let's assume you're just throwing functions into the queue. When the "data" event is fired, it sends the currently aggregated receive buffer as a parameter into the first element of the queue, which can see if it contains all of the data it needs, and if so, does something with it, and removes itself from the queue somehow.
This allows you to handle multiple different kinds of architecture (callback/promise/coroutine/etc) with the same basic mechanism.
As an added bonus: If you have full control of both sides of the protocol, you can add a "\n" to the end of those strings and then use serialport's "readline" parser, so you'll only get data events on whole strings. Might make things a bit easier than constantly checking input validity if it comes in pieces.
Update:
And now that code has been finished and tested (see the ET312 module in http://github.com/metafetish/buttshock-js), here's how I do it:
function writeAndExpect(data, length) {
return new Promise((resolve, reject) => {
const buffer = new Buffer(length);
this._port.write(data, (error) => {
if (error) {
reject(error);
return;
}
});
let offset = 0;
let handler = (d) => {
try {
Uint8Array.from(d).forEach(byte => buffer.writeUInt8(byte, offset));
offset += d.length;
} catch (err) {
reject(err);
return;
}
if (offset === length) {
resolve(buffer);
this._port.removeListener("data", handler);
};
};
this._port.on("data", handler);
});
}
The above function takes a list of uint8s, and an expected amount of data to get back, returns a promise. We write the data, and then set ourselves up as the "data" event handler. We use that to read until we get the amount of data we expect, then resolve the promise, remove ourselves as a "data" listener (this is important, otherwise you'll stack handlers!), and finish.
This code is very specific to my needs, and won't handle cases other than very strict send/receive pairs with known parameters, but it might give you an idea to start with.

Categories