Nodejs map serial port write to receive data - javascript

I am currently using node-serialport module for serial port communication. I will send a command ATEC and it will respond with ECHO.
However, this process of sending and receiving data is async(after i send the data, i will not know when the data will arrive in the data event), the example code is below:
//Register the data event from the serial port
port.on('data', (data) => {
console.log(data);
});
//Send data using serialport
port.write('ATEC');
Is there anyway I could write it in this way?
//When i send the command, I could receive the data
port.write('ATEC').then((data)=> {
console.log(data);
});
Is this possible to achieve?
In http communication using request client, we could do something like
request.get('http:\\google.com')
.on('response', (res) => {
console.log(res);
});
I want to replicate the same behaviour using serialport

I wrap a promise in the serial data receive
function sendSync(port, src) {
return new Promise((resolve, reject) => {
port.write(src);
port.once('data', (data) => {
resolve(data.toString());
});
port.once('error', (err) => {
reject(err);
});
});
}
Please take note, the event is using once instead of on to prevent event from stacking (please check the comments below for more information - thanks #DKebler for spotting it)
Then, I could write the code in sync as below
sendSync(port, 'AThello\n').then((data) => {
//receive data
});
sendSync(port, 'ATecho\n').then((data) => {
//receive data
});
or I could use a generator, using co package
co(function* () {
const echo = yield sendSync(port, 'echo\n');
const hello = yield sendSync(port, 'hello 123\n');
return [echo, hello]
}).then((result) => {
console.log(result)
}).catch((err) => {
console.error(err);
})

We have a similar problem in a project I'm working on. Needed a synchronous send/receive loop for serial, and the serialport package makes that kinda weird.
Our solution is to make some sort of queue of functions/promises/generators/etc (depends on your architecture) that the serial port "data" event services. Every time you write something, put a function/promise/etc into the queue.
Let's assume you're just throwing functions into the queue. When the "data" event is fired, it sends the currently aggregated receive buffer as a parameter into the first element of the queue, which can see if it contains all of the data it needs, and if so, does something with it, and removes itself from the queue somehow.
This allows you to handle multiple different kinds of architecture (callback/promise/coroutine/etc) with the same basic mechanism.
As an added bonus: If you have full control of both sides of the protocol, you can add a "\n" to the end of those strings and then use serialport's "readline" parser, so you'll only get data events on whole strings. Might make things a bit easier than constantly checking input validity if it comes in pieces.
Update:
And now that code has been finished and tested (see the ET312 module in http://github.com/metafetish/buttshock-js), here's how I do it:
function writeAndExpect(data, length) {
return new Promise((resolve, reject) => {
const buffer = new Buffer(length);
this._port.write(data, (error) => {
if (error) {
reject(error);
return;
}
});
let offset = 0;
let handler = (d) => {
try {
Uint8Array.from(d).forEach(byte => buffer.writeUInt8(byte, offset));
offset += d.length;
} catch (err) {
reject(err);
return;
}
if (offset === length) {
resolve(buffer);
this._port.removeListener("data", handler);
};
};
this._port.on("data", handler);
});
}
The above function takes a list of uint8s, and an expected amount of data to get back, returns a promise. We write the data, and then set ourselves up as the "data" event handler. We use that to read until we get the amount of data we expect, then resolve the promise, remove ourselves as a "data" listener (this is important, otherwise you'll stack handlers!), and finish.
This code is very specific to my needs, and won't handle cases other than very strict send/receive pairs with known parameters, but it might give you an idea to start with.

Related

API calls not returning response with MySQL data streams

I have the following code:
await axiosAPICall(dummyData); // works
const sqlQuery = `SELECT column1, column2 FROM table`;
const queryStream = mySqlConnectionInstance.query(sqlQuery, []);
queryStream
.on('error', function (err) {
// Handle error, an 'end' event will be emitted after this as well
})
.on('result', async (actualData) => {
// axios api call, the api callback goes to callstack with any of the following errors:
// 1. read ECONNRESET
// 2. Client network socket disconnected before secure TLS connection was established
await axiosAPICall(actualData); // breaks
})
.on('end', function () {
// all rows have been received
});
As you can see I'm getting all the rows from a table in a MySQL database stream. When the data comes from the database stream, I'm passing that data to the axios API call.
The API call works perfectly fine when called outside of the stream logic but when I call the API inside the streaming logic, it breaks all the time.
I am hitting API calls as fast as each on('result') gets called (the async/await does NOT slow down the request rate, i.e. I end up with multiple requests in parallel.
Does anyone know why is API calls not working inside the streaming logic section?
If the question needs any clarifications please comment.
Based on a comment that suggests the error is due to making "too many requests" at once - this is a simple and naive way to wait for the previous request before making the next
const sqlQuery = `SELECT column1, column2 FROM table`;
const queryStream = mySqlConnectionInstance.query(sqlQuery, []);
const wait = (ms) => new Promise(resolve => setTimeout(resolve, ms));
let previous = Promise.resolve();
queryStream
.on('error', function (err) {
// Handle error, an 'end' event will be emitted after this as well
})
.on('result', async (actualData) => {
// wait until previous request has completed
await previous;
// optional, add a delay,
// 100ms for this example -
// for example if there is a limit of 10 requests per second
// adjust (or remove) as required
await wait(100);
// set "previous" for next request
previous = axiosAPICall(actualData);
})
.on('end', function () {
// if you `await previous` here,
// you can truly wait until all rows are processed
// all rows have been received
});

Problems with API Call and mongodb

I have a problem to insert a API call with request in mongodb with mongoose. Need your help!!! I have a problem with the api call and mongodb. I call the api call over an interval and delete all data with deleteMany and then I fill the data again with insertMany. That works well but the problem is that my _id is deleted every time and is reassigned when it is inserted. So I have problems with queries with the _id. Is there a more elegant way to update the data in mongodb every 10 minutes? How can i solve this problem any ideas. My target is to save the Api call from the url and then update it all 10 min. For all help i am very grateful.
const updateBaseCoin = async () => {
request(
{
url:
'https://api.coingecko.com/api/v3/coins/markets?vs_currency=eur&order=market_cap_desc&per_page=250&page=1&sparkline=false&price_change_percentage=2h%2C1h%2C24h%2C7d%2C14d%2C30d%2C200d%2C1y%2C5y',
json: true,
},
async (error, response, body) => {
try {
if (error) {
console.log(error);
} else {
await BaseCoin.deleteMany({});
await BaseCoin.insertMany(body.map((item) => item));
}
} catch (e) {
console.log(e);
}
}
);
};
setInterval(updateBaseCoin, 10 * 10 * 1000);
First, to solve the problem of the _id in a naive way is to change the map in that way (or something similar with another field as id) -
await BaseCoin.insertMany(body.map((item) => ({_id: item.id, ...item})));
Second,
This is not the best way to handle data fetching from another source.
You can create a mechanism that knows how to handle crud operations of data based on the existing data in the DB.

Like we have readFileSync function in node js, I need http.getSync - a wrapper for http.get() to make it sync

I want to make the implementation of this https.getSync the wrapper method, so that it calls the api synchronously, same like the readFileSync method which we use for reading file synchronously,
const https = require('https');
How should i implement this method -
https.getSync = (url) => {
let data = '';
https.get(url, resp => {
resp.on('data', (chunk) => {
data += chunk;
});
resp.on('end', () => {
console.log(JSON.parse(data));
});
}).on("error", (err) => {
console.log("Error: " + err.message);
});
return data;
}
I want the below two calls to be made synchronously, without changing the below code where we are calling the getSync method. Here for this calling i don't want to use promises or callback.
let api1 = https.getSync('https://api.nasa.gov/planetary/apod?api_key=DEMO_KEY');
let api2 = https.getSync('https://api.nasa.gov/planetary/apod?api_key=NNKOjkoul8n1CH18TWA9gwngW1s1SmjESPjNoUFo');
You can use npm package sync-request.
It's quite simple.
var request = require('sync-request');
var res = request('GET', 'http://example.com');
console.log(res.getBody());
Here is the link: sync-request
Read this also: An announcement from the package, in case readers would think using it is a good idea. You should not be using this in a production application. In a node.js application you will find that you are completely unable to scale your server. In a client application you will find that sync-request causes the app to hang/freeze. Synchronous web requests are the number one cause of browser crashes.
According to me also you should avoid making http sync request. Instead clear your concepts of using callback, promise, async/await.

Node server not handling async post requests from client properly

We have a node server which doesn't handle post requests properly when they are made asynchronously. When the requests are made synchronously, it handles them fine.
There is a node api server and to mimic a client, there is node script which makes a post request to the server.
While making a single post request or post requests in a loop synchronously, everything works as expected.
While making asynchronous post requests in a loop to the server, the code doesn't execute properly.
Here is the code on the server side. This method is called from router.post() method.
async insert(params) {
let account = new Account();
try {
let totalLicenses = await this.getLicenses(params);
if (totalLicenses === 0) throw new AccountError('NO_AVAILABLE_LICENSE');
let accountResponse = await account.insert(params);
let useLicense = await license.use(accountResponse, params);
/*
Do other account setup stuff here
*/
return accountResponse;
} catch(err) {
throw err;
}
}
getLicenses(params) is a async function that prepares the sql query and awaits the response from a queryDb method which wraps the callback in a promise. Here is the code:
getLicense(params) {
let vals = [...arguments],
sql = 'my sql query';
try {
return await queryDb(sql, val);
} catch (err) {
throw new Error()
}
}
We are using mysql package to talk to db. This is what queryDb method looks like.
queryDb(query, vals) {
return new Promise( (resolve, reject) => {
connection.query(query, vals, (err, results, fields) => {
if (err) reject(err);
resolve(results);
}
}
}
While making asynchronous post requests in a loop to the server, the insert method above executes this.getLicenses(params) method for the requests and then calls the account.insert(params) for the requests and then license.use(accountResponse, params) for the requests.
This becomes a problem when let's say a client has 3 licenses available and they send 5 asynchronous post requests. Technically, it should throw error for the last 2 requests but that is not the case. What ends up happening is, it inserts all 5 accounts since it calls this.getLicenses(params) for all 5 requests before proceeding to insert the account.
Any help is appreciated!

Pausing Node.js Readable Stream

I'm building a bar code scanning app using the node-serialport. Where I'm stuck is making a AJAX call to trigger a scan and then have Express server respond with the data from the readable stream.
Initialize Device:
// Open device port
var SerialPort = require('serialport');
var port = '/dev/cu.usbmodem1411';
var portObj = new SerialPort(port, (err) => {
if(err) {
console.log('Connection error ' + err);
}
});
//Construct device object
var Scanner = {
// Trigger Scan
scan : () => {
portObj.write(<scan cmd>), (err) => {
if(err) {
console.log('Error on scan' + err);
}
});
}
}
I've tried two approaches and neither produce the 'scan-read-respond' behavior I'm looking for.
First, I tried putting a event listener immediately following a scan, then using a callback in the listener to respond to the AJAX request. With this approach I get a 'Can't set headers after they are sent' error'. From what I understand Node is throwing this error because res.send is being called multiple times.
First Approach -- Response as callback in listener:
app.get('/dashboard', (req, res) => {
Scanner.scan(); //fire scanner
portObj.on('data', (data) => {
res.send(data); //'Can't set headers after they are sent' error'
});
}
In the second approach, I store the scan data into a local variable ('scanned_data') and move the response outside the listener block. The problem with this approach is that res.send executes before the scanned data is captured in the local variable and so comes up as 'undefined'. Also intriguing is the scanned_data that is captured in the listener block seems to multiple with each scan.
Second Approach -- Response outside listener:
app.get('/dashboard', (req, res) => {
var scanned_data; //declare variable outside listener block
Scanner.scan(); //trigger scan
portObj.on('data', (data) => {
scanned_data = data;
console.log(scanned_data); //displays scanned data but data multiplies with each scan. (e.g. 3 triggers logs 'barcode_data barcode_data barcode_data')
});
console.log(scanned_data); //undefined
res.send(scanned_data);
}
I'm a front end developer but have gotten to learn a lot about Node trying to figure this out. Alas, I think I've come to a dead end at this point. I tinkered with the .pipe() command, and have a hunch that's where the solution lies, but wasn't able to zero in on a solution that works.
Any thoughts or suggestions?
You should not make assumptions about what chunk of data you get in a 'data' event. Expect one byte or many bytes. You need to know the underlying protocol being used to know when you have received a full "message" so you can stop listening for data. At that point you should then send a response to the HTTP request.

Categories