Pausing Node.js Readable Stream - javascript

I'm building a bar code scanning app using the node-serialport. Where I'm stuck is making a AJAX call to trigger a scan and then have Express server respond with the data from the readable stream.
Initialize Device:
// Open device port
var SerialPort = require('serialport');
var port = '/dev/cu.usbmodem1411';
var portObj = new SerialPort(port, (err) => {
if(err) {
console.log('Connection error ' + err);
}
});
//Construct device object
var Scanner = {
// Trigger Scan
scan : () => {
portObj.write(<scan cmd>), (err) => {
if(err) {
console.log('Error on scan' + err);
}
});
}
}
I've tried two approaches and neither produce the 'scan-read-respond' behavior I'm looking for.
First, I tried putting a event listener immediately following a scan, then using a callback in the listener to respond to the AJAX request. With this approach I get a 'Can't set headers after they are sent' error'. From what I understand Node is throwing this error because res.send is being called multiple times.
First Approach -- Response as callback in listener:
app.get('/dashboard', (req, res) => {
Scanner.scan(); //fire scanner
portObj.on('data', (data) => {
res.send(data); //'Can't set headers after they are sent' error'
});
}
In the second approach, I store the scan data into a local variable ('scanned_data') and move the response outside the listener block. The problem with this approach is that res.send executes before the scanned data is captured in the local variable and so comes up as 'undefined'. Also intriguing is the scanned_data that is captured in the listener block seems to multiple with each scan.
Second Approach -- Response outside listener:
app.get('/dashboard', (req, res) => {
var scanned_data; //declare variable outside listener block
Scanner.scan(); //trigger scan
portObj.on('data', (data) => {
scanned_data = data;
console.log(scanned_data); //displays scanned data but data multiplies with each scan. (e.g. 3 triggers logs 'barcode_data barcode_data barcode_data')
});
console.log(scanned_data); //undefined
res.send(scanned_data);
}
I'm a front end developer but have gotten to learn a lot about Node trying to figure this out. Alas, I think I've come to a dead end at this point. I tinkered with the .pipe() command, and have a hunch that's where the solution lies, but wasn't able to zero in on a solution that works.
Any thoughts or suggestions?

You should not make assumptions about what chunk of data you get in a 'data' event. Expect one byte or many bytes. You need to know the underlying protocol being used to know when you have received a full "message" so you can stop listening for data. At that point you should then send a response to the HTTP request.

Related

Why is request.on data firing with a delay on NodeJS?

There is a simple web server that accepts data. Sample code below.
The idea is to track in real time how much data has entered the server and immediately inform the client about this. If you send a small amount of data, then everything works well, but if you send more than X data in size, then the on.data event on the server is triggered with a huge delay. I can see that data is transfering for 5 seconds already but on.data event is not trigerred.
on.data event seems to be triggered only when data is uploaded completely to the server, so that's why it works fine with small data (~2..20Mb), but with big data (50..200Mb) it doesnt work well.
Or maybe it is due to some kind of buffering..?
Do you have any suggestions why on.data triggered with delay and how to fix it?
const app = express();
const port = 3000;
// PUBLIC API
// upload file
app.post('/upload', function (request, response) {
request.on('data', chunk => {
// message appears with delay
console.log('upload on data', chunk.length);
// send message to the client about chunk.length
});
response.send({
message: `Got a POST request ${request.headers['content-length']}`
});
});
app.listen(port, () => {
console.log(`Example app listening at http://localhost:${port}`);
});
TLDR:
The delay that you are experiencing probably is the Queueing from Resource scheduling from the browser.
The Test
I did some tests with express, and then I found that it uses http to handle requests/response, so I used a raw http server listener to test this scenario, which has the same situation.
Backend code
This code, based on sample of Node transaction samples, will create a http server and give log of time on 3 situations:
When a request was received
When the first data event fires
When the end event fires
const http = require('http');
var firstByte = null;
var server = http.createServer((request, response) => {
const { headers, method, url } = request;
let body = [];
request.on('error', (err) => {
}).on('data', (chunk) => {
if (!firstByte) {
firstByte = Date.now();
console.log('received first byte at: ' + Date.now());
}
}).on('end', () => {
console.log('end receive data at: ' + Date.now());
// body = Buffer.concat(body).toString();
// At this point, we have the headers, method, url and body, and can now
// do whatever we need to in order to respond to this request.
if (url === '/') {
response.statusCode = 200;
response.setHeader('Content-Type', 'text/html');
response.write('<h1>Hello World</h1>');
}
firstByte = null;
response.end();
});
console.log('received a request at: ' + Date.now());
});
server.listen(8083);
Frontend code (snnipet from devtools)
This code will fire a upload to /upload which some array data, I filled the array before with random bytes, but then I removed and see that it did not have any affect on my timing log, so yes.. the upload content for now is just an array of 0's.
console.log('building data');
var view = new Uint32Array(new Array(5 * 1024 * 1024));
console.log('start sending at: ' + Date.now());
fetch("/upload", {
body: view,
method: "post"
}).then(async response => {
const text = await response.text();
console.log('got response: ' + text);
});
Now running the backend code and then running the frontend code I get some log.
Log capture (screenshots)
The Backend log and frontend log:
The time differences between backend and frontend:
Results
looking at the screenshoots and I get two differences between the logs:
The first, and most important, is the difference between frontend fetch start and backend request recevied, I got 1613ms which is "close" (1430ms) to Resource Scheduling in network timing tab, I think there are more things happening between the frontend fetch call and the node backend event, so I can't direct compare the times:
log.backendReceivedRequest - log.frontEndStart
1613
The second is the difference between receving data on backend, which I got
578ms, close to Request sent (585ms) in network timing tab:
log.backendReceivedAllData - log.backendReceivedFirstData
578
I also changed the frontend code to send different sizes of data and the network timing tab still matches the log
The thing that remains unknown for me is... Why does Google Chrome is queueing my fetch since I'm not running any more requests and not using the bandwidth of the server/host? I readed the conditions for Queueing but not found the reason, maybe is allocating the resources on disk, but not sure: https://developer.chrome.com/docs/devtools/network/reference/#timing-explanation
References:
https://nodejs.org/es/docs/guides/anatomy-of-an-http-transaction/
https://developer.chrome.com/docs/devtools/network/reference/#timing-explanation
I found a problem. It was in nginx config. Nginx was setup like a reverse proxy. By default proxy request buffering is enabled, so nginx grabs first whole request body and only then forwards it to nodejs, so that's why I saw delay.
https://nginx.org/en/docs/http/ngx_http_proxy_module.html#proxy_request_buffering

Problem with React making Get request to Node(express)

As the title says, i have a part of my react app that tries to get some data from my database, making a select based on the value I passed to it. So im gonna go ahead and first show the code where i think the problem lies:
So first, this is the function from one of my forms that sends the request to the server, i know code is probably ugly, but i can tell from the console.logs that the parameters im sending are what i intend to send(a string called "licenciaInput"
async handleClickLicencia (event) {
event.preventDefault();
console.log(this.state);
console.log("licenciaInput: "+this.state.licenciaInput);
const datoBuscar = this.state.licenciaInput;
axios.get('http://localhost:3001/atletas/:licencia',this.state)
.then(response =>{
console.log(response)
})
.catch(error =>{
console.log(error)
})
And then, i have this function which is called in that localhost route which attempts to get "licencia", and launch a select in my postgresql db where licencia="whatever", you can see the sentence in the code:
const getAtletasByLicencia = (request, response) => {
const licencia = request.body.licenciaInput;
console.log("Request: "+request);
console.log("what the server gets: "+licencia);
// const licencia = request.licenciaInput;
const sentencia ="SELECT * FROM atleta WHERE licencia ='"+licencia+"'";
pool.query(sentencia, (error, results) =>{
if(error){
throw error
}
response.status(200).json(results.rows)
})
}
As you can see, i have console.logs everywhere, and i still cannot access whatever element i send, because i always get on the server console "undefined" value.
TLDR:How can i access the "licenciaInput" i passed from my client form to my server, i have tried request.body.licenciaInput, request.params.licenciaInput, and request.licenciaInput, but none of those seem to work
I also know i have to treat after that the data i receive from the server, but i need to solve this before looking two steps ahead. Im also really new to React and node/express, so feel free to burn me with good practices im not meeting.Thanks in advance
EDIT: Im also adding this code that i have which shows the route for my method in the server:
app.get('/atletas/:licencia', db.getAtletasByLicencia)
As #Gillespie59 suggested that i should send a POST request, but i dont think i should if im both trying to send a parameter to the server to make a select, and then send the results back to the client
Change your request to:
axios.get(`http://localhost:3001/atletas/${this.state.licenciaInput}`)
...
and your route (if you are using express) should look like this:
app.get('/atletas/:licencia', function (req, res) {
var licencia = req.params.licencia
...
})
As you are using request.body you should send a POST request with axios and add a body.

Passing function on express js Route not working

I'm just really new on Node and Express. Trying to pass a function instead of text on my route but it seems not working. I just looked up at documentation there, They mentioned only text with req.send() method. I'm trying to pass here function's but it's not working. and also the alert() not working like this req.send(alert('Hello world')) it say's alert isn't defined or something similar.
**Update: ** I'm trying to execute this library with express and node https://github.com/przemyslawpluta/node-youtube-dl
I'm trying to do here pass functions like this
function blaBla() {
var youtubedl = require('youtube-dl');
var url = 'http://www.youtube.com/watch?v=WKsjaOqDXgg';
// Optional arguments passed to youtube-dl.
var options = ['--username=user', '--password=hunter2'];
youtubedl.getInfo(url, options, function(err, info) {
if (err) throw err;
console.log('id:', info.id);
console.log('title:', info.title);
console.log('url:', info.url);
console.log('thumbnail:', info.thumbnail);
console.log('description:', info.description);
console.log('filename:', info._filename);
console.log('format id:', info.format_id);
});
}
app.get('/', (req, res) => {
res.send(blaBla());
})
**Instead of **
app.get('/', function (req, res) {
res.send('Hello World!')
})
I hope you guy's understood my question.
res.send() expects a string argument. So, you have to pass a string.
If you want the browser to execute some Javascript, then what you send depends upon what kind of request is coming in from the browser.
If it's a browser page load request, then the browser expects an HTML response and you need to send an HTML page string back. If you want to execute Javascript as part of that HTML page, then you can embed a <script> tag inside the page and then include Javascript text inside that <script> tag and the browser will execute that Javascript when the page is parsed and scripts are run.
If the route is in response to a script tag request, then you can return Javascript text as a string and you need to make sure the MIME type appropriately indicates that it is a script.
If the route is in response to an Ajax call, then it all depends upon what the caller of the Ajax call expects. If they expect a script and are going to execute the text as Javascript, then you can also just send Javascript text as a string. If they expect HTML and are going to process it as HTML, then you probably need to embed the <script> tag inside that HTML in order to get the Javascript executed.
In your example of:
response.send(blaBla());
That will work just fine if blaBla() synchronously returns a string that is formatted properly per the above comments about what the caller is expecting. If you want further help with that, then you need to show or describe for us how the request is initiated in the browser and show us the code for the blaBla() function because the issue is probably in the blaBla() function.
There are lots of issues with things you have in your question:
You show req.send(alert('Hello world')) in the text of your question. The .send() method belongs to the res object, not the req object (the second argument, not the first). So, that would be res.send(), not req.send().
In that same piece of code, there is no alert() function in node.js, but you are trying to execute it immediately and send the result with .send(). That won't work for a bunch of reasons.
Your first code block using blaBla() will work just fine as long as blaBla() returns a string of the right format that matches what the caller expects. If that doesn't work, then there's a problem with what blaBla() is doing so we need to see that code.
Your second code block works because you are send a string which is something the caller is equipped to handle.
Update now that you've shown the code for blaBla().
Your code for blaBla() does not return anything and it's asynchronous so it can't return the result. Thus, you cannot use the structure response.send(blaBla());. There is no way to make that work.
Instead, you will need to do something different like:
blaBla(response);
And, then modify blaBla() to call response.send(someTextValue) when the response string is known.
function blaBla(res) {
var youtubedl = require('youtube-dl');
var url = 'http://www.youtube.com/watch?v=WKsjaOqDXgg';
// Optional arguments passed to youtube-dl.
var options = ['--username=user', '--password=hunter2'];
youtubedl.getInfo(url, options, function(err, info) {
if (err) {
res.status(500).send("Internal Error");
} else {
console.log('id:', info.id);
console.log('title:', info.title);
console.log('url:', info.url);
console.log('thumbnail:', info.thumbnail);
console.log('description:', info.description);
console.log('filename:', info._filename);
console.log('format id:', info.format_id);
// construct your response here as a string
res.json(info);
}
});
}
Note also that the error handling does not use throw because that is really not useful inside an async callback.
No one just could help me with that and after finding things are alone I got to know how to do this. In express there is something called middleware we have to use that thing to get this kind of matter done. Those who are really expert or have working experience with express they know this thing.
to using functions with express you need to use middleware.
like below I'm showing
const express = require('express')
const youtubedl = require('youtube-dl');
const url = 'https://www.youtube.com/watch?v=quQQDGvEP10';
const app = express()
const port = 3000
function blaBla(req, res, next) {
youtubedl.getInfo(url, function(err, info) {
console.log('id:', info.id);
console.log('title:', info.title);
console.log('url:', info.url);
// console.log('thumbnail:', info.thumbnail);
// console.log('description:', info.description);
console.log('filename:', info._filename);
console.log('format id:', info.format_id);
});
next();
}
app.use(blaBla);
app.get('/', (request, response) => {
response.send('Hey Bebs, what is going on here?');
})
app.listen(port, (err) => {
if (err) {
return console.log('something bad happened', err)
}
console.log(`server is listening on ${port}`)
})
And remember that you must need to use app.use(blaBla); on top of getting your route. Otherwise this might not work.

Nodejs map serial port write to receive data

I am currently using node-serialport module for serial port communication. I will send a command ATEC and it will respond with ECHO.
However, this process of sending and receiving data is async(after i send the data, i will not know when the data will arrive in the data event), the example code is below:
//Register the data event from the serial port
port.on('data', (data) => {
console.log(data);
});
//Send data using serialport
port.write('ATEC');
Is there anyway I could write it in this way?
//When i send the command, I could receive the data
port.write('ATEC').then((data)=> {
console.log(data);
});
Is this possible to achieve?
In http communication using request client, we could do something like
request.get('http:\\google.com')
.on('response', (res) => {
console.log(res);
});
I want to replicate the same behaviour using serialport
I wrap a promise in the serial data receive
function sendSync(port, src) {
return new Promise((resolve, reject) => {
port.write(src);
port.once('data', (data) => {
resolve(data.toString());
});
port.once('error', (err) => {
reject(err);
});
});
}
Please take note, the event is using once instead of on to prevent event from stacking (please check the comments below for more information - thanks #DKebler for spotting it)
Then, I could write the code in sync as below
sendSync(port, 'AThello\n').then((data) => {
//receive data
});
sendSync(port, 'ATecho\n').then((data) => {
//receive data
});
or I could use a generator, using co package
co(function* () {
const echo = yield sendSync(port, 'echo\n');
const hello = yield sendSync(port, 'hello 123\n');
return [echo, hello]
}).then((result) => {
console.log(result)
}).catch((err) => {
console.error(err);
})
We have a similar problem in a project I'm working on. Needed a synchronous send/receive loop for serial, and the serialport package makes that kinda weird.
Our solution is to make some sort of queue of functions/promises/generators/etc (depends on your architecture) that the serial port "data" event services. Every time you write something, put a function/promise/etc into the queue.
Let's assume you're just throwing functions into the queue. When the "data" event is fired, it sends the currently aggregated receive buffer as a parameter into the first element of the queue, which can see if it contains all of the data it needs, and if so, does something with it, and removes itself from the queue somehow.
This allows you to handle multiple different kinds of architecture (callback/promise/coroutine/etc) with the same basic mechanism.
As an added bonus: If you have full control of both sides of the protocol, you can add a "\n" to the end of those strings and then use serialport's "readline" parser, so you'll only get data events on whole strings. Might make things a bit easier than constantly checking input validity if it comes in pieces.
Update:
And now that code has been finished and tested (see the ET312 module in http://github.com/metafetish/buttshock-js), here's how I do it:
function writeAndExpect(data, length) {
return new Promise((resolve, reject) => {
const buffer = new Buffer(length);
this._port.write(data, (error) => {
if (error) {
reject(error);
return;
}
});
let offset = 0;
let handler = (d) => {
try {
Uint8Array.from(d).forEach(byte => buffer.writeUInt8(byte, offset));
offset += d.length;
} catch (err) {
reject(err);
return;
}
if (offset === length) {
resolve(buffer);
this._port.removeListener("data", handler);
};
};
this._port.on("data", handler);
});
}
The above function takes a list of uint8s, and an expected amount of data to get back, returns a promise. We write the data, and then set ourselves up as the "data" event handler. We use that to read until we get the amount of data we expect, then resolve the promise, remove ourselves as a "data" listener (this is important, otherwise you'll stack handlers!), and finish.
This code is very specific to my needs, and won't handle cases other than very strict send/receive pairs with known parameters, but it might give you an idea to start with.

writestream and express for json object?

I might be out of depth but I really need something to work. I think a write/read stream will solve both my issues but I dont quite understand the syntax or whats required for it to work.
I read the stream handbook and thought i understood some of the basics but when I try to apply it to my situation, it seems to break down.
Currently I have this as the crux of my information.
function readDataTop (x) {
console.log("Read "+x[6]+" and Sent Cached Top Half");
jf.readFile( "loadedreports/top"+x[6], 'utf8', function (err, data) {
resT = data
});
};
Im using Jsonfile plugin for node which basically shortens the fs.write and makes it easier to write instead of constantly writing catch and try blocks for the fs.write and read.
Anyways, I want to implement a stream here but I am unsure of what would happen to my express end and how the object will be received.
I assume since its a stream express wont do anything to the object until it receives it? Or would I have to write a callback to also make sure when my function is called, the stream is complete before express sends the object off to fullfill the ajax request?
app.get('/:report/top', function(req, res) {
readDataTop(global[req.params.report]);
res.header("Content-Type", "application/json; charset=utf-8");
res.header("Cache-Control", "max-age=3600");
res.json(resT);
resT = 0;
});
I am hoping if I change the read part to a stream it will allievate two problems. The issue of sometimes receiving impartial json files when the browser makes the ajax call due to the read speed of larger json objects. (This might be the callback issue i need to solve but a stream should make it more consistent).
Then secondly when I load this node app, it needs to run 30+ write files while it gets the data from my DB. The goal was to disconnect the browser from the db side so node acts as the db by reading and writing. This due to an old SQL server that is being bombarded by a lot of requests already (stale data isnt an issue).
Any help on the syntax here?
Is there a tutorial I can see in code of someone piping an response into a write stream? (the mssql node I use puts the SQL response into an object and I need in JSON format).
function getDataTop (x) {
var connection = new sql.Connection(config, function(err) {
var request = new sql.Request(connection);
request.query(x[0], function(err, topres) {
jf.writeFile( "loadedreports/top"+x[6], topres, function(err) {
if(err) {
console.log(err);
} else {
console.log(x[6]+" top half was saved!");
}
});
});
});
};
Your problem is that you're not waiting for the file to load before sending the response. Use a callback:
function readDataTop(x, cb) {
console.log('Read ' + x[6] + ' and Sent Cached Top Half');
jf.readFile('loadedreports/top' + x[6], 'utf8', cb);
};
// ...
app.get('/:report/top', function(req, res) {
// you should really avoid using globals like this ...
readDataTop(global[req.params.report], function(err, obj) {
// setting the content-type is automatically done by `res.json()`
// cache the data here in-memory if you need to and check for its existence
// before `readDataTop`
res.header('Cache-Control', 'max-age=3600');
res.json(obj);
});
});

Categories