Access grpc stream variable for long-running process in Node - javascript

I am using Node.js to connect to a server using gRPC that performs a long running task.
The server sends a unidirectional stream to the client (the Node.js app) while the job is in progress. I need to implement a Stop button and am told that closing the gRPC stream will stop the job in progress.
This is currently my code:
let express = require('express'),
router = express.Router(),
grpc = require('grpc'),
srv = grpc.load(__dirname + '/job_handler.proto').ns;
let startJob = (jobID, parameters) => srv.createJob(jobID, parameters);
router.post('/jobs', (req, res) => {
let lengthyOperation = startJob(jobID, parameters);
lengthyOperation.on('data', (data) => {
console.log(`Data from lengthy operation: ${data}`);
});
lengthyOperation.on('end', () =>
console.log('Lengthy operation completed');
});
res.setHeader('Location', `/jobs/${jobID}`);
res.status(202).send();
});
As you can see, I send an HTTP 202 response to the client upon creating the job and it continues asynchronously in the background.
Questions:
How do I close the stream?
How do I access the lengthyOperation variable to do so?

The lengthyOperation object has a cancel method that cancels the call. So, when you want to stop the stream, just call lengthyOperation.cancel().
Note that when you do this, it will cause the call to end with an error. I would recommend adding a lengthyOperation.on('error', ...) handler to handle that error.

Related

Cannot execute HTTP GET call twice on nodejs/express backend

Using the packages express and ````ftp``` I try to simply get files from an ftp and return
them by HTTP GET to the client requesting.
The first request goes through fine, but when I try calling it again I run into Exception:
Error [ERR_HTTP_HEADERS_SENT]: Cannot set headers after they are sent to the client
I've tried to use the solutions from Error: Can't set headers after they are sent to the client like having a return when sending, not setting , unfortunately none of the worked for me.
This is ALL the code:
const express = require('express');
const ftp = require('ftp');
const app = express();
const port = 3000;
const c = new ftp();
app.get('/files', (req, res) => {
c.on('ready', () => {
c.list((err, list) => {
c.end();
return res.setHeader("Content-Type", "application/json").status(200).send({data: list});
});
});
c.connect({
host: 'xxx',
user: 'xxx',
password: 'xxx',
});
});
app.listen(port, () => {
console.log(`Example app listening on port ${port}`);
});
I think it might be something with the c.list() callback, however I cannot for the love of god find what is wrong with it, as the res.send() does not get called twice anytime.
The problem is you have just one ftp object, and every request subscribes to (and never unsubscribes from) the ready event, and the ready event fires every time you call connect() , which you do for every request. So when the second request calls connect(), the event fires for both the first and second request. This leads to setHeader() being called a second time for the first request, hence the error.
Using once() instead of on() so that the event handler it only called once should resolve the issue, though there are probably better ways to write this code (use a promise API or promisify this one, only initialize a connection to the FTP server once instead of for every request).

Why is request.on data firing with a delay on NodeJS?

There is a simple web server that accepts data. Sample code below.
The idea is to track in real time how much data has entered the server and immediately inform the client about this. If you send a small amount of data, then everything works well, but if you send more than X data in size, then the on.data event on the server is triggered with a huge delay. I can see that data is transfering for 5 seconds already but on.data event is not trigerred.
on.data event seems to be triggered only when data is uploaded completely to the server, so that's why it works fine with small data (~2..20Mb), but with big data (50..200Mb) it doesnt work well.
Or maybe it is due to some kind of buffering..?
Do you have any suggestions why on.data triggered with delay and how to fix it?
const app = express();
const port = 3000;
// PUBLIC API
// upload file
app.post('/upload', function (request, response) {
request.on('data', chunk => {
// message appears with delay
console.log('upload on data', chunk.length);
// send message to the client about chunk.length
});
response.send({
message: `Got a POST request ${request.headers['content-length']}`
});
});
app.listen(port, () => {
console.log(`Example app listening at http://localhost:${port}`);
});
TLDR:
The delay that you are experiencing probably is the Queueing from Resource scheduling from the browser.
The Test
I did some tests with express, and then I found that it uses http to handle requests/response, so I used a raw http server listener to test this scenario, which has the same situation.
Backend code
This code, based on sample of Node transaction samples, will create a http server and give log of time on 3 situations:
When a request was received
When the first data event fires
When the end event fires
const http = require('http');
var firstByte = null;
var server = http.createServer((request, response) => {
const { headers, method, url } = request;
let body = [];
request.on('error', (err) => {
}).on('data', (chunk) => {
if (!firstByte) {
firstByte = Date.now();
console.log('received first byte at: ' + Date.now());
}
}).on('end', () => {
console.log('end receive data at: ' + Date.now());
// body = Buffer.concat(body).toString();
// At this point, we have the headers, method, url and body, and can now
// do whatever we need to in order to respond to this request.
if (url === '/') {
response.statusCode = 200;
response.setHeader('Content-Type', 'text/html');
response.write('<h1>Hello World</h1>');
}
firstByte = null;
response.end();
});
console.log('received a request at: ' + Date.now());
});
server.listen(8083);
Frontend code (snnipet from devtools)
This code will fire a upload to /upload which some array data, I filled the array before with random bytes, but then I removed and see that it did not have any affect on my timing log, so yes.. the upload content for now is just an array of 0's.
console.log('building data');
var view = new Uint32Array(new Array(5 * 1024 * 1024));
console.log('start sending at: ' + Date.now());
fetch("/upload", {
body: view,
method: "post"
}).then(async response => {
const text = await response.text();
console.log('got response: ' + text);
});
Now running the backend code and then running the frontend code I get some log.
Log capture (screenshots)
The Backend log and frontend log:
The time differences between backend and frontend:
Results
looking at the screenshoots and I get two differences between the logs:
The first, and most important, is the difference between frontend fetch start and backend request recevied, I got 1613ms which is "close" (1430ms) to Resource Scheduling in network timing tab, I think there are more things happening between the frontend fetch call and the node backend event, so I can't direct compare the times:
log.backendReceivedRequest - log.frontEndStart
1613
The second is the difference between receving data on backend, which I got
578ms, close to Request sent (585ms) in network timing tab:
log.backendReceivedAllData - log.backendReceivedFirstData
578
I also changed the frontend code to send different sizes of data and the network timing tab still matches the log
The thing that remains unknown for me is... Why does Google Chrome is queueing my fetch since I'm not running any more requests and not using the bandwidth of the server/host? I readed the conditions for Queueing but not found the reason, maybe is allocating the resources on disk, but not sure: https://developer.chrome.com/docs/devtools/network/reference/#timing-explanation
References:
https://nodejs.org/es/docs/guides/anatomy-of-an-http-transaction/
https://developer.chrome.com/docs/devtools/network/reference/#timing-explanation
I found a problem. It was in nginx config. Nginx was setup like a reverse proxy. By default proxy request buffering is enabled, so nginx grabs first whole request body and only then forwards it to nodejs, so that's why I saw delay.
https://nginx.org/en/docs/http/ngx_http_proxy_module.html#proxy_request_buffering

How to close an unbounded and piped stream request in node?

My node/express application has an endpoint that's proxying a stream of data from an internal service, which is using server-sent events. This means the internal service will continue to stream data in eternity until the connection closes.
It works well, but when the browser closes the connection to my node app, the piped connection to the internal service stays open, causing the internal service to have a lot of open/unused connections.
So I'm trying to force close the piped connection when the node connection closes, but can't seem to figure out how to do it.
Code looks something like this. Piping using the request/request library.
import request from 'request';
app.get('/stream', (req, res) => {
const stream = request.get({
url: 'https://internalservice.acme.com/stream'
})
stream.on('error', console.log);
stream.pipe(res);
// When browser closes...
req.on('close', () => {
// ...close connection to internal service
stream.destroy() // <-- doesn't work
});
});
When you're making a request in Node, there is the abort() method. It will close your request stream.
req.on('close', () => {
// ...close connection to internal service
stream.abort()
});

nodeJS video streaming, connection not closing when user disconnects or moved to different page

I am building a video streaming server in nodeJS using express and using "request-progress" module to get the progress status.
It's(video streaming) working fine.
But the problem is, even after I close the browser or moves to next page, the server still streaming data. I can see that in console.
Here is the code for the same:
app.route('/media/*').get(function (req, res) {
var originalUrl = "http://myactualserver.com";
var resourceUrl = originalUrl.split('media');
var requestURL = BASE_URL + resourceUrl[1];
req.on('close', function () {
console.log('Client closed the connection');
});
var options = {};
progress(request(requestURL), {
throttle: 1000,
delay: 0,
lengthHeader: 'content-length',
})
.on('progress', function (state) {
console.log('progress', state);
})
.on('error', function (err) {
console.log('err');
})
.on('end', function () {
console.log('end');
})
.pipe(res);
});
I tried the following post and finally added the
req.on("close", function(){});
Video Streaming with nodejs and Express
node.js http server, detect when clients disconnect
Node.js server keeps streaming data even after client disconnected
My question is:
1. How will call either the "error" or "end" function of "request-progress"?
2. How will I close the streaming?
3. Why console.log('Client closed the connection');
is called twice?
Answers to your questions:
You can probably use emit from events.EventEmitter
Refer this example
https://github.com/IndigoUnited/node-request-progress/blob/master/test/test.js
Alternatively you can use the last progress status in close function.
Not sure what is the question here, the progress will continue till the socket is closed.
If you are making the request from browser (default player), it makes two requests.

How to know when node.js express server is up and ready to use

Have an application where I want to start a node express server and then start a browser on the same machine automatically as soon as the server is up. How can I query to see if the server is up and ready to go? I really wanted there to be some sort of callback on the .listen call, but doesn't seem to be. I could just wait a longer than I expect amount of time, but this is going on equipment that will be in the field so I either have to wait a ridiculous amount of time to make sure I'm up and running before kicking off the browser or have the user be smart enough to hit refresh if the page doesn't load right. Neither of those are good options for me. . .
I read the API online but don't see anything like this. Surely there's a trick I don't know that can accomplish this.
If the node HTTP api (which has a callback and tells me about the listening event) is the base for the express object, maybe there is a callback option for the express call listen that isn't documented. Or perhaps I'm supposed to just know that it's there.
Any help would be greatly appreciated.
The Express app.listen function does support a callback. It maps the arguments that you pass in to the http.listen call.
app.listen = function(){
var server = http.createServer(this);
return server.listen.apply(server, arguments);
};
So you can just call: app.listen(port, callback);
Or you could use http.listen directly.
var app = require('express')(),
server = require('http').createServer(app);
server.listen(80, function() {
console.log('ready to go!');
});
You can fire a custom event after the server is started:
// server.js
const express = require('express');
const app = express();
modeule.export = app;
app.listen(3000, () => {
app.emit('listened', null)
});
In a separate module, the app can listen your custom event:
// custom.js
const server = require('server.js');
server.on('listened', function() {
console.log('The server is running!');
});
You can use the http.listen method which has a callback function that triggers once the server is ready:
http.createServer(app).listen(app.get('port'), function () {
console.log('Printed when ready!!!');
});
See the official reference at Node.js:
http://nodejs.org/api/all.html#all_server_listen_port_hostname_backlog_callback
http://nodejs.org/api/all.html#all_server_listen_path_callback_1
http://nodejs.org/api/all.html#all_server_listen_handle_callback_1
As many have mentioned, the listen function (on the express app or an http server, both support it), does support a callback and that will let your node process know when it is listening.
So if you plan to launch the browser from within your express app, do it there and you are good. However, if you are launching the express app from an external script and then want that external script to open the browser, the node callback doesn't really buy you anything.
Waiting for some magic string on stdout isn't really an improvement on just waiting for a good HTTP response. You may as well just use a try/backoff/timeout loop with curl until you get a successful response.
server.on('listening', function() {
resolve();//I added my own promise to help me await
});
The Listening event worked for me. Note I added my own Promise. I imagine you could obtain similar results without a promise by adding an entry point to this listener.
Note, I tried the more intuitive server.on('listen') and it didn't work. Running node 6.9.1
With async/await syntax, this can be done by wrapping the server startup in a promise, so you can wait for it to be started before running anything else:
import express from 'express';
import http from 'http';
const app = express();
let server: http.Server;
const startServer = async (): Promise<void> => {
return new Promise((resolve, _reject) => {
server = app.listen(3000, () => {
console.log('Express server started');
resolve();
});
});
};
await startServer();
// here the server is started and ready to accept requests

Categories