Wrapping node.js request into promise and piping - javascript

Here is my scenario:
I want to get some external resource (binary file) using request library and pipe it to the client of my own application. If response code is != 200 or there are problems reaching remote server, I want to intercept and provide custom error message instead. Ideally, if response is fine, I want original headers to be preserved.
I was able to achieve that with the first piece of code I've pasted below. However, my whole application is based on Promise API so I wanted to make it consistent and wrap it in promise too. And when I do that, it no longer works. Firstly, I tried to achieve that with request-promise, without success. Then I tried to prepare very simple example on my own, still no luck.
Working example
var r = request.get('http://foo.bar');
r.on('response', result => {
if (result.statusCode === 200) {
r.pipe(res);
} else {
res.status(500).send('custom error message')
}
});
r.on('error', error => res.status(500).send('custom error message');
Not working example
var r;
return new Promise((resolve, reject) => {
r = request.get('http://foo.bar');
r.on('response', result => {
if (result.statusCode === 200) {
resolve();
} else {
reject()
}
});
r.on('error', reject);
}).then(() => {
r.pipe(res);
}).catch(() => {
res.status(500).json('custom error message');
});
By 'not working' I mean - no response is delivered, request is pending until timeout.
I've changed the code to call .pipe() on result passed to resolve instead of r. It responds to client, but response is empty then.
At the end, I've tried replacing request lib with simply http.get(). And with that, server returns file to the client, but headers (like Content-Type or Content-Length) are missing.
I've googled a lot, tried several request versions... and nothing is working.

The problem is that when "response" is triggered, you create a new promise that resolves immeadiately, but the then callback is always executed asynchronously, and when it gets called the file has arrived at the server, and there is no data flowing through the stream anymore. Instead you could just use the body parameter of the callback:
request.get('http://foo.bar', function(request, response, body) {
if(response.statusCode === 200) {
res.end(body);
} else {
res.status(500).end();
}
});
For working with streams request seems a bit buggy, axios seems to do it better:
axios.get("http://foo.bar"', {
validateStatus: status => status === 200,
responseType: "stream"
}).then(({data: stream}) => {
stream.pipe(res);
}).catch(error => {
res.status(500).json(error);
});

Related

nodejs retrieve body from inside request scope

I'm new to nodejs and javascript in general. I believe this is an issue with the scope that I'm not understanding.
Given this example:
...
...
if (url == '/'){
var request = require('request');
var body_text = "";
request('http://www.google.com', function (error, response, body) {
console.log('error:', error);
console.log('statusCode:', response && response.statusCode);
console.log('body:', body);
body_text=body;
});
console.log('This is the body:', body_text)
//I need the value of body returned from the request here..
}
//OUTPUT
This is the body: undefined
I need to be able to get the body from response back and then do some manipulation and I do not want to do all the implementation within the request function. Of course, if I move the log line into:
request( function { //here })
It works. But I need to return the body in some way outside the request. Any help would be appreciated.
You can't do that with callbacks because this will works asynchronously.
Work with callbacks is kind of normal in JS. But you can do better with Promises.
You can use the request-promise-native to do what you want with async/await.
async function requestFromClient(req, res) {
const request = require('request-promise-native');
const body_text = await request('http://www.google.com').catch((err) => {
// always use catches to log errors or you will be lost
})
if (!body_text) {
// sometimes you won't have a body and one of this case is when you get a request error
}
console.log('This is the body:', body_text)
//I need the value of body returned from the request here..
}
As you see, you always must be in a function scope to use the async/await in promises.
Recommendations:
JS the right way
ES6 Fetures
JS clean coding
More best practices...
Using promises

ionic2 http get fails but other work

I do not get why the following URL is not working. I am running a bunch of nearly identical requests successfully but this one is not arriving at the server. Could it be the URL length in ionic2 maxes out?
let requestURL = 'https://myServer/php/setUserPushInfo.php?user=user&PushToken=fc0112d3936f738d9d4c197c50abf80304ab13fca48b19d539ecacf65ce58b34&OS=iOS&other=value';
this.http.get(requestURL).map(res => {
console.log(JSON.stringify(res.json()));
return res.json();
},
err => {
console.log('ERROR ' + err);
}
);
similar requests in my code all the time. Using requestURL in the browser works... ? There are 4 other requests to other php files running concurrently and successfully.
In Angular 2/Ionic 2, if you are using http module,
the request itself is only sent if it is subscribed to by one or more subscribers.
It uses Observables concept to make it work.
As soon as it has observers, it will send the request and return an observable for the response which will be later returned asynchronously.
Without anything waiting for a response there is no need to send request.
Try it like this:
let requestURL = 'https://myServer/php/setUserPushInfo.php?user=user&PushToken=fc0112d3936f738d9d4c197c50abf80304ab13fca48b19d539ecacf65ce58b34&OS=iOS&other=value';
this.http.get(requestURL)
.map(res => res.json())
.subscribe(data => {
console.log("Data is :",data);
observer.next(data);
},
err => {
console.log('ERROR ',err);
observer.error(err);
});
Should work.

Why does fetch return 302 status code but XHR doesn't?

I'm using fetch to query my own WordPress REST API endpoint. However, every request that I make with fetch ends up with a infinite loop of 302 status codes until fetch finally errors with TOO_MANY_REDIRECTS.
This is on Chrome 56, using the native fetch. But when I visit the endpoint using my browser, use XHR or Postman, I always get 200 as the status.
XHR on the other hand...
Code relevant to this problem:
static checkStatus(email = null) {
const cookie = Cookies.get('cookiename');
const data = email || cookie;
if (!cookie && !email) {
return Promise.reject(new Error('No data given.'));
}
// This works.
return new Promise((resolve, reject) => {
request({
url: `/wp-json/woo/donate/status/?email=${data}`,
method: 'GET',
success: (response) => {
resolve(response);
},
error: (error) => {
reject(error);
}
})
});
// This doesn't.
return fetch(`/wp-json/woo/donate/status/?email=${data}`, {
method: 'GET',
});
}
Please note that I do not actually have two consecutive return statements in my code, that's just for demo. I'm using request() as a wrapper to XHR, you can find the source for that here.
So how do I go around this? I know fetch is still "experimental", but I've used it in several already shipped projects by bundling a polyfill to browsers that don't support it just yet. This is a new thing.
This problem was resolved by "rebuilding" WordPress by destroying my local Vagrant machine, and starting it again.
For once, turning it on and off again fixed the problem.

Nodejs map serial port write to receive data

I am currently using node-serialport module for serial port communication. I will send a command ATEC and it will respond with ECHO.
However, this process of sending and receiving data is async(after i send the data, i will not know when the data will arrive in the data event), the example code is below:
//Register the data event from the serial port
port.on('data', (data) => {
console.log(data);
});
//Send data using serialport
port.write('ATEC');
Is there anyway I could write it in this way?
//When i send the command, I could receive the data
port.write('ATEC').then((data)=> {
console.log(data);
});
Is this possible to achieve?
In http communication using request client, we could do something like
request.get('http:\\google.com')
.on('response', (res) => {
console.log(res);
});
I want to replicate the same behaviour using serialport
I wrap a promise in the serial data receive
function sendSync(port, src) {
return new Promise((resolve, reject) => {
port.write(src);
port.once('data', (data) => {
resolve(data.toString());
});
port.once('error', (err) => {
reject(err);
});
});
}
Please take note, the event is using once instead of on to prevent event from stacking (please check the comments below for more information - thanks #DKebler for spotting it)
Then, I could write the code in sync as below
sendSync(port, 'AThello\n').then((data) => {
//receive data
});
sendSync(port, 'ATecho\n').then((data) => {
//receive data
});
or I could use a generator, using co package
co(function* () {
const echo = yield sendSync(port, 'echo\n');
const hello = yield sendSync(port, 'hello 123\n');
return [echo, hello]
}).then((result) => {
console.log(result)
}).catch((err) => {
console.error(err);
})
We have a similar problem in a project I'm working on. Needed a synchronous send/receive loop for serial, and the serialport package makes that kinda weird.
Our solution is to make some sort of queue of functions/promises/generators/etc (depends on your architecture) that the serial port "data" event services. Every time you write something, put a function/promise/etc into the queue.
Let's assume you're just throwing functions into the queue. When the "data" event is fired, it sends the currently aggregated receive buffer as a parameter into the first element of the queue, which can see if it contains all of the data it needs, and if so, does something with it, and removes itself from the queue somehow.
This allows you to handle multiple different kinds of architecture (callback/promise/coroutine/etc) with the same basic mechanism.
As an added bonus: If you have full control of both sides of the protocol, you can add a "\n" to the end of those strings and then use serialport's "readline" parser, so you'll only get data events on whole strings. Might make things a bit easier than constantly checking input validity if it comes in pieces.
Update:
And now that code has been finished and tested (see the ET312 module in http://github.com/metafetish/buttshock-js), here's how I do it:
function writeAndExpect(data, length) {
return new Promise((resolve, reject) => {
const buffer = new Buffer(length);
this._port.write(data, (error) => {
if (error) {
reject(error);
return;
}
});
let offset = 0;
let handler = (d) => {
try {
Uint8Array.from(d).forEach(byte => buffer.writeUInt8(byte, offset));
offset += d.length;
} catch (err) {
reject(err);
return;
}
if (offset === length) {
resolve(buffer);
this._port.removeListener("data", handler);
};
};
this._port.on("data", handler);
});
}
The above function takes a list of uint8s, and an expected amount of data to get back, returns a promise. We write the data, and then set ourselves up as the "data" event handler. We use that to read until we get the amount of data we expect, then resolve the promise, remove ourselves as a "data" listener (this is important, otherwise you'll stack handlers!), and finish.
This code is very specific to my needs, and won't handle cases other than very strict send/receive pairs with known parameters, but it might give you an idea to start with.

How to use setInterval (with clearInterval) to send requests using Bluebird Promises?

I'm using node-request for sending the requests to server to get some report. The thing is server needs some time to generate the report, so it responses with the report state. I'm checking the report state with setInterval() function, and use clearInterval() when server sends ready response. But with this approach, even after I use clearInterval, responses of earlier requests keep coming, and the response handler runs again and again. This does not cause a lot of harm, but still I believe it can be done better.
Here is my code:
checkForReportReady = setInterval =>
#request URL, options, (err, res, body) =>
console.log err if err
body = JSON.parse body
if body['status'] is 'ready'
clearInterval checkForReportReady
#processReport body
, 1000
What I need: make a request, wait for response, check the status, if status is not ready - make another request after some timeout, repeat until the status code in response is ready. If the status is ready - exit the loop (or clear the interval) and run #processReport.
I tried to make promisified request, and put it into setInterval, but the result was the same.
P.S. I do not control the server, so I can't change the way it responds or deals with the report.
I would recommend not to put requests in an interval callback. This can get ugly when they a) fail b) take longer than the interval.
Instead put a setTimeout in the success handler and try again after (and only if) receiving a response.
This is rather easy with promises:
request = Promise.promisifyAll require 'request'
getReport = () =>
request URL, options
.spread (res, body) =>
body = JSON.parse body
if body.status is 'ready'
body
else
Promise.delay 1000
.then getReport # try again
getReport().then(#processReport, (err) -> console.log(err))
It seems like you can just use a setTimeout() in your response handler:
function checkForReportReady() {
request(URL, options, function(err, res, body) {
if (err) {
console.log(err);
} else {
if (body.status === "ready") {
processReport(body);
// do any other processing here on the result
} else {
// try again in 20 seconds
setTimeout(checkForReportReady, 20*1000);
}
}
});
}
This will run one request, wait for the response, check the response, then if it's ready it will process it and if it's not ready, it will wait a period of time and then start another request. It will never have more than one request in-flight at the same time.
If you want to use Bluebird promises, you can do that also, though in this case it doesn't seem to change the complexity particularly much:
var request = Promise.promisifyAll(require('request'));
function checkForReportReady() {
return request(URL, options).spread(function(res, body) {
if (body.status === "ready") {
return body;
} else {
// try again in 20 seconds
return Promise.delay(20 * 1000).then(checkForReportReady);
}
});
}
checkForReportReady().then(function(body) {
processReport(body);
}, function(err) {
// error here
});

Categories