One SaaS provider we use has a webook field, but only allows a single url to be entered. In fact, we need this webhook to be sent to two analytics services, so I need to figure out a way to write a custom endpoint that forwards the entire request to as many other endpoints as we need (currently 2).
What is the easiest way to do this with node and express? If I am not mistaken, a simple redirect would not work for multiple POSTs, right?
I am not sure what the headers or even request content will look like, but it needs to be preserved as much as possible in case auth is in headers etc.
This is what I have so far, but it's nowhere near complete:
app.post('/', (req, res) => {
console.log('Request received: ', req.originalUrl)
const forwardRequests = config.forwardTo.map(url => {
return new Promise((resolve, reject) => {
superagent
.post(url)
.send(req)
.end((endpointError, endpointResponse) => {
if (endpointError) {
console.error(`Received error from forwardTo endpoint (${url}): `, endpointError)
reject(endpointError)
} else {
resolve(endpointResponse)
}
})
})
})
Promise.all(forwardRequests)
.then(() => res.sendStatus(200))
.catch(() => res.sendStatus(500))
})
I get an error, because superagent.send is just for the content... how can I duplicate a request entirely and send it off?
To duplicate a request entirely and send it off to various endpoint, you can use the request module with req.pipe(request(<url>)) and Promise.all.
According to request module's document:
You can also pipe() from http.ServerRequest instances, as well as to http.ServerResponse instances. The HTTP method, headers, and entity-body data will be sent.
Here is an example:
const { Writable } = require('stream');
const forwardToURLs = ['http://...','http://...'];
app.post('/test', function(req, res) {
let forwardPromiseArray = [];
for (let url of forwardToURLs) {
let data = '';
let stream = new Writable({
write: function(chunk, encoding, next) {
data += chunk;
next();
}
});
let promise = new Promise(function(resolve, reject) {
stream.on('finish', function() {
resolve(data);
});
stream.on('error', function(e) {
reject(e);
});
});
forwardPromiseArray.push(promise);
req.pipe(request(url)).pipe(stream);
}
Promise.all(forwardPromiseArray).then(function(result) {
// result from various endpoint, you can process it and return a user-friendly result.
res.json(result);
}).catch(function() {
res.sendStatus(500);
});
});
Please note the above code should be placed before body-parser (if you are using it). Otherwise, the request won't be piped.
Related
Hello Stackoverflow users,
Many peoples like me searched for how to pass extra arguments to a callback function. The questions have similar titles but actually they have different challenges and many ways to solve. Plus, it is always a pleasure to share practices to be more experienced.
Recently, I faced a pretty simple challenge in my node js project. One of the APIs I communicate with has an SDK that works synchronically. And I used to pass callback functions every time (which is annoying when you have requests depending on each other and some data needs to transfer within the app layers).
Imagine a plan payment flow that goes like this, a client sends a request to the server including the selected plan and his ID. When the server API layer receives the request data, it passes it to a third-party service function ( .create(...) ). The third-party service function receives a callback with 2 parameters function(err, plan_document). And then, the callback is supposed to apply the selected plan logic on the client by the ID in the request.
** We need to pass the client's and the plan's data to the callback function to apply the logic. The third-party service provides to the callback a plan_document parameter and we still need to somehow pass the client id from the API layer to the service.
The code will look like this.
const create_plan_agreement = (req, res) => {
// some code
var client_id = req.auth.client_id;
third_party.plan_agreement.create({}, update_plan_agreement);
};
const update_plan_agreement = (err, plan_document, client_id) => {
/*
The third-party `third_party.plan_agreement.create` function passes the first
two parameters and somehow we need to add the client_id
*/
console.log('client plan activated');
active_client_plan(plan_document, client_id);
};
------------------ EDIT ------------------
I wonder what if the flow was longer and I need the client id farther than the update function like this.
const create_plan_agreement = (req, res) => {
// some code
var client_id = req.auth.client_id;
third_party.plan_agreement.create({}, update_plan_agreement);
};
const update_plan_agreement = (err, plan_document) => {
console.log('plan activated, send notification to the client');
third_party.plan_agreement.update(plan_document, send_agreement_notification);
};
const send_agreement_notification = (err, plan_document) => {
console.log('client plan activated');
active_client_plan(plan_document, this.client_id);
};
What should I do in this case? Should I keep repeating the.bind({'client_id': client_id}) function until the last step in the flow?
If you want to support older people, you can easily bind using a containing callback, like this:
const create_plan_agreement = (req, res) => {
// some code
var client_id = req.auth.client_id;
third_party.plan_agreement.create({}, function(params, from, create) {
update_plan_agreement(params, from, create, client_id)
});
};
const update_plan_agreement = (err, plan_document, client_id) => {
/*
The third-party `third_party.plan_agreement.create` function passes the first
two parameters and somehow we need to add the client_id
*/
console.log('client plan activated');
active_client_plan(plan_document, client_id);
};
The traditional way is to use a closure. Define the functions inside the parent's scope so that they can access the client_id as an enclosed variable (kind of like global variables):
const create_plan_agreement = (req, res) => {
// some code
var client_id = req.auth.client_id;
const update_plan_agreement = (err, plan_document) => {
console.log('plan activated, send notification to the client');
third_party.plan_agreement.update(plan_document, send_agreement_notification);
};
const send_agreement_notification = (err, plan_document) => {
console.log('client plan activated');
// Note: this function can access client_id
// because it is in scope
active_client_plan(plan_document, client_id);
};
third_party.plan_agreement.create({}, update_plan_agreement);
};
Closures are to scopes what objects are to classes. A closure is an instance of a scope. So unlike regular global variable, each call to create_plan_agreement() will create its own closure with its own copy of client_id.
With modern javascript it is often easier to handle this with Promises. Convert legacy functions to return a Promise and then you can use async/await:
const create_plan_agreement = async (req, res) => {
// some code
var client_id = req.auth.client_id;
try {
var plan_document = await plan_agreement_create({});
var updated_plan_document = await update_plan_agreement(plan_document);
send_agreement_notification(updated_plan_document, client_id);
}
catch (err) {
// handle errors here.
}
};
const plan_agreement_create = (arg) {
return new Promise ((ok, fail) => {
third_party.plan_agreement.create({}, (err, result) => {
if (err) {
return fail(err);
}
ok(result);
});
})
}
const update_plan_agreement = (plan_document) => {
return new Promise ((ok, fail) => {
third_party.plan_agreement.update(plan_document, (err, result) => {
if (err) return fail(err);
ok(result);
});
});
};
const send_agreement_notification = (plan_document, client_id) => {
active_client_plan(plan_document, client_id);
};
Or even without async/await Promises still make callbacks easier to use:
const create_plan_agreement = async (req, res) => {
// some code
var client_id = req.auth.client_id;
plan_agreement_create({})
.then(doc => update_plan_agreement(doc));
.then(doc => {
send_agreement_notification(doc, client_id)
})
.catch(err => {
// handle errors here.
});
};
INTRODUCTION
I am implementing a function for making any kind of https request to any endpoint (using the https native module). When I make a request to a specific API I get an error response in JSON format. Like this:
{
"error": {
"code": 404,
"message": "ID not found"
}
}
How can I handle this kind of errors? At a first moment, I supposed that they were handled in
request.on("error", (err) => {
reject(err);
});
HTTPs Request function code
I have comment '<---------' in the relevant parts of the code
const https = require("https");
exports.httpsRequest = function (options, body = null) {
/*
This function is useful for making requests over the HTTPs protocol
*/
return new Promise((resolve, reject) => {
const request = https.request(options, (response) => {
// Get the response content type
const contentType =
response.headers["content-type"] &&
response.headers["content-type"].split(";")[0];
// Cumulate data
let chuncks = [];
response.on("data", (chunck) => {
chuncks.push(chunck);
});
response.on("end", () => {
// Concat all received chunks
let response = Buffer.concat(chuncks);
// Some responses might be in JSON format...
if (contentType === "application/json") {
// Jsonify the response
response = JSON.parse(response);
}
// (For the future) TODO - Check and parse more content types if needed.
// Resolve the promise with the HTTPs response
resolve(response); // <--------- The JSON format error responses are resolved too!!
});
});
// Reject on request error
request.on("error", (err) => {
// <------------- At a first moment, I supposed that all error responses were handled in this part of the code
reject(err);
});
// Write the body
if (body) {
request.write(body);
}
// Close HTTPs connection.
request.end();
});
};
Question
Why the error response is not handled in request.on("error", ...) ?
Thank you. I would appreciate any help or suggestion.
You need to create a different code path for when the content type isn't what you were expecting in which you call reject() and you also need to try/catch around JSON parsing errors so you can properly catch them and reject on them too. You can solve those issues with this code:
exports.httpsRequest = function (options, body = null) {
/*
This function is useful for making requests over the HTTPs protocol
*/
return new Promise((resolve, reject) => {
const request = https.request(options, (response) => {
// Get the response content type
const contentType =
response.headers["content-type"] &&
response.headers["content-type"].split(";")[0];
// Cumulate data
let chuncks = [];
response.on("data", (chunck) => {
chuncks.push(chunck);
});
response.on("end", () => {
// Concat all received chunks
let response = Buffer.concat(chuncks);
// Some responses might be in JSON format...
if (contentType === "application/json") {
try {
// Jsonify the response
response = JSON.parse(response);
resolve(response);
return;
} catch(e) {
reject(e);
return;
}
}
reject(new Error("Not JSON content-type"))
});
});
// Reject on request error
request.on("error", (err) => {
reject(err);
});
// Write the body
if (body) {
request.write(body);
}
// Close HTTPs connection.
request.end();
});
};
FYI, libraries such as got() and others listed here, all do this work for you automatically and have a lot of other useful features. You don't really need to build this yourself.
Say there is a HTTP GET callback defined as:
router.get('/latestpost', function(req, res, next) {
var data = new FbData();
get_latest_post (data);
get_post_image (data);
res.json(data);
};
Both get_ functions use the fb package to generate a HTTP request and execute a callback when finished. How can the above GET callback be modified in order to wait for the responses from Facebook and only then send a response to the client?
At the time being I solved the problem by executing the get_ functions in series and passing them the res (response) argument, with the last function sending the response:
router.get('/latestpost', function(req, res, next) {
var data = new FbData();
get_latest_post (res, data);
};
function get_latest_post (res, data) {
FB.api(_url, function (res_fb) {
if(!res_fb || res_fb.error) {
console.log(!res_fb ? 'error occurred' : res_fb.error);
return;
}
// Do stuff with data
get_post_image (res, data);
});
}
function get_post_image (res, data) {
FB.api(_url, function (res_fb) {
if(!res_fb || res_fb.error) {
console.log(!res_fb ? 'error occurred' : res_fb.error);
return;
}
// Do stuff with data
/* At the end send the post data to the client */
res.json(data);
});
}
I have found a similar question, but I'm wrapping my head around it, since I can't find a proper way to apply the solution to my problem. I have tried using the patterns described in this manual, but I can't get it to execute using promises, or async/await. Can someone please point me in the right direction?
Your API can easily be modified to return a promise:
function get_post_image (res, data) {
return new Promise((resolve, reject) => {
FB.api(_url, function (res_fb) {
if(!res_fb || res_fb.error) {
reject(res_fb && res_fb.error);
} else resolve(res_fb/*?*/);
});
}
Now that you have a promise, you can await it:
router.get('/latestpost', async function(req, res, next) {
const data = new FbData();
const image = await get_post_image (data);
res.json(data);
});
I am trying to issue an HTTP request to another web service, from a Google Cloud Function (GCF) that I have created. I need the HTTP request to complete and return that result inside of my GCF so that I can do something else with it.
My question is; What is the best way to use Promise inside a Google Cloud Function? Is what I am trying to do possible?
My code currently looks like this:
export const MyGCF = functions.https.onRequest((request, response) => {
let dayOfTheWeek: any;
const request1 = require('request');
const url = 'http://worldclockapi.com/api/json/pst/now';
function getDay() {
return new Promise((resolve, reject) => {
request1(url, { json: true }, (err: any, res: any, body: any) => {
if (err) {
return reject(err);
}
resolve(body.dayOfTheWeek);
});
});
}
getDay().then((data) => {
dayOfTheWeek = data;
console.log(dayOfTheWeek);
});
});
In general your approach will work, and you can define additional functions inside of your MyGCF handler, in the same way that you have defined getDay(). One problem with you current code however is that you're forgetting to "write a response" for the request being processed by MyGCF.
You can write a response for the request by calling send() on the second res argument of your MyGCF request handler. A simple example would be:
/* Sends a response of "hello" for the request */
res.send("hello");
With respect to your code, you can use res.send() in your .then() callback to send a response back to the client after getDay() has completed (see code below). Note also to include a .catch() clause and callback for the error case (with an error status) to ensure the client receives an appropriate error response if the call to getDay() fails:
export const MyGCF = functions.https.onRequest((req, res) => {
const request = require('request');
const url = 'http://worldclockapi.com/api/json/pst/now';
function getDay() {
return new Promise((resolve, reject) => {
request(url, {
json: true
}, (err: any, r: any, body: any) => {
if (err) {
reject(err);
} else {
resolve(body.dayOfTheWeek);
}
});
});
}
getDay().then((dayOfTheWeek) => {
/* Send a response once the getDay() request complete */
res.send(dayOfTheWeek);
})
.catch(err => {
/* Don't forget the error case */
res.status(500).send(err);
});
});
This is a pseudo code of what I am trying to achieve. First I need to get a list of URLs from the request body then pass those URLs to request function (using request module) which will get the data from each url and then save those data to MongoDB. After all the requests are finished including saving data to the server only then it should send a response.
app.post('/', (req, resp) => {
const { urls } = req.body;
urls.forEach((url, i) => {
request(url, function (err, resp, body) {
if (err) {
console.log('Error: ', err)
} else {
// function to save data to MongoDB server
saveUrlData(body);
console.log(`Data saved for URL number - ${i+1}`)
}
})
});
// Should be called after all data saved from for loop
resp.send('All data saved')
})
I have tried this code and of course the resp.send() function will run without caring if the request has completed. Using this code I get a result on the console like this:
Data saved for URL number - 3
Data saved for URL number - 1
Data saved for URL number - 5
Data saved for URL number - 2
Data saved for URL number - 4
I could write them in nested form but the variable urlscan have any number of urls and that's why it needs to be in the loop at least from my understanding. I want the requests to run sequentially i.e. it should resolve 1st url and then second and so on and when all urls are done only then it should respond. Please help!
app.post('/', async (req, resp) => {
const {
urls
} = req.body;
for (const url of urls) {
try {
const result = await doRequest(url)
console.log(result)
} catch (error) {
// do error processing here
console.log('Error: ', err)
}
}
})
function doRequest(url) {
return new Promise((resolve, reject) => {
request(url, function(err, resp, body) {
err ? reject(err) ? resolve(body)
})
})
}
using async await
You should look at JavaScript Promises
Otherwise, you can do a recursive request like so:
app.post('/', (req, resp) => {
const { urls } = req.body;
sendRequest(urls, 0);
})
function sendRequest(urlArr, i){
request(urlArr[i], function (err, resp, body) {
if (err) {
console.log('Error: ', err)
}
else {
saveUrlData(body);
console.log(`Data saved for URL number - ${i+1}`)
}
i++;
if(i == urlArr.length) resp.send('All data saved') //finish
else sendRequest(urlArr, i); //send another request
})
}
All I had to do is create a separate function I can call over and over again, passing the url array and a base index 0 as arguments. Each success callback increments the index variable which I pass in the same function again. Rinse and repeat until my index hits the length of the url array, I'll stop the recursive loop from there.
You want to wait till all api response you get and stored in db, so you should do async-await and promisify all the response.
You can use Request-Promise module instead of request. So you will get promise on every requested api call instead of callback.
And use promise.all for pushing up all request(module) call inside array.
Using async-await you code execution will wait till all api call get response and stored in db.
const rp = require('request-promise');
app.post('/', async (req, res) => {
try{
const { urls } = req.body;
// completed all will have all the api resonse.
const completedAll = await sendRequest(urls);
// now we have all api response that needs to be saved
// completedAll is array
const saved = await saveAllData(completedAll);
// Should be called after all data saved from for loop
res.status(200).send('All data saved')
}
catch(err) {
res.status(500).send({msg: Internal_server_error})
}
})
function sendRequest(urlArr, i){
const apiCalls = [];
for(let i=0;i < urlArr.length; i++){
apiCalls.push(rp(urlArr[i]));
}
// promise.all will give all api response in order as we pushed api call
return Promise.all(apiCalls);
}
You can refer these links:
https://www.npmjs.com/package/request-promise
https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Promise/all
Looking at the intention(a crawler) you can use Promise.all because the urls are not dependant upon each other.
app.post('/', (req, resp) => {
const { urls } = req.body;
const promises = urls.map((url, i) => {
return new Promise((resolve, rej)=>{
request(url, function (err, resp, body) {
if (err) {
rej(err);
} else {
resolve(body);
}
})
})
.then((body)=>{
//this should definitely be a promise as you are saving data to mongo
return saveUrlData(body);
})
});
// Should be called after all data saved from for loop
Promise.all(promises).then(()=>resp.send('All data saved'));
})
Note: Need to do error handling as well.
there are multiple ways to solve this.
you can use async/await
Promises
you can also use the async library
app.post('/', (req, res, next) => {
const { urls } = req.body;
async.each(urls, get_n_save, err => {
if (err) return next(err);
res.send('All data saved');
});
function get_n_save (url, callback) {
request(url, (err, resp, body) => {
if (err) {
return callback(err);
}
saveUrlData(body);
callback();
});
}
});