I'm using request js library to make HTTP requests to my API. My single API call looks like this:
var options = {
method: "post",
url: 'http//example.com',
json: true,
headers: headers,
body: {key: value}
}
request(options, callback);
However, I have array of options, which are needed to be called one after another and I need to break whole chain if one of them fails.
If last chain finishes, I need to output result to console.
I know that chaining callbacks could be fulfilled via promises, but all examples that I have found uses predefined amount of chained requests.
Is it possible?
A recursive function which calls itself in the request callback should work.
options = [{...}, {...}];
function doRequests(options){
request(options.shift(), function(){
if(error){
return "handle error";
}
if(options.length > 0){
doRequests(options);
}
});
}
The first thing I would do would be to use a request library that returned a promise. Assuming you have such a thing then you just chain the promises together.
First create a resolved promise:
var promise = new Promise.resolve();
The for each new object you want to request:
promise = promise.then(() => requestToPromise(options));
will chain another request onto the existing promise and will fire off a new request only when the previous one has completed.
If you have an array, you can have an index into that array, and have the callback kick off the next request when the previous one finishes. Roughly:
var index = 0;
var options = [/*...array of options objects...*/];
doRequest() {
request(options[index], function(err, result) {
// ...handle the result/error here. If there's no error, then:
if (++index < options.length) {
// Kick off the next request
doRequest();
}
});
}
While the above can be Promise-ified, since your requestmethod appears not to be, it would just complicate things.
You can instead use request-promise
and do the following
import request = require('request-promise');
var options = [/*...array of options objects...*/];
requests = [];
options.forEach(function(option){
requests.push(request(option));
}
Promise.all(requests).then(function(reponses){
//all requests are done.
});
Related
nodejs multiple http requests in loop
As per the above question,it was answered how to perform a loop of http requests with an array of urls which works fine.But what I am trying to achieve is to perform another loop of http requests which should be done only after the completion of the first loop (i.e) it should wait for the first loop of http requests to complete.
// Import http
var http = require('http');
// First URLs array
var urls_first = ["http://www.google.com", "http://www.example.com"];
// Second URLs array
var urls_second = ["http://www.yahoo.com", "http://www.fb.com"];
var responses = [];
var completed_requests = 0;
function performHTTP(array) {
for (i in urls_first) {
http.get(urls[i], function(res) {
responses.push(res);
completed_requests++;
if (completed_requests == urls.length) {
// All download done, process responses array
console.log(responses);
}
});
}
}
In the above snippet ,i have added another array of urls.I wrapped the for inside a function to change the array each time called.Since i have to wait for the first loop to complete, i tried async/await like below.
async function callMethod() {
await new Promise (resolve =>performHTTP(urls_first))) // Executing function with first array
await new Promise (resolve =>performHTTP(urls_second))) // Executing function with second array
}
But in this case both the function calls get executed simultaneously(i.e)it does not wait for the first array execution to complete .Both execution happens simultaneously,which i need to happen only after the completion of one.
You need to make your request inside a Promise :
function request(url) {
return new Promise((resolve, reject) => {
http.get(url, function(res) {
// ... your code here ... //
// when your work is done, call resolve to make your promise done
resolve()
});
}
}
And then resolve all your requests
// Batch all your firts request in parallel and wainting for all
await Promise.all(urls_first.map(url => request(url)));
// Do the same this second url
await Promise.all(urls_second.map(url => request(url)));
Note, this code is not tested and may contains some mistake, but the main principle is here.
More information about Promise : https://developer.mozilla.org/fr/docs/Web/JavaScript/Reference/Objets_globaux/Promise
Check out how to use .then() to call second performHttp right after the first one is completed.
You can call services using eachSeries.
https://www.npmjs.com/package/async-series
series([
function(done) {
console.log('First API Call here')
done() // if you will pass err in callback it will be caught in error section.
},
function(done) {
console.log('second API call here')
done()
},
function(done) {
// handle success here
}
], function(err) {
console.log(err.message) // "another thing"
})
How to touch url one by one via javascript ajax or jquery from array? Because if you touch big process php in one touch will make timeout, so how process one by one?
Example
var myurls = [
"http://example.com/grape.php",
"http://example.com/apple.php",
"http://example.com/orange.php",
"http://example.com/banana.php"];
May be if grape.php is done and then next to apple, if apple is done and then next to orange.
And then if all process finished, show alert success.
You mean this?
var myurls = [
"http://example.com/grape.php",
"http://example.com/apple.php",
"http://example.com/orange.php",
"http://example.com/banana.php"],cnt=0;
function process(data) {
console.log(data);
}
function loadUrl() {
if (cnt>=myurls.length) return;
$.get(myurls[cnt++],function(data) {
process(data);
loadUrl();
});
}
$(function() {
loadUrl();
})
Based on your question and the discussion we had in your comments, I figured that you wanted to perform AJAX calls sequentially based on an array of URLs you have.
Solution 1: Use repeated, self-referencing $.ajax() requests
This is undoubtedly the easier solution. What we have here is that we keep track of the position of the array we are in, and stop making AJAX requests when the array has been iterated through.
$(function() {
// Array of URLs
var myurls = [
"https://jsonplaceholder.typicode.com/posts/1",
"https://jsonplaceholder.typicode.com/posts/2",
"https://jsonplaceholder.typicode.com/posts/3",
"https://jsonplaceholder.typicode.com/posts/4"
];
// Iterate through array, and keep a cursor on which item we are at
var urlCount = 0,
ajaxCall = function() {
if (urlCount < myurls.length) {
console.log('Making AJAX call to url: '+myurls[urlCount]);
$.ajax({
url: myurls[urlCount]
})
.done(function(returnedData) {
console.log(returnedData);
urlCount++;
ajaxCall();
});
}
};
ajaxCall();
});
<script src="https://ajax.googleapis.com/ajax/libs/jquery/2.1.1/jquery.min.js"></script>
Solution 2: Using .then() chaining set up using a for loop
I have adapted the code provided in this answer, for your use case, since the example can be quite difficult to understand for those unfamiliar with deferred objects and returned promises.
So the trick is the following:
Set up a master deferred object
Set up a general function that will make AJAX calls for you
Loop through the array, chaining them using .then()
Kick start the first AJAX call on the master deferred object
$(function() {
// Array of URLs
var myurls = [
"https://jsonplaceholder.typicode.com/posts/1",
"https://jsonplaceholder.typicode.com/posts/2",
"https://jsonplaceholder.typicode.com/posts/3",
"https://jsonplaceholder.typicode.com/posts/4"
];
// Set up chain of AJAX requests
var d = $.Deferred(),
_d = d,
ajaxRequest = function(ajaxUrl) {
// Log in browser console that AJAX call is being made
console.log('Making AJAX call to: ' + ajaxUrl);
// Return deferred object for .then() chaining
return $.ajax({
url: ajaxUrl
});
};
// We chain each AJAX call to the next one
for (var i in myurls) {
// Use IIFE so that reference to `i` is fixed
(function(j) {
// Update _d for chaining
_d = _d.then(function() {
// The _request is a defered object returned
// So we can chain deferred methods such as .done()
var _request = ajaxRequest(myurls[j]).done(function(ajaxData) {
// Just to show that data is being returned after each call
console.log(ajaxData);
});
// Return the deferred object for chaining
return _request;
});
})(i);
}
// Kick start sequential ajax call
d.resolve();
});
<script src="https://ajax.googleapis.com/ajax/libs/jquery/2.1.1/jquery.min.js"></script>
I'm struggling to send multiple AJAX calls in the following task. The API returns takes two parameters: userId and offsetValue and returns last 10 messages for the specified user, starting from specified offset. If the offset is more than the total count of messages for the user, API returns an empty string.
I wrote a function that returns an individual promise to get 10 messages for specified userId and offsetValue.
function getMessages(userId, offsetValue) {
return new Promise(function (resolve, reject) {
$.ajax(
{
url: 'https://example.com/api.php',
type: 'POST',
data: {
action: 'get_messages',
offset: offsetValue,
user: userId
},
success: function (response) {
if (response != '') {
resolve(response);
} else {
reject(response);
}
},
error: function (response) {
reject(response);
}
});
});
}
I need to run parallel tasks using .all() for multiple userId, however I cannot run parallel subtasks for each userId (incrementing offsetValue by 10 each time) as I don't know in advance how many messages does each user have, so the execution should stop when first individual promise is rejected (i.e. offsetValue is more than total messages count). Something like this:
var messages = '';
getMessages('Alex', 0)
.then(function(result) {
messages += result;
getMessages('Alex', 10);
.then(function(result) {
messages += result;
getMessages('Alex', 20)
....
});
});
So, is there any way to run sequental promises with iterating parameter sequentially one by one and resolve the overall concatenated result upon first reject?
First off, you want to avoid the promise anti-pattern where you unnecessarily wrap your code in a new promise when $.ajax() already returns a promise that you can just use. To fix that, you would change to this:
// retrieves block of messages starting with offsetValue
// resolved response will be empty if there are no more messages
function getMessages(userId, offsetValue) {
return $.ajax({
url: 'https://example.com/api.php',
type: 'POST',
data: {
action: 'get_messages',
offset: offsetValue,
user: userId
}
});
}
Now, for your main question. Given that you want to stop requesting new items when you get a rejection or empty response and you don't know how many requests there will be in advance, you pretty much have to request things serially and stop requesting the next one when you get an empty response or an error. The key to doing that is to chain sequential promises together by returning a new promise from a .then() handler.
You can do that like this:
function getAllMessagesForUser(userId) {
var offsetValue = 0;
var results = [];
function next() {
return getMessages(userId, offsetValue).then(function(response) {
// if response not empty, continue getting more messages
if (response !== '') {
// assumes API is returning 10 results at a time
offsetValue += 10;
results.push(response);
// chain next request promise onto prior promise
return next();
} else {
// empty response means we're done retrieving messages
// so just return results accumulated so far
return results.join("");
}
});
}
return next();
}
This creates an internal function that returns a promise and each time it gets some messages, it chains a new promise onto the original promise. So, getAllMessagesForUser() returns a single promise that resolves with all the messages it has retrieved or rejects with an error.
You would use it like this:
getAllMessagesForUser('Bob').then(function(messages) {
// got all messages here
}, function(err) {
// error here
});
You could parallelize multiple users (as long as you're sure you're not overloading the server or running into a rate limiting issue) like this:
$.when.apply($, ['Bob', 'Alice', 'Ted'].map(function(item) {
return getAllMessagesForUser(item);
})).then(function() {
// get results into a normal array
var results = Array.prototype.slice.call(arguments);
});
P.S. Promise.all() is much nicer to use than $.when() (since it takes an array and resolves to an array), but since there are already jQuery promises involved here and I didn't know your browser compatibility requirements, I stuck with jQuery promise management rather than ES6 standard promises.
I'm new on AngularJS and JavaScript.
I am getting remote information for each of the elements of an array (cars) and creating a new array (interested prospects). So I need to sync the requests. I need the responses of each request to be added in the new array in the same order of the cars.
I did it first in with a for:
for (a in cars) {
//async request
.then(function () {
//update the new array
});
}
This make all the requests but naturally didn't update the new array.
After seeking in forums, I found this great examples and explanations for returning a intermediate promise and sync all of them.
1. http://pouchdb.com/2015/05/18/we-have-a-problem-with-promises.html
2. http://stackoverflow.com/questions/25605215/return-a-promise-from-inside-a-for-loop
3. http://www.html5rocks.com/en/tutorials/es6/promises/
(#MaurizioIndenmark, #Mark Rajcok , #Michelle Tilley, #Nolan Lawson)
I couldn't use the Promise.resolve() suggested in the second reference. So I had used $q.defer() and resolve(). I guess I have to inject a dependency or something else that I missed. As shown below:
In the Controller I have:
$scope.interestedProspects = [] ;
RequestDetailsOfAsync = function ($scope) {
var deferred = $q.defer();
var id = carLists.map(function (car) {
return car.id;
}).reduce(function (previousValue, currentValue) {
return previousValue.then(function () {
TheService.AsyncRequest(currentValue).then(function (rData) {
$scope.interestedProspects.push(rData);
});
});
}, deferred.resolve());
};
In the Service I have something like:
angular.module('app', []).factory('TheService', function ($http) {
return {
AsyncRequest = function (keyID) {
var deferred = $q.defer();
var promise = authorized.get("somep.provider.api/theService.json?" + keyID).done(function (data) {
deferred.resolve(data);
}).fail(function (err) {
deferred.reject(err);
});
return deferred.promise;
}
}
}
The displayed error I got: Uncaught TypeError: previousValue.then is not a function
I made a jsfiddle reusing others available, so that it could be easier to solve this http://jsfiddle.net/alisatest/pf31g36y/3/. How to wait for AsyncRequests for each element from an array using reduce and promises
I don't know if the mistakes are:
the place where the resolve is placed in the controller function.
the way the reduce function is used
The previousValue and currentValue sometimes are seen by javascript like type of Promise initially and then as a number. In the jsfiddle I have a working example of the use of the reduce and an http request for the example.
Look at this pattern for what you want to do:
cars.reduce(function(promise, car) {
return promise.then(function(){
return TheService.AsyncRequest(car).then(function (rData) {
$scope.details.push(rData);
});
});
}, $q.when());
This will do all the asynchronous calls for every car exactly in the sequence they are in the cars array. $q.all may also be sufficient if the order the async calls are made doesn't matter.
It seems you are calling reduce on an array of ids, but assume in the passed function that you are dealing with promises.
In general, when you want to sync a set of promises, you can use $q.all
You pass an array of promises and get another promise in return that will be resolved with an array of results.
I use mbostock/queue for queuing few async operation. It is more to rate limit (UI generate few events, where the backend can process it slowly), and also to make sure they are processed sequentially. I use it like
function request(d, cb) {
//some async oper
add.then(function(){
cb(null, "finished ")
})
}
var addQ = queue(1);
addQ.defer(request) //called by few req at higher rates generated by UI
I already uses angular.js $q for async operation. So, do I have to use mbostock/queue, or can I build a queue out of $q (which is in spirit https://github.com/kriskowal/q)
Thanks.
Basic $q Chain Example
Yes you can build a chained queue using Angular's $q! Here is an example that shows you how you could use recursion to create a queue of any length. Each post happens in succession (one after another). The second post will not start until the first post has finished.
This can be helpful when writing to databases. If the database does not have it's own queue on the backend, and you make multiple writes at the same time, you may find that not all of your data is saved!
I have added a Plunkr example to demonstrate this code in action.
$scope.setData = function (data) {
// This array will hold the n-length queue
var promiseStack = [];
// Create a new promise (don't fire it yet)
function newPromise (key, data) {
return function () {
var deferred = $q.defer();
var postData = {};
postData[key] = data;
// Post the the data ($http returns a promise)
$http.post($scope.postPath, postData)
.then(function (response) {
// When the $http promise resolves, we also
// resolve the queued promise that contains it
deferred.resolve(response);
}, function (reason) {
deferred.reject(reason);
});
return deferred.promise;
};
}
// Loop through data creating our queue of promises
for (var key in data) {
promiseStack.push(newPromise(key, data[key]));
}
// Fire the first promise in the queue
var fire = function () {
// If the queue has remaining items...
return promiseStack.length &&
// Remove the first promise from the array
// and execute it
promiseStack.shift()()
// When that promise resolves, fire the next
// promise in our queue
.then(function () {
return fire();
});
};
// Begin the queue
return fire();
};
You can use a simple function to begin your queue. For the sake of this demonstration, I am passing an object full of keys to a function that will split these keys into individual posts, then POST them to Henry's HTTP Post Dumping Server. (Thanks Henry!)
$scope.beginQueue = function () {
$scope.setData({
a: 0,
b: 1,
/* ... all the other letters of the alphabet ... */
y: 24,
z: 25
}).then(function () {
console.log('Everything was saved!');
}).catch(function (reason) {
console.warn(reason);
});
};
Here is a link to the Plunkr example if you would like to try out this code.
The short answer is no, you don't need an extra library. Promise.then() is sufficiently "atomic". The long answer is: it's worth making a queue() function to keep code DRY. Bluebird-promises seems pretty complete, but here's something based on AngularJS's $q.
If I was making .queue() I'd want it to handle errors as well.
Here's an angular service factory, and some use cases:
/**
* Simple promise factory
*/
angular.module('app').factory('P', function($q) {
var P = $q;
// Make a promise
P.then = function(obj) {
return $q.when(obj);
};
// Take a promise. Queue 'action'. On 'action' faulure, run 'error' and continue.
P.queue = function(promise, action, error) {
return promise.then(action).catch(error);
};
// Oook! Monkey patch .queue() onto a $q promise.
P.startQueue = function(obj) {
var promise = $q.when(obj);
promise.queue = function(action, error) {
return promise.then(action).catch(error);
};
return promise;
};
return P;
});
How to use it:
.run(function($state, YouReallyNeedJustQorP, $q, P) {
// Use a $q promise. Queue actions with P
// Make a regular old promise
var myPromise = $q.when('plain old promise');
// use P to queue an action on myPromise
P.queue(myPromise, function() { return console.log('myPromise: do something clever'); });
// use P to queue an action
P.queue(myPromise, function() {
throw console.log('myPromise: do something dangerous');
}, function() {
return console.log('myPromise: risks must be taken!');
});
// use P to queue an action
P.queue(myPromise, function() { return console.log('myPromise: never quit'); });
// Same thing, but make a special promise with P
var myQueue = P.startQueue(myPromise);
// use P to queue an action
myQueue.queue(function() { return console.log('myQueue: do something clever'); });
// use P to queue an action
myQueue.queue(function() {
throw console.log('myQueue: do something hard');
}, function() {
return console.log('myQueue: hard is interesting!');
});
// use P to queue an action
myQueue.queue(function() { return console.log('myQueue: no easy days'); });
Chained Promises
Angular's $q implementation allows you to chain promises, and then handle resolves of those promises according to your own logic. The methods are a bit different than mbostock/queue, but the intent is the same. Create a function that determines how your defered will be resolved (creating a promise), then make these available to a higher level controller/service for specific resolution handling.
Angular uses $q.defer() to return promise objects, which can then be called in the order you wish inside your application logic. (or even skipped, mutated, intercepted, etc...).
I'll throw down some code, but I found this 7 minute video at egghead.io to be the best short demo: https://egghead.io/lessons/angularjs-chained-promises, and it will do a FAR better job of explaining. Thomas (the presenter) builds a very small flight dashboard app that queues up weather and flight data, and processes that queue when a user queries their itenerary. ThomasBurleson/angularjs-FlightDashboard
I will be setting up a smaller demonstration on codepen, using the situation of 'eating at a restaurant' to demonstrate this concept: http://codepen.io/LongLiveCHIEF/pen/uLyHx
Code examples here:
https://gist.github.com/LongLiveCHIEF/4c5432d1c2fb2fdf937d