Loading data into webpage which requires multiple calls to API - javascript

For my website I am using an API where I need to load several variables from. Each of the variables are dependent on the return value of the previous call (I use the returned variable from call 1 to make call 2 etc).
Example:
Say that I need to make 5 different API calls to gather all of my data and each is dependent on the return value of the previous call. Then in my case I am doing like this. I am passing a callback function to the first function that loads the data. Then that function will make the first API call. When that call is finished it will pass the callback function to the next function that makes the second API call, and so on. When the last API call is finished the callback function gets called and then I know that all the data has been loaded. In code it would look something like this (I am using the Trello API in my application so I will use it in the example below):
function loadData(cb){
//Make the first API call
Trello.get('/member/me/boards', function(boards){
myBoards = boards;
for(var i = 0; i < boards.length; i++){
//Make the second API call
Trello.get('/boards/' + board[i].id + '/lists', function(lists){
board[i].lists = lists;
//Then make the third and fourth and so on
.....
//When all calls are made call the callback function
cb();
});
});
}
As you can see the callback function will be passed a long way into the callstack. I was wondering if there is a better way to load the data and to store it (as of now I just store everything in a large array). And what is some best practices for loading large amount of data from an API?
P.S. In my original code each of the API calls are in separate functions, but I simplified it here to reduce the amount of code in the example.

I don't know if this is an option for you but using TypeScript makes solving this kind of JavaScript problem much more simple:
async function loadData() {
const boards = await Trello.get('/member/me/boards');
return boards.map(async (board) => {
const lists = await Trello.get('/boards/' + board.id + '/lists');
const something = await Trello.get('/...');
const somethingElse = await Trello.get('/...');
// ...more calls
return {
...board,
lists: lists,
something: something,
somethingElse: somethingElse
// ... more attributes
};
});
}
loadData().then((data) => console.log(data));

Without fully understanding your problem this may not be a valid solution, but taking a quick glance at the trello api docs shows a batch call you could make to avoid looping at each level. Batching these would allow for many fewer API calls at each level and would be considered a best practice:
function loadData(cb){
//Make the first API call
Trello.get('/member/me/boards', function(boards){
myBoards = boards;
var boardAPIs = [];
var boardResponses = [];
for(var i = 0; i < boards.length; i++){
boardAPIs.push('/boards/' + board[i].id + '/lists');
//max of 10 at a time per documentation
if (boardAPIs.length == 10 || i >= (boards.length - 1)) {
//Make the second level API call
Trello.get('/batch/?urls=' + boardAPIs.join(','), function(boards){
// collect response information on all boards, then continue with third request
boardResponses.push(...);
if (i >= (boards.length - 1)) {
// all board requests have been made, continue execution at third level
// if this were the last level of calls, you could call cb() here
for(var j = 0; i < boardResponses.length; i++){
// loop inside responses to get individual board responses, build up next set of batch requests
}
}
});
boardAPIs= [];
}
});
});
}
One thing to note here: the docs mentioned that you can only batch 10 requests at a time, so I added some code in there to check for that.
This post provides more information on how to consume the batch service:
this means you get only a single response back, and it looks a little
different from a normal response. The response is an array of objects
– but not of the normal response objects you might expect. Instead,
it’s an object with a single property, with a name set to the HTTP
response code of the request.

You may focus on a deep first approach, so that the first data arrives fast at the client:
function loadData(showChunk){
//Make the first API call
Trello.get('/member/me/boards', function(boards){
myBoards = boards;
(function getboard(i){
//Make the second API call
Trello.get('/boards/' + board[i].id + '/lists', function(lists){
board[i].lists = lists;
//Then make the third and fourth and so on
.....
//When all calls are made for the first board call the callback function, and also continue with the next board
showChunk();
if(i+1<boards.length) setTimeout(getboard, 1, i+1);
});
})(0);
});
}

Related

How to make pure js cycle of callbacks?

I have an API from which I need to query N pages of data. Because I do not want to overload the API, I want to do it sequentially and without blocking the main thread.
The code would be something like this:
var res = []; // all data from api
var totalPages = 10;
var pageSize = 100;
for (let page = 0; page < totalPages; page++) {
// load using jQuery ajax request
$.get('api.php', { page: page, page_size: pageSize }, function(result) {
res.push(...result); // add data to resulting array
});
}
But this approach has a few issues:
Since it is async, it will just run all requests in parallel, overloading API as a result. I need them to still run async, but each should run only when the previous is done.
Since all the calls are async, by the end of the cycle we still wouldn't have the requested data - it will be loading in the background. We need somehow to wait for all callbacks to be done before returning res to some other code that needs it.
There is no way to make every callback pass its result to the next callback and it is the only way to stop loading when some callback receives "stop loading"/"no more data" request from the server
Is there a way to fix these issues without using some side libraries or promises? Just plain old vanilla javascript.
Sorry if something looks unclear, I am not very experienced in js
You can use a simple recursion here, like this:
var res = []; // all data from api
var totalPages = 10;
var pageSize = 100;
const loader = page => {
// load using jQuery ajax request
$.get('api.php', { page: page, page_size: pageSize }, function(result) {
res.push(...result); // add data to resulting array
if (page < totalPages)
loader(++page)
else
console.log('DONE!');
});
}
loader(0);

How do I simulate Database throttling in JavaScript?

Sometime ago I had a code question in a take home test. It was as follows:
Database Throttling
You are given an array userInfo of user data and a function updateDB that takes a single user data argument. updateDB makes an asynchronous call that parses the user data and inserts the parsed data into a database. The database throttles requests so to make sure all user data is added to the database we need a function addAllUserData that calls updateDB on each entry in userInfo making sure never to exceed 7 calls per second to prevent being throttled.
var userInfo = [{'name':'antonio', 'username':'antonio_pavicevac_ortiz'}], dataBase = [];
function updateDB(singleUserDataArgument, callback){
dataBase.push(callback(singleUserDataArgument));
}
function addAllUserInfo(data) {
var eachUserData;
setInterval(function(){
eachUserData = data.map(data)
}, 7000);
}
As you can see by my attempt I am having a hard time wrapping my head around this exercise. Could anyone also inject what is meant by throttling in regards to async calls?
Thanks in advance!
// contains times at which requests were made
var callTimes = [];
function doThrottle(){
// get the current time
var time - new Date().getTime();
// filter callTimes to only include requests this second
callTimes = callTimes.filter(function(t){
return t > time-1000;
});
// if there were more than 7 calls this second, do not make another one
if(callTimes.length > 7) return true;
else{
// safe, do not throttle
callTimes.push(time);
return false;
}
}
// use like this
function makeRequest(){
if(doThrottle()){ /* too many requests, throttle */ }
else{ /* it's safe, make the ajax call*/ }
}

Delayed Ajax calls to same URL using same data

While waiting for the back end devs to implement a "cancel all" feature, which cancels all tasks tracked by the back end, I am attempting to makeshift it by cancelling each individual task. The cancel REST service accepts an ID in the form of a data object {transferID: someID}.
I use a FOR loop to iterate over an array of IDs that I have stored elsewhere. Anticipating that people MAY end up with dozens or hundreds of tasks, I wanted to implement a small delay that will theoretically not overflow the number of HTTP requests the browser can handle and will also reduce a blast of load on the back end CPU. Here is some code with comments for the purpose of this discussion:
ta.api.cancel = function (taskArray, successCallback, errorCallback) {
// taskArray is ["task1","task2"]
// this is just the latest attempt. I had an attempt where I didn't bother
// with this and the results were the same. I THOUGHT there was a "back image"
// type issue so I tried to instantiate $.ajax into two different variables.
// It is not a back image issue, though, but one to do with setTimeout.
ta.xhrObjs = ta.xhrObjs || {};
for (var i = 0; i < taskArray.length; i++) {
console.log(taskArray); // confirm that both task1 and task2 are there.
var theID = taskArray[i];
var id = {transferID: theID}; // convert to the format understood by REST
console.log(id); // I see "task1" and then "task2" consecutively... odd,
// because I expect to see the "inside the setTimeout" logging line next
setTimeout(function () {
console.log('inside the setTimeout, my id is: ')
console.log(id.transferID);
// "inside the setTimeout, my id is: task2" twice consecutively! Y NO task1?
ta.xhrObjs[theID] = doCancel(id);
}, 20 * i);
}
function doCancel(id) {
// a $.Ajax call for "task2" twice, instead of "task1" then "task2" 20ms
// later. No point debugging the Ajax (though for the record, cache is
// false!) because the problem is already seen in the 'setTimeout' and
// fixed by not setting a timeout.
}
}
Thing is: I know setTimeout makes the containing function execute asynchronously. If I take out the timeout, and just call doCancel in the iterator, it will call it on task1 and then task2. But although it makes the call async, I don't understand why it just does task2 twice. Can't wrap my head around it.
I am looking for a way to get the iterator to make the Ajax calls with a 20ms delay. But I need it to call on both! Anybody see a glaring error that I can fix, or know of a technique?
You must wrap your function setTimeout and pass the id variable into it, like this:
(function(myId, i) {
setTimeout(function () {
console.log('inside the setTimeout, my id is: ', myId);
}, 20 * i);
}(theId, i));
This pattern does not create a unique variable1 for each instance of the loop as one might expect.
function () {
for (var i = 0; i < length; i++) {
var variable1;
}
}
In javascript variables are "hoisted". To quote Mozilla:
"Because variable declarations (and declarations in general) are
processed before any code is executed, declaring a variable anywhere
in the code is equivalent to declaring it at the top."
https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Statements/var
So it should be re-written as:
function () {
var variable1;
for (var i = 0; i < length; i++) {
}
}
What this means is that after the loop has finished, any asynchronous callbacks that reference this variable will see the last value of the loop.

Need to Populate Javascript Array BEFORE (document).ready()

How can I get some javascript to run before a document ready function?
I have the following snippet finished first...
var applicantlist = [];
$.getJSON("apps.json", function(appsdata) {
for (var i = 0; i<appsdata.applications.length; i++){
var tempapp = [appsdata.applications[i].name, appsdata.applications[i].server];
applicantlist.push(tempapp);
}
});
I've tested this, and the data gets pushed into the array just fine. The problem is that I need this array to make some ajax calls that are found in my page ready function as follows...
$(document).ready(function(){
window.jsonpCallbacks = {};
alert(applicantlist.length);
for (var i = 0; i < applicantlist.length; i++){
(function(index){
window.jsonpCallbacks["myCallback" + index] = function(data){
myCallback(data,index);
};
})(i);
//Jquery/Ajax call to the WoW API for the data.
$.ajax({
"url":"http://us.battle.net/api/wow/character/" + applicantlist[i][1] + "/" + applicantlist[i][0] + "?jsonp=jsonpCallbacks.myCallback" + i,
"type":"GET",
"data": { fields: "items, talents, progression, professions, audit, guild, stats"},
"dataType":"jsonp",
"contentType":"application/json",
"jsonpCallback":"jsonpCallbacks.myCallback"+i,
"success":function(data1){
}
})
}
All of this fires off before the first snipet, no matter where I seem to put it. So, the array is empty (the alert message just shows "0").
As you can see by the URL of my ajax call, I need that array populated beforehand. I've tried putting the first snippet in a seperate .js file and calling it before all other javascript files on the actual HTML page...
What am I missing?
Move the code that sends the first request to the document.ready. You don't usually want anything happening before the document is ready. Then move the code that sends the next request(s) to the callback of the first request after you populate the array and do whatever else you need to happen first
$(document).ready(function () {
$.getJSON("apps.json", function(appsdata) {
...
// add items to your array
sendNextRequest();
}
});
function sendNextRequest() {
//Jquery/Ajax call to the WoW API for the data.
...
}
This gurantees that the calls to the WoW api don't get fired until the first $.getJSON call completes and you populate your array.
FYI this is a common challenge in javascript. You need one operation to run only after another finishes. When you use ajax, you have callbacks like in my example above that help you achieve this. Outside of ajax requests, you can use jQuery Promises to defer tasks until after something else finishes.

get all JSON entries over x amount of calls

I'm accessing a json file which has 50 entries per page over x amount of pages.
I have the total number of entries, say 500 - which amounts to 10 pages.
I get the data from json file for page 1, pass the data to an array and then repeat the function but this time for page 2.
I have created the function and it loops perfectly incrementing and fetching each page, but it doesn't wait for the json data to be parsed and passed to the array before looping again.
Basically I want to wait until the data has been processed and then continue on.
My code so far is roughly this:
function getJsonData(metroID){
currentPageNo = 0;
totalPages = 'x';
count = 0;
function jsonLoop(){
meroAreaSearchString = 'http://jsonurl'+currentPageNo;
$.getJSON(meroAreaSearchString,{},function( data ){
if(totalPages == 'x'){
var totalEntries = data.resultsPage.totalEntries;
var perPage = data.resultsPage.perPage;
totalPages = (totalEntries/perPage);
log(totalEntries+', '+perPage+', '+totalPages);
log(Math.round(totalPages));
}
$.each(data.resultsPage.results.event, function(i,item){
var name = item.displayName;
var type = item.type;
var valueToPush = new Array();
valueToPush[0] = name;
valueToPush[1] = type;
valueToPush[3] = count;
locations.push(valueToPush);
count++;
});
});
if(currentPageNo == totalPages){
log(locations);
alert('finished processing all results');
}else{
currentPageNo++;
jsonLoop();
}
currentPageNo++;
jsonLoop();
}
}
Have you tried making the request syncronous?
Just put this piece of code at the top of your function getJsonData
$.ajaxSetup({async:false});
You can specify the async option to be false to get a synchronous Ajax request. This will stop your function until the callback set some data.
The $.getJSON() function fires off an AJAX request, and calls it's callback function when the AJAX call resolves successfully, if that makes any sense.
Basically, that just means that given a call $.getJSON(url,data,callback);, jQuery will fire an AJAX request to url passing data along with it, and call callback when that call resolves. Clear cut straightforward.
The thing you're missing here is that an AJAX call is just that -- as its name implies, its asynchronous. This means that throughout the whole lifetime of the AJAX call, it lets the other logic in your application run instead of waiting for it to finish.
So something like this:
$.getJSON(url, data, callback);
alert('foo');
... will most probably result in an alert() call happening before your AJAX call completes. I hope that made sense.
To make sure that something happens after your AJAX call completes, you put that logic inside the callback. That's really what the callback is for.
$.getJSON(url, data, function (d) {
something_you_want_done_after_ajax_call();
});
In the context of your problem, you just have to put all that conditional recalling of jsonLoop() into your callback. It's not very obvious right now because of your indenting, but it's currently outside your callback.

Categories