How to ignore previous Sammy "get" route if newer is started - javascript

I have simple sammy routing declared like:
$.sammy("#main", function () {
this.get(/\#\/(.*)/, function (context) {
context.load(url, { json: true }).then(function (result) {
// some result custom processing
return result.Html;
}).swap();
}).run('#/');
});
Everything is working very well, except one small issue. If I'm clicking on my links very fast, all requests are started. However they sometimes finishing in different order than started - which is normal because some are longer then others.
But because they are finishing in different order, the final result on screen in content from swap is different than current selected page.
Is there any way to cancel/ignore previous request (which is still asynchronously processing) if newer one is already processed?

Related

Call a ajax request only if there are ajax requests added to the list

Setup
I have a form that uses a series of check boxes to determine a number selected on a page:
What it does is on selected, it makes a call off to a controller that will save the selection and return the new checked amount based on its current data.
The code looks like this
$('#myId').('click', '.selectorClass input', function() {
const inputCheckBox = this;
inputCheckBox.disabled = true;
$(inputCheckBox).addClass('disabled-checkbox')
$.ajax({
method: "POST",
url: "#(Url.Action<MyController>(c => c.MyEndPoint(Model.ItemId, null)))",
data: {
"boxWasChecked": inputCheckBox.checked
}
}).done(function (data) {
UpdateFields(inputCheckBox, data); //this would update the column footer and 'Total Quantity' field below
}).fail(function (data) {
// error check and handle here
}).always(function(data) {
inputCheckBox.disabled = false;
$(inputCheckBox).removeClass('disabled-checkbox')
});
});
This works great when selecting one at a time and even usually works when selecting multiple.
Problem
The issue comes when a user selects them very quickly and then sometimes the last request to finish wasn't the last one to come in/grab the data from the database on the controller side and therefore it brings back the wrong math for the total.
See image below where the math should be 11 (remember that it is adding the value of each box rather than number of boxes checked) but is reporting 6.
If I were to select the final check box, it would correctly report 12 because it has all the correct data in place for the call.
Part where I am stuck
I think I need to somehow make a list of ajax calls and then only call off a request if there is a request that is added after it OR all other requests infront of it have been completed. I don't know how to do this with ajax calls.
I have toyed around with promises/Fetch API and tried to get that working but could get the promise to add to the stack in Promise.all[myListOfAjaxCalls].
Summary of Code I am working towards
var listOfPromiseCalls = []
$('#myId').('click', '.selectorClass input', function() {
// my checkbox is clicked
listOfPromiseCalls.push(myAjaxCall/myPromise)
// Psudeo code
// begin request if it is the only one in the list
// else
// kick off other requests that are in existing list infront of it
// if other requests finish before another input is clicked, begin this request
});
I think you are in the right path by creating a list with all requests, but since you are using jQuery to do async call, you should use the $.when operator to await until all requests are finished. Promise.all woun't work because $.ajax doesn't implements the same interface as a new Promise() does, so you need to handle it using the jQuery operator.
You can use a list with all async calls as you described above, and then do something like:
$.when(...listOfPromiseCalls)
.done((...allResponses) => {
// do something
})
.catch(errors => {
// handle errors
});
Notice that I'm using spread operators from es6 to spread all requests to when and also to aggregate all responses in a list of responses.
About the spread syntax: https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Operators/Spread_syntax
If you can't use es6 syntax, I recommend you to study and use the apply method for js functions, which gives you the same result.

NodeJS and Electron - request-promise in back-end freezes CSS animation in front-end

Note: Additional information appended to end of original question as Edit #1, detailing how request-promise in the back-end is causing the UI freeze. Keep in mind that a pure CSS animation is hanging temporarily, and you can probably just skip to the edit (or read all for completeness)
The setup
I'm working on a desktop webapp, using Electron.
At one point, the user is required to enter and submit some data. When they click "submit", I use JS to show this css loading animation (bottom-right loader), and send data asynchronously to the back-end...
- HTML -
<button id="submitBtn" type="submit" disabled="true">Go!</button>
<div class="submit-loader">
<div class="loader _hide"></div>
</div>
- JS -
form.addEventListener('submit', function(e) {
e.preventDefault();
loader.classList.remove('_hide');
setTimeout(function() {
ipcRenderer.send('credentials:submit', credentials);
}, 0)
});
where ._hide is simply
._hide {
visibility: hidden;
}
and where ipcRenderer.send() is an async method, without option to set otherwise.
The problem
Normally, the 0ms delay is sufficient to allow the DOM to be changed before the blocking event takes place. But not here. Whether using the setTimeout() or not, there is still a delay.
So, add a tiny delay...
loader.classList.remove('_hide');
setTimeout(function() {
ipcRenderer.send('credentials:submit', credentials);
}, 100);
Great! The loader displays immediately upon submitting! But... after 100ms, the animation stops dead in its tracks, for about 500ms or so, and then gets back to chooching.
This working -> not working -> working pattern happens regardless of the delay length. As soon as the ipcRenderer starts doing stuff, everything is halted.
So... Why!?
This is the first time I've seen this kind of behavior. I'm pretty well-versed in HTML/CSS/JS, but am admittedly new to NodeJS and Electron. Why is my pure CSS animation being halted by the ipcRenderer, and what can I do to remedy this?
Edit #1 - Additional Info
In the back-end (NodeJS), I am using request-promise to make a call to an external API. This happens when the back-end receives the ipcRenderer message.
var rp = require('request-promise');
ipcMain.on('credentials:submit', function(e, credentials) {
var options = {
headers : {
... api-key...
},
json: true,
url : url,
method : 'GET'
};
return rp(options).then(function(data) {
... send response to callback...
}).catch(function(err) {
... send error to callback...
});
}
The buggy freezing behavior only happens on the first API call. Successive API calls (i.e. refreshing the desktop app without restarting the NodeJS backend), do not cause the hang-up. Even if I call a different API method, there are no issues.
For now, I've implemented the following hacky workaround:
First, initialize the first BrowserWindow with show:false...
window = new BrowserWindow({
show: false
});
When the window is ready, send a ping to the external API, and only display the window after a successful response...
window.on('ready-to-show', function() {
apiWrapper.ping(function(response) {
if(response.error) {
app.quit();
}else {
window.show(true);
}
});
});
This extra step means that there is about 500ms delay before the window appears, but then all successive API calls (whether .ping() or otherwise) no longer block the UI. We're getting to the verge of callback hell, but this isn't too bad.
So... this is a request-promise issue (which is asynchronous, as far as I can tell from the docs). Not sure why this behavior is only showing-up on the first call, so please feel free to let me know if you know! Otherwise, the little hacky bit will have to do for now.
(Note: I'm the only person who will ever use this desktop app, so I'm not too worried about displaying a "ping failed" message. For a commercial release, I would alert the user to a failed API call.)
Worth to check how does request-promise internally setups up module loading. reading it, it seems like there is kind of lazy loading (https://github.com/request/request-promise/blob/master/lib/rp.js#L10-L12) when request is being called. Quick try out
const convertHrtime = require('convert-hrtime');
const a = require('request-promise');
const start = process.hrtime();
a({uri: 'https://requestb.in/17on4me1'});
const end = process.hrtime(start);
console.log(convertHrtime(end));
const start2 = process.hrtime();
a({uri: 'https://requestb.in/17on4me1'});
const end2 = process.hrtime(start2);
console.log(convertHrtime(end2));
returns value like below:
{ seconds: 0.00421092,
milliseconds: 4.21092,
nanoseconds: 4210920 }
{ seconds: 0.000511664,
milliseconds: 0.511664,
nanoseconds: 511664 }
first call is obviously taking longer than subsequent. (number of course may vary, I ran this on bare node.js on relatively fast cpu) If module loading is major cost for first call, then it'll block main process until module is loaded (cause node.js require resolve is synchronous)
I'm not able to say this is concrete reason, but worth to check. As suggested in comment, try other lib or bare internal module (like Electron's net) to rule out.

How to run a function when all the data loaded?

I process thousands of points asynchronously in ArcGIS JS API. In the main function, I call functions processing individual features, but I need to finalize the processing when all the features are processed. There should be an event for this, though I didn't find any and I'm afraid it even doesn't exist - it would be hard to state that the last item processed was the last of all. .ajaxStop() should do this, but I don't use jQuery, just Dojo. Closest what I found in Dojo was Fetch and its OnComplete, but as far as I know it's about fetching data from AJAX, not from other JS function.
The only workaround idea I have now is to measure how many features are to be processed and then fire when the output points array reaches desired length, but I need to count the desired number at first. But how to do it at loading? Tracking the data to the point where they are read from server would mean modifying functions I'm not supposed to even know, which is not possible.
EDIT - some of my code:
addData: function (data) {
dojo.addOnLoad(
this.allData = data,
this._myFunction()
);
},
Some comments:
data is an array of graphics
when I view data in debugger, its count is 2000, then 3000, then 4000...
without dojo.addOnLoad, the count started near zero, now it's around 2000, but still a fraction of the real number
_myFunction() processes all the 2000...3000...4000... graphics in this._allData, and returns wrong results because it needs them all to work correctly
I need to delay execution of _myFunction() until all data load, perhaps by some other event instead of dojo.addOnLoad.
Workarounds I already though of:
a) setTimeout()
This is clearly a wrong option - any magic number of miliseconds to wait for would fail to save me if the data contains too much items, and it would delay even cases of a single point in the array.
b) length-based delay
I could replace the event with something like this:
if(data.length == allDataCount) {
this._myFunction();
}
setTimeout(this._thisFunction, someDelay);
or some other implementation of the same, through a loop or a counter incremented in asynchronously called functions. Problem is how to make sure the allDataCount variable is definitive and not just the number of features leaded until now.
EDIT2: pointing to deferreds and promises by #tik27 definitely helped me, but the best I found on converting synchronous code to a deferred was this simple example. I probably misunderstood something, because it doesn't work any better than the original, synchronous code, the this.allData still can't be guaranteed to hold all the data. The loading function now looks like this:
addData: function (data) {
var deferred = new Deferred();
this._addDataSync(data, function (error, result) {
if (error) {
deferred.reject(error);
}
else {
deferred.resolve(result);
}
});
deferred.promise.then(this._myFunction());
},
_addDataSync: function (data, callback) {
callback(this.allData = data);
},
I know most use cases of deferred suppose requesting data from some server. But this is the first time where I can work with data without breaking functions I shouldn't change, so tracking the data back to the request is not an option.
addonload is to wait for the dom.
If you are waiting for a function to complete to run another function deferred/promises are what is used.
Would need more info on your program to give you more specific answers..
I sort of solved my problem, delaying the call of my layer's constructor until the map loads completely and the "onUpdateEnd" event triggers. This is probably the way how it should be properly done, so I post this as an answer and not as an edit of my question. On the other hand, I have no control over other calls of my class and I would prefer to have another line of defense against incomplete inputs, or at least a way to tell whether I should complain about incomplete data or not, so I keep the answer unaccepted and the question open for more answers.
This didn't work when I reloaded the page, but then I figured out how to properly chain event listeners together, so I now can combine "onUpdateEnd" with extent change or any other event. That's perfectly enough for my needs.

How Async really works and How to use it properly with node.js (node-webkit)

efor this problem i am using Node-Webkit (node.js) and Async, loading a Windows App.
The reason of this question is to definitively answer:
What really means asynchronous execution in Javascript and Node.Js.
My personal code problem is at the end of the Question. "The Case".
I am going to explain all about the problem i have directly with a schematic summary. (And I will be updating the info as you help me to understand it)
The Concept (theory)
Imagine a Primary Screen (JS, Html, css, ... Node.js frameworks) and a Background Procedure (JS execution every 10 min, JS internal checks, background Database Optimization, ...).
Whatever you do in Primary Screen wont affect background execution (except some important cases), and Background can change even the Screen if he needs to (screen timers, info about online web status, ...)
Then the behaviour is like:
Thread 1: Your actions inside the App framework. Thread 2: Background App routines
Any action as they finish gives his output to screen, despite of the rest of the actions in async parallel
The Interpretation (For me)
I think this is something that "Async" will handle without problems, as a parallel execution.
async.parallel([
function(){ ... },
function(){ ... }
], callback); //optional callback
So the Thread 1 and Thread 2 can work together correctly while they do not affect the same code or instruction.
The Content will be changing while any threads request something of/to it.
The Implementation (Reality)
Code is not fully asynchronous during the execution, there are sync parts with common actions, that when they need calls the async codes.
Sync: Startup with containers -> Async: load multiple content and do general stuff -> Sync: Do an action in the screen -> ...
The Case
So here it is my not working properly code:
win.on('loaded', function() {
$( "#ContentProgram" ).load( "view/launcherWorkSpace.html", function() {
$("#bgLauncher").hide();
win.show();
async.parallel([
function() //**Background Process: Access to DB and return HTML content**
{
var datacontent = new data.GetActiveData();
var exeSQL = new data.conn(datacontent);
if(exeSQL.Res)
{
var r = exeSQL.Content;
if(r.Found)
{
logSalon = new data.activeSData(r)
$('#RelativeInfo').empty();
$("#RelativeInfo").html("<h4 class='text-success'>Data found: <b>" + logData.getName + "</b></h4>");
}
}
},
function() //**Foreground Process: See an effect on screen during load.**
{
$("#bgLauncher").fadeIn(400);
$("#centralAccess").delay(500).animate({bottom:0},200);
}
]);
});
});
As you can see, im not using "Callback()" because i dont need to (and it does the same).
I want to do the Foreground Process even if Background Process is not finished, but the result of the code is done at same time when both request has finished...
If i disconect the DB manually, first function takes 3 seconds until gives an exception (that i wont handle). Until then, both proccess will not output (show on screen) anything. (Foreground Process should be launched whatever happends to Background Process).
Thanks and sorry for so much explanation for something that looks like trivial.
EDITED
This start to be annoying... I tried without Async, just a javascript with callback like this:
launchEffect(function () {
var datacontent = new data.GetActiveData();
var exeSQL = new data.conn(datacontent);
if(exeSQL.Res)
{
var r = exeSQL.Content;
if(r.Found)
{
logData = new data.activeData(r)
$('#RelativeInfo').empty();
$("#RelativeInfo").html("<h4 class='text-success'>Salón: <b>" + log.getName + "</b></h4>");
}
}
});
});
});
function launchEffect(callback)
{
$("#bgLauncher").fadeIn(400);
$("#centralAccess").delay(500).animate({bottom:0},200);
callback();
}
Even with this... Jquery doesnt work until the callback answer...
node-webkit let's you run code written like code for node.js, but is ultimately just a shim running in WebKit's Javascript runtime and only has one thread, which means that most 'asynchronous' code will still block the execution of any other code.
If you were running node.js itself, you'd see different behavior because it can do genuinely asynchronous threading behind the scenes. If you want more threads, you'll need to supply them in your host app.

Sequential web service call not working

A little (!) bit of background before I can get to the question :
I am implementing a web based search solution. Technology used: javascript (jquery), .net, html etc. etc.
All my web service calls are done through javascript (cross domain ws call). I have few sequential web service calls which all have different success callback function.
I am not able to digest - when i call those ws individually in seperate places they are returning me proper results but sequentially sometime they are giving and sometime not.
sample code: this is not giving expected results all the time.
function submitSearchRequest(_queryString, Stores) {
if (Stores[1].length>0) {
//generate 'searchRequestForArtifact' request object
getSearchResponse("successcallForArtifact", _searchRequestForArtifact);
}
if (Stores[2].length > 0) {
//generate 'searchRequestForPerson' request object
getSearchResponse("successcallForPerson", _searchRequestForPerson);
}
}
function successcallForArtifact(response)
{
//show the results
}
function successcallForPerson(response)
{
//show the results
}
}
If you need sequentially you will need to kick off each search only after one has returned. Currently you are making async calls, meaning it gets kicked off then continues with the code. Currently if the second call is simply faster the order will be off. You will either need to make a sync call or simply have the order enforced by calling the second search from the success function for the artifact.
If you are using JQuery which it seems you are you can set the async parameter to false which will force the order you want but it will slow the overall performance of your page. See this question.

Categories