Aurelia stop overload of http requests - javascript

I'm using Aurelia and have multiple promises calling my API. If I continually click the button to fire the promises over and over and over again the back end will timeout. How can I stop/halt the promises I am firing and just get the newest one so as to not over work the API and cause a timeout?

This is not specific to aurelia, it happens with any asynchronous event implementation.
When you click on a button, the event handler is called.
The event handler sends an asynchronous AJAX request to your server. Because it's async, your code and your application continues to run while the request is happening.
User can click again on the button and a second AJAX request can be sent at the same time, assuming that you click faster than requests complete.
To make it worse, concurrent requests may even complete out-of-order (i.e. a later request completes before a former). Your code should be prepared to handle that properly.
If you don't want this behaviour, it is up to you to prevent the user from submitting again until the AJAX request completes. For example you could:
Disable the button when sending the request; or
Display a modal "loading" screen / spinner to prevent any interaction with the application while the request is in flight.
Note that giving user feedback is good UX anyway. A network request might be delayed for a whole lot of reasons and it is a good idea to give some feedback to let him know that something is happening.

Related

Jquery async call not working on user event

In my webpage many on-load ajax call those works fine.
Action takes time as per processing time. Means if any action that has been complete will send response and will not wait for first to finish.
But if same I am trying to do with on-lick or any user event. All ajax call works synchronously. I want these should not wait to finish the execution of first running action. All should start and complete independently.
I am using jquery 1.8 where default async= true;
Please help me here to resolve this.
Issue may be due to session lock.
more detail you can find it here
http://php.net/manual/en/function.session-write-close.php
call session_write_close function if there no session write activity in your script or function.
Such issues are observed in many concurrent ajax call and previous call has some session write activity. In this case session will be locked until it completed its execution.
If you set async to true, they will all fire at the same time.
make explicitly declaration of async = true option in each ajax call .
Useful jsfiddle link
http://jsfiddle.net/jquerybyexample/tS349/

window.onbeforeunload, closing browser and synchronous AJAX

I've been searching for any reasonable example of a situation, when synchronous AJAX makes sense. I've found this SO question, where the author mentions window.onbeforeunload and claims, that "the request would never stop" if the AJAX call was asynchronous.
Can anybody explain this? I mean - window.onbeforeunload is fired when the user wants to close the tab. What had to be going on to make the tab still alive, even though somebody clicked to close it? Can somebody give more specific example?
He didn't say the request will never stop; he said it will never complete. That is, your code has no hope of ever getting the response back, because the execution environment (the window) would disappear from existence before that happened.
The tab will close when window.onbeforeunload exits with a truthy value. Thus, as long as it is running, the page is waiting for the return value, and not closing. This allows a synchronous AJAX to be sent, and for the response to be received and processed. If the request is asynchronous, the code constructs XHR object, then exits, and the page (and your code) goes away.
I have never tested this, but the answerer apparently believes (and I don't think it unreasonable) that the page might not stick around long enough for an async XHR to even be sent, let alone to receive a response. Thus, if you want to be sure the server receives the information that the user closed the page, you want to have the request synchronous.
What had to be going on to make the tab still alive, even though somebody clicked to close it? Can somebody give more specific example?
Sending a synchronous XMLHttpRequest on unload is the only way to guarantee delivery of session data when a user-agent unloads the page (and may never re-visit your site again). There are two specific cases for this:
Tracking - Tracking and reporting the total session time for a user's page visit.
Batching - Coalescing and deferring delivery of batched session data to reduce the number of server requests.
The Beacon spec (navigator.sendBeacon) was designed to optimize this specific case, making it possible to send asynchronous requests guaranteed to still complete even after the page unloads.

Will an async ajax query complete if triggered by window.onbeforeunload?

I have tested this and it seems like even if the page unloads, the async action completes regardless if the tab is closed or if the tab navigates to a new url. The server it's calling to however is pretty snappy.
What is the process behind the scenes when it comes to async processes running and tabs being closed - at which point will the browser just call it quits on an async process that the page has started?
You cannot rely on the behavior of async calls from within onbeforeunload and onunload between servers.
We had an app that ran with an Apache server- Windows on our dev environment, and Unix on the release. It turns out that when the server was configured to handle requests in threads- default for our Windows/dev boxes- the Ajax would always complete; when it was configured to handle requests in processes- default for our Unix/prod environment, it would always cancel!
What happens is that the Ajax request fires, and then the browser unloads the page, which closes the connection for the Ajax reply. We set up a test where the Ajax call would execute a 4-second "sleep" on the server to avoid any timing issue. It looked like with a threaded back-end, Apache did not notice the client closing the connection until after it returned from the "sleep", whereas with the child-process backend, it aborted the Ajax call immediately.
The answer is to use synchronous requests in your on[before]unload event handler. That will keep the page open until the request completes. That does mean an additional delay when the page switches/reloads/goes back...
...and also you have no guarantee that the next page will see the results of that request- it seems that some browsers will GET the next page before firing the onunload event! But that's a topic for another question.

state of XMLHttpRequest Object in jquery AJAX

In traditional javascript AJAX, we know if readystate is:
0 - The request is not initialized
1- The request has been set up
2 - The request has been sent
3 - The request is in process
4 - The request is complete.
When it comes to jQuery AJAX, we have:
complete property where we code what should happen after completion
success property where we code what should happen if the ajax request succeeds and
error property where we code what should happen if ajax request fails.
All of the above properties lets us code to do something after completion of ajax request. Where can I specify some code to execute something during processing(when readyState is 3) in Jquery Ajax??
As my AJAX script takes too long time to execute, which means, I will not attain 'complete' stage quickly. This seems like nothing is happening to the user. I wanted to initiate another ajax script at processing stage which gets information from server meanwhile and shows the user what has been done so far. Is it possible at all in Javascript? I know there is no multi-threading in Javascript.
I think I made my self clear. But, Please let me know if anything is not making any sense.
I handle this by initiating the first long running request, returning to the user immediately and allowing the process to fork server side for the extended processing.
The initial return ajax call to the user sets them up to 'watch' that process via a flag against the object ( I store them against the object in the database, but you could for instance watch file sizes or other stuff )
Subsequent ajax calls occur in a loop, each one returning setTimeout for the next call, and report on changes to that flag so the progress of the long running process is then visible. Completion of the long running process prompts NOT sending another setTimeout() and showing the overall results.
If your process is not that intensive, a simple spinner would probably do the job and no work for your server process. I usually handle that having $.ajax flip the visibility of a 'spinner' icon that's preloaded on my pages in the same spot for all.
According to jQuery's Ajax documention, they do not expose the readystate change event:
No onreadystatechange mechanism is provided, however, since success,
error, complete and statusCode cover all conceivable requirements.
It would be possible to show a loading image after the initial Ajax request is kicked off (and before getting any "complete" or "success" events, and then start polling a different URL via ajax which will give you the status of the first request, assuming your server can show progress of the long process before it completes.

Using beforeunload to execute task before page exit

I'm trying to get an action executed before page leave, this requires a GET request being executed.
I would like to give the GET request at least 50-100ms in order to get the request forwarded to the web server.
I'm capturing the beforeunload action using:
$(window).bind('beforeunload', function() {
send_data();
});
Any tips?
You have the highest chance to get your request sent if you make it synchronous (async: false in the $.ajax() options if you are using jQuery).
But you cannot be sure it always gets through..
You should prevent default action with event.preventDefault().
Note that not every browser support beforeunload

Categories