Google Chrome scroll freezing while appending ajax content on a page - javascript

I have a problem with chrome scrollbar,on Mozilla there is no such a problem.
I have couple of synchronous ajax requests and then some info appending on the page,they need about 2 secs to load.During this time the scrollbar freezes and is not usable,when the ajax end the scroll works fine.

Javascript is completely single-threaded.
If you make multiple AJAX calls, you will receive each response as
soon as the server sends it; the order depends on the amount of time
that it takes the server to send each reply.
If your code is still running when the server replies, the reply will
only be processed after your code finishes.
You should try to load all of the data in a single request.
Source: SLaks
I might hit a wall here, in other programming languages you can start a new thread for connections/things like that, if you run multiply threads (kinda like layers) your interface stays the way it is/was.

When you use synchronous AJAX the page stops until ajax finish, so if you want the page not stop, must be asynchronous AJAX call.
You can see more here:
Documentation AJAX W3Schools

The problem you are describing is not browser behavior problem.
When you make an synchronous request it means that the the code pending for the response.
Since javascript is a single threaded language (lets ignore web-workers for now),
also the UI processing/manipulation is pending,
and this is why the browser or scroll-bar "stuck".
The reason it works on Firefox is that synchronous calls are deprecated (btw because of the "stuck" behavior), and what you are actually doing there is asynchronous request;
You can read more about it here: https://developer.mozilla.org/en-US/docs/Web/API/XMLHttpRequest/Synchronous_and_Asynchronous_Requests

Related

Is it Possible to Perform a background PHP server task from an AJAX call that will not lockup your site?

I have a server function like this
function very_long_task($data) {}
This function is called using $.ajax() function clients-side.
The problem is that when my server-side function very_long_task() is executed the site is locked down. Meaning that if I tried to view another page of the website from a different tab or window, the website will not load until the very_long_task() function has completed.
Is there anyway to get around this either server-side or client-side?
UPDATED: 2015-11-3
The AJAX call is actually called many times because it is looping through all the elements in a list and performing an action on each of them. The very_long_task() function is then being called on each element.
For example, if there were a list of 20 elements then the very_long_task() function would be called 20 times. This does help a little bit in the overall responsiveness on that page but not on other pages.
UPDATED: 2015-11-3
Also this is built with WordPress so I can leverage some of their functions, but I have had no luck with wp_schedule_single_event since I need a return value.
https://codex.wordpress.org/Function_Reference/wp_schedule_single_event
UPDATED: 2015-11-3
Here is an updated view of my
function very_long_task($data) {
session_write_close();
// Very long task...
return $data;
}
You'll want to call session_write_close() as soon as possible.
This is because while one page has called session_start(), the session file will be locked until the page finishes execution, or until the session is closed.
If this is not done, any page calling session_start() will wait for the lock to be lifted.
UPDATE
I think I know what's going on:
your browser limits the number of simultaneous connections to a server, typically somewhere between 2 and 10.
If you're making 20 asynchronous AJAX calls, and you open the Developer Console (F12 / control-shift-I), you'll probably find that not all of them are executing simultaneously. This would certainly leave no room for additional connections.
Note, that the session_write_close() is still necessary, otherwise the ajax calls will execute serially.
SUGGESTION
So, it is best to only make one AJAX call.
If you want parallelism, you can fork child processes server-side.
You probably won't be able to use jQuery for this, because you'll want to send data from the server and flush()-ing it as it becomes available (HTTP streaming).
One solution I used in a WP importer plugin is not to use AJAX at all, but perform the long running operation, pushing out HTML and a <script> tag to update the UI.
I'm not entirely sure what you mean by "locked down" but below are some things to try:
Make sure that your AJAX is asynchronous
$.ajax({
url: '/start_very_long_task.php',
async: true
});
Make sure your PHP accommodates the expected behavior
// start_very_long_task.php
function start_very_long_task()
{
ini_set('ignore_user_abort','on');
ini_set('max_execution_time', 0)
session_write_close();
do_very_long_task();
}
function do_very_long_task()
{
// Very long task stuff
// This can recursively call itself without making
// making multiple calls to session_write_close(), etc...
}
start_very_long_task();

window.onbeforeunload, closing browser and synchronous AJAX

I've been searching for any reasonable example of a situation, when synchronous AJAX makes sense. I've found this SO question, where the author mentions window.onbeforeunload and claims, that "the request would never stop" if the AJAX call was asynchronous.
Can anybody explain this? I mean - window.onbeforeunload is fired when the user wants to close the tab. What had to be going on to make the tab still alive, even though somebody clicked to close it? Can somebody give more specific example?
He didn't say the request will never stop; he said it will never complete. That is, your code has no hope of ever getting the response back, because the execution environment (the window) would disappear from existence before that happened.
The tab will close when window.onbeforeunload exits with a truthy value. Thus, as long as it is running, the page is waiting for the return value, and not closing. This allows a synchronous AJAX to be sent, and for the response to be received and processed. If the request is asynchronous, the code constructs XHR object, then exits, and the page (and your code) goes away.
I have never tested this, but the answerer apparently believes (and I don't think it unreasonable) that the page might not stick around long enough for an async XHR to even be sent, let alone to receive a response. Thus, if you want to be sure the server receives the information that the user closed the page, you want to have the request synchronous.
What had to be going on to make the tab still alive, even though somebody clicked to close it? Can somebody give more specific example?
Sending a synchronous XMLHttpRequest on unload is the only way to guarantee delivery of session data when a user-agent unloads the page (and may never re-visit your site again). There are two specific cases for this:
Tracking - Tracking and reporting the total session time for a user's page visit.
Batching - Coalescing and deferring delivery of batched session data to reduce the number of server requests.
The Beacon spec (navigator.sendBeacon) was designed to optimize this specific case, making it possible to send asynchronous requests guaranteed to still complete even after the page unloads.

How to make auto-updating (ajax) counter correctly? Or how to disable network log?

I'm trying to make auto-reload counter (for ex.: Messages [num]).
So, I just in setTimeout(); getting JSON code from test_ajax.php. I think it's not correctly..
Can I send info by server (I think not, but suddenly I something don't know..)?
Why I think that's not correctly: because when I'm looking in my chrome network log (F12 -> network tab), I see a lot of requests (to test_ajax.php), but when, I'm visiting vk.com (great example for ajax) or facebook.com, I don't see any requests while something will not change.
So, what's incorrectly in my solution (or what's bad..)?
UPD: Sorry, vk.com sending requests to q%NUM%.queue.vk.com every 25s, but until 25s last request's status is "Pending". When someone, for example, sending me a message it immediately display it. And request has parameter "wait" which equals 25. This delay in requests doing on server side.. But how?
Ajax counter can be done in easy just include below files
index.html
counter.php (ajax file)
necessary images
JS file (for jquery paging call)
download link: https://docs.google.com/open?id=0B5dn0M5-kgfDcE0tOVBPMkg2bHc
What you are looking for is called COMET (also sometimes called Reverse AJAX) techniques.
Doing what you want to do, e.g. regular polls, is one way of doing it.
A lot is actually happening on the server side; to avoid recreating new connections on every poll, some servlet containers like Jetty started to implement techniques like Continuation which basically maintain a two-way connection open.
In the Java world, with Servlet 3, you have asynchronous calls as part of the specs.

state of XMLHttpRequest Object in jquery AJAX

In traditional javascript AJAX, we know if readystate is:
0 - The request is not initialized
1- The request has been set up
2 - The request has been sent
3 - The request is in process
4 - The request is complete.
When it comes to jQuery AJAX, we have:
complete property where we code what should happen after completion
success property where we code what should happen if the ajax request succeeds and
error property where we code what should happen if ajax request fails.
All of the above properties lets us code to do something after completion of ajax request. Where can I specify some code to execute something during processing(when readyState is 3) in Jquery Ajax??
As my AJAX script takes too long time to execute, which means, I will not attain 'complete' stage quickly. This seems like nothing is happening to the user. I wanted to initiate another ajax script at processing stage which gets information from server meanwhile and shows the user what has been done so far. Is it possible at all in Javascript? I know there is no multi-threading in Javascript.
I think I made my self clear. But, Please let me know if anything is not making any sense.
I handle this by initiating the first long running request, returning to the user immediately and allowing the process to fork server side for the extended processing.
The initial return ajax call to the user sets them up to 'watch' that process via a flag against the object ( I store them against the object in the database, but you could for instance watch file sizes or other stuff )
Subsequent ajax calls occur in a loop, each one returning setTimeout for the next call, and report on changes to that flag so the progress of the long running process is then visible. Completion of the long running process prompts NOT sending another setTimeout() and showing the overall results.
If your process is not that intensive, a simple spinner would probably do the job and no work for your server process. I usually handle that having $.ajax flip the visibility of a 'spinner' icon that's preloaded on my pages in the same spot for all.
According to jQuery's Ajax documention, they do not expose the readystate change event:
No onreadystatechange mechanism is provided, however, since success,
error, complete and statusCode cover all conceivable requirements.
It would be possible to show a loading image after the initial Ajax request is kicked off (and before getting any "complete" or "success" events, and then start polling a different URL via ajax which will give you the status of the first request, assuming your server can show progress of the long process before it completes.

Show animation in a jsp page while waiting for server to respond

What would be the best way to display an animation while waiting for server-side processing of a jsp page to complete.Basically, the server side request can take more than a minute to process and until then I would like the user to have some way to get an update of how his request is getting along.I require an animated gif and a line stating that x% has been completed.
One of the methods I came across while surfing the net was to have an intermediate page that shows the animation while loading the actual page using javascript (location.href).So ,I figure use a couple of ajax calls from the intermediate page to a servlet to get the feedback.Problem is it works fine in IE 6/7 and Firefox 3.But the ajax callbacks dont seem to be getting executed in case of Chrome and Opera (The location.href part seems to mess it up and the callbacks never get executed).
If this approach is flawed how should I go about it?.And if not how can i fix this issue?
Thanks in advance
The simple way I've done this is to go to a JSP that displays a "X % completed" page (image, whatever) that reloads periodically. And when the request is complete, it redirects to an appropriate page to indicate completion. A lot simpler than AJAX, if not as fancy, and requires nothing that is browser-specific.
Try window.location='URL'. Also document.location='URL' works, but I think is deprecated.
Also to be opinionated I do think that a non-reloading web page is much saucier than just being forwarded.

Categories