Using jQuery's AJAX function, I am doing a synchronous call to a PHP script (it runs a shell command to convert a video).
I am trying to use "beforeSend" to run a different asynchronous AJAX call that will read a text file every second (to find out the conversion progress of the video) but it seems that despite the 2nd call being async, it doesn't run asynchronously, instead only being called after the first sync call has finished.
Is there a way to have the async task carry on, as it should, when being ran from the firsts "beforeSend" option?
I understand a sync task locks up operations, but surely this way of using "beforeSend" should work?
beforeSend is for modifieng the xhr object (http://api.jquery.com/jquery.ajax/), not for this kind of stuff
you could try to run a selfcalling timout function in beforeSend() and then handle the conversion progress logic there.
Related
I have a an angular event like this:
$rootScope.$broadcast("postData");
doSomething();
however, doSomething() must wait for postData to complete before execution.
I would normally do something like:
$rootScope.$broadcast("postData").then(function(){
doSomething();
});
But apparently this isn't a thing in angular...Any ideas?
I would like to point-out that the previous solutions are not possible to be implemented where we don't have a handle on the async call to put a callback/promise/throw an event to solve the issue. The async call may be library function like for example setTimeout and we just cant use the previous solutions to fix the flow.
Here's my solution:
Put doSomething(); in an setTimeout with the time interval set to 0,
$rootScope.$broadcast("postData");
setTimeout(function(){
doSomething();}
, 0);
As simple as that!
settimeout makes dosomething() also to be asynchronous and this makes both the asynchronous operations happen one after the other(asynchronously). How? The explanation follows, but first note that the dosomething() is in a setTimout of interval 0 ms. One may obviously think that dosomething() shall be executed right-away (after 0 ms (actually default minimum time interval in javascript is 4 ms, so 0 ms becomes 4 ms)) before the postData event is broadcast-ed and serviced.
The answer is no!
Settimeout doesn't guarantee that the callback function passe inside it shall surely get executed after the specified interval. The specified interval is just the minimum interval needed after which the callback can be executed. SetTimeOut is an asynchronous call. if there are any other async operation waiting in the pipeline already, javascript runs them first.
For understanding how all this happen you need to understand what is an event loop in javascript.
Javascript runtime is single threaded, it just have one call stack, meaning it runs the code sequentially as it is written. Then how on earth does it implement asyncronicity?
So this is what happens under the hood when the javascript run-time encounters an operation that is asyncronous (like an API call, http call, settimeout, event broadcast etc). Please note these functions are not provided in our native javascipt run time engine (for example chromes V8 engine), instead they are provided by the browser (known as webAPIs), which are basically threads that you can make call to and they fork off an independent path of execution, separate from the javascript run-time execution flow, and thats how concurrency is actually achieved.
The question arises that Javascript run time is still single threaded. So how does these webAPI inturrpt the runtime flow and provide their results when they are complete? They can not just prompt the javascript runtime anytime when they are finished and provide their results to it? There must be some mechanism.
So javascript just makes the call to these webAPI's and does not wait for the output of the call. It simply goes on and execute the code that follows the call, and thats how the dosomething() in the problem, gets executed before the postDate event is listened and served).
Meanwhile the forked thread processes the http call or the setTimeout or handle the event etc, whatever the async call was made for. And when its done, the callback is pushed onto a event queue (task queue) (Note that multiple callback returns can be pushed into this queue.). But they are not run right away.
The javascript runtime waits for the call stack to get empty first. When there is nothing for the javascript runtime to execute the async calls callback function are popped out from the task queue, one by one and executed.
So in essence if we can just make dosomething() async, it will get executed after the 1st async is completed. Thats what I did. The settimeout callback gets pushed onto the event queue/task queue. Javascript call stack gets empty. The call back for postData event broadcast gets served. Then dosomething() gets a chance to execute.
You could $broadcast the event, listen for it in your other controller with $on, and $emit another event on completion, and listen for it in your original controller so you know when it is finished.
I would not recommend this approach. Instead use a service.
Emit and broadcast are coupling your mechanisms for communication to the view because the $scope is fundamentally a fabric for data-binding.
The services approach is far more maintainable and can communicate between services in addition to controllers.
Im assuming the broadcast of 'postData' is defining the end of a funciton.
If you use the $q angular service this can be accomplished easily by creating asynchronous functions.
function postData() {
var deferred = $q.defer();
//Do your asynchronous work here that post data does
//When the asynchronous work is done you can just resolve the defer or
//you can return data with resolve. Passing the data you want
//to return as a param of resolve()
deferred.resolve();
//return
return deferred.promise;
}
When you call postData now you can now use the then method to run doSomething() after postData() is done.
postData().then(function(data) {
doSomething();
}, function (err){
//if your ansyncronous function return defer.reject() instead of defer.resolve() you can catch the error here
};
Heres the angular documentation for $q
Heres a plunk to show you a simple example
This isn't how events works, you can't wait for events to complete.
Why won't you fire 'postData', let the consumers of this event do whatever they does, and then wait for another event and execute 'doSomething' once you receive it?
This way, once the consumer of 'postData' finish processing the event, he can fire another event which you can consume and execute your 'doSomething' when you receive it.
I’ve been reading about callback functions here, and learned that JavaScript is a single-thread synchronous language.
This means that if you want to collect data from a database then you’d have to wait for the routine to finish before any more code was executed. Is this true? What would happen if the user pressed a button to call a different function in the same script file?
To make it asynchronous you can use callbacks. Asynchronous here would mean that a section of code in the callback would ‘wait’ for an event before being called but a new thread is not created.
What is it about being an Object that makes JavaScript callbacks asynchronous?
It is the same as waiting for an event?
It is true that with Javascript, if you are going to call into a database, generally you must wait for the database to respond (ie, a round trip to Pluto) before your code will continue to execute. This is called a 'blocking' call.
What a callback allows you to do is make a blocking call, but as you do so also say, "Execute this code when the blocking call concludes, but don't wait around for that to happen." Thus, your program continues execution. When the blocking call completes, the code you specify in the callback (which has not been run yet) will then execute. This may be almost immediately or some time later.
With Javascript, the rest of your code will complete execution, then the first callback to be triggered by a blocking call finishing will be executed, and so on until all callbacks are executed. At that point the thread will be shut down.
Note that only the callback code is 'waiting for an event'.
Thus, the execution order looks something like this:
Execute some code.
Set up callback code.
Execute blocking call.
Execute remainder of code.
Wait for blocking call to return.
Execute callback code.
Stop thread process.
I know that the synchronous ajax calls on the main thread are deprecated, but I still wonder why.
How do you archive this in asynchronous module loading: get('moduleDependency').foo(); ?
I would like to use this kind of synchronous calls at least in development to speed up the overall development circle. The modules are in production already concatenated into one file and will never touch the synchronous loading function at all.
My synchronous module loader (~80 loc) solves dependencies and more. I rewrote it to asynchronous loading, and it's working fine... but I'll have to give up using code like: get('moduleDependencie').foo();
And that's really a mess!
How do you get this kind of calls working with asynchronous loading? Or do I simply have to use asynchronous loading in cooperation with a while(true) function on the main thread in the future - until they ban while loops on the main thread also?
As long as the synchronous call isn't finished or a timeout isn't reached, there is no possibility for user to interact with the page. So it can hang up and in worst case the user has to restart his browser. Asynchronous programming and scripting is based on callbacks. You just have to bind a method to the success handler of the AJAX-Request. You can use
success:function(result){
//do something
}
or
success: myfunction
[...]
function myfunction(result){
//do something
}
Once the asynchronous code has finished, this method will be called. So put everything that works with the data from the AJAX request into this method.
I was just reading another question about jQuery's synchronous ajax call, and I got to wondering:
What circumstances make a synchronous version of an ajax call beneficial/necessary?
Ideally I'd like an example, and why synchronous is better than standard ajax.
The only reasonable example I can think of (that can't be worked around another way) is making a call in window.onbeforeunload, where you need it to be synchronous, or the page will move on and the request will never complete.
In this specific case using standard/asynchronous behavior, you're all but assured the request will die too early to have any impact, or ever contact the server.
I'm not saying I'm in favor of doing this, quite the opposite (as it negatively impacts the user's browsing speed). But...there's not much option here.
In sum, please do not use synchronous requests as #Brandon says: they are a cheap/easy/quick way to avoid making a callback. In addition, modern browsers show warnings if synchronous requests are made and we do not like that. Make your world asynchronous.
synchronous ajax is often used to retrieve a valued from the server which is required to further continue processing of client side code. in such case, the ajax call will block until the call returns with the desired value. example:
a javascript function needs to compute salary for an employee:
step1 : get the employee id from the form
step2 : make a sync server call passing the emp.id to get his salary/hour
step3 : multiply salary rate by number of working hours
as you can see, total salary cannot be computed unless the server call is finished so this should be a sync function, although if using jquery, one could handle onSuccess to compute the salary asynchronously but processing will continue in this if you have a message box to display the salary, it will appear empty...
I would venture a guess that it'd be good in a scenario where you want to perform some ajax calls but you have one call that relies on the results of another call. If you perform them synchronously you can wait for the independent to finish before the dependent call fires.
Well, first I want to say I'm a bit new in the world of Internet dev.
Anyway, I'm trying to know if its possible to run two pieces of code in parallel using javascript.
What I really need is to call two methods that are in a remote server. I pass, for both, a callback function that will be executed soon the data I want is ready. As the server running these functions take a time to answer, I'm trying to find a way to call both methods at the same time without need to wait till the first finishes to call the second.
Does methods like setTimeout run concurrently, for example
setTimeout(func1, 0);
setTimeout(func2, 0);
...
function func1()
{
webMethod1(function() {alert("function 1 returned"); } );
}
function func1()
{
webMethod2(function() {alert("function 2 returned"); } );
}
Edited
I've just found this article that may be very cool for the realease of next browsers: Javascript web workers
There is one single thread of execution in Javascript in normal WebBrowsers: your timer handlers will be called serially. Your approach using timers will work in the case you present.
There is a nice piece of documentation on timers by John Resig (author of the very popular jQuery javascript framework - if you are new to Web development, I would suggest you look it up).
Now, if you are referring to HTML5 based browsers, at some point, they should have threading support.
Yes, that's exactly how web requests through AJAX work. No need to setTimeout to 0, you can just call them one by one, and make an AJAX request, and it'll be executed asynchronously, allowing you to pass a callback function to be invoked when the request completes.
The means of creating an AJAX request differs some depending on what browser you're running. If you're going to build something that depends considerably upon AJAX, and you want it to work across multiple browsers, you're best off with a library. Here's how it's done in jQuery, for instance:
$.ajax({ url: '/webrequesturl', success: function(result) {
// this will be called upon a successful request
} });
$.ajax({ url: '/webrequest2url', success: function(result) {
// this will be called upon a successful request
// this may or may not be called before the above one, depending on how long it takes for the requests to finish.
} });
Well, JavaScript is single-threaded, the two timers will run sequentially one after the other, even if you don't notice it.
I would recommend you to give a look to the following article, it really explains how timers and asynchronous events work, it will also help you to understand the single-threaded nature of JavaScript:
How JavaScript Timers Work
And as an alternative you could give a look to WebWorkers, is a way to run scripts in separate background threads, but they are only supported by modern browsers.
What you are looking for is asynchronous client-server communication (keyword: async). Asynchronous functions return straight away, but the provided callback will be executed after the specified condition is satisfied.
So, if the function that sends a request to the server is asynchronous, this would let you send both requests to the server without waiting for one to respond.
Using setTimeout may work, as this will schedule both request-sending functions to be called. However, some browsers only run one thread of Javascript at a time, so the result would be that one of the scheduled functions would run and block (waiting for a reply) and the other scheduled function would wait until the first was done to start running.
It is advisable to use async support from your server communication library. For instance jQuery uses async by default.
It depends on the JavaScript engine.