Well, first I want to say I'm a bit new in the world of Internet dev.
Anyway, I'm trying to know if its possible to run two pieces of code in parallel using javascript.
What I really need is to call two methods that are in a remote server. I pass, for both, a callback function that will be executed soon the data I want is ready. As the server running these functions take a time to answer, I'm trying to find a way to call both methods at the same time without need to wait till the first finishes to call the second.
Does methods like setTimeout run concurrently, for example
setTimeout(func1, 0);
setTimeout(func2, 0);
...
function func1()
{
webMethod1(function() {alert("function 1 returned"); } );
}
function func1()
{
webMethod2(function() {alert("function 2 returned"); } );
}
Edited
I've just found this article that may be very cool for the realease of next browsers: Javascript web workers
There is one single thread of execution in Javascript in normal WebBrowsers: your timer handlers will be called serially. Your approach using timers will work in the case you present.
There is a nice piece of documentation on timers by John Resig (author of the very popular jQuery javascript framework - if you are new to Web development, I would suggest you look it up).
Now, if you are referring to HTML5 based browsers, at some point, they should have threading support.
Yes, that's exactly how web requests through AJAX work. No need to setTimeout to 0, you can just call them one by one, and make an AJAX request, and it'll be executed asynchronously, allowing you to pass a callback function to be invoked when the request completes.
The means of creating an AJAX request differs some depending on what browser you're running. If you're going to build something that depends considerably upon AJAX, and you want it to work across multiple browsers, you're best off with a library. Here's how it's done in jQuery, for instance:
$.ajax({ url: '/webrequesturl', success: function(result) {
// this will be called upon a successful request
} });
$.ajax({ url: '/webrequest2url', success: function(result) {
// this will be called upon a successful request
// this may or may not be called before the above one, depending on how long it takes for the requests to finish.
} });
Well, JavaScript is single-threaded, the two timers will run sequentially one after the other, even if you don't notice it.
I would recommend you to give a look to the following article, it really explains how timers and asynchronous events work, it will also help you to understand the single-threaded nature of JavaScript:
How JavaScript Timers Work
And as an alternative you could give a look to WebWorkers, is a way to run scripts in separate background threads, but they are only supported by modern browsers.
What you are looking for is asynchronous client-server communication (keyword: async). Asynchronous functions return straight away, but the provided callback will be executed after the specified condition is satisfied.
So, if the function that sends a request to the server is asynchronous, this would let you send both requests to the server without waiting for one to respond.
Using setTimeout may work, as this will schedule both request-sending functions to be called. However, some browsers only run one thread of Javascript at a time, so the result would be that one of the scheduled functions would run and block (waiting for a reply) and the other scheduled function would wait until the first was done to start running.
It is advisable to use async support from your server communication library. For instance jQuery uses async by default.
It depends on the JavaScript engine.
Related
So i have a web application which is making around 14-15 AJAX calls to some APIs. The problem is the amount of time all the AJAX calls takes is nearly 3x than the time in which each individual API shows me response when I type its URL in the browser.
I am making all the AJAX calls instantly inside the DOM Ready event.
The thing is how can i speed up this process of making 15 AJAX calls together, getting the response as fast as possible and Manipulating the DOM accordingly.
Few Points which i have in my mind:
All the AJAX calls should be ASYNC in nature. (Already doing it).
Don't make all the AJAX calls at the same time. Induce some sort of timeout as making all the AJAX calls at the same time may block the bandwidth and slows down the turn around time of the process.
Reducing the number of API calls by any means. (Already doing it).
Manipulate the DOM as minimal as possible. (Already doing it).
Setting cache:true in AJAX setup. I don't think that will really help, still i am doing it wherever i am sure content will update really slow.
Any suggestions will be valuable! Thanks.
The way i am making AJAX calls
$(document).ready(function(){
loadSentimentModule();
loadCountModule();
loadEntitiesModule();
// Some more function calls.
});
function loadSentimentModule(){
$.ajax({
url:someurl,
cache:true,
dataType:"json",
success: function(data){
// Based on data manipulating DOM.
}
}
// Same kind of function defintions for all the functions.
You may not issue the ajax call directly, but queue them and let a manager control the queue, see here: Queue ajax requests using jQuery.queue()
I recomend you to use the async.js module on client. May be it this what are you looking for.
I have a script that pings a series of urls with a GET method. I only want to ping them each once and do not expect a response. My script works in Chrome and Safari, but Firefox won't complete the later requests.
Is there a way to trigger Firefox to make a series of calls (five, to be precise) once each, and not care if they fail? It seems that Firefox won't complete the series of requests when the first ones fail.
I'm working in javascript and jQuery, with a little jQuery.ajax() thrown in. I've searched, to no avail and have reached the limit of my beginner's skill set. Any insight would be appreciated.
(If you're interested in the full scope, there's code at jquery-based standalone port knocker)
Thank you.
Update:
After further research, I believe the issue is that Firefox isn't handling the calls truly asynchronously. I have versions of code making the pings with img calls, iframe url calls, and ajax calls to work in Chrome and Safari, but in Firefox they're not behaving as I need them to.
Our server monitoring for the knock sequence should see requests come sequentially to ports 1, 2, 3, 4, 5 (as it does when using Chrome or Safari) but in Firefox, no matter which method I've tried, I see the first attempt ping port 1 twice, then port 2, and on subsequent attempts I only see it ping port 1. My status updates appear as expected, but the server isn't receiving the calls in the order it needs them. It seems that Firefox is retrying failed calls rather than executing each one once, in sequence, which is what I need it to do.
Here is a sample of my script using a simple jquery.ajax call method. It works in Safari and Chrome, but doesn't achieve the desired result in Firefox. While all my code runs and I can see the status updates (generated with the jquery.append function), the request aren't sent once each, sequentially to my server.
<script src="http://code.jquery.com/jquery-latest.js"></script>
<script type="text/javascript">
$(document).ready(function(){
$('button').click(function(){
$('#knocks').append('<p>Knocking...</p>');
setTimeout(function(){
$.ajax({url: 'https://example.sample.com:1111'});
$('#knocks').append("<p>Knock 1 of 5 complete...</p>");
}, 500);
setTimeout(function(){
$.ajax({url: 'https://example.sample.com:2222'});
$('#knocks').append("<p>Knock 2 of 5 complete...</p>");
}, 3500);
setTimeout(function(){
$.ajax({url: 'https://example.sample.com:3333'});
$('#knocks').append("<p>Knock 3 of 5 complete...</p>");
}, 6500);
setTimeout(function(){
$.ajax({url: 'https://example.sample.com:4444'});
$('#knocks').append("<p>Knock 4 of 5 complete...</p>");
}, 9500)
setTimeout(function(){
$.ajax({url: 'https://example.sample.com:5555'});
$('#knocks').append("<p>Knock 5 of 5 complete...</p>");
}, 12000);
setTimeout(function(){
$('#knocks').append("<p>Knocking is complete... <br>Proceed to site: <a href='http://example-url.sample-url.com'>http://example-url.sample-url.com</a></p>");
}, 13000);
});
});
</script>
Seeing there's no real answer to your question and you'd probably want to move on, I though I'd give you some suggestions as a starting point.
For your function calls to truly execute in a sequential order (or synchronous, in-order, blocking,...) you will have to make sure you issue all the subsequent function calls (AJAX requests in your case) once the proceeding requests completed (either succeed, or failed, in which case you might not want to proceed with the next in-order call and issue a completely separate response).
The way you're doing it now isn't considered synchronous, instead it is actually asynchronous, delayed (or 'in the background' with a timeout). This might cause all kinds of problems when you expect your AJAX calls to execute synchronously (blocking) at your server end. From browsers re-issuing failed or timed-out requests (for various reasons, depending on their feature set and how they handle failed requests, caching,...) to preemptively issuing requests and caching results when some pre-fetchers are enabled (or however they're calling it in FF), then re-issuing them again if pre-fetcher failed. I believe this is similar to what you observed in Firefox and might be the main culprit for this unexpected behavior. As you can't control what features end user enables or disables in their browser, or what new features they implement in future versions, you can't really expect your server calls to execute asynchronous by delaying their calls with setTimeout, even if they appear to be doing so in other browsers (probably because of your server responding fast enough for them to appear as such).
In your code, the second call would only appear to be executing synchronously (waiting for the first one to complete) for up to half a second, the third request for up to 3 seconds and a half, and so on. But even if setTimeout was blocking execution (which it doesn't), which external request would it be waiting for? The first one, or the second one? I think you get what I'm trying to say and why your code doesn't work as expected.
Instead, you should either issue subsequent AJAX calls through your server's response (which is actually the point of using AJAX, otherwise there's no need for it), or preferably, create an external listener function that will handle these calls according to the status and/or return values of your previous external calls. If you need to handle failed requests as well and continue execution regardless, then the external listener (with preset execution stack timeout) is the way to go, as you obviously wouldn't be able to depend on response of failed requests.
You see, browsers have no problems issuing multiple concurrent requests and delaying them with setTimout doesn't stop pre-fetchers to try and cache their responses for later use either. It also doesn't issue requests in a blocking manner, the next one waiting for the previous one to finish, as you expected them to. Most will be happy to utilize up to a certain number of concurrent connection (~10 on client machines, a lot more on servers) in an effort to speed-up the download and/or page rendering process, and some obviously have even more advanced caching mechanism in place for this very same reason, Firefox being merely one of them.
I hope this clears things up a bit and you'll be able to rewrite your code to work as expected. As we have no knowledge of how your server-side code is supposed to work, you'll have to write it yourself, though. There are however plenty of threads on SE discussing similar techniques that you might decide on using, and you can always ask another question if you get stuck and we'll be glad to help.
Cheers!
Is it possibly to do things asynchronously in javascript (AJAX aside)? For example, to iterate multiple arrays at the same time. How is it done? A brief example would be nice. Searching for this was hard, due to all the ajax pollution, which is not what I am looking for.
Thanks in advance.
Use web Workers. But remember that it is a very new feature and not all browsers are fully supported.
You could use setTimeout.
setTimeout(function () { iterateArray(array1); reportDone(1); }, 0);
setTimeout(function () { iterateArray(array2); reportDone(2); }, 0);
I'm not sure how concurrent it will be, but it is an asynchronous programming model.
As stated by Grumdrig you can write code like this:
setTimeout(function () { iterateArray(array1); reportDone(1); }, 0);
setTimeout(function () { iterateArray(array2); reportDone(2); }, 0);
But it will still not run concurrently. Here's a general idea of what happens after such timeouts are called:
Any code after the setTimeout calls will be run immediately, including returns to calling functions.
If there are other timers in queue that are at or past their delay or interval time, they will be executed one at a time.
While any timer is running, another might hit its interval/delay time, but it will not be run until the last one is finished.
Some browsers give priority to events fired from user interaction such as onclick and onmousemove, in which case the functions attached to those events will execute at the expense of timer accuracy.
This will continue until there is an opening (no previously called timers or event handlers requesting execution). Only then will the functions in the example code be run. Again one at a time, with the first one likely but not certainly executing first. Also, I'm venturing a guess that some browsers might impose a minimum delay time, which would make any timers set with a delay of 0 milliseconds be run even later than expected.
Obviously there is no performance advantage to running code like this. In every case it will make things take longer to complete. However in cases where a single task is taking so long it freezes the browser (and possibly trips "Script is taking too long" browser warnings), it can be helpful to break it up into smaller faster executing pieces that run sequentially after some delay time, thus giving the browser some time to breathe.
Web Workers have been mentioned, and if you are not concerned about IE compatibility then you can use them for true concurrency. However there are some severe limitations on their use imposed for security reasons. For one they cannot interact with the DOM in any way, meaning any changes to the page still must be done synchronously. Also all data passed to and from workers is serialized in transit, meaning true Javascript objects cannot be used. That being said, for intensive data processing, Web Workers are probably a better solution than breaking a function up into multiple timer delayed tasks.
One new development in this field is HTML5 Web Workers.
JavaScript is normally single threaded; you cannot do several things at once. If your JavaScript code is too slow, you will need to offload the work. The new way is to use web workers, as others have noted. The old way is often to use AJAX and do the work on the server instead. (Either with web workers or with AJAX, the arrays would have to be serialized and the result deserialized)
I have to agree with MooGoo, i also wonder why you would run through such a big array in one go.
There's an extension to JavaScript called StratifiedJS, it allows you do multiple things at once as long as they're asynchronous. Also, webworkers are an awkward "solution" that just make things more complicated, also, they don't work in IE.
In StratifiedJS you could just write.
waitfor {
// do something long lasting here...
}
and {
// do something else at the same time...
}
// and when you get here, both are done
Anyone can tell me how to activate two (or more) JavaScript AJAX functions in parallel?
This is not possible. Javascript can only work in a single thread and there is no way to actually have two functions running in parallel. You need to make one call and then the other. The callbacks of these will be called (not necessarily in the same order with the invocation methods), when data have been returned or an error/timeout occurs. Only when one callback completes, will the second one be allowed to run.
Have also in mind that browsers restrict the number of active ajax calls. So, if you try to make too many ajax calls, one might wait (blocking all javascript code) for other calls to complete.
Search for Web Workers. These are kind of a new feature in modern browsers and may not be available in old ones.
Is this what you're looking for?
setTimeout('JsFunction1(val);', 0);
setTimeout('JsFunction2(val);', 0);
use Web Workers to run tasks in Parallel
You can a tutorial here: https://developer.mozilla.org/en-US/docs/Web/API/Web_Workers_API/Using_web_workers
Also, this library, which takes advantage of web workers, came up pretty fast on google: https://parallel.js.org/
Using several 'setInterval' may make parallel running possible, though it may still run on a single core. The following code is an example of parallelizing a function func with an array of data 'datas'. To run it, use Parallel(func,datas) where func is the name of your global function and datas is an array of data each one as an input for func.
var i_array=new Array();
function Parallel(func,datas){
$(datas).each(function(i,v){
i_array[i]=setInterval(function(){
clearInterval(i_array[i]);
window[func](datas[i]);
},10);
});
}
Here is a jsfiddle test. The time stamp in integer numbers show the two ajax were running in parallel.
Use window.open() to call a new page. That page will call the first js function. After window.open() calls the second function, you are not technically waiting for the first function to complete. You just wait for the window.open() to execute and then the second will get execute.
Javascript runs as a single thread, if the requests that you want to make doesn't have an IO, In that case its just not possible, if there's an IO operation involved, you can very well execute the two functions one after the other, the very nature of javascript will start executing the next function when it waits for IO.
Usually in languages that support threading the same thing is achieved automatically during the CPU rest period for a thread.
Very simple question: suppose I have the following js/jquery code
doSomething();
$.get(url, callback);
doSomethingElse();
I understand that right after the GET request is sent doSomethingElse() starts being executed. Now suppose that the server's reply arrives while doSomethingElse() is executing. What happens?
Does the callback run in a separate thread in parallel to doSomethingElse()?
Does the execution of doSomethingElse() pause until the callback runs and returns?
Does the callback only get called once doSomethingElse() has returned?
Thank you for any insight!
lara
No, JavaScript in web browsers is single-threaded, by design. That means that although the ajax call may start immediately (and be processed by another thread in the browser), your callback won't happen until the JavaScript interpreter is next idle. Things to do get queued up waiting for the interpreter to become idle and process them.
Edit Answering your specific questions:
Does the callback run in a separate thread in parallel to doSomethingElse()?
No, the callback will run in the same logical thread as doSomethingElse. (It would be implementation-dependant whether that's the same actual underlying OS thread, but you have no way of knowing and you don't care; logically, it's the same thread.)
Does the execution of doSomethingElse() pause until the callback runs and returns?
By default, the get will be asynchronous, so no. doSomethingElse initiates the request, but then continues. (It's possible to do a synchronous get via the underlying XmlHttpRequest mechanism, but it's a very bad idea -- tends to lock up the UI of the browser completely while the request is running, which is ugly -- and I don't know how you do it with jQuery.)
Does the callback only get called once doSomethingElse() has returned?
With an asynchronous get (the usual kind), you can be certain that doSomethingElse will finish before the callback gets called, yes. This is because the JavaScript interpreter will only do one thing at a time, and it doesn't switch to doing a new thing until it's done with the current one. So although doSomethingElse triggers the get (and the get may be processed by other, non-JavaScript threads in parallel to JavaScript), your callback won't happen until after the interpreter is done with doSomethingElse and anything that called it.
I wouldn't be surprised if at some point we start getting multiple threads in browser-based JavaScript, but if and when we do, it'll have to be explicit, since we all happily assume one thread for the moment.
To all intents and purposes there are no threads in JS, therefore execution does not happen on a separate thread.
What web APIs do do is make use of asynchronous callbacks, and that's what is happening here -- get() returns immediately, your callback function will be called once the load is complete and there is no other JS code running.
No, there is only one thread of control for Javascript. The currently executing function will continue running until it completes, then the callback will be invoked once it is ready.
So to specifically answer your question, the callback only gets called once doSomethingElse() has returned. Assuming, of course, that the GET request is successful - if an error occurrs then the callback will never be executed.
Here is a pretty good article that illustrates how this works.
http://www.javascriptkata.com/2007/06/12/ajax-javascript-and-threads-the-final-truth/
I honestly don't know the answer, but I wouldn't place any code in doSomethingElse() that is dependent on something that callback() does. I know that doSomethingElse() will always run first, but as far as timing/thread issues I'm not sure.