I'm trying to override the standard confirm() method in Javascript (make a nice UI and stuff). I've read a 100 posts that it "can't be done", but I don't want to give up until I have given it a fair shot. :)
So, the real problem is of course that the confirm() method must block all javascript execution until the user selects an option. So, what are the methods in Javascript that have blocking behavior? I've been able to come up with 5:
alert() - does not suit me, because it displays an unwanted UI of its own;
confirm() - the same problem as alert();
infinite loop - even modern browsers will eat away CPU like crazy and display a "stop javascript?" prompt after a few seconds;
XmlHttpRequest in synchronous mode - sort of, but it involves server...
showModalDialog() - nice, but I need a specific design, plus there are some browser compatibility requirements...
The best approach I have so far is to create an <iframe> with the prompt (which then gets its own javascript execution thread) and block with XmlHttpRequest until the user has selected an option in the <iframe>. Unfortunately this involves passing the result forth and back between the server, and I'd like to make this 100% client-side. Also, it ties up a server thread while the dialog is opened, and there might be some browser-specific ajax-timeouts that apply.
Can anyone think of any other Javascript methods that block execution which might be (ab)used to achieve the desired effect?
No, it can't be done for a good reason. The arbitrary, custom-styled user interaction in the page is always asynchronous (event-based), and therefore does not work with any type of blocking behaviour (the event that would stop the infinite loop would only occur after the infinite loop has finished).
All those blocking methods that you mentioned do their user interaction in a different environment than the page - the alert/confirm/prompt popups controlled by the browser, the different page loaded by the showModalDialog - and that environment would need be able to gain focus while the first is frozen.
Creating a setup like this reliable is difficult enough. However, you could try almost every javascript functionality (that does not involve async callbacks), as all JS operations are synchronous by default. If you want to further experiment, I would suggest to look at those methods that deal with different DOM environments (window.open, document.write, iframe.contentWindow cross-frame-access) to see whether you can get any of these to spawn a second environment which runs in parallel reliably.
Related
This is a duplicate question. It has been asked many times before, with dozens of answers, some of them rated very highly. Unfortunately, as far as I have been able to tell, every single one of those answers is a variant of "You don't, it's bad programming practice. Use setTimeout instead".
This is Not. An. Answer!
There are some use cases - rare but they exist - where you might want the entire page's execution to halt for a second or two, and I find it very frustrating that nobody seems interested in answering the actual question. (have a look at the comments here for some examples).
I am sure it's possible to halt javascript executing; for instance, if I use firebug to insert a breakpoint, then the execution stops when it hits that point. So, firebug can do it. Is there some way that the program can halt execution of the current thread until some timeout occurs?
Just some thoughts: How does firebug do it? Is there some browser-specific method? Is it possible to trigger a stop, without specifying a timeout to continue? Could I programmatically insert a breakpoint, or remove one? Could I get a closure representing the current thread to pass to setTimeout?
I don't have a specific use case in mind; I am just looking for advise from someone who knows the browser/javascript design better than me, as to how this can most effectively be done.
So far, I have come up with only one solution:
endtime=Date.now()+1000;
while(Date.now() < endtime)
$.ajax(window.location.origin,{'async':false});
This appears to work. The problem with it is, it makes hundreds of excess requests. I would replace the location.origin with something like mysite/sleep?delay=X and write a server side script to provide the delay, which would but it down to one, but the whole thing still seems really hacky. There must be a better way to do this! How does the jquery.ajax function manage it? Or is there a busy-wait buried in it somewhere?
The following do not answer the question and will be downvoted, just because I am sick of seeing pages of answers that completely ignore the question in their rush to rant on the evils of sleep:
Sleep is evil, and you should do anything it takes to avoid needing it.
Refactor your code so that you can use setTimeout to delay execution.
Busy-wait (because it doesn't stop execution for the duration of the sleep).
Refactor your code to use deferred/promise semantics.
You should never do this, it's a bad idea...
... because the browser has been, traditionally, single-threaded. Sleeping freezes the UI as well as the script.
However, now that we have web workers and the like, that's not the case. You probably don't need a sleep, but having a worker busy-wait won't freeze the UI. Depending on just how much you want to freeze a particular thread, I've seen people use:
endtime = Date.now()+1000;
while (Date.now() < endtime)
or, curiously (this was in an older but corporate-sponsored analytics library):
endtime = new Date().getTime() + 1000;
while (new Date().getTime() < endtime)
which is probably slower. If you're running a busy wait, that doesn't necessarily matter, and allocating objects probably just burns memory and GC time.
Code using promises or timeouts tends to be more modular, but harder to read (especially when you first learn async techniques). That's not an excuse for not using it, as there are definite advantages, but maybe you need everything to stay synchronous for some reason.
If you have a debugger running and want some chunk of code to pause itself (very useful when you have a bunch of nested callbacks), you can use:
function foo() {
do.someStuff();
debugger;
do.otherStuff();
}
The browser should pause execution at the debugger statement. The debugger can almost always pause execution, because it is in control of the VM running the code; it can just tell the VM to stop running, and that ought to happen. You can't get quite to that level from a script, but if you take source as text (perhaps from a require.js plugin), you can modify it on the fly to include debugger statements, thus "programmatically inserting breakpoints." Bear in mind that they will only take effect when the debugger is already open, though.
To capture the state of a "thread" and persist it for later use, you may want to look into some of the more complicated functional programming concepts, particularly monads. These allow you to wrap a start value in a chain of functions, which modify it as they go, but always in the same way. You could either keep simple state (in some object), or record and reproduce everything the "thread" may have done by wrapping functions in functions. There will be performance implications, but you can pick up the last function later and call it, and you should be able to reproduce everything the thread may have done.
Those are all fairly complicated and specific-use solutions to avoid just deferring things idiomatically, but if you hypothetically need them, they could be useful.
No, it is not possible to implement a sleep in javascript in the traditional sense, as it is a single-threaded event based model. The act of sleeping this thread will lock up the browser it is running in and the user is presented with a message either telling them the browser has stopped responding (IE) or allowing them to abort the currently running code (Firefox).
I'm using the PerformanceTiming interface to measure page load time.
Several of my pages have a long "Browser Time" (i.e. loadEventEnd - responseEnd), and I think this could be because of the Ajax requests from the pages.
My question is: where do Ajax requests fit in the PerformanceTiming process model? Is it in the "Processing" block?
If so, what's the best way to measure the Ajax execution time?
We're currently fighting this issue.
For a couple of years, we've had ajax requests that are fired with jQuery's "document ready" timer. This shouldn't have - technically speaking - extended the loadEventEnd. However, with absolute certainty, they did push out firing the loadEventEnd.
Now, we're digging to see what happened with our last release that moved those values outside of loadEventEnd.
To really measure those ajax requests, you'll want to use IE10 or - preferably - a chrome version greater than 28. Those both include the resourceTiming interface.
You can access metrics similar to the navigationTiming interface above for each resource loaded on a page.
From the javascript console in chrome (ctrl+shift+j on windows), enter:
window.performance.getEntries()
this will return all of the objects associated with your page. To get the ajax requests, you'll want initatorType = xmlhttprequest. You can find all of those events with the following:
for(var i=0; i<window.performance.getEntries().length;i++){
console.log(window.performance.getEntries()[i].initiatorType);
}
The Ajax Requests are after the end of the model.
See https://stackoverflow.com/a/16289733/1168884 for an example where an Ajax request runs and does not affect the properties on the performance object (which only reflects the loading of the page)
I guess the underlying issue is that, where as the page-loading events are quite defined (for example the DOM is now complete and available) and this is reflected in the model. The Ajax events are not (for example, there isn't really an event when you can say "all Ajax on the page has now completed running").
I haven't tried it, but there a project Boomerang which promises to allow the measuring of dynamically loaded content - http://lognormal.github.io/boomerang/doc/use-cases.html
Is it possibly to do things asynchronously in javascript (AJAX aside)? For example, to iterate multiple arrays at the same time. How is it done? A brief example would be nice. Searching for this was hard, due to all the ajax pollution, which is not what I am looking for.
Thanks in advance.
Use web Workers. But remember that it is a very new feature and not all browsers are fully supported.
You could use setTimeout.
setTimeout(function () { iterateArray(array1); reportDone(1); }, 0);
setTimeout(function () { iterateArray(array2); reportDone(2); }, 0);
I'm not sure how concurrent it will be, but it is an asynchronous programming model.
As stated by Grumdrig you can write code like this:
setTimeout(function () { iterateArray(array1); reportDone(1); }, 0);
setTimeout(function () { iterateArray(array2); reportDone(2); }, 0);
But it will still not run concurrently. Here's a general idea of what happens after such timeouts are called:
Any code after the setTimeout calls will be run immediately, including returns to calling functions.
If there are other timers in queue that are at or past their delay or interval time, they will be executed one at a time.
While any timer is running, another might hit its interval/delay time, but it will not be run until the last one is finished.
Some browsers give priority to events fired from user interaction such as onclick and onmousemove, in which case the functions attached to those events will execute at the expense of timer accuracy.
This will continue until there is an opening (no previously called timers or event handlers requesting execution). Only then will the functions in the example code be run. Again one at a time, with the first one likely but not certainly executing first. Also, I'm venturing a guess that some browsers might impose a minimum delay time, which would make any timers set with a delay of 0 milliseconds be run even later than expected.
Obviously there is no performance advantage to running code like this. In every case it will make things take longer to complete. However in cases where a single task is taking so long it freezes the browser (and possibly trips "Script is taking too long" browser warnings), it can be helpful to break it up into smaller faster executing pieces that run sequentially after some delay time, thus giving the browser some time to breathe.
Web Workers have been mentioned, and if you are not concerned about IE compatibility then you can use them for true concurrency. However there are some severe limitations on their use imposed for security reasons. For one they cannot interact with the DOM in any way, meaning any changes to the page still must be done synchronously. Also all data passed to and from workers is serialized in transit, meaning true Javascript objects cannot be used. That being said, for intensive data processing, Web Workers are probably a better solution than breaking a function up into multiple timer delayed tasks.
One new development in this field is HTML5 Web Workers.
JavaScript is normally single threaded; you cannot do several things at once. If your JavaScript code is too slow, you will need to offload the work. The new way is to use web workers, as others have noted. The old way is often to use AJAX and do the work on the server instead. (Either with web workers or with AJAX, the arrays would have to be serialized and the result deserialized)
I have to agree with MooGoo, i also wonder why you would run through such a big array in one go.
There's an extension to JavaScript called StratifiedJS, it allows you do multiple things at once as long as they're asynchronous. Also, webworkers are an awkward "solution" that just make things more complicated, also, they don't work in IE.
In StratifiedJS you could just write.
waitfor {
// do something long lasting here...
}
and {
// do something else at the same time...
}
// and when you get here, both are done
I'm developing a SCORM compliant LMS, and having some problems with Captivate generated contents.
Basically, the behavior is: If you see a SCO (captivate generated content) with for example 15 slides and 1 question in each slide quickly, my lms is not tracking all the 15 question, only the first 3 or 4. If you wait a long time at the end, or if you take the content slow, it works fine.
After a lot of google searches, and debugging and tracing, finally, I found two main issues:
1) Captivate - SCORM API communication is asynchronous (is the same than flash - javascript communication). So, when the user see the content quickly, the function calls get more and more dealayed, and at the end, maybe the user is answering question 15, and the content is sending question 4 information. I cannot change the Flash or JS-Flash interface, because this is provided by Captivate.
There is a way to make this sync?? I mean, to force the flash wait some way?
2) The functions are taking longer each time they are called, for example, setValue takes 7 milliseconds the first time and 200 the last time is called.
To understand this problem, here is a little background:
Captivate contents (all contents really but more captivate) calls a specific function many times, the SetValue function, one of the SCORM API functions. This function takes two parameters (fieldName, value) the firstone is the name of the field to be set, and the second the new value. In my implementation, this function first validate the value using a regular expression, and then set the value in an object.
Ok, I can add a lot more info, but I don't know what is really important, I'm not hoping you fix my code without seeing it, but I'm out of ideas, and need new opinions, ideas, directions.... maybe that sombody ask the right question... help :)
Thanks
When publishing for SCORM, Captivate does not use synchronous communication methods.* Depending on the browser, Captivate uses either FSCommand or the old-school getURL method to communicate with the HTML file; the HTML file then uses JavaScript to relay the data to the LMS via the SCORM API.
The response (if any) is relayed from JavaScript to either FSCommand or a proxy SWF (for getURL), which is then monitored internally in Captivate via a callback function. This callback function uses timers, and that's probably where your problem lies.
If you're setting g_intAPIType to 0, you're forcing the browser to use FSCommand, which isn't supported in all browsers and operating systems. Setting g_intAPIType to 1 means you're forcing the browser to use getURL, which is cross-browser but has a few drawbacks (including lots of clicking sounds).
In both cases, the data is sent via an internal queue script, which uses the waitForResponse callback function.
The performance problems you're encountering are likely due to the queuing, and the asynchronous communication compounds the problem because of timers attached to waitForResponse. Changing g_intAPIType will probably only have a minor effect on your performance issues, though using getURL (g_intAPIType=1) may help improve consistency from browser to browser.
Regardless of the g_intAPIType settings, you cannot prevent the internal tracking mechanism from using the asynchronous waitForResponse function, so there is no way to stop Captivate from using timers when getting/setting data; over a period of time you will probably start to notice longer and longer delays like the ones you described, esp. if you're making a lot of calls to the LMS.
(* Small exception: I've been informed Captivate 4 and 5 use ExternalInterface if the project is built in AS3 and is published for SCORM 2004, but it appears the queue and waitForResponse timers are still used, basically treating ExternalInterface like the asynchronous methods listed above.)
Some Options:
You could change how you are doing the questions. Instead of 1 per frame put all the questions on 1 frame.
Otherwise, you will need to do some JavaScript magic in your SCORM Player JavaScript. I would start with minimizing the JS code with a tool like JSMin.
Then try to cache the JS files so they are only loaded once. I suspect that the files are being called over and over with each frame.
"There is a way to make this sync?? I mean, to force the flash wait some way?"
Apparently, the problem is this one :
"Captivate is the only SCO that calls SCORM JavaScript functions asynchronously. Firefox is the only browser that does not force synchronous communications between the SCO and the supporting JavaScript. When a Captivate SCO, running on Firefox, submits a status update to one of the JS functions, Captivate does not wait for a success or fail response before submitting the next status update. Since Captivate is quite verbose in its communications and JavaScript is not multithreaded, quiz status submissions can stack up and overwrite each other. This can cause a loss of data - especially for longer quizzes. [...]
If you'd like to see the asynchronous problem with any other LMS, take a long Captivate quiz using Firefox and answer the questions very quickly. Some of the questions near the end will get dropped.. " (interzoic.com forum)
And maybe a solution :
"The slow issue is resolved when I force the g_intAPIType to 0 (into the
.htm file), so it force Captivate to communicate as if it was into IE."
In captivate, while publishing a scorm you will see option "Send tracking data at the end",
Use this option, it will resolve your problem.
I have seen this link: Implementing Mutual Exclusion in JavaScript.
On the other hand, I have read that there are no threads in javascript, but what exactly does that mean?
When events occur, where in the code can they interrupt?
And if there are no threads in JS, do I need to use mutexes in JS or not?
Specifically, I am wondering about the effects of using functions called by setTimeout() and XmlHttpRequest's onreadystatechange on globally accessible variables.
Javascript is defined as a reentrant language which means there is no threading exposed to the user, there may be threads in the implementation. Functions like setTimeout() and asynchronous callbacks need to wait for the script engine to sleep before they're able to run.
That means that everything that happens in an event must be finished before the next event will be processed.
That being said, you may need a mutex if your code does something where it expects a value not to change between when the asynchronous event was fired and when the callback was called.
For example if you have a data structure where you click one button and it sends an XmlHttpRequest which calls a callback the changes the data structure in a destructive way, and you have another button that changes the same data structure directly, between when the event was fired and when the call back was executed the user could have clicked and updated the data structure before the callback which could then lose the value.
While you could create a race condition like that it's very easy to prevent that in your code since each function will be atomic. It would be a lot of work and take some odd coding patterns to create the race condition in fact.
The answers to this question are a bit outdated though correct at the time they were given. And still correct if looking at a client-side javascript application that does NOT use webworkers.
Articles on web-workers:
multithreading in javascript using webworkers
Mozilla on webworkers
This clearly shows that javascript via web-workers has multithreading capabilities. As concerning to the question are mutexes needed in javascript? I am unsure of this. But this stackoverflow post seems relevant:
Mutual Exclusion for N Asynchronous Threads
Yes, mutexes can be required in Javascript when accessing resources that are shared between tabs/windows, like localStorage.
For example, if a user has two tabs open, simple code like the following is unsafe:
function appendToList(item) {
var list = localStorage["myKey"];
if (list) {
list += "," + item;
}
else {
list = item;
}
localStorage["myKey"] = list;
}
Between the time that the localStorage item is 'got' and 'set', another tab could have modified the value. It's generally unlikely, but possible - you'd need to judge for yourself the likelihood and risk associated with any contention in your particular circumstances.
See the following articles for a more detail:
Wait, Don't Touch That: Mutual Exclusion Locks & JavaScript - Medium Engineering
JavaScript concurrency and locking the HTML5 localStorage - Benjamin Dumke-von der Eh, Stackoverflow
As #william points out,
you may need a mutex if your code does something where it expects a
value not to change between when the asynchronous event was fired and
when the callback was called.
This can be generalised further - if your code does something where it expects exclusive control of a resource until an asynchronous request resolves, you may need a mutex.
A simple example is where you have a button that fires an ajax call to create a record in the back end. You might need a bit of code to protect you from trigger happy users clicking away and thereby creating multiple records. there are a number of approaches to this problem (e.g. disable the button, enable on ajax success). You could also use a simple lock:
var save_lock = false;
$('#save_button').click(function(){
if(!save_lock){
//lock
save_lock=true;
$.ajax({
success:function()
//unlock
save_lock = false;
}
});
}
}
I'm not sure if that's the best approach and I would be interested to see how others handle mutual exclusion in javascript, but as far as i'm aware that's a simple mutex and it is handy.
JavaScript is single threaded... though Chrome may be a new beast (I think it is also single threaded, but each tab has it's own JavaScript thread... I haven't looked into it in detail, so don't quote me there).
However, one thing you DO need to worry about is how your JavaScript will handle multiple ajax requests coming back in not the same order you send them. So, all you really need to worry about is make sure your ajax calls are handled in a way that they won't step on eachother's feet if the results come back in a different order than you sent them.
This goes for timeouts too...
When JavaScript grows multithreading, then maybe worry about mutexes and the like....
JavaScript, the language, can be as multithreaded as you want, but browser embeddings of the javascript engine only runs one callback (onload, onfocus, <script>, etc...) at a time (per tab, presumably). William's suggestion of using a Mutex for changes between registering and receiving a callback should not be taken too literally because of this, as you wouldn't want to block in the intervening callback since the callback that will unlock it will be blocked behind the current callback! (Wow, English sucks for talking about threading.) In this case, you probably want to do something along the lines of redispatching the current event if a flag is set, either literally or with the likes of setTimeout().
If you are using a different embedding of JS, and that executes multiple threads at once, it can get a bit more dicey, but due to the way JS can use callbacks so easily and locks objects on property access explicit locking is not nearly as necessary. However, I would be surprised if an embedding designed for general code (eg, game scripting) that used multi threading didn't also give some explicit locking primitives as well.
Sorry for the wall of text!
Events are signaled, but JavaScript execution is still single-threaded.
My understanding is that when event is signaled the engine stops what it is executing at the moment to run event handler. After the handler is finished, script execution is resumed. If event handler changed some shared variables then resumed code will see these changes appearing "out of the blue".
If you want to "protect" shared data, simple boolean flag should be sufficient.