I have very basic promise based self-calling function that:
takes collection of divs with certain class
checks whether they have just been moved left or right
based on result makes choice to move (transform: translate) them
with classList.add() / classList.remove()
and on transitionend - calls itself
here is function:
function transitionTest(){
console.log('called --- transitionTest() ');
var dummies = document.getElementsByClassName('dummy'),
count = dummies.length;
if(window.cache==='right'){
var transitionCollection=0;
for(var i = 0; i < dummies.length; i++){
dummies[i].classList.remove('right');
dummies[i].addEventListener('transitionend', function(){
transitionCollection++;
if( transitionCollection === dummies.length ){
transitionTest();
}
});
}
window.cache='';
} else {
var transitionCollection=0;
for(var i = 0; i < dummies.length; i++){
dummies[i].classList.add('right');
dummies[i].addEventListener('transitionend', function(){
transitionCollection++;
if( transitionCollection === dummies.length ){
transitionTest();
}
});
}
window.cache='right';
}
and here is working fiddle
So, what is wrong?
Nothing, if you are accessing via modern browser but not latest versions
of Chrome on Windows
Nothing, if you are accessing via latest versions of Chrome on Windows but refrain from causing any mouse events such as mouseenter/leave, click, even window focus event (e.g. if you stand still)
If you do such, infinite left - right movement of dummy div will occasionally break, under unclear circumstances
What gets wrong:
Dummy div, which is moving left-right infinitely, on mouseenter, mouseleave, click, sometimes and sometimes not (exact conditions are unclear) will:
go to end CSS value without transition and resumes normal operation after a while
stop entirely and resumes normal operation after a while
slow down (!? yeah, I wish I was kidding ) and stop/go to end CSS value
These errors are occurring in Chrome 45 (Win 7) and, less intensively Chrome 42 (Win XP) - which are platforms that I was able to test by now. Just to note, upper code does not need to be cross browser, I'm fully aware of implications.
Related
What I am trying to achieve:
user clicks on an element
the screen shows the "calculation in progress" screen
the system performs time-consuming math calculations
the screen shows the result ("done")
Here's the stripped code:
<div id ="di" onclick="calc()">initial</div>
<script>
function calc()
{
var n,a=0;
document.getElementById('di').textContent="calculation in progress";
for(n=0;n<1000000000;n++) // Here's the time consuming calculation
{
a=a+n; // for clarity's sake, I removed a complicated math formula here
}
document.getElementById('di').textContent="done "+a;
}
</script>
When I run it and click on the div, it takes a while and then changes the text to "done", so the user does not see the "calculation in progress" message at all - this is my problem.
To force a screen repaint to display the message before the calculations start, other threads suggest modifying CSS, hiding and immediately unhiding the element or using setTimeout, but nothing worked.
This will be a program that draws complicated math objects (fractals) and I will use canvas instead of a div, but I simplified the example above. Because of the future graphic interface, using "alert()" is not an option - the "calculation in progress" screen should turn to "done" immediately upon completion of the calculations.
Because modern browsers may delay redrawing for better frame rate, versions with setTimeout may not work with too low time-outs.
If possible you need to use requestAnimationFrame. If its not posible then #Bálint answer should work, but with much bigger timeout (in my tests in Firefox its began work with timeout near 20-30). Actual timeout value is browser dependent (and probably system dependent too)
function very_long_func(){
el= document.getElementById('di');
requestAnimationFrame( function(){
//edit dom for new frame;
el.textContent = 'calculation in progress'
//browser will wait for this functions end, before start redraw.
//So actual calucation must run outside of it
setTimeout( function_with_actual_calculation, 1 );
})
}
function function_with_actual_calculation(){
//..your math here + updating textContent to "done" in the end.
}
IMO an easy way to handle this is to have your computation performed in "small" chunks by a timer function, for example:
function calcFractal(x0, y0, x1, y1) {
... compute the fractal for the tile (x0, y0)-(x1, y1) ...
}
var x = 0, y = 0;
function nextTile() {
calcFractal(x, y, x+tw, y+th);
x += tw;
if (x >= width) {
x = 0;
y += th;
}
if (y < height) setTimeout(nextTile, 0);
}
nextTile();
This allows you to show progress (including for example a low resolution of the fractal, the percentage of the computation) and to allow interruption (with onclick events on a stop button for example).
If the tiles are not tiny the overhead will be acceptable, still maintaining the page reasonably responsive to both repaints and user interaction.
You need to either wait a millisecond or do the calculations with a Worker.
The first example is probably the easiest, instead of calling calc directly, create a new function
function caller() {
// insert "calculation in progress" in the body
setTimeout(calc, 1);
}
Then call caller.
Since JavaScript is sequential (not counting async abilities), then why does it not "seem" to behave sequential as in this simplified example:
HTML:
<input type="button" value="Run" onclick="run()"/>
JS:
var btn = document.querySelector('input');
var run = function() {
console.clear();
console.log('Running...');
var then = Date.now();
btn.setAttribute('disabled', 'disabled');
// Button doesn't actually get disabled here!!????
var result = 0.0;
for (var i = 0; i < 1000000; i++) {
result = i * Math.random();
}
/*
* This intentionally long-running worthless for-loop
* runs for 600ms on my computer (just to exaggerate this issue),
* meanwhile the button is still not disabled
* (it actually has the active state on it still
* from when I originally clicked it,
* technically allowing the user to add other instances
* of this function call to the single-threaded JavaScript stack).
*/
btn.removeAttribute('disabled');
/*
* The button is enabled now,
* but it wasn't disabled for 600ms (99.99%+) of the time!
*/
console.log((Date.now() - then) + ' Milliseconds');
};
Finally, what would cause the disabled attribute not take effect until after the for-loop execution has happened? It's visually verifiable by simply commenting out the remove attribute line.
I should note that there is no need for a delayed callback, promise, or anything asynchronous; however, the only work around I found was to surround the for-loop and remaining lines in a zero delayed setTimeout callback which puts it in a new stack...but really?, setTimeout for something that should work essentially line-by-line?
What's really going on here and why isn't the setAttribute happening before the for loop runs?
For efficiency reasons, the browser does not immediately layout and display every single change you make to the DOM instantly right when the change is made. In many cases, DOM updates are collected into a batch and then updated all at once at some later time (like when the current thread of JS finishes).
This is done because if a piece of Javascript is making multiple changes to the DOM, it is very inefficient to relayout the document and then repaint each change as it occurs and much more efficient to wait until the Javascript finishes executing and then repaint all the changes at once.
This is a browser-specific optimization scheme so every browser makes their own implementation decisions on exactly when to repaint a given change and there are some events that can cause/force a repaint. As far as I know, this is not an ECMAScript-specified behavior, just a performance optimization that each browser implements.
There are some DOM properties that require a finished layout before the property is accurate. Accessing these properties via Javascript (even just reading them) will force the browser to do a layout of any pending DOM changes and will usually also cause a repaint. One such property is .offsetHeight and there are others (though all in this category have the same effect).
For example, you can probably cause a repaint by changing this:
btn.setAttribute('disabled', 'disabled');
to this:
btn.setAttribute('disabled', 'disabled');
// read the offsetHeight to force a relayout and hopefully a repaint
var x = btn.offsetHeight;
This Google search for "force browser repaint" contains quite a few articles on this topic if you want to read about it further.
In cases where the browser still won't repaint, the other work-arounds are to hide, then show some element (this causes layout to be dirty) or to use a setTimeout(fn, 1); where you continue the rest of your code in the setTimeout callback - thus allowing the browser a chance to "breathe" and do a repaint because it thinks your current thread of Javascript execution is done.
For example, you could implement the setTimeout workaround like this:
var btn = document.querySelector('input');
var run = function() {
console.clear();
console.log('Running...');
var then = Date.now();
btn.setAttribute('disabled', 'disabled');
// allow a repaint here before the long-running task
setTimeout(function() {
var result = 0.0;
for (var i = 0; i < 1000000; i++) {
result = i * Math.random();
}
/*
* This intentionally long-running worthless for-loop
* runs for 600ms on my computer (just to exaggerate this issue),
* meanwhile the button is still not disabled
* (it actually has the active state on it still
* from when I originally clicked it,
* technically allowing the user to add other instances
* of this function call to the single-threaded JavaScript stack).
*/
btn.removeAttribute('disabled');
/*
* The button is enabled now,
* but it wasn't disabled for 600ms (99.99%+) of the time!
*/
console.log((Date.now() - then) + ' Milliseconds');
}, 0);
};
The browser doesn't render changes to the DOM until the function
returns. - #Barmar
Per #Barmar's comments and a lot of additional reading on the subject, I'll include a summary referring to my example:
JavaScript is single threaded, so only one process at a time can occur
Rendering (repaint & reflow) is a separate/visual process that the browser performs so it comes after the function finishes to avoid the potentially heavy CPU/GPU calculations that would cause performance/visual problems if rendered on the fly
Summarized another way is this quote from http://javascript.info/tutorial/events-and-timing-depth#javascript-execution-and-rendering
In most browsers, rendering and JavaScript use single event queue. It means that while JavaScript is running, no rendering occurs.
To explain it another way, I'll use the setTimeout "hack" I mentioned in my question:
Clicking the "run" button puts my function in the stack/queue of things for the browser to accomplish
Seeing the "disabled" attribute, the browser then adds a rendering process to the stack/queue of tasks.
If we instead add a setTimeout to the heavy part of the function, the setTimeout (by design) pulls it out of the current flow and adds it to the end of the stack/queue. This means the initial lines of code will run, then the rendering of the disabled attribute, then the long-running for-loop code; all in the order of the stack as it was queued up.
Additional resources and explanations concerning the above:
Event Loop
Painting
Reflow
I noticed mousewheel event is happening multiple times in mac osx. Can be atributed to inertia feature.
Is there a way to fix this behaviour?
(self signed ssl no worries please!)
https://sandbox.idev.ge/roomshotel/html5_v2/
I'm using scrollSections.js https://github.com/guins/jQuery.scrollSections
And it uses mousewheel jquery plugin: https://github.com/brandonaaron/jquery-mousewheel
I'm seeing a lot of people having the same issue: https://github.com/brandonaaron/jquery-mousewheel/issues/36
There are some solutions but none works with scrollSections plugin.
Any ideas how to disable this inertia feature from JS?
My attempted fix:
// Fix for OSX inertia problem, jumping sections issue.
if (isMac) {
var fireEvent;
var newDelta = deltaY;
if (oldDelta != null) {
//check to see if they differ directions
if (oldDelta < 0 && newDelta > 0) {
fireEvent = true;
}
//check to see if they differ directions
if (oldDelta > 0 && newDelta < 0) {
fireEvent = true;
}
//check to see if they are the same direction
if (oldDelta > 0 && newDelta > 0) {
//check to see if the new is higher
if (oldDelta < newDelta) {
fireEvent = true;
} else {
fireEvent = false;
}
}
//check to see if they are the same direction
if (oldDelta < 0 && newDelta < 0) {
//check to see if the new is lower
if (oldDelta > newDelta) {
fireEvent = true;
} else {
fireEvent = false;
}
}
} else {
fireEvent = true;
}
oldDelta = newDelta;
} else {
fireEvent = true;
}
You can see fix implemented here: https://sandbox.idev.ge/roomshotel/html5_v2/ But it is a hit/miss.
The latest solution with timeouts had one major drawback: kinetic scrolling effect could last rather long (even 1s or so)... and disabling scrolling for 1-2 seconds wouldn't be the best decision.
Soooo, as promised, here's another approach.
Our goal is to provide one response for one user action, which in this case is scrolling.
What's 'one scrolling'? For the sake of solving this problem, let's say that 'one scrolling' is an event that lasts from the moment the page has started to move till the moment the movement has ended.
Kinetic scrolling effect is achieved by moving the page many times (say, every 20ms) for a small distance. It means that our kinetic scrolling consists of many-many little linear 'scrollings'.
Empirical testing has showed that this little 'scrollings' happen every 17-18ms in the middle of kinetic scroll, and about 80-90ms at the beginning and the end. Here's a simple test we can set up to see that:
var oldD;
var f = function(){
var d = new Date().getTime();
if(typeof oldD !== 'undefined')
console.log(d-oldD);
oldD = d;
}
window.onscroll=f;
Important! Every time this mini-scroll happens, scroll event is triggered. So:
window.onscroll = function(){console.log("i'm scrolling!")};
will be fired 15 to 20+ times during one kinetic scroll. BTW, onscroll has really good browser support (see compatibility table), so we can rely on it (except for touch devices, I'll cover this issue a bit later);
Some may say that redefining window.onscroll is not the best way to set event listeners. Yes, you're encouraged to use
$(window).on('scroll',function(){...});
or whatever you like, it's not the point of the problem (I personally use my self-written library).
So, with the help of onscroll event we can reliably say whether this particular mini-movement of the page belongs to one long-lasting kinetic scroll, or is it a new one:
var prevTime = new Date().getTime();
var f = function(){
var curTime = new Date().getTime();
if(typeof prevTime !== 'undefined'){
var timeDiff = curTime-prevTime;
if(timeDiff>200)
console.log('New kinetic scroll has started!');
}
prevTime = curTime;
}
window.onscroll=f;
Instead of "console.log" you can call your desired callback function (or event handler) and you're done!
The function will be fired only once on every kinetic or simple scroll, which was our goal.
You may have noticed that I've used 200ms as a criteria of whether it's a new scroll or a part of the previous scroll. It's up to you to set it to greater values to be 999% sure you prevent any extra calls. However, please keep in mind that it's NOT what we have used in my previous answer. It's just a period of time between any two page movements (whether it's a new scroll or a little part of a kinetic scroll). To my mind, there's a very little chance that there will be a lag more than 200ms between steps in kinetic scroll (otherwise it will be not smooth at all).
As I've mentioned above, the onscroll event works differently on touch devices. It won't fire during every little step of kinetic scroll. But it will fire when the movement of the page has finally ended. Moreover, there's ontouchmove event... So, it's not a big deal. If necessary, I can provide solution for touch devices too.
P.S. I understand that I've written a bit too much, so I'd be happy to answer all your questions and provide further code if you need one.
Provided solution is supported in all browsers, very lightweight and, what's more important, is suitable not only for macs, but for every device that might implement kinetic scrolling, so I think it's really a way to go.
You know, I think it's a better idea to use timeouts in this case. Why not write something like this:
// Let's say it's a global context or whatever...:
var fireEvent = true;
var newDelta, oldDelta, eventTimeout;
newDelta = oldDelta = eventTimeout = null;
// ... and the function below fires onmousewheel or anything similar:
function someFunc(){
if(!fireEvent) return; // if fireEvent is not allowed => stop execution here ('return' keyword stops execution of the function), else, execute code below:
newDelta = deltaY;
if(oldDelta!=null&&oldDelta*newDelta>0){ // (1.1) if it's not the first event and directions are the same => prevent possible dublicates for further 50ms:
fireEvent = false;
clearTimeout(eventTimeout); // clear previous timeouts. Important!
eventTimeout = setTimeout(function(){fireEvent = true},500);
}
oldDelta = newDelta;
someEventCallback(); // (1.2) fire further functions...
}
So, any mousewheel event fired within half a second after any previous mousewheel event call will be ignored, if it is made in the same direction as previous (see condition at 1.1). It will solve the problem and there's no way user would spot this. Delay amount may be changed to better meet your needs.
The solution is made on pure JS. You're welcome to ask any questions about integrating it in your environment, but then I'll need you to provide further code of your page.
P.S. I have not seen anything similar to eventCallback() call in your code (see 1.2 of my solution). there was only fireEvent flag. Were you doing something like:
if(fireEvent)
someEventCallback();
later on or something?
P.P.S.note that fireEvent should be in global scope in order to work here with setTimeout. If it's not, it's also quite easy to make it work fine, but the code needs to be altered a bit. If it's your case, tell me and I'll fix it for you.
UPDATE
After a brief search I found out, that similar mechanism is used in Underscore's _debounce() function. See Underscore documentation here
Have you though about using fullpage.js instead?
It has a delay between arriving to a section and the moment you are able to scroll to the next section which solves part of the problem Mac users experience with track-pads or Apple magic mouses.
It would also provide you some other benefits, such as much more options, methods and compatibility with touch devices and old browsers with no CSS3 support.
To have something to start with, let's make your solution shorter (therefore easier to understand & debug):
var fireEvent;
var newDelta = deltaY;
var oldDelta = null;
fireEvent = EventCheck();
oldDelta = newDelta;
function EventCheck(){
if(oldDelta==null) return true; //(1.1)
if(oldDelta*newDelta < 0) return true; // (1.2) if directions differ => fire event
if(Math.abs(newDelta)<Math.abs(oldDelta)) return true; // (1.3) if oldDelta exceeds newDelta in absolute values => fire event
return false; // (1.4) else => don't fire;
}
As you see, it does absolutely what your code does.
However, I can't understand this part of your code (which corresponds to (1.3) in my snippet):
//check to see if the new is lower
if (oldDelta > newDelta) {
fireEvent = true;
} else {
fireEvent = false;
}
from code provided it's unclear how deltaY is calculated. As one could assume, delta equals to endPosition - initialPosition. So, oldDelta>newDelta does not mean that the new position is lower, but that the new gap between these two values is bigger. If it's what it mean and you still use it, I suppose you try to track inertia with that. Then you should alter comparative operator (use less than, instead of greater then and vice-versa). In other words, I'd write:
if(Math.abs(newDelta)>Math.abs(oldDelta)) return true; // (1.3)
you see, now I've used 'greater than' operator, which means: newDelta exceeds oldDelta in absolute values => it's not inertia and you can still fire the event.
Is it what you're trying to achieve or have I misinterpreted your code? If so, please explain how deltaY is calculated and what was your goal by comparing old&new Deltas.
P.S. I'd suggest not to use if(isMac) in this step, while a problem can also potentially hide there.
We are modifying an older pre-existing web app and as part of that have begun viewing it using IE10. This app has a third party menu control (menu9_com.js?) and among the numerous issues we are noticing, is the positioning of this menu on IE7+ in Standards mode. In FF, Chrome, or any version of IE with Quirks - it is positioned correctly. In Standards mode, however, it is shoved far off to the right.
I've identified the function below as a possible source for the issue. Running in any mode, the value of StartLeft begins about the same. In the working modes it finishes at a value which - by definition - works. In the broken modes, it is much much higher.
Though it's not fully clear, I believe the function is walking up the DOM from a given target location and adding values on to calculate a "total" offset for the menu element it is adding. And I think the issue comes down to the different ways that offsetLeft (and maybe offsetParent?) are handled. So I'm trying to find the best way to get consistent behavior from this function but just not familiar enough with the intention of the function, nor with the behavior of offsetLeft etc in the various modes.
Here's the function:
function ClcTrgt() {
var TLoc=Nav4?FLoc.document.layers[TargetLoc]:DomYesFLoc.document.getElementById(TargetLoc):FLoc.document.all[TargetLoc];
if (DomYes) {
while (TLoc) {
StartTop += TLoc.offsetTop;
StartLeft += TLoc.offsetLeft;
TLoc = TLoc.offsetParent;
}
}
else {
StartTop+=Nav4?TLoc.pageY:TLoc.offsetTop;
StartLeft+=Nav4?TLoc.pageX:TLoc.offsetLeft;
}
}
Any suggestions? For example, I'd convert this function to use jQuery, if I knew how.
UPDATE:
I've posted the script on pastebin.
My current direction, in the absence of an actual fix to the script (which may not be worth the work), is adding this function to run after the script itself. I added some markup to facilitate it, and it just takes the menu, and puts it where it should be (which it right-aligned with another element I've identified). This is for from optimal, but it works.
function fixMenu9() {
var pTD = $('#pgMenuDivTD');
var pMN = $('#pgMenuDiv');
var additionalOffset = ExpYes ? 3 : 0;
var leftVal = (parseInt(pTD.offset().left) + parseInt(pTD.css('width'))) - (parseInt(pMN.css('width')) + additionalOffset);
$('#pgMenuDiv').css('left', leftVal);
}
I'm calling a javascript function that sets the opacity of an iframe an unknown amount of times in rapid succession. Basically this tweens the alpha from 0 to 100.
here is the code
function setAlpha(value)
{
iframe.style.opacity = value * .01;
iframe.style.filter = 'alpha(opacity =' + val + ')';
}
My problem is that for the first time it is working in ie (7) and not in firefox (3.02). in Firefox I get a delay and then the contentdocument appears with an opacity of 100. If I stick an alert in it works, so I'm guessing it is a race condition (although I thought javascript was single threaded) and that the setAlpha function is being called before the last function has finished executing.
Any help would be greatly appreciated. I've read the 'avoiding a javascript race condition post' but I think this qualifies as something different (plus I can't figure out how to apply that example to this one).
The issue is that most browsers don't repaint until there is a pause in the javascript execution.
This can be solved by using setTimeout, as others have suggested. However, I recommend using something like jQuery, or any of the javascript libraries to do animations. Running setTimeout 100 times is a bad idea because the length of the animation will vary based on the browser and speed of the user's computer. The correct way to do animations, is to specify how long they should last and check the system time to determine how far the animation should progress.
function fadeIn(elem,animation_length) {
var start = (new Date()).getTime();
var step = function() {
window.setTimeout(function() {
var pct = ((new Date()).getTime() - start)/animation_length;
elem.style.opacity = Math.min(pct,1);
if (pct < 1)
step();
},20);
};
step();
}
[edit:] The code above is only to illustrate how to do animations based on the system clock instead of simple intervals. Please use a library to do animations. The code above will not work on IE, because IE uses "filter:opacity(xx)" instead of "opacity". Libraries will take care of this for you and also provide nice features such as completion events, and the ability to cancel the animation.
Javascript doesn't run across multiple threads so you're safe from race conditions (ignoring upcoming Worker thread support in Safari and Firefox :D ).
Simple question, how are you calling setAlpha multiple times, firefox, safari and opera all coalesce style sheet updates -- eg. they won't repaint or even recalc style info while js is running unless they have to. So they will only paint if JS has completed.
So if you're doing
while(...) setAlpha(...)
they won't update, you'll probably need to use setTimeout to trigger multiple distinct calls to update the style.
An alternative would be to use a library such as jQuery, mootools,etc that i vaguely recall provide a simplified mechanism to do these types of animations and transitions. As an added bonus i believe at least a few libraries will also use webkit transition and animation css rules when available (eg. Safari, and i think the latest firefox builds)
[edit: caveat: i haen't actually used any of these libraries, i only read about what they're supposed to do. My sites render the same in lynx as any other browser because i couldn't design my way out of a paper bag :D ]
Are you using setTimeout or a tight loop? If you're using just a loop to call the function, then switch to using setTimout.
example:
function setAlpha(value)
{
iframe.style.opacity = value * .01;
iframe.style.filter = 'alpha(opacity =' + val + ')';
if(value < 100 ) {
setTimeout(function () {setAlpha(value+1)},20);
}
}
setAlpha(0);
Because you see, it's not just javascript that's single threaded. It's the whole damn browser. If your javascript goes into a tightloop, you hang the whole browser. So the browser pauses waiting for javascript to finish, and doesn't even have a chance to update the screen, while your code is rapidly changing some dom values.
Some browsers are smart enough to delay changes to the DOM until the call stack is empty.
This is a generally a smart thing to do. For example, if you call a function that changes an element to yellow, and immediately call a function that changes the same element back to it's original state, the browser shouldn't waste time making the change, since it should happen so quickly as to be imperceptible to a user.
The setTimeout(func, 0) trick is commonly used to force Javascript to delay execution of func until the call stack is empty.
In code:
function setAlpha(opacity){
some_element.style.opacity = opacity;
}
/**
* This WON'T work, because the browsers won't bother reflecting the
* changes to the element's opacity until the call stack is empty,
* which can't happen until fadeOut() returns (at the earliest)
**/
function fadeOut(){
for (var i=0; i<10; i++){
setAlpha(0.1*i);
}
}
/**
* This works, because the call stack will be empty between calls
* to setAlpha()
**/
function fadeOut2(){
var opacity = 1;
setTimeout(function setAlphaStep(){
setAlpha(opacity);
if (opacity > 0){
setTimeout(setAlphaStep, 10);
}
opacity -= 0.1;
}, 0);
}
All this boils down to being a wonderful excuse to use one of many javascript libraries that handle this tricky stuff for you.
Edit: and here's a good article on the tricky Javascript call stack