Recently, I got this kind of warning, and this is my first time getting it:
[Violation] Long running JavaScript task took 234ms
[Violation] Forced reflow while executing JavaScript took 45ms
I'm working on a group project and I have no idea where this is coming from. This never happened before. Suddenly, it appeared when someone else got involved in the project. How do I find what file/function causes this warning? I've been looking for the answer, but mostly about the solution on how to solve it. I can't solve it if I can't even find the source of the problem.
In this case, the warning appears only on Chrome. I tried to use Edge, but I didn't get any similar warnings, and I haven't tested it on Firefox yet.
I even get the error from jquery.min.js:
[Violation] Handler took 231ms of runtime (50ms allowed) jquery.min.js:2
Update: Chrome 58+ hid these and other debug messages by default. To display them click the arrow next to 'Info' and select 'Verbose'.
Chrome 57 turned on 'hide violations' by default. To turn them back on you need to enable filters and uncheck the 'hide violations' box.
suddenly it appears when someone else involved in the project
I think it's more likely you updated to Chrome 56. This warning is a wonderful new feature, in my opinion, please only turn it off if you're desperate and your assessor will take marks away from you. The underlying problems are there in the other browsers but the browsers just aren't telling you there's a problem. The Chromium ticket is here but there isn't really any interesting discussion on it.
These messages are warnings instead of errors because it's not really going to cause major problems. It may cause frames to get dropped or otherwise cause a less smooth experience.
They're worth investigating and fixing to improve the quality of your application however. The way to do this is by paying attention to what circumstances the messages appear, and doing performance testing to narrow down where the issue is occurring. The simplest way to start performance testing is to insert some code like this:
function someMethodIThinkMightBeSlow() {
const startTime = performance.now();
// Do the normal stuff for this function
const duration = performance.now() - startTime;
console.log(`someMethodIThinkMightBeSlow took ${duration}ms`);
}
If you want to get more advanced, you could also use Chrome's profiler, or make use of a benchmarking library like this one.
Once you've found some code that's taking a long time (50ms is Chrome's threshold), you have a couple of options:
Cut out some/all of that task that may be unnecessary
Figure out how to do the same task faster
Divide the code into multiple asynchronous steps
(1) and (2) may be difficult or impossible, but it's sometimes really easy and should be your first attempts. If needed, it should always be possible to do (3). To do this you will use something like:
setTimeout(functionToRunVerySoonButNotNow);
or
// This one is not available natively in IE, but there are polyfills available.
Promise.resolve().then(functionToRunVerySoonButNotNow);
You can read more about the asynchronous nature of JavaScript here.
These are just warnings as everyone mentioned. However, if you're keen on resolving these (which you should), then you need to identify what is causing the warning first. There's no one reason due to which you can get force reflow warning.
Someone has created a list for some possible options. You can follow the discussion for more information.
Here's the gist of the possible reasons:
What forces layout / reflow
All of the below properties or methods, when requested/called in
JavaScript, will trigger the browser to synchronously calculate the
style and layout*. This is also called reflow or layout
thrashing,
and is common performance bottleneck.
Element
Box metrics
elem.offsetLeft, elem.offsetTop, elem.offsetWidth, elem.offsetHeight, elem.offsetParent
elem.clientLeft, elem.clientTop, elem.clientWidth, elem.clientHeight
elem.getClientRects(), elem.getBoundingClientRect()
Scroll stuff
elem.scrollBy(), elem.scrollTo()
elem.scrollIntoView(), elem.scrollIntoViewIfNeeded()
elem.scrollWidth, elem.scrollHeight
elem.scrollLeft, elem.scrollTop also, setting them
Focus
elem.focus() can trigger a double forced layout (source)
Also…
elem.computedRole, elem.computedName
elem.innerText (source)
getComputedStyle
window.getComputedStyle() will typically force style recalc
(source)
window.getComputedStyle() will force layout, as well, if any of the
following is true:
The element is in a shadow tree
There are media queries (viewport-related ones). Specifically, one of the following:
(source) * min-width, min-height, max-width, max-height, width, height * aspect-ratio, min-aspect-ratio, max-aspect-ratio
device-pixel-ratio, resolution, orientation
The property requested is one of the following: (source)
height, width * top, right, bottom, left * margin [-top, -right, -bottom, -left, or shorthand] only if the
margin is fixed. * padding [-top, -right, -bottom, -left,
or shorthand] only if the padding is fixed. * transform,
transform-origin, perspective-origin * translate, rotate,
scale * webkit-filter, backdrop-filter * motion-path,
motion-offset, motion-rotation * x, y, rx, ry
window
window.scrollX, window.scrollY
window.innerHeight, window.innerWidth
window.getMatchedCSSRules() only forces style
Forms
inputElem.focus()
inputElem.select(), textareaElem.select() (source)
Mouse events
mouseEvt.layerX, mouseEvt.layerY, mouseEvt.offsetX, mouseEvt.offsetY
(source)
document
doc.scrollingElement only forces style
Range
range.getClientRects(), range.getBoundingClientRect()
SVG
Quite a lot; haven't made an exhaustive list , but Tony Gentilcore's 2011 Layout Triggering
List
pointed to a few.
contenteditable
Lots & lots of stuff, …including copying an image to clipboard (source)
Check more here.
Also, here's Chromium source code from the original issue and a discussion about a performance API for the warnings.
Edit: There's also an article on how to minimize layout reflow on PageSpeed Insight by Google. It explains what browser reflow is:
Reflow is the name of the web browser process for re-calculating the
positions and geometries of elements in the document, for the purpose
of re-rendering part or all of the document. Because reflow is a
user-blocking operation in the browser, it is useful for developers to
understand how to improve reflow time and also to understand the
effects of various document properties (DOM depth, CSS rule
efficiency, different types of style changes) on reflow time.
Sometimes reflowing a single element in the document may require
reflowing its parent elements and also any elements which follow it.
In addition, it explains how to minimize it:
Reduce unnecessary DOM depth. Changes at one level in the DOM tree
can cause changes at every level of the tree - all the way up to the
root, and all the way down into the children of the modified node.
This leads to more time being spent performing reflow.
Minimize CSS rules, and remove unused CSS rules.
If you make complex rendering changes such as animations, do so out of the flow. Use position-absolute or position-fixed to accomplish
this.
Avoid unnecessary complex CSS selectors - descendant selectors in
particular - which require more CPU power to do selector matching.
A couple of ideas:
Remove half of your code (maybe via commenting it out).
Is the problem still there? Great, you've narrowed down the possibilities! Repeat.
Is the problem not there? Ok, look at the half you commented out!
Are you using any version control system (eg, Git)? If so, git checkout some of your more recent commits. When was the problem introduced? Look at the commit to see exactly what code changed when the problem first arrived.
I found the root of this message in my code, which searched and hid or showed nodes (offline). This was my code:
search.addEventListener('keyup', function() {
for (const node of nodes)
if (node.innerText.toLowerCase().includes(this.value.toLowerCase()))
node.classList.remove('hidden');
else
node.classList.add('hidden');
});
The performance tab (profiler) shows the event taking about 60 ms:
Now:
search.addEventListener('keyup', function() {
const nodesToHide = [];
const nodesToShow = [];
for (const node of nodes)
if (node.innerText.toLowerCase().includes(this.value.toLowerCase()))
nodesToShow.push(node);
else
nodesToHide.push(node);
nodesToHide.forEach(node => node.classList.add('hidden'));
nodesToShow.forEach(node => node.classList.remove('hidden'));
});
The performance tab (profiler) now shows the event taking about 1 ms:
And I feel that the search works faster now (229 nodes).
In order to identify the source of the problem, run your application, and record it in Chrome's Performance tab.
There you can check various functions that took a long time to run. In my case, the one that correlated with warnings in console was from a file which was loaded by the AdBlock extension, but this could be something else in your case.
Check these files and try to identify if this is some extension's code or yours. (If it is yours, then you have found the source of your problem.)
Look in the Chrome console under the Network tab and find the scripts which take the longest to load.
In my case there were a set of Angular add on scripts that I had included but not yet used in the app :
<script src="//cdnjs.cloudflare.com/ajax/libs/angular-ui-router/0.2.8/angular-ui-router.min.js"></script>
<script src="//cdnjs.cloudflare.com/ajax/libs/angular-ui-utils/0.1.1/angular-ui-utils.min.js"></script>
<script src="//ajax.googleapis.com/ajax/libs/angularjs/1.3.9/angular-animate.min.js"></script>
<script src="//ajax.googleapis.com/ajax/libs/angularjs/1.3.9/angular-aria.min.js"></script>
These were the only JavaScript files that took longer to load than the time that the "Long Running Task" error specified.
All of these files run on my other websites with no errors generated but I was getting this "Long Running Task" error on a new web app that barely had any functionality. The error stopped immediately upon removing.
My best guess is that these Angular add ons were looking recursively into increasingly deep sections of the DOM for their start tags - finding none, they had to traverse the entire DOM before exiting, which took longer than Chrome expects - thus the warning.
I found a solution in Apache Cordova source code.
They implement like this:
var resolvedPromise = typeof Promise == 'undefined' ? null : Promise.resolve();
var nextTick = resolvedPromise ? function(fn) { resolvedPromise.then(fn); } : function(fn) { setTimeout(fn); };
Simple implementation, but smart way.
Over the Android 4.4, use Promise.
For older browsers, use setTimeout()
Usage:
nextTick(function() {
// your code
});
After inserting this trick code, all warning messages are gone.
Adding my insights here as this thread was the "go to" stackoverflow question on the topic.
My problem was in a Material-UI app (early stages)
placement of custom Theme provider was the cause
when I did some calculations forcing rendering of the page
(one component, "display results", depends on what is set in others, "input sections").
Everything was fine until I updated the "state" that forces the "results component" to rerender. The main issue here was that I had a material-ui theme (https://material-ui.com/customization/theming/#a-note-on-performance) in the same renderer (App.js / return.. ) as the "results component", SummaryAppBarPure
Solution was to lift the ThemeProvider one level up (Index.js), and wrapping the App component here, thus not forcing the ThemeProvider to recalculate and draw / layout / reflow.
before
in App.js:
return (
<>
<MyThemeProvider>
<Container className={classes.appMaxWidth}>
<SummaryAppBarPure
//...
in index.js
ReactDOM.render(
<React.StrictMode>
<App />
//...
after
in App.js:
return (
<>
{/* move theme to index. made reflow problem go away */}
{/* <MyThemeProvider> */}
<Container className={classes.appMaxWidth}>
<SummaryAppBarPure
//...
in index.js
ReactDOM.render(
<React.StrictMode>
<MyThemeProvider>
<App />
//...
This was added in the Chrome 56 beta, even though it isn't on this changelog from the Chromium Blog: Chrome 56 Beta: “Not Secure” warning, Web Bluetooth, and CSS position: sticky
You can hide this in the filter bar of the console with the Hide violations checkbox.
This is violation error from Google Chrome that shows when the Verbose logging level is enabled.
Example of error message:
Explanation:
Reflow is the name of the web browser process for re-calculating the positions and geometries of elements in the document, for the purpose of re-rendering part or all of the document. Because reflow is a user-blocking operation in the browser, it is useful for developers to understand how to improve reflow time and also to understand the effects of various document properties (DOM depth, CSS rule efficiency, different types of style changes) on reflow time. Sometimes reflowing a single element in the document may require reflowing its parent elements and also any elements which follow it.
Original article: Minimizing browser reflow by Lindsey Simon, UX Developer, posted on developers.google.com.
And this is the link Google Chrome gives you in the Performance profiler, on the layout profiles (the mauve regions), for more info on the warning.
If you're using Chrome Canary (or Beta), just check the 'Hide Violations' option.
For what it’s worth, here are my 2¢ when I encountered the
[Violation] Forced reflow while executing JavaScript took <N>ms
warning. The page in question is generated from user content, so I don’t really have much influence over the size of the DOM. In my case, the problem is a table of two columns with potentially hundreds, even thousands of rows. (No on-demand row loading implemented yet, sorry!)
Using jQuery, on keydown the page selects a set of rows and toggles their visibility. I noticed that using toggle() on that set triggers the warning more readily than using hide() & show() explicitly.
For more details on this particular performance scenario, see also this article.
The answer is that it's a feature in newer Chrome browsers where it alerts you if the web page causes excessive browser reflows while executing JS. Please refer to
Forced reflow often happens when you have a function called multiple times before the end of execution.
For example, you may have the problem on a smartphone, but not on a classic browser.
I suggest using a setTimeout to solve the problem.
This isn't very important, but I repeat, the problem arises when you call a function several times, and not when the function takes more than 50 ms. I think you are mistaken in your answers.
Turn off 1-by-1 calls and reload the code to see if it still produces the error.
If a second script causes the error, use a setTimeOut based on the duration of the violation.
This is not an error just simple a message. To execute this message change
<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd"> (example)
to
<!DOCTYPE html>(the Firefox source expect this)
The message was shown in Google Chrome 74 and Opera 60 . After changing it was clear, 0 verbose.
A solution approach
I created a web page for viewing images. This page has some other code that gets included that I did not write. The page loads 40 small images upon load. Then the user will scroll down and additional pages of 40 images can be loaded via ajax. Once I get to 15-20 pages, I notice the page begins to slow significantly. I check app counters and it can go up to 100% cpu and memory can go over 3GB. Then I will inevitably get the modal that JQuery is taking too long to execute, asking me if I want to stop executing the script. Now I realize that a page with up to 800 images is a big load, but the issue with JQuery suggests to me that some code may also be iterating over this larger and larger group of dom objects. It almost appears to get exponentially slower as I pass 15 pages or so. Once I get to 20 pages it becomes almost unusable.
First of all, is it even possible to run a page efficiently, even with minimal JS, when you have this many images? Secondly, is there a recommended way to "trace" JS and see what kinds of functions are getting executed to help determine what is the most likely culprit? This is most important to me - is there a good way to do in Firebug?
Thanks :)
EDIT - I found my answer. I had some older code which was being used to replace images that failed to load with a generic image. This code was using Jquery's .each operator and thus was iterating over the entire page and each new ajax addition every time the page loaded. I am going to set a class for the images that need to be checked in CSS so that the ajax-loaded images are unaffected.
Firebug, and all the other debugging tools let you profile your functions. You can see how long they take to run and how many times they have been called.
http://getfirebug.com/javascript
See: Profile JavaScript performance
Another useful tool to look into is the profile() function
console.log('Starting Profile');
console.profile();
SuspectFunction();
console.profileEnd();
Though the console window in the debugger you can see the profile results.
The best tool I have used is https://developers.google.com/web-toolkit/speedtracer/ for Chrome
To answer your first question, 15 pages of images should not be a problem for a computer to handle. Google loads up to 46 pages of images without lagging at all. Although it does stop you from loading more after that.
The answer your second question, there are many ways to track JS code. Since you are doing a performance related debugged, I'd go with timestamped console log:
console.log(" message " + new Date());
I'd put one in the beginning and end of function you are interested in measuring the performance of, and read through the log to see how long it takes to execute each of those functions. You would compare the timestamp to see what excess code is executing and how long it took for the code to execute.
Finally, in Firebug, go to the console tab, and click on Pofile before you start you start scrolling down the page. Then scroll to your 15th page, and then click Profile again. It breaks down function called and amount of time it took.
I prefer to use the timer function in Firebug or Chrome, it can be called like this:
console.time('someFunction timer');
function someFunction(){ ... }
console.timeEnd('someFunction timer');
This isn't as robust as the profiler functions, but it should give you an idea of how long functions are taking.
Also if you are running at 100% CPU and 3GB of memory, you almost certainly have a memory leak. You may want to consider removing some the initial images, when more pages are loaded in. For example, after 5 pages being shown, you remove the first page when the user views the 6th page.
I was able to fix the problem by going over my code again. I was loading new images via ajax but I had an older line of code that was checking all images, ie $('img') to replace any images that failed to load with a generic image. This means that as I continually load new images, this selector has to iterate over the entire growing dom again and again. I altered that code and now the page is flying! Thanks everyone for the help.
I wrote an infinite loop in my javascript code. Using WebKit's Web Inspector, how do I terminate execution? I have to quit Safari and reopen it again (after changing my code of course).
EDIT: To be more specific, I'm looking for a way to enter the looping executing process/thread to see why the loop isn't terminating. An analogy would be in GDB where I could do a ^C and break into the process. I'm not looking for a way to kill my web browser. I'm pretty good at that already.
I'm not familiar with WebKit, but this sounds like a common problem that I usually debug as follows: Declare an integer outside of the scope of the loop, and increment it for each iteration, then throw an exception when the iteration count exceeds the maximum expected possible amount of iterations. So, in pseudo-code, something like the following could be used to debug this problem:
var iterations = 0;
var greatestPossibleNumberOfValidIterations = 500;
while(shouldLoopBooleanTest){
iterations++;
if(iterations>greatestPossibleNumberOfValidIterations){
//do debugging/error handling
}
}
I don't expect that this is specific enough to warrant an accepted answer, but I hope it helps you solve your problem.
Have you tried top to look at the process tree? Find the PID of the program and then type kill -9 [PID].
Have you tried using F11? This is the key for step into: https://trac.webkit.org/wiki/WebInspector
In using Firefox (I know the question doesn't specify Firefox, but I'm just throwing this in, in case it helps somebody), Safari, and Chrome, I found that there isn't a consistent way to break into an infinite loop, but there are ways to setup execution to break into a loop if you need to. See my test code below.
Firefox
Utter trash. It just stood there spinning it's rainbow. I had to kill it.
Safari
The nice thing about Safari is that it will eventually throw up a dialog asking if you want to stop. You should say yes. Then hit command-option i to bring up the Web Inspector. The source should pop-up saying that the Javascript exceeded the timeout. Hit the pause button in the right hand side towards the top, then refresh. The code will reload, now step through: I like using command-; because it steps through every call. If you have complex code you might never get to the end. Don't hit continue though (command-/) or you'll be back at square one.
Chrome
The most useful of the three. If you naively load the code, it will keep going forever. But, before loading the page, open the Web Inspector and select 'Load resources all the time'. Then reload the page. While the page is trying to load, you can click over to scripts and pause the Javascript while it is running. This is what I was looking for.
Code
<html>
<head></head>
<body onload="TEST.forever.loop()">
Nothing
</body>
<script type="text/javascript">
TEST = {};
TEST.forever = {
loop : function() {
var i = 0;
while (i >= 0) {
i++;
}
}
};
</script>
</html>
I get an error message from Firefox "Unresponsive script". This error is due to some javascript I added to my page.
I was wondering if the unresponsiveness are caused exclusively by code loops (function calling each other cyclically or endless "for loops") or there might be other causes ?
Could you help me to debug these kind of errors ?
thanks
One way to avoid this is to wrap your poor performant piece of code with a timeout like this:
setTimeout(function() {
// <YOUR TIME CONSUMING OPERATION GOES HERE>
}, 0);
This is not a bullet proof solution, but it can solve the issue in some cases.
According to the Mozzila Knoledgebase:
When JavaScript code runs for longer than a predefined amount of time, Firefox will display a dialog that says Warning: Unresponsive Script. This time is given by the settings dom.max_script_run_time and dom.max_chrome_script_run_time. Increasing the values of those settings will cause the warning to appear less often, but will defeat the purpose: to inform you of a problem with an extension or web site so you can stop the runaway script.
Furthermore:
Finding the source of the problem
To determine what script is running too long, click the Stop Script button on the warning dialog, then go to Tools | Error Console. The most recent errors in the Error Console should identify the script causing the problem.
Checking the error console should make it pretty obvious what part of your javascript is causing the issue. From there, either remove the offending piece of code or change it in such a way that it won't be as resource intensive.
EDIT: As mentioned in the comments to the author of the topic, Firebug is highly recommended for debugging problems with javascript. Jonathan Snook has a useful video on using breakpoints to debug complex pieces of javascript.
We need to follow these steps to stop script in Firefox.
Step 1
Launch Mozilla Firefox.
Step 2
Click anywhere in the address bar at the top of the Firefox window, to highlight the entire field.
Step 3
Type "about:config" (omit the quotes here and throughout) and press "Enter." A "This might void your warranty!" warning message may come up; if it does, click the "I'll be careful, I promise!" button to open the page that contains all Firefox settings.
Step 4
Click once in the "Search" box at the top of the page and type "dom.max_script_run_time". The setting is located and displayed; it has a default value of 10.
Step 5
Double-click the setting to open the Enter Integer Value window.
Step 6
Type "0" and click "OK" to set the value to zero. Firefox scripts now run indefinitely, and will not throw any script errors.
Step 7
Restart Mozilla Firefox.
Excellent solution in this question: How can I give control back (briefly) to the browser during intensive JavaScript processing?, by using the Async jQuery Plugin. I had a similar problem and solved it by changing my $.each for $.eachAsync
there could be an infinite loop somewhere in the code
start by commenting out codes to identify which section is causing it
too many loops: there might be a chance that your counter variable name clashes, causing the variable to keep resetting, causing the infinite loop.
try as much as possible to create hashes for your objects so much so that read time is O(1) and in a way caching those data
avoid using js libs as some of the methods might cause overheads. eg. .htm() vs .innerHTML
setTimeout() yes and no -- depends on how you chunkify your codes
Does having (approx 100) HTML validation errors affect my page loading speeds? Currently the errors on my pages don't break the page in ANY browser, but I'd spend time and clear those anyhow if it would improve my page loading speed?
If not on desktops, how about mobile devices like iPhone or Android? (For example, N1 and Droid load pages much slower than iPhone although they both use Webkit engine.)
Edit: My focus here is speed optimization not cross-browser compatibility (which is already achieved). Google and other biggies seem to use invalid HTML for speed or compatibility of both?
Edit #2: I'm not in quirks mode, i.e. I use XHTML Strict Doctype and my source looks great and its mostly valid, but 100% valid HTML usually requires design (or some other kind of) sacrifice.
Thanks
It doesn't affect -loading- speed. Bad data is transferred over the wires just as fast as good data.
It does affect rendering speed though (...in some cases... ...positively! Yeah, MSIE tends to be abysmally slow in standards mode) In most cases though, render speed will be somewhat slower due to Quirks mode which is less efficient, more paranoid and generally instead of just executing your data like a well-written program, it tries its best to fish out some meaningful content from what is essentially a tag soup.
Some validation errors like missing ALT or no / at the end of single-element tags won't affect render at all, but some, like missing a closing tag or using antiquated obsolete parameters may impact performance seriously.
It might affect loading speed, or it might not. It depends on the kind of errors you're getting.
I'd say that in most cases it's likely that it will be slower because the browser will have to handle these errors. For instance if you forgot to close a div tag, some browsers will close it for you. This takes processing time and increase the loading time.
I don't think the time delta between no error and 100 errors would be minimal. But if you have that many errors, you should consider fixing your code :)
Probably yes, and here's why.
If your code is valid to the W3C doctype you are using then the browser doesn't have to put more effort in to try and fix your code. This is called quirks mode, and it would be logical that if your code were to validate, the browser wouldn't have to try and piece the website back together.
Remembering it's always beneficial to make your code validate, if only to ensure a consistent design across the popular browsers. Finally you'll probably find that you fix the first few errors and your list of 100 errors will drastically decrease.
In theory, yes it will decrease page load times because the browser has to do less to handle errors and so on.
However it does depend on the nature of the validation errors. If you're improperly nesting tags (which actually may be valid in HTML4) then the browser would have to do a little more work working out where elements start and end. And this is the kind of thing that can cause cross-browser problems.
If you're simply using unofficial attributes (say, the target attribute on links) then support for that is either built into the browser or not. If the browser understands it, it will do something with it, otherwise it will ignore the attribute.
One thing that will ramp up your validation errors is using <br> under XHTML or <br /> under HTML. Neither should increase loading times (although <br /> takes a fraction longer to download).