stop a long webgl process from freezing chrome - javascript

In webgl, a long running glsl code can freeze all the computer.
When browsing shadertoy, some examples especially in fullscreen mode have frozen my mac like this one:
Path Tracer MIS (progressive)
Is there any way to detect whether a shader is taking too much and then to auto-kill it from the javascript level? Or this is above developer reach and needs to be handled by browser developers?
Another question: is there a way (plugin or external application in mac or linux) which prevents a long chrome GPU access from freezing the computer?

It's not just WebGL, all GPU code can freeze the computer. The problem is, unlike CPUs, current GPUs are non-preemptable. That means if you give them 30 minutes of work to do they will execute that work for 30 minutes and there's no way to preempt them to do something else like you can with a CPU. (no multi-tasking)
The solution on a good OS is the CPU runs a timeout and if the GPU takes too long the OS resets the GPU (effectively it reboots it). Windows has done this since Windows Vista. macOS has only kind of added it recently and the issue for them is they didn't design the OS to recover from those kinds of situations like MS did so it's taking them longer to get it all working.
The good thing is there's not much incentive to do it on purpose. If you visit some website that locks your computer you're probably going to stop visiting that site so the problem is usually self correcting since people want you to visit their site.
On the other hand it's a problem for sites like shadertoy where people write extremely long and complicated shaders that are really only designed to run on top end GPUs and not common notebook GPUs.
As far as what you can do about it.
Ask to Apple to fix their drivers and OS so it can't be crashed.
Ask Shadertoy to not automatically run heavy shaders
Maybe they could add some timing info or other heuristic to decide
whether or not to warn the user the shader might be too heavy for
their machine.
Ask to GPU companies to make their GPUs preemptable.
Otherwise there's not much even the browser can do. The browser has no idea if the shader is going to be fast or slow before it runs it.

Related

How to target WebGL compatibility solely in terms of hardware performance

Recently I've been working with Three.js in order to render 3D scenes in WebGL. As a fallback, I've also maintained versions using Three's CanvasRenderer that, while they do not have as many polygons, lighting techniques, or effects, can still run in browsers that do not have WebGL capability (and can also run in Safari if the user refuses to enable WebGL themselves).
However, today I realized that compatibility is not merely a question of whether or not the user's browser supports WebGL, but also whether the user's hardware is even rigorous enough to perfectly render WebGL in the first place (i.e. run at 60 fps). While there is a lot of data out there illustrating what percentage of the population uses which web browser, I had trouble finding distributions of users who are running computers more than capable of running WebGL.
What exactly is the percentage of the population that -- assuming they are using a WebGL capable web browser -- can run your average WebGL page at 60 fps? And if it is a large amount of the population who struggle with rendering in WebGL, what would be the best way to detect such shortcomings in hardware? A javascript solution would be ideal since we are already working with Three in that language.
It is very possible that I may be misunderstanding the situation as this does not seem to be a widely discussed issue in the world of WebGL development. If such is the case please let me know so I may better understand how to work with Three in the future.
I had the same issue professionally working with Android - we had a 3D intensive app prior to release of a hardware accelerated Android OS, it ran smoothly on some phones and horribly on others. This issue crops up on any open platform like Android or Windows. iPhones/iPad have a very explicit hardware set, but on the PC you will be getting a range of video cards.
If you use Cg/HLSL shaders you compile to a specific hardware profile which describes the lower bound of the vertex, geometry and pixel shaders. However I'm guessing you're using GLSL shaders and compiling at runtime? GLSL does not target hardware profiles and will simply fail to compile the shader if the hardware in incapable of running it.
Of course, this only gives you an idea of whether the program can execute, not how well it can execute. A shader can compile just fine and run at 5fps.
I was worried about users seeing a bad graphics experience so for our app on the first run I rendered a model to an offscreen target for a brief time and took an average FPS, then disabled the 3D section if the score was too low. You know what? THE USERS HATED IT. Plenty of them had 3D-capable phones and felt cheated that they couldn't see the 3D sections that were in the screenshots. I pushed an update to remove the check, so now if you have bad hardware you'll see the 3D scene running slowly. And we actually had fewer complaints with that experience, even though I hated how inelegant it seemed.
On the positive side, both OS X and Windows (since Vista) support hardware accelerated compositing in the OS (GL/D3D renders all your windows to the screen now). This has pushed most every PC vendor to include 3D acceleration in their computers. So I think it's a safe bet to assume people have a 3D card - whether they have WebGL will be more of a limiting factor. The variety of hardware and drivers also gives you wacky rendering bugs on one machine that appear nowhere else - PC game developers test on a range of cards prior to release in an attempt to mitigate that issue. But these wacky bugs are going to be a tiny fraction of your users at most and are often trivial, so it isn't worth worrying about unless users are complaining.
If you're still worried about performance, the standard solution on the PC is to allow the user to adjust the graphics settings to reduce the complexity of the pixel shader / number of triangles on the screen / etc. Make sure to look at real-world data about what hardware most of your customers own and use that range for targeting and testing.
Anyway, there's a real-world anecdote, HTH.
if you work with three.js you can query for extensions:
gl = renderer.getContext();
exts = gl.getSupportedExtensions();
so if you don't get something like float textures, you get some idea of what you're working with. WebGL is built around the lowest common denominator.

Why does my JS web app cause Firefox CPU usage when doing nothing?

I've created a complex JavaScript web app. It works very well and is fast. But I've noticed that Firefox is continuously using the CPU while the app is open in the browser window but not doing anything. I haven't programmed in any setIntervals, setTimeouts, automatic regular requests to the server - nothing. The only thing it has is the GUI, which is already instantiated and sitting there doing nothing (since the last interaction with the user), and the event handlers which are waiting, attached to various GUI controls. Just to make sure, I used the profiler in Firebug and it confirmed that there was no activity to profile.
I did have 3 FF add-ons running: Avast Online Security, Firebug, and Web Developer. I have disabled these, and it seems to have reduced the CPU useage from say 4-5% to 1-2%. I know this is low, but considering I have only one tab open with my app in it, I would like to know where that 1-2% is being used. The memory consumption is not increasing (it fluctuates, but doesn't increase over time). I have done some quick checks with IE (9) but the CPU usage tends to stay at 0%.
I can't provide any code or solid starting points, but I just thought someone might have some ideas.

Is it posible to check how much memory browser is using and clear memory currently using by javascript?

i am stuck with big problem i working on big project that is hanging browser automaticaly javacript executes
"how to detect how much memory javascript is using and clear the memory in regular interval.Is it posible?"
You don't have any way to play with memory. Javascript runs in a sandbox environment, so you have no access to memory management in any way. The garbage collector takes care of this, and you can somehow make it do what you want, but it's random. Don't count on it.
Rather, for your problem, you can use Chrome Inspector's Profiler.
What does it do? Well... it profiles the webpage you're in. You can see how long each function takes, and especially: where is your bottleneck.
Try in Chrome, specifically.
Chrome's V8 has a brilliant generational garbage collector, where three types of polling happens: There are three threads constantly polling the three generation types, and I think they run at 10, 50 and 200 millisecond intervals (I may have got the figures wrong, but they are principally similar, with the time intervals increasing for older generations).
This is aggressive, and ensures that memory usage remains low.
In spite of this, if your code is hogging memory in Chrome, then you can be sure that the issue is with the code. It could be that:
(a) Your code is really unoptimized, or
(b) It is really working on very large data that is probably not best suited for the client (e.g. an excessively heavy page that has tons of widgets, dom nodes etc.)
Care to post some snippets?

Minimum system requirements to run webGL?

I want to know what are the minimum system requirements to run a webGL application smoothly?
I know this is a broad question,so what would be the minimum requirements to run the following experiments?
1.) http://www.ro.me/
2.) http://alteredqualia.com/three/examples/webgl_loader_ctm_materials.html
I would have done this myself,but i don't have a machine with low specs,and yes i did try google.
There are no minimum requirements in terms of system specs. I guess if you wanted to be really pedantic you could claim that a system needs a GPU capable of processing shaders (roughly DX9 level) but that basically describes every device with a GPU these days. (Except the Wii. Go figure...)
On a more practical level, you must have a device who's drivers have not been blacklisted due to instability issues. You can typically find the blacklist for a given browser by following the "visit support site" link on http://get.webgl.org/
Beyond that, however, it's all a matter of what the WebGL app is trying to render and what the device's bottlenecks are. For example: I've tried several WebGL demos on a Galaxy Tab with Opera and Firefox, and the framerate is limited to 10-12 FPS no matter what I'm rendering. A full, complex scene renders at the same speed as a single cube. This suggests that the device is probably getting stuck on the compositing phase (where it places the WebGL canvas in the page itself) and as such while the device may be capable of some very complex rendering the limitations of the browsers are holding it back. In comparison, I've seen highly detailed scenes running on iOS devices (with a hacked browser) of similar power at a solid 60FPS. So the exact hardware you have is only one part of the overall equation.
But really the best answer you can possibly get at this point is to pick up a device and try it!

Some kind of task manager for JavaScript in Firefox 3?

Recently I have been having issues with Firefox 3 on Ubuntu Hardy Heron.
I will click on a link and it will hang for a while. I don't know if its a bug in Firefox 3 or a page running too much client side JavaScript, but I would like to try and debug it a bit.
So, my question is "is there a way to have some kind of process explorer, or task manager sort of thing for Firefox 3?"
I would like to be able to see what tabs are using what percent of my processor via the JavaScript on that page (or anything in the page that is causing CPU/memory usage).
Does anybody know of a plugin that does this, or something similar? Has anyone else done this kind of inspection another way?
I know about FireBug, but I can't imagine how I would use it to finger which tab is using a lot of resources.
Any suggestions or insights?
It's probably the awesome firefox3 fsync "bug", which is a giant pile of fail.
In summary
Firefox3 saves its bookmarks and history in an SQLite database
Every time you load a page it writes to this database several times
SQLite cares deeply that you don't lose your bookmarks, so each time it writes, instructs the kernel to flush it's database file to disk and ensure that it's fully written
Many variants of linux, when told to flush like that, flush EVERY FILE. This may take up to a minute or more if you have background tasks doing any kind of disk intensive stuff.
The kernel makes firefox wait while this flush happens, which locks up the UI.
So, my question is, is there a way to have some kind of process explorer, or task manager sort of thing for Firefox 3?
Because of the way Firefox is built this is not possible at the moment. But the new Internet Explorer 8 Beta 2 and the just announced Google Chrome browser are heading in that direction, so I suppose Firefox will be heading there too.
Here is a post ( Google Chrome Process Manager ),by John Resig from Mozilla and jQuery fame on the subject.
There's a thorough discussion of this that explains all of the fsync related problems that affected pre-3.0 versions of FF. In general, I have not seen the behaviour since then either, and really it shouldn't be a problem at all if your system isn't also doing IO intensive tasks. Firebug/Venkman make for nice debuggers, but they would be painful for figuring out these kinds of problems for someone else's code, IMO.
I also wish that there was an easy way to look at CPU utilization in Firefox by tab, though, as I often find myself with FF eating 100% CPU, but no clue which part is causing the problem.
XUL Profiler is an awesome extension that can point out extensions and client side JS gone bananas CPU-wise. It does not work on a per-tab basis, but per-script (or so). You can normally relate those .js scripts to your tabs or extensions by hand.
It is also worth mentioning that Google Chrome has built-in a really good task manager that gives memory and CPU usage per tab, extension and plugin.
[XUL Profiler] is a Javascript profiler. It
shows elapsed time in each method as a
graph, as well as browser canvas zones
redraws to help track down consuming
CPU chunks of code.
Traces all JS calls and paint events
in XUL and pages context. Builds an
animation showing dynamically the
canvas zones being redrawn.
As of FF 3.6.10 it is not up to date in that it is not marked as compatible anymore. But it still works and you can override the incompatibility with the equally awesome MR Tech Toolkit extension.
There's no "process explorer" kind of tool for Firefox; but there's https://developer.mozilla.org/en-US/docs/Archive/Mozilla/Venkman with profiling mode, which you could use to see the time spent by chrome (meaning non-content, that is not web-page) scripts.
From what I've read about it, DTrace might also be useful for this sort of thing, but it requires creating a custom build and possibly adding additional probes to the source. I haven't played with it myself yet.

Categories