I'm using box2dweb version 2.1.a.3 (javascript, ported from flash), to create game. Some examples that I've got from Google used:
setInterval(
function(){
world.Step(1/60 , 10, 10)
world.ClearForces()
}
,1000/60)
I tried to remove the line world.ClearForces() but things behaved the same. I wonder what function ClearForces() does? What trouble can I get if I remove it like that ? Thanks!
I can't say for sure about the Flash and Javascript versions, but the ClearForces function was originally necessary in early versions of Box2D. Back then if you did ApplyForce to move an object, that force would remain in effect indefinitely, but now you need to do ApplyForce every time step if you want a continuous force. So effectively, the engine is calling this ClearForces for you every step. If you can take it out without changing anything you might as well.
Related
I am using Cytoscape.js 2.7.15 for my graduation project and i need to make some simple visualizations like changing the labels of nodes.
subjectNode.style('label',myDesiredLabelToshow);
works for me but i am using it in a for loop and when i want to make it slowly or in debug mode to see how my algorithms works to label them, the labels of nodes do not change immediately, they change eventually together after my function ends (i mean hit to end scope of function).
I tried to use cy.batch(); cy.startBatch() even tried to set Timeout but nothing worked.
After tracking cytoscape.js file in debug mode i saw a function o.requestAnimationFrame = function.. after debugger hits that changes are applied on my graph, how can i manually trigger it in my custom functions?
Pretty much any renderer is going to use a render loop using requestAnimationFrame(). That means it's async, and you won't see results while stepping.
The answer is not to use breakpoints or stepping, because those methods assume everything works synchronously.
Use a button in your UI to "step" through points in your algorithm, or use animations in a promise chain to visualise progression. Or forgo showing debugging cues in the UI itself and just use console.log().
I'm building real time time layout.
If server sends packages every 200ms, then Function Call + Recalculate Style + Layout + Paint time must be less than 200ms.
By using performance.mark with performance.measure or just console.time('1') with console.timeEnd('1') i can measure Function Call what is not enough.
Is there any known way how to put some sort of anchors to get and log number that includes Paint?
That will be used for automated performance testing.
Thanks in advance!
It isn't possible to log Paint times using the Console API. It looks as though there was an attempt to integrate this into WebKit several years ago, but this never got implemented. Right now, you can only do CPU profiling using console.profile, but this isn't relevant for you.
You need to explicitly run the Timeline tool to gather Paint profiling data. You could look into using macros to run this. You can export your data into a JSON file, so that you can re-import and compare each one. It's not optimal for automated testing, but I'm not sure of other ways.
This isn't going to include the paint time itself, but requestAnimationFrame will be called just before the paint.
I also tried setTimeout(fn, 0), but it turned out to be called fairly long after the paint.
If all you want to know is that there is some time left at the end of a frame you might be able to use requestIdleCallback.
How can I control the rendering loop frame rate in KineticJS? The docs for Kinetic.Animation show a frame rate being passed to the render callback, and Kinetic.Tween seems to have no frame rate logic, but I don't see anyway to force, say, a 30fps when 60fps is possible.
Loads of context for the curious follows, but the question is that simple. If anyone reads on, other advice is welcome. If you already know the answer, don't waste your time reading on!
I'm developing a music app that combines some DOM-based GUI controls (current iteration using jQuery Mobile) and Canvas-based GUI controls (using KineticJS). The latter involve some animation. Because the animated elements are triggered by music playback, I'm using Kinetic.Tween to avoid the complexity of remembering how long a given note has been playing (which Kinetic.Animation would require doing).
This approach works great at 60fps in Chrome (on a fast machine) but is just slow enough on iOS 6.1 Safari (iPad 2) that manipulating controls while animations are happening gets a little janky. I'm not using WebGL (unless KineticJS or Chrome does this by default for canvas?), and that's not an option when I package for native UIWebView.
As I'm getting beyond prototype into wanting to make more committed tech decisions, I see the following options, in order of perceived goodness:
Figure out how to cap the frame rate. Because my animations heavily use alpha fades but do not involve motion, I believe I could get away with 20-30fps and look fine. Could also scale this up on faster devices.
Don't respond immediately to touch inputs, but add them to a queue which I poll at a constant interval and only use the freshest for things like touchmove. This has no impact on my non-interactive animated elements, but tackles the problem from the other direction, trying to reduce the load of user interaction. This would require making Kinetic controls static and manually tracking touch coordinates (not terrible effort if it actually helped).
Rewrite DOM-based GUI to canvas-based (KineticJS); rewrite WebAudio-based engine to HTML5 audio; leverage CocoonJS or Ejecta for GPU-acceleration. This means having to hand-code stuff like file choosers and nav menus and such (bad). Losing WebAudio is pretty serious as it eliminates features like DSP effects and very fine-grained, low-latency timing (which is working just fine on an iPad 2).
Rewrite the app to separate DOM based GUI and WebAudio from Canvas-based elements, leverage CocoonJS. I'm not sure if/how well this works out, but the fact that CocoonJS passes JavaScript code as strings between the 2 components makes me very skittish about how solid this idea is. It's probably doable, but best case I'm very tied to CocoonJS moving forwards. I don't like architecting this way, but maybe it's not as bad as it sounds?
Make animations less juicy. This is least good not because of its design impact but because, as it is, I'm only animating ~20 simple shapes at any time in my central view component, however they include transparency and span an area ~1000x300. Other components like sliders are similarly bare-bones. In other words, it's not very juicy right now.
Overcome severe allergy to Objective-C; forget about the browser, Android, and that other mobile OS. Have a fast app that performs natively and has shiny Apple-approved widgets. My biggest problem with this approach is not wanting to be stuck in Objective-C reality for years, skillset-wise. I just don't like it.
Buy an iPad 3 or later. Since I already am pretending Android doesn't exist (I don't have any devices to test), why not pretend no one still has iPad 2? I think this is passing the buck -- if I can get acceptable performance on iPad 2, I will feel confident about the app's performance as I add more features.
I may be overlooking options or otherwise naive about how to tackle this. Some would say what I'm trying to build is just silly. But it's working pretty well just not ready for prime time on the iPad 2.
Yes, you can control the Kinetic.Animation framerate
The Kinetic.Animation sends in a frame object which has a frame.time property.
That .time is a running timer that you can use to throttle your animation speed.
Here's an example that throttles the Kinetic.Animation: http://jsfiddle.net/m1erickson/Hn3cC/
var lastTime;
var frameDelay=1000;
var loop = new Kinetic.Animation(function(frame) {
var time = frame.time
if(!lastTime){lastTime=time;}
var elapsed = time-lastTime;
if(elapsed>=frameDelay){
// frameDelay has expired, so animate stuff now
// set lastTime for the next loop
lastTime=time;
}
}, layer);
loop.start();
Working from #markE's suggestions, I tried a few things and found a solution. It's ultimately not rocket science, but sharing what I figured out:
First, tried the hack of doubling Tween durations and targets, using a timer to stop them at 50%. This kinda sorta worked but was hard to get to look good and was pretty error prone in coding bogus targets like negative opacity or height or whatnot.
Second, having read the source to Tween and looked at docs again for Animation, decided I could locally use Animation instances instead of Tween instances, and allow the closure scope to hang onto the relevant note properties. Eventually got this working smoothly and finally realized a big Duh! which is that throttling the frame rate of several independently animating things does not in any way throttle the overall frame rate.
Lastly, decided to give my component a render() method that calls itself in a loop with requestAnimationFrame, exits immediately if called before my clamp time, and inside render() I update all objects in the Kinetic canvas and call layer.drawScene(). Because there is now only one animation, this drops frame rate to whatever I need and the app is fast on iPad 2 (looks exactly the same to my eyes too).
So Kinetic is still helping for its higher level canvas API, and so far my other control widgets are still easy code using Kinetic to handle user input and dragging, now performing much better as the big beast component is not eating up the CPU.
The short answer to my original question is that no, you can't lock the overall frame rate for very complex animations, but as Mark said, you can for anything that fits in a single Animation instance.
Note that I could have still used Animation without giving it a layer or explicitly calling any draw() methods, but since I'd still have to write all the logic to determine individual element's current visual state, there was no gain to doing this. What would be very useful would be if Tween could accept a parameter to not automatically render. This would simplify code like mine, as I could shorthand the animation on individual objects but still choose when to actually do the heavy lifting of rendering everything. Seeing how much this whole exercise gained in performance on the iPad 2, might be worth adding this option to the framework.
I do have a Paper.path() in Raphael that is filled with a simple texture:
var fill = screen.path(Iso.topFacePath(top)).attr({
fill: 'url(http://www.example.com/mytexture.jpg)',
});
The path can be altered by the user via drag and drop. For this I use Element.drag() to bind the handlers.
The problem that I encouter now is that while the onmove-handler function is called the element in question will be recalculated and has to be drawn again. Apparently this is "too much" for raphael and the fill pattern will disappear randomly (flicker) and appear again some time later (at latest onend).
The actual code I use is a little too much to post here but I built a fiddle where you can see what's going on (you can drag the upper sides of the quadrangle).
Is there a simple fix to this?
I am used to canvas much more than raphael (actually this is the first time I really use raphael) so maybe my approach of redrawing everything everytime sth changes is plain wrong?
EDIT: I just found out that seems to be somehow browser-related as well. Chrome and Firefox will produce the flicker where Safari seems to do everything just fine.
This seems to be caching issue (raphael.js does not cache the bitmap fill and will reload it on every change) and is fixed (for me) by this pull request on GitHub that is (as of 08/14/2012) still pending.
Raphael is pretty hard / impossible to build oneself as the make file points to local and/or inexistent files, but you can either concatenate everything by hand, modify the build script or use the modified build that is used in the example to get hold of the fix.
Let's hope it will find its way into a future release of Raphael.
Is there a standard (accepted/easy/performant) way to determine how fast a client machine renders javascript?
When I'm running web apps (videos, etc) on my other tabs my JS animations slow to a crawl.
If I could detect slowness from my JS, I would use simpler animations to provide a better user experience.
Update:
Removing animations for everyone is not the answer. I am talking about the simplest of animations which will stutter depending on browser / computer. If I could detect the level of slowness, I would simply disable them.
This is the same as video games with dynamic graphics quality: you want to please people with old computers without penalizing those who have the extra processing power.
One tip is to disable those hidden animations. if they are on another tab that is not in focus, what's the use of keeping them animated?
Another is to keep animations to a minimum. I assume you are on the DOM, and DOM operations are expensive. keep them to a minimum as well.
One tip I got somewhere is if you are using image animation manipulation, consider using canvas instead so that you are not operating on the DOM.
Also, consider progressive enhancement. Keep your features simple and work your way up to complicated things. Use the simple features as a baseline every time you add something new. That way, you can easily determine what causes the problem and fix it accordingly.
The main problem you should first address is why it is slow, not when it is slow.
I know this question is old, but I've just stumbled across it. The simplest way is to execute a long loop and measure the start and end time. This should give you some idea of the machine's Javascript performance.
Please bear in mind, this may delay page loading, so you may want to store the result in a cookie, so it's not measured on every visit to the page.
Something like:
var starttime = new Date();
for( var i=0; i<1000000; i++ ) ;
var dt = new Date() - starttime;
Hope this helps.