This might be a shot in the dark but I have no idea what's causing this.
I've developed a game engine with webgl. My main testing browser has been firefox and everything works perfectly. No lag or random stutters, even if I'm doing more intense things like blending with multiple framebuffers.
However, on Chrome it's a whole other story. Chrome struggles to keep a stable fps when even running the most simple tasks. I decided to create an experiment to see if the problem was in my code or in the requestAnimation loop. This is the code I ran:
<!DOCTYPE html>
<html>
<body>
<div id="fpsCounter"></div>
Lowest fps
<div id="minFps"></div>
<br>
Highest fps
<div id="maxFps"></div>
<script>
var minFps = 999;
var maxFps = 0
var fps = 0;
var last = performance.now();
var now;
var fpsUpdateTime = 20;
var fpsUpdate = 0;
var fpsCounter = document.getElementById("fpsCounter");
var minFpsEle = document.getElementById("minFps");
var maxFpsEle = document.getElementById("maxFps");
function timestamp(){
return window.performance && window.performance.now ? window.performance.now() : new Date().getTime();
}
var getMaxFps = false;
setTimeout(function(){
getMaxFps = true;
}, 2000);
function gameLoop(){
now = performance.now();
if(fpsUpdate == 0){
fps = 1000 / (now - last);
fpsUpdate = fpsUpdateTime;
}
fpsUpdate--;
fpsCounter.innerHTML = fps;
if(parseInt(fps, 10) < parseInt(minFps, 10)){
minFps = parseInt(fps, 10);
minFpsEle.innerHTML = minFps;
}
if(parseInt(fps, 10) > parseInt(maxFps, 10) && getMaxFps){
maxFps = parseInt(fps, 10);
maxFpsEle.innerHTML = maxFps;
}
last = now;
requestAnimationFrame(gameLoop);
}
gameLoop();
</script>
</body>
</html>
All the code does is loop the animation frame and put the fps into a div. On Firefox this works just as well as the whole game engine did, it keeps an average of about 58 and never dipps below 52 fps. Chrome struggles to be above 40 fps and frequently dips below 28. Oddly enough, Chrome has some frequent burst of speed, highest fps chrome got was 99 fps but thats kinda pointless since a stable 60 fps is more important.
Details:
Firefox version: 55.0.2 64-bit
Chrome version: 60.0.3112.78 (official version) 64-bit
OS: Ubuntu 16.04 LTS
Ram: 8gb
GPU: gtx 960m
Cpu: intel core i7HQ
This is how performance looks in Chrome:
I made this minimalistic html page for test:
<!DOCTYPE html>
<html>
<head>
<title>requestAnimationFrame</title>
</head>
<body>
<canvas id="canvas" width="300" height="300"></canvas>
<script>
"use strict"
// (tested in Ubuntu 18.04 and Chrome 79.0)
//
// requestAnimationFrame is not precise
// often SKIPs a frame
//
function loop() {
requestAnimationFrame(loop)
var ctx = document.getElementById("canvas").getContext("2d")
ctx.fillStyle = "red"
ctx.fillRect(100,100,200,100)
}
loop()
</script>
</body>
</html>
summary - dev tools image
There is no memory leak problem.
The scripting execution time is negligible.
fps - dev tools image
The FPS has inconsistent behaviour (running Chrome on Ubuntu).
In this test the problem was hardware acceleration.
The FPS was ok when hardware acceleration was disabled.
EDITED
I have done more tests with a page containing just a single canvas.
My conclusion is that browsers are too much complex (or buggy) and hardly run smoothly 100% of the time.
my architecture for games
var previousTimeStamp = 0
function mainLoop(timeStamp) {
if (! shallSkipLoop(timeStamp)) { gameLoop() }
requestAnimationFrame(mainLoop)
}
function gameLoop() {
// some code here
}
function shallSkipLoop(timeStamp) {
var deltaTime = timeStamp - previousTimeStamp
previousTimeStamp = timeStamp
//
// avoiding bad frame without less than 1000 / 60 ms!!!
// this happens when browser executes a frame too late
// and tries to be on time for the next screen refresh;
// but then may start a long sequence of unsynced frames:
// one very short (like 5ms) the other very long (like 120ms)
// maybe it is a bug in browser
//
return deltaTime < 16
}
requestAnimationFrame(mainLoop)
Related
I am having issues with using the webAudio API with javascript.
The problem is that I am hearing glitches on the sounds being played in my browser even though I have used a gainNode to gradually increase/decrease the sound when it starts/stops.
The audio file is simply 60 seconds of 400hz tone to demonstrate the issue. In the demo I play a snippet from time point 2.0 seconds for 1 second duration, within this duration I ramp up for 100ms and at 800ms I begin to ramp down for 199ms. This is an attempt to avoid a non zero crossing glitch. I use gainNode.gain.setTargetAtTime() but also tried exponentialRampToValueAtTime() as well. In this example I repeat at time point 52 seconds.
At the beginning of the code I impliment an audioContext.resume() to trigger the audio facility of the browser.
<!DOCTYPE html>
<html>
<head>
<title>My experiment</title>
<audio id="audio" src="pure_400Hz_tone.ogg" preload="auto"></audio>
</head>
<body>
<div id="jspsych_target"></div>
<button onclick="dummyPress()">Press to Activate Audio</button>
<button onclick="playTheTones()">sound the tone</button>
</body>
<script>
console.log("setting up audiocontext at ver 28 ");
const audioContext = new AudioContext();
const element = document.querySelector("audio");
const source = audioContext.createMediaElementSource(element);
const gainNode = audioContext.createGain();
gainNode.gain.setValueAtTime(0, audioContext.currentTime);
source.connect(gainNode);
gainNode.connect(audioContext.destination);
function dummyPress(){
audioContext.resume();
playTheTones();
};
function playTheTones(){
// ******* The First Tone ***********
// **********************************
source.currentTime = 2;
gainNode.gain.setTargetAtTime(1.0, audioContext.currentTime, 0.1);
var g = setTimeout(function(){
gainNode.gain.setTargetAtTime(0.0001, audioContext.currentTime, 0.199);
console.log("start Down # " + source.currentTime);
},800);
source.mediaElement.play();
console.log("PLAYING 2 now # " + source.currentTime);
var k = setTimeout(function(){
source.mediaElement.pause();
console.log("STOPPED # " + source.currentTime);
},1100);
// ******* The Second Tone ***********
// **********************************
setTimeout(function(){
source.currentTime = 52;
gainNode.gain.setTargetAtTime(1.0, audioContext.currentTime, 0.1);
var h = setTimeout(function(){
gainNode.gain.setTargetAtTime(0.0001, audioContext.currentTime, 0.199);
console.log("start Down # " + source.currentTime);
},800);
source.mediaElement.play();
console.log("PLAYING 52 now # " + source.currentTime);
var j = setTimeout(function(){
source.mediaElement.pause();
console.log("STOPPED # " + source.currentTime);
},1100);
},1500);
};
</script>
</html>
Unfortunately I think I have confused myself in trying to resolve the glitch issues and may not be using best practice using the API and this might be causing my problem.
Would someone look at the code and point out if I am using the API correctly and confirm that I am correct in thinking I should be able to use the API and present tones in this way without glitching.
Thanks
I found the problem
gainNode.gain.setTargetAtTime(0.0001, audioContext.currentTime, 0.199);
the third parameter is a 'time constant' not a 'time duration' so it was mammothly large at 0.199 and the gain did not diminish rapidly enough so causing the glitch. Setting to 0.01 cures the issue !
I am trying to create a test that explores the boundaries of our subconscious. I want to briefly display a number and see if the user can use their intuition to guess the value - is their subconscious able to read the number faster than their conscious self. So I am trying to flash a number onto the screen for a few milliseconds. Chrome does not seem to behave as well as Edge in with this code. How can I make it work more consistently across browsers?
I have tried various ways of hiding and revealing the number. Finally ending up with this version.
<script>
function onLoad() {
numberOfPoints = Math.floor(Math.random() * (99 - 9 + 1)) + 9;
document.f.points.value = numberOfPoints;
setTimeout(hideRun, 3000);
}
function hideRun() {
hide();
document.getElementById("hiddenNumber").innerHTML = numberOfPoints;
document.getElementById("hiddenNumber").style.display = 'block';
setTimeout(hide, 5);
}
function hide() {
document.getElementById("hiddenNumber").style.display = 'none';
}
</script>
<body onload="onLoad()">
<div id=hiddenNumber style="display: block;">GET READY</div>
</body>
In this case I am hoping to display the Get Ready text for 3 seconds, then show a random number for 5 milliseconds. Although I have no way to actually measure it, the 5 milliseconds on a chrome browser is a lot longer than with the Edge browser.
You can try it yourself here: Test Timer
Thinking in terms of time is not reliable here, because you don't know when the browser will paint to screen, nor when the screen will do its v-sync.
So you'd better think of it in term of frames.
Luckily, we can hook callbacks to the before-paint event, using requestAnimationFrame method.
let id = 0;
btn.onclick = e => {
cancelAnimationFrame(id); // stop potential previous long running
let i = 0,
max = inp.value;
id = requestAnimationFrame(loop);
function loop(t) {
// only at first frame
if(!i) out.textContent = 'some dummy text';
// until we reached the required number of frames
if(++i <= max) {
id= requestAnimationFrame(loop);
}
else {
out.textContent = '';
}
}
};
Number of frames: <input type="number" min="1" max="30" id="inp" value="1"><button id="btn">flash</button>
<div id="out"></div>
Can you try a 2D canvas and see if that helps?
<html>
<head>
<script>
var numberOfPoints;
var canvas;
var context;
function onLoad() {
canvas = document.getElementById("myCanvas");
context = canvas.getContext("2d");
context.font = "30px Arial";
// context.fillText("...", 10, 50);
numberOfPoints = Math.floor(Math.random() * (99 - 9 + 1) ) + 9;
setTimeout(hideRun, 3000);
}
function hideRun() {
context.fillText(numberOfPoints, 10, 50);
setTimeout(hide, 5);
}
function hide() {
context.clearRect(0, 0, canvas.width, canvas.height);
}
</script>
</head>
<body onload="onLoad()">
<canvas id="myCanvas"></canvas>
</body>
</html>
In my tests, it seems to show the number more consistently versus the CSS property, but to be absolutely sure, I would recommend a 60fps screen reader to record and validate the cross-browser accuracy.
I have the following two pieces of code (awful but I have no idea what I'm doing):
var stage = new createjs.Stage("canvas");
createjs.Ticker.on("tick", tick);
// Simple loading for demo purposes.
var image = document.createElement("img");
image.src = "http://dossierindustries.co/wp-content/uploads/2017/07/DossierIndustries_Cactus-e1499205396119.png";
var _obstacle = new createjs.Bitmap(image);
setInterval(clone, 1000);
function clone() {
var bmp = _obstacle.clone();
bmp.x= Math.floor((Math.random() * 1920) + 1);
bmp.y = Math.floor((Math.random() * 1080) + 1);
stage.addChild(bmp);
}
function tick(event) {
stage.update(event);
}
<script>
$j=jQuery.noConflict();
jQuery(document).ready(function($){
var interval = 1;
setInterval(function(){
if(interval == 3){
$('canvas').show();
interval = 1;
}
interval = interval+1;
console.log(interval);
},1000);
$(document).bind('mousemove keypress', function() {
$('canvas').hide();
interval = 1;
});
});
<script src="https://code.createjs.com/easeljs-0.8.2.min.js"></script>
<canvas id="canvas" width="1920" height="1080"></canvas>
Basically what I'm hoping to achieve is that when a user is inactive for x amount of time the full page (no matter on size) slowly fills with the repeated image. When anything happens they all clear and it begins again after the set amount of inactivity.
The code above relies on an external resource which I'd like to avoid and needs to work on Wordpress.
Site is viewable at dossierindustries.co
Rather than interpret your code, I made a quick demo showing how I might approach this.
The big difference is that drawing new images over time is going to add up (they have to get rendered every frame), so this approach uses a cached container with one child, and each tick it just adds more to the cache (similar to the "updateCache" demo in GitHub.
Here is the fiddle.
http://jsfiddle.net/dcs5zebm/
Key pieces:
// Move the contents each tick, and update the cache
shape.x = Math.random() * stage.canvas.width;
shape.y = Math.random() * stage.canvas.height;
container.updateCache("source-over");
// Only do it when idle
function tick(event) {
if (idle) { addImage(); }
stage.update(event);
}
// Use a timeout to determine when idle. Clear it when the mouse moves.
var idle = false;
document.body.addEventListener("mousemove", resetIdle);
function resetIdle() {
clearTimeout(this.timeout);
container.visible = false;
idle = false;
this.timeout = setTimeout(goIdle, TIMEOUT);
}
resetIdle();
function goIdle() {
idle = true;
container.cache(0, 0, stage.canvas.width, stage.canvas.height);
container.visible = true;
}
Caching the container means this runs the same speed forever (no overhead), but you still have control over the rest of the stage (instead of just turning off auto-clear). If you have more complicated requirements, you can get fancier -- but this basically does what you want I think.
I'm playing with deviceorientation in JavaScript and I noticed some differences between my Ipad (iOS 6.1) and my Nexus7 (Android 4.2.2).
This code does not print the same data with the Ipad and the Nexus7.
<html>
<head/>
<body>
<button id="calibrate">Calibrate</button>
<button id="stop">Stop</button>
<button id="play">Play</button>
<div id="log"><p></p></div>
<script>
var log = document.getElementById('log');
var calibrate = false;
var calibrateG = 0, calibrateB = 0, calibrateA = 0;
var deviceorientation = function(e) {
if (calibrate) {
calibrateG = e.gamma;
calibrateB = e.beta;
calibrateA = e.alpha;
calibrate = false;
}
var gamma = parseInt(e.gamma - calibrateG);
var beta = parseInt(e.beta - calibrateB);
var alpha = parseInt(e.alpha - calibrateA);
var p = document.createElement('p');
p.innerHTML = gamma + ' ' + beta + ' ' + alpha;
log.insertBefore(p, log.firstChild);
}
document.getElementById('stop').onclick = function() {
window.removeEventListener('deviceorientation', deviceorientation);
};
document.getElementById('play').onclick = function() {
window.addEventListener('deviceorientation', deviceorientation);
};
document.getElementById('calibrate').onclick = function() {
calibrate = true;
};
window.addEventListener('deviceorientation', deviceorientation);
</script>
</body>
</html>
At start Android print 0 0 270 and iOS 0 0 0.
Then when I move both in the same way, they don't print the same values.
Can someone explain why, and if there are a way to normalize the data.
UPDATE #1
I already try some calibrations and I care about landscape/portrait.
To reproduce, you can take the code above, put ipad and nexus7 in portrait in front of you.
Calibrate the value of both (first button).
Then take the right corner of the tablet and rotate it until the tablet reaches 90 degrees.
The tablet should be on the left side.
On Android the gamma goes from 0 to -80 and then jump to 270.
On IOS the gamma goes from 0 to -180 without any jump.
Full Tilt JS normalizes the data values between Android and iOS deviceorientation implementations. It also ensures deviceorientation data remains consistent whenever the user rotates their screen.
This article provides a summary of some of the techniques used in the Full Tilt JS library.
Disclaimer: I am the author of both the article and library above. Please give it a try and report any issues directly on the Github project.
If you need all three for an application or game you could prompt the user to ~"hold there device up straight" and record the initial values, then get offsets (deltas) of those values. You could even save that initial calibration to localStorage so it doesn't need to be repeated.
If all you need is landscape or portrait just compare window.innerWidth with window.innerHeight or something equally as trivial.
I have problem with display of correct javascript in IE9. Other browsers (Firefox, Opera, Chrome, Safari) work well, but animation in IE is not fluent. For example see this line which can be dragged from left to right (link at the end of the post).
javascript code:
var w = 1250;
var h = 650;
var drawing = Raphael("obrazek",w,h);
var Ax = 50
var Ay = 50
var Ey = 500
var w = 1250;
var h = 650;
var drawing = Raphael("obrazek",w,h);
var Ax = 50
var function onDragMove(dx,dz) {
this.onDragUpdate(dx - (this.deltax || 0), dz - (this.deltaz || 0));
this.deltax = dx;
this.deltaz = dz;
}
function onDragStart() { this.deltax = this.deltaz = 0; }
function onDragStop() { this.onDragStop(); }
// line 1
var Ax
var line = drawing.path([["M",Ax,Ay],["L",Ax,Ey]]).attr({"stroke-width":3})
line.drag(onDragMove,onDragStart)
line.attr({"cursor":"move"})
line.onDragUpdate = function(dx,dz) {
Ax += dx
line.attr({"path":[["M",Ax,Ay],["L",Ax,Ey]]})
}
and corresponding HTML:
<html>
<head>
<script src="raphael.js"></script>
</head>
<body>
<div id="obrazek">
<script src="ietest.js"></script>
</div>
</body>
</html>
or see the problem in IE9 here and compare it with Chrome:
http://mech.fsv.cvut.cz/~stransky/ietest/ietest.html
Thank in advance for any help.
Your page is missing doctype, so it is rendered in quirks mode. IE9 uses VML instead of SVG in quirks mode, which probably results in slower rendering. Just add this on the first line of your html:
<!DOCTYPE html>
However, your code has some other problems:
Missing semicolons. There is a good explanation of how it may be dangerous.
Variable re-declarations and re-definitions.
When handling rapidly repeating events like mousemove or scroll, it is reasonable to use throttling to avoid redrawing/repainting glitches and performance problems. You can read more about it here. Include the plugin from that site and replace your drag binding with the following:
line.drag($.throttle(30, onDragMove), onDragStart);
In fact, even doing this without specifying the doctype can greatly improve the rendering performance, but there's no reason not to specify it altogether.