I launched this site yesterday (a site for live editing three.js examples) and found that when making updates to the code or navigating to multiple example files, the frame rate skyrockets to around 1000 f/s.
The first mention of this is here. I'm not sure why the frame rate would increase after updating. The WebGL canvas is inside an iframe, and I'm updating the iframe content with this code (iframe has an id of 'preview):
var previewFrame = document.getElementById('preview');
var preview = previewFrame.contentDocument || previewFrame.contentWindow.document;
preview.open();
preview.write(this.props.code);
preview.close();
Does anyone have an idea how to resolve this? The editing is done with CodeMirror and the site is built with React. All src code is in the repo here.
My guess is you're starting multiple requestAnimationFrame loops.
For example
let numLoops = 0;
const countElem = document.querySelector("#count");
const stats = new Stats();
document.body.appendChild(stats.domElement);
function loop() {
stats.update();
requestAnimationFrame(loop);
}
function startLoop() {
++numLoops;
countElem.textContent = numLoops;
requestAnimationFrame(loop);
}
startLoop();
document.querySelector("button").addEventListener('click', startLoop);
<script src="https://cdnjs.cloudflare.com/ajax/libs/stats.js/r16/Stats.min.js"></script>
<button>Click to add another requestAnimationFrame loop</button>
<div>Num Loops Running: <span id="count"></span></div>
The way I made my examples editable and then runable on http://webglfundamentals.org is to run the examples in an iframe using a blob. Anytime the user picks "update" I generate a new blob with their edited source and then set the iframe to that new blob's URL. This means the example gets completely reloaded so any old code/loops/events/webgl contexts, etc are discarded by the browser.
You can see the code here which is effectively
function runLastVersionOfUsersCode() {
var url = getSourceBlob(usersEditedHtml);
someIFrameElement.src = url;
}
var blobUrl;
function getSourceBlob(options, htmlForIFrame) {
// if we already did this discard the old one otherwise
// it will stick around wasting memory
if (blobUrl) {
URL.revokeObjectURL(blobUrl);
}
var blob = new Blob([htmlForIFrame], {type: 'text/html'});
blobUrl = URL.createObjectURL(blob);
return blobUrl;
}
If you look at the actual code for getSourceBlob you'll see it does a little more work but that's basically it above.
To build on gman's helpful answer, I used cancelAnimationFrame in the React render loop (not the threejs render loop). The commit is here: https://github.com/ekatzenstein/three.js-live/commit/2cad65afa5fe066618a7aac179e096ee9e29ed76
//in the iframe
window.parent.three_live = requestAnimationFrame(animate)
//in the parent, on render loop
_resetAnimationFrame(){
//disables abnormally high frame rates
if(window.three_live){
var previewWindow = document.getElementById('preview').contentWindow;
previewWindow.cancelAnimationFrame(window.three_live);
}
}
Doing window.parent wasn't necessary but wanted to reference in the global project.
Related
I am making game in browser and use sound effects for example shot, explosion and for every generated instance of classes there is also creating new Audio object which is eating memory so much and app is crashing after 2/3 minutes thats mean is getting very slow. Is any better way to do this? Maybe creating new Audio() in another place but just once and call it when need, not every time when generating new enemy, bullet etc.
For example:
class Bullet extends Common {
constructor() {
this.element = document.createElement("div");
this.audio = new Audio("./audio/LaserShot.wav");
}
And in upper class Spaceship I call it every time I shot pressing space:
executeShot() {
const bullet = new Bullet(this.getCurrentPosition(), this.element.offsetTop, this.area);
bullet.init();
this.bullets.push(bullet);
}
Not sure if this works great in all scenario, but you can try the following code, and see if it works.
<button class="btn">Click</button>
class AudioService {
constructor(initialsetup = 1) {
this._audios = [];
for (let i = 0; i < initialsetup; i++) {
this._audios.push(new Audio());
}
}
/**
* use to get available audio
*/
_getAudioElemToPlay() {
const audios = this._audios.filter(audio => {
// if the audio is empty, without a valid url or if the audio is ended
// TODO: not sure if these attributes are more than enough
return !audio.duration || audio.ended;
});
console.log('audios', audios);
if (audios.length == 0) {
const audio = new Audio();
this._audios.push(audio);
return audio;
}
return audios[0];
}
playAudio(url) {
const player = this._getAudioElemToPlay();
player.src = url;
player.load();
player.play();
}
}
const audioService = new AudioService();
let index = 0;
document.querySelector('.btn').addEventListener('click', function() {
index++;
const audioList = new Array(12).fill(0).map((value, index) => {
return `https://www.soundhelix.com/examples/mp3/SoundHelix-Song-${index}.mp3`;
});
audioService.playAudio(audioList[index % audioList.length]);
})
Here is the link to run the above code, https://codepen.io/hphchan/pen/xxqbezb.
You may also change the audio to other audio as you like.
My main idea to solve the issue, is by reusing the audio element created, by having an array to store it, and reuse the element once it finish playing.
Of course, for the demo, I am playing the audio by using a click button. But definitely, you can plug it into your game.
Hope the solution may help you. In case there are any cases not covering, as I have not much exposure to this area, it would be nice if you can post your modified solution here, so we can all learn together.
Have you looked at the Web Audio API? If it works for you, a single AudioBuffer can hold the audio data in memory for a given cue, and you can play it multiple times by spawning AudioBufferSourceNode objects. If you have many different sounds playing, this might not be much help, but if you are reusing sounds continuously (many laser shots), this could a big help. Another benefit is that this way of playing sounds is pretty low latency.
I just used this for the first time, getting it to work yesterday. But I'm loading it with raw PCM data (floats ranging from -1 to 1). There is surely a way to load this or an equivalent in-memory structure with a wav, but I'm too new to the API to know yet how to do this.
I've scouted many forums and blogs and questions and sites and whatnot but cannot seem to find a solution that works for me - I am trying to load images using pure javascript without halting the rest of the page to load, and without relying on third party libraries.
On the site I work on, there may be between 0 - 30 images that may load, of different resolutions, and as you may imagine, might slow down performance to a halt on slower connections (which is what I am trying to prevent now - I want the user to see info on the page and worry less about images hooting up the performance on it)
on my latest attempt:
(function () {
// jquery is unavailable here. using javascript counterpart.
var carouselDivs = document.querySelectorAll('#caruselImagesDivs div[data-url]');
var carouselIndicators = document.querySelector('.carousel-indicators');
var carouselInner = document.querySelector('.carousel-inner');
for (var i = 0; i < carouselDivs.length; i++) {
var liIndicator = document.createElement('LI');
liIndicator.dataset.target = "#property_image_gallery";
liIndicator.dataset.slideTo = i + 1;
var divItem = document.createElement('DIV');
divItem.className = "item";
var image = document.createElement('IMG');
image.dataset.src = carouselDivs[i].dataset.url;
image.classname = 'img-responsive center-block';
// for some reason I thought this might work, but it hasn't.
image.onload = function () {
image.src = image.dataset.src;
image.onload = function () { };
}
image.src = '/Images/blankbeacon.jpg';
divItem.appendChild(image);
carouselIndicators.appendChild(liIndicator);
carouselInner.appendChild(divItem);
}
})();
I tried deferring the loading of the images too (the top code section hadn't had the onload event then):
function initImg() {
var imgs = document.querySelectorAll('#property_image_gallery .carousel-inner .item img');
for (var i = 0; i < imgs.length; i++) {
var imgSource = imgs[i].dataset.src;
imgs[i].src = imgSource;
}
}
window.onload = initImg
2 hours in. no results. I am stumped. What am I missing? how can I force the browser to just move on with life and load those images later on?
At first, you may load images one after one, using recursive functions:
function addimg(img){
img.onload=function(){
addimg(nextimg) ;
img.onload=null;//kill closure -> free the js memory
}
}
Start that if the html is loaded completely:
window.onload=addimg;
(pseudocode)
You can also use a image compressor tool to make the images load faster.
http://optimizilla.com/
This is a great article that might also help you
https://varvy.com/pagespeed/defer-images.html
Few suggestions:
If the images are not in the current viewport and are taking up too much initial bandwidth then i suggest to lazy load images when the user is in (or close to) the same viewport of the images.
You can also try deferring the images like what you are doing, but ensure the script is run right before the end body tag.
I also suggest doing things like making sure images are correctly compressed and resized (you have an image there that is 225kb which isnt ideal)
I have a web page which is chock full of javascript, and a few references to resources like images for the javascript to work with. I use a websocket to communicate with the server; the javascript parses the socket's data and does things with the page presentation accordingly. It all works fine, except when it doesn't.
The problem appears to be that that page contains images which I want to display parts of, under javascript control. No matter how I play with defer, there are apparently situations in which the images don't seem to be fully downloaded before the javascript tries to use them. The result is images are missing when the page is rendered, some small percentage of the time.
I'm not very used to languages and protocols where you don't have strict control over what happens when, so the server and browser shipping stuff and executing stuff in an uncontrolled and asynch order annoys me. So I'd like to stop depending on apparently unreliable tricks like defer. What I'd like to do is just download the whole page, and then open my websocket and send my images and other resources down through it. When that process is complete, I'll know it's safe to accept other commands from the websocket and get on with doing what the page does. In other words I want to subvert the browsers asynch handling of resources, and handle it all serially under javascript control.
Pouring an image file from the server down a socket is easy and I have no trouble coming up with protocols to do it. Capturing the data as byte arrays, also easy.
But how do I get them interpreted as images?
I know there are downsides to this approach. I won't get browser caching of my images and the initial page won't load as quickly. I'm ok with that. I'm just tired of 95% working solutions and having to wonder if what I did works in every browser imaginable. (Working on everything from IE 8 to next year's Chrome is a requirement for me.)
Is this approach viable? Are there better ways to get strict, portable control of resource loading?
You still haven't been very specific about what resources you are waiting for other than images, but if they are all images, then you can use this loadMonitor object to monitor when N images are done loading:
function loadMonitor(/* img1, img2, img3 */) {
var cntr = 0, doneFn, self = this;
function checkDone() {
if (cntr === 0 && doneFn) {
// clear out doneFn so nothing that is done in the doneFn callback
// accidentally cause the callback to get called again
var f = doneFn;
doneFn = null;
f.call(self);
}
}
function handleEvents(obj, eventList) {
var events = eventList.split(" "), i;
function handler() {
--cntr;
for (i = 0; i < events.length; i++) {
obj.removeEventListener(events[i], handler);
}
checkDone();
}
for (i = 0; i < events.length; i++) {
obj.addEventListener(events[i], handler);
}
}
this.add = function(/* img1, img2, img3 */) {
if (doneFn) {
throw new Error("Can't call loadMonitor.add() after calling loadMonitor.start(fn)");
}
var img;
for (var i = 0; i < arguments.length; i++) {
img = arguments[i];
if (!img.src || !img.complete) {
++cntr;
handleEvents(img, "load error abort");
}
}
};
this.start = function(fn) {
if (!fn) {
throw new Error("must pass completion function as loadMonitor.start(fn)");
}
doneFn = fn;
checkDone();
};
// process constructor arguments
this.add.apply(this, arguments);
}
// example usage code
var cardsImage = new Image();
cardsImage.src = ...
var playerImage = new Image();
playerImage.src = ...
var tableImage = new Image();
var watcher = new loadMonitor(cardsImage, playerImage, tableImage);
// .start() tells the monitor that all images are now in the monitor
// and passes it our callback so it can now tell us when things are done
watcher.start(function() {
// put code here that wants to run when all the images are loaded
});
// the .src value can be set before or after the image has been
// added to the loadMonitor
tableImage.src = ...
Note, you must make sure that all images you put in the loadMonitor do get a .src assigned or the loadMonitor will never call its callback because that image will never finish.
Working demo: http://jsfiddle.net/jfriend00/g9x74d2j/
I'm working on a website which uses ExpressionEngine to create a list of images with img1, img2, img3 etc as the ID and creates an array with their sources imgAddresses[1], imgAddresses[2], imgAddresses[3] etc.
I'm attempting to create a function which loads the first image, then (when the first is completely loaded), load the second, third etc. The following is what I have so far:
function loadImage(counter) {
var i = document.getElementById("img"+counter);
if(counter==imgAddresses.length) { return; }
i.onload = function(){
loadImage(counter+1)
};
i.src = imgAddresses[counter];
}
document.onload=loadImage(0);
It works when refreshing the page, but not when accessing the page via the URL. As far as I can tell from research, this is because the onload event is not fired when a cached image is loaded, and refreshing the page clears the cache, whereas accessing the page via the URL does not.
Research suggests that assigning the src of the image after declaring the onload event would get around this, but it does not seem to have solved it in this case. I was thinking that this may be because the onload event is recursive in this case.
Does anyone have any ideas on how to make sure the browser is loaded a fresh copy of the image, rather than a cached version? Or whether there is a better way to write this function? Thanks for any help!
EDIT: One solution that I have found is to change the img source assignment to:
i.src = imgAddresses[counter] + '?' + new Date().getTime();
This forces the user to load a fresh copy each time, which I guess is not so much a solution, but a workaround
The only thing I can say is that you are not attaching the document.onload handler correctly. I cannot tell if it will fix your issue because image.onload is not reliable in all browsers, however the onload should be set to a function reference and that's not what you are doing.
Instead, you should have something like:
function loadImage(counter) {
//initialize the counter to 0 if no counter was passed
counter = counter || 0;
var i = document.getElementById("img"+counter);
if(counter==imgAddresses.length) { return; }
i.onload = function(){
loadImage(counter+1)
};
i.src = imgAddresses[counter];
}
document.onload = loadImage; //instead of loadImage(0);
You can tell how the browser will manage the cached resources
Take a look to HTML5 cache approach:
HTML5: The cache manifest file
This way you can avoid the browser cache for the specified resources.
The Audio sound is distorded (like playing twice at the same time or something like that) when I change its source dynamically, if the element was used in the createMediaElementSource of an AudioContext.
Here is a minimalist example of the error: http://jsfiddle.net/BaliBalo/wkFpv/ (It works well at first but it is going crazy when you click a link to change the source).
var audio = document.getElementById('music');
var actx = new webkitAudioContext();
var node, processor = actx.createScriptProcessor(1024, 1, 1);
processor.onaudioprocess = function(e) { /* STUFF */ };
processor.connect(actx.destination);
audio.addEventListener('canplay', canPlayFired);
function canPlayFired(event)
{
node = actx.createMediaElementSource(audio);
node.connect(processor);
audio.removeEventListener('canplay', canPlayFired);
}
$('a.changeMusic').click(function(e){
e.preventDefault();
audio.src = $(this).attr('href');
});
If I put node.disconnect(0); before audio.src = ... it works but the data is no more processed. I tried a lot of thing like creating a new context but it seems not to erase the previously set javascript node.
Do you know how I could fix it ?
Thanks in advance.
I would suggest taking a look at this: jsbin.com/acolet/1
It seems to be doing the same thing you are looking for.
I found this from this post.