Chrome: memory leak when displaying base64 images - javascript

I have a websocket that sends binary images. My script get those images, convert to base64 and display in a tag.
Something like this:
websocket.onmessage = function(evt) {
var msg = evt.data;
var image = $('.my-image')
image.attr('src', "data:image/jpeg;base64,"+ toBase64(msg))
}
This seems to cause a memory leak in Chrome. After a few minutes, it will be easily using more than 1GB of RAM. In a few hours I get the "Aw, Snap" error.
Looking at the resources tab, I see that all images received are displayed. It doesn't look like they are removed at any moment, even when they aren't displayed anymore.
Is there a workaround to this issue? Maybe a way to force the old images to be removed from memory.

I had been plagued by the same issue, with rising memory usage in the browser. Turns out this is not a memory leak, but instead a side-effect of the browser caching images.
#metal03326 provides a solution at https://stackoverflow.com/a/38788279/1510289. The idea is to,
write the image bytes to a JavaScript Blob,
create a unique object URL from the blob,
use the unique object URL in your src attribute, and finally
revoke the object URL when done to release memory.
Here's the code:
function getBlob(byteString, mimeString) {
var ab = new ArrayBuffer(byteString.length);
var ia = new Uint8Array(ab);
for (var i = 0; i < byteString.length; i++) {
ia[i] = byteString.charCodeAt(i);
}
var blob = new Blob([ab], {type: mimeString});
return blob;
}
let prevObjectURL = null;
websocket.onmessage = function(event) {
var blob = getBlob(atob(event.image), 'image/jpg');
var objectURL = URL.createObjectURL(blob);
$('.my-image').attr('src', objectURL);
URL.revokeObjectURL(prevObjectURL);
prevObjectURL = objectURL;
}

Save the images base64 in a temporary variable and replace the info with null.

Related

Is there a way to stream data into a blob (or generate a giant blob)

Checking MDN I see there used to be BlobBuilder and that I could call blobBuilder.append to continue adding data to a blob but according to MDN BlobBuilder is deprecated in favor of the Blob constructor. Unfortunately the Blob constructor requires all data in memory at construction time. My data is too large to be in memory at construction time. Looking at the File API see nothing there either.
Is there a way to generate large data client side and put it in a blob? For example say I wanted to render a 16k by 16k image. Uncompressed that's a 1gig image.
I have an algorithm that can generate it 1 or a few scan lines at a time but I need way to write those scan lines into a file/blob and then when finished I can use the standard way to let the user download that blob but, I can't seem to find an API that let's me stream data into a blob.
The only thing I can think of is apparently I can make a Blob from Blobs so I suppose I can write each part of the image to a separate blob and then send all the blobs to another blob to get a big blob.
Is that the only solution? Seems kind of um .... strange. Though if it works then ¯\_(ツ)_/¯
Someone voted to close as they don't understand the question. Here's another explanation.
Write 4 gig to a blob
const arrays = [];
for (let i = 0; i < 4096; ++i) {
arrays.push(new Uint8Array(1024 * 1024)); // 1 meg
}
// arrays now holds 4 gig of data
const blob = new Blob(arrays);
The code above will crash because the browser will kill the page for using too much memory. Using BlobBuilder I could have done something like
const builder = new BlobBuilder();
for (let i = 0; i < 4096; ++i) {
const data = new Uint8Array(1024 * 1024); // 1 meg
builder.append(data);
}
const blob = builder.getBlob(...);
That would not have run out of memory because there is never more than 1meg of data around. The browser can flush the data being appended to the BlobBuilder out to disk.
What's the new way to achieve writing 4 gig to a blob? Is it only writing lots of small blobs and then using those to generate a larger one or is there some more traditional way where traditional means steaming into some object/file/blob/storage.
As you know, the data that the blob will contain must be ready to pass to the constructor. Let us take the example from MDN:
var aFileParts = ['<a id="a"><b id="b">hey!</b></a>'];
var oMyBlob = new Blob(aFileParts, {type : 'text/html'});
Now, we have two options:
We can append data to the array, and then convert it to a blob:
var aFileParts = ['<a id="a"><b id="b">hey!</b></a>'];
aFileParts.push('<p>How are you?</p>');
var oMyBlob = new Blob(aFileParts, {type : 'text/html'});
Alternatively, we can use blobs to create the blob:
var oMyOtherBlob = new Blob([], {type: 'text/html'});
oMyOtherBlob = new Blob([oMyOtherBlob, '<a id="a"><b id="b">hey!</b></a>'], {type : 'text/html'});
oMyOtherBlob= new Blob([oMyOtherBlob, '<p>How are you?</p>'], {type : 'text/html'});
You may build your own BlobBuilder encapsulating that... given that appending to an array seems to lead you to run out of memory, let us encapsulate the second option:
var MyBlobBuilder = function() {
var blob = new Blob([], {type: 'text/html'});
this.append = function(src)
{
blob = new Blob([blob, src], {type: 'text/html'});
};
this.getBlob = function()
{
return blob;
}
};
Note: tested with your code (replaced BlobBuilder with MyBlobBuilder), did not run out of memory on my machine. Windows 10, Chrome 67, 8 GB Ram, Intel Core I3 - 64 bits.

converting video src to a single-session blob url

Let's say I have a video element on my website:
<video src="/video.mp4" controls="" id="video"></video>
How do I go about protecting the original source file (/video.mp4) by converting it to a single-session Blob URL?
I have seen a few posts stating that it needs to be done with JavaScript, though none of them actually expand on the necessary details explaining how to do it (or where you can find out how).
So, what is the best approach for something like this?
Here is a quick and dirty example. Hope it helps.
Make sure to go over the docs of all of the methods being used and check their browser support. This will not protect your video from being downloadable though.
// Request the video using a new XMLHttpRequest() with the
// responseType set to blob.
var xhr = new XMLHttpRequest();
xhr.responseType = 'blob';
xhr.onload = function(){
var reader = new FileReader();
reader.onloadend = function(){
// Pass this string to atob to decode the base-64 encoded string to a string
// representing each byte of binary data.
var byteCharacters = atob(reader.result.slice(reader.result.indexOf(',') + 1));
// Now you can create an array of byte values using charCodeAt and looping
// over the byte string.
var byteNumbers = new Array(byteCharacters.length);
for(var i = 0; i < byteCharacters.length; i++){
byteNumbers[i] = byteCharacters.charCodeAt(i);
}
// Pass the resulting array to Uint8Array to create a typed array of
// 8-bit, unsigned integers.
var byteArray = new Uint8Array(byteNumbers);
// This can then can be passed to the Blob constructor.
var blob = new Blob([byteArray], {type: 'video/ogg'});
// Now that you have a blob, you can pass it to the createObjectURL method.
var url = URL.createObjectURL(blob);
// The resulting URL can be attached to the src attribute of your video.
document.getElementById('video').src = url;
}
// Pass the response to the FileReader using readAsDataURL.
// This will give you a base-64 encoded string representation of the file.
reader.readAsDataURL(xhr.response);
};
xhr.open('GET', 'https://upload.wikimedia.org/wikipedia/commons/c/c0/Roaring_Burps.ogg');
xhr.send();
<video controls="" id="video"></video>
To make the Blob URL, I found this answer. This will load large files much faster than DavidDomain's answer (which took unacceptably long for my case of a >100MB video file). Although, I believe that this will download the whole video into the browser's memory, and embed the data into the DOM, so larger files might still cause other performance issues.
Why do you want to "[protect] the original source files location" of the video? If something finds the video's location and requests the video file, then that file should be served: that's a server's job.
AFAIK it's practically impossible to load a video file without exposing the URL required to obtain that video file. (It should be technically possible to embed it into the DOM server-side, but that would force the entire video to be loaded before the page shows anything, and would be unusable)

Base64 image upload in Web but not upload in device Cordova

I'm developing an app and trying to upload an image in base64 to my server. Testing the code via web it works perfectly! And when I convert the image in base64 to File, it returns me the following object:
But when I try to make the same thing in my mobile app running on an Android device, the same code creates to me the following File object:
What is happening? I call the function to convert my base64 image to File as this:
var blob = new Blob([photoBase64], {type: 'image/png'});
var filePhoto = new File([blob], "employeePhoto.png");
In first image (which worked) I was running on web browser.
In second image (not worked) I was running on Android app.
It is the same code...
It appears that File constructor have different behavior in web browser and Android app. I am not understanding this. Passing the parameters to File constructor in the same order create different File objects (as the described on images).
If you are trying to convert base64 to File, you can utilize part of canvas.toBlob() polyfill
var base64 = "data:image/png,base64,"; // data URI
var binStr = atob(base64.split(",")[1]),
len = binStr.length,
arr = new Uint8Array(len);
for (var i = 0; i < len; i++) {
arr[i] = binStr.charCodeAt(i);
}
var file = new File([arr], "image");
var img = new Image;
img.onload = function() {
document.body.appendChild(this);
URL.revokeObjectURL(url);
}
var url = URL.createObjectURL(file);
img.src = url;

Reading raw binary image data for Internet Explorer 10 and 11: HTML7007, Data freed

I'm currently working on a little image viewer which reads images that are in a local cache and not placed on the file system. So I'm requesting the images data by using a XMLHTTPRequest. No problem so far.. After that I'm generating an ArrayBuffer using that function:
function stringToArrayBuffer(string) {
if(!string.length) return;
var buffer = new ArrayBuffer(string.length);
var bufferElem = new Uint8Array(buffer);
for (var i = 0; i < string.length; i++) {
bufferElem[i] = string.charCodeAt(i) & 0xff;
}
return buffer;
}
Works fine. Even in Internet Explorer 10 and 11.
Here, after the call of the stringToArrayBuffer() call, begins the Internet Explorer nightmare:
function convertImage(tmpstring) {
var arrayBuffer = stringToArrayBuffer(tmpstring);
var filereader = new FileReader();
var arrayBufferView = new Uint8Array(arrayBuffer);
blob = new Blob([arrayBufferView], {'type': 'image\/jpeg'});
filereader.readAsDataURL(blob);
return filereader;
}
Works fine in Firefox, Chrome/Chromium and Opera, but Internet Explorer generates strange colorful trash images and the debugger says:
HTML7007: One or more blob URLs were revoked by closing the blob for which
they were created. These URLs will no longer resolve as the data backing the
URL has been freed.
Any ideas on that?

Rapidly updating image with Data URI causes caching, memory leak

I have a webpage that rapidly streams JSON from the server and displays bits of it, about 10 times/second. One part is a base64-encoded PNG image. I've found a few different ways to display the image, but all of them cause unbounded memory usage. It rises from 50mb to 2gb within minutes. Happens with Chrome, Safari, and Firefox. Haven't tried IE.
I discovered the memory usage first by looking at Activity Monitor.app -- the Google Chrome Renderer process continuously eats memory. Then, I looked at Chrome's Resource inspector (View > Developer > Developer Tools, Resources), and I saw that it was caching the images. Every time I changed the img src, or created a new Image() and set its src, Chrome cached it. I can only imagine the other browsers are doing the same.
Is there any way to control this caching? Can I turn it off, or do something sneaky so it never happens?
Edit: I'd like to be able to use the technique in Safari/Mobile Safari. Also, I'm open to other methods of rapidly refreshing an image if anyone has any ideas.
Here are the methods I've tried. Each one resides in a function that gets called on AJAX completion.
Method 1 - Directly set the src attribute on an img tag
Fast. Displays nicely. Leaks like crazy.
$('#placeholder_img').attr('src', 'data:image/png;base64,' + imgString);
Method 2 - Replace img with a canvas, and use drawImage
Displays fine, but still leaks.
var canvas = document.getElementById("placeholder_canvas");
var ctx = canvas.getContext("2d");
var img = new Image();
img.onload = function() {
ctx.drawImage(img, 0, 0);
}
img.src = "data:image/png;base64," + imgString;
Method 3 - Convert to binary and replace canvas contents
I'm doing something wrong here -- the images display small and look like random noise. This method uses a controlled amount of memory (grows to 100mb and stops), but it is slow, especially in Safari (~50% CPU usage there, 17% in Chrome). The idea came from this similar SO question: Data URI leak in Safari (was: Memory Leak with HTML5 canvas)
var img = atob(imgString);
var binimg = [];
for(var i = 0; i < img.length; i++) {
binimg.push(img.charCodeAt(i));
}
var bytearray = new Uint8Array(binimg);
// Grab the existing image from canvas
var ctx = document.getElementById("placeholder_canvas").getContext("2d");
var width = ctx.canvas.width,
height = ctx.canvas.height;
var imgdata = ctx.getImageData(0, 0, width, height);
// Overwrite it with new data
for(var i = 8, len = imgdata.data.length; i < len; i++) {
imgdata.data[i-8] = bytearray[i];
}
// Write it back
ctx.putImageData(imgdata, 0, 0);
I know it's been years since this issue was posted, but the problem still exists in recent versions of Safari Browser. So I have a definitive solution that works in all browsers, and I think this could save jobs or lives!.
Copy the following code somewhere in your html page:
// Methods to address the memory leaks problems in Safari
var BASE64_MARKER = ';base64,';
var temporaryImage;
var objectURL = window.URL || window.webkitURL;
function convertDataURIToBlob(dataURI) {
// Validate input data
if(!dataURI) return;
// Convert image (in base64) to binary data
var base64Index = dataURI.indexOf(BASE64_MARKER) + BASE64_MARKER.length;
var base64 = dataURI.substring(base64Index);
var raw = window.atob(base64);
var rawLength = raw.length;
var array = new Uint8Array(new ArrayBuffer(rawLength));
for(i = 0; i < rawLength; i++) {
array[i] = raw.charCodeAt(i);
}
// Create and return a new blob object using binary data
return new Blob([array], {type: "image/jpeg"});
}
Then when you receive a new frame/image base64Image in base64 format (e.g. data:image/jpeg;base64, LzlqLzRBQ...) and you want to update a html <img /> object imageElement, then use this code:
// Destroy old image
if(temporaryImage) objectURL.revokeObjectURL(temporaryImage);
// Create a new image from binary data
var imageDataBlob = convertDataURIToBlob(base64Image);
// Create a new object URL object
temporaryImage = objectURL.createObjectURL(imageDataBlob);
// Set the new image
imageElement.src = temporaryImage;
Repeat this last code as much as needed and no memory leaks will appear. This solution doesn't require the use of the canvas element, but you can adapt the code to make it work.
Try setting image.src = "" after drawing.
var canvas = document.getElementById("placeholder_canvas");
var ctx = canvas.getContext("2d");
var img = new Image();
img.onload = function() {
ctx.drawImage(img, 0, 0);
//after drawing set src empty
img.src = "";
}
img.src = "data:image/png;base64," + imgString;
This might be helped
I don't think there are any guarantees given about the memory usage of data URLs. If you can figure out a way to get them to behave in one browser, it guarantees little if not nothing about other browsers or versions.
If you put your image data into a blob and then create a blob URL, you can then deallocate that data.
Here's an example which turns a data URI into a blob URL; you may need to change / drop the webkit- & WebKit- prefixes on browsers other than Chrome and possibly future versions of Chrome.
var parts = dataURL.match(/data:([^;]*)(;base64)?,([0-9A-Za-z+/]+)/);
//assume base64 encoding
var binStr = atob(parts[3]);
//might be able to replace the following lines with just
// var view = new Uint8Array(binStr);
//haven't tested.
//convert to binary in ArrayBuffer
var buf = new ArrayBuffer(binStr.length);
var view = new Uint8Array(buf);
for(var i = 0; i < view.length; i++)
view[i] = binStr.charCodeAt(i);
//end of the possibly unnecessary lines
var builder = new WebKitBlobBuilder();
builder.append(buf);
//create blob with mime type, create URL for it
var URL = webkitURL.createObjectURL(builder.getBlob(parts[1]))
return URL;
Deallocating is as easy as :
webkitURL.revokeObjectURL(URL);
And you can use your blob URL as your img's src.
Unfortunately, blob URLs do not appear to be supported in IE prior to v10.
API reference:
http://www.w3.org/TR/FileAPI/#dfn-createObjectURL
http://www.w3.org/TR/FileAPI/#dfn-revokeObjectURL
Compatibility reference:
http://caniuse.com/#search=blob%20url
I had a very similar issue.
Setting img.src to dataUrl Leaks Memory
Long story short, I simply worked around the Image element. I use a javascript decoder to decode and display the image data onto a canvas. Unless the user tries to download the image, they'll never know the difference either. The other downside is that you're going to be limited to modern browsers. The up side is that this method doesn't leak like a sieve :)
patching up ellisbben's answer, since BlobBuilder is obsoleted and https://developer.mozilla.org/en-US/Add-ons/Code_snippets/StringView provides what appears to be a nice quick conversion from base64 to UInt8Array:
in html:
<script src='js/stringview.js'></script>
in js:
window.URL = window.URL ||
window.webkitURL;
function blobify_dataurl(dataURL){
var parts = dataURL.match(/data:([^;]*)(;base64)?,([0-9A-Za-z+/]+)/);
//assume base64 encoding
var binStr = atob(parts[3]);
//convert to binary in StringView
var view = StringView.base64ToBytes(parts[3]);
var blob = new Blob([view], {type: parts[1]}); // pass a useful mime type here
//create blob with mime type, create URL for it
var outURL = URL.createObjectURL(blob);
return outURL;
}
I still don't see it actually updating the image in Safari mobile, but chrome can receive dataurls rapid-fire over websocket and keep up with them far better than having to manually iterate over the string. And if you know you'll always have the same type of dataurl, you could even swap the regex out for a substring (likely quicker...?)
Running some quick memory profiles, it looks like Chrome is even able to keep up with deallocations (if you remember to do them...):
URL.revokeObjectURL(outURL);
I have used different methods to solve this problem, none of them works. It seems that memory leaks when img.src = base64string and those memory can never get released. Here is my solution.
fs.writeFile('img0.jpg', img_data, function (err) {
// console.log("save img!" );
});
document.getElementById("my-img").src = 'img0.jpg?'+img_step;
img_step+=1;
Note that you should convert base64 to jpeg buffer.
My Electron app updating img every 50ms, and memory doesn't leak.
Forget about disk usage. Chrome's memory management piss me off.
Unless Safari or Mobile Safari don't leak data urls, server-side might be the only way to do this on all browsers.
Probably most straightforward would be to make a URL for your image stream, GETting it gives a 302 or 303 response redirecting to a single-use URL that will give the desired image. You will probably have to destroy and re-create the image tags to force a reload of the URL.
You will also be at the mercy of the browser regarding its img caching behavior. And the mercy of my understanding (or lack of understanding) of the HTTP spec. Still, unless server-side operation doesn't fit your requirements, try this first. It adds to the complexity of the server, but this approach uses the browser much more naturally.
But what about using the browser un-naturally? Depending on how browsers implement iframes and handle their associated content, you might be able to get data urls working without leaking the memory. This is kinda Frankenstein shit and is exactly the sort of nonsense that no one should have to do. Upside: it could work. Downside: there are a bazillion ways to try it and uneven, undocumented behavior is exactly what I'd expect.
One idea: embed an iframe containing a page; this page and the page that it is embedded in use cross document messaging (note the GREEN in the compatibility matrix!); embeddee gets the PNG string and passes it along to the embedded page, which then makes an appropriate img tag. When the embeddee needs to display a new message, it destroys the embedded iframe (hopefully releasing the memory of the data url) then creates a new one and passes it the new PNG string.
If you want to be marginally more clever, you could actually embed the source for the embedded frame in the embeddee page as a data url; however, this might leak that data url, which I guess would be poetic justice for trying such a reacharound.
"Something that works in Safari would be better." Browser technology keeps on moving forward, unevenly. When they don't hand the functionality to you on a plate, you gotta get devious.
var inc = 1;
var Bulk = 540;
var tot = 540;
var audtot = 35.90;
var canvas = document.getElementById("myCanvas");
//var imggg = document.getElementById("myimg");
canvas.width = 550;
canvas.height = 400;
var context = canvas.getContext("2d");
var variation = 0.2;
var interval = 65;
function JLoop() {
if (inc < tot) {
if (vid.currentTime < ((audtot * inc) / tot) - variation || (vid.currentTime > ((audtot * inc) / tot) + variation)) {
contflag = 1;
vid.currentTime = ((audtot * inc) / tot);
}
// Draw the animation
try {
context.clearRect(0, 0, canvas.width, canvas.height);
if (arr[inc - 1] != undefined) {
context.drawImage(arr[inc - 1], 0, 0, canvas.width, canvas.height);
arr[inc - 1].src = "";
//document.getElementById("myimg" + inc).style.display = "block";;
// document.getElementById("myimg" + (inc-1)).style.display = "none";
//imggg.src = arr[inc - 1].src;
}
$("#audiofile").val(inc);
// clearInterval(ref);
} catch (e) {
}
inc++;
// interval = 60;
//setTimeout(JLoop, interval);
}
else {
}
}
var ref = setInterval(JLoop, interval);
});
Worked for me on memory leak thanks dude.

Categories