I'm working with a big company, and part of their flow is scanning a QR code to register some data. The problem is, in order to test this, I need to generate a QR code from the data, photograph it on my phone, and scan it in through my laptop's camera.
There are NPM modules for creating QR codes from data so that's okay, but I was wondering if it's somehow possible to override getUserMedia to return a stream of bytes that is just a QR code? I was thinking of maybe encapsulating all this into one nice chrome extension, but from looking around online, I'm not sure how I'd 'override' the camera input and replace it with a stream of QR code bytes instead.
Thanks
The HTMLCanvasElement has a captureStream() method that does produce a MediaStream with a VideoTrack similar to what getUserMedia({video: true}) produces.
This is a convenient way to test various things with a video stream, without needing an human in the loop:
const width = 1280;
const height = 720
const canvas = Object.assign(document.createElement("canvas"), {width, height});
const ctx = canvas.getContext("2d");
// you'd do the drawing you wish, here I prepare some noise
const img = new ImageData(width, height);
const data = new Uint32Array(img.data.buffer);
const anim = () => {
for(let i = 0; i<data.length; i++) {
data[i] = 0xFF000000 + Math.random()*0xFFFFFF;
}
ctx.putImageData(img, 0, 0);
requestAnimationFrame(anim);
};
requestAnimationFrame(anim);
// extract the MediaStream from the canvas
const stream = canvas.captureStream();
// Use it in your test (here I'll just display it in the <video>)
document.querySelector("video").srcObject = stream;
video { height: 100vh }
<video controls autoplay></video>
But in your case, you need to separate the concerns.
The QR code detection tests should be done on their own, and these can certainly use still images instead of a MediaStream.
I'm a web developer from japan.
This is first question on stack over flow.
I'm creating a simple music Web application now.
making a music system program is a completely beginner, so I am struggling to implement it.
As a result of various investigations, I noticed that using the Web Audio API was the best choice,
so, I decided to use it.
▼ What I want to achieve
Multiple Wav files load with the Web audio API can be grouped into one Wav file &To be able to download from the browser.
For example, load the multiple wav file like guitar, drum and piano, and
edit it on the browser, and finally output it as one Wav file.
Then we can download that edited wav file from the browser and we are able to play itunes.
▼ Question
Is it possible to achieve this requirements by just using web audio api ?
or we need to use another Library ?
I checked Record.js on github but development has stopped about 2 ~ 3 years and has many issues and I can not get support. so I decided not to use it.
and also I checked similar issue Web audio API: scheduling sounds and exporting the mix
Since the information is old, I do not know if I can still use it
thanks.
Hi and welcome to Stack Overflow!
Is it possible to achieve this just using the web audio api?
In terms of merging/mixing the files together this is perfectly achievable! This article goes through many (if not all) of the steps you will need to carry out the task you suggested.
Each file you want to upload can be loaded into an AudioBufferSource (examples explained in that article linked before) Example setting up a buffer source once the audio data has been loaded in:
play: function (data, callback) {
// create audio node and play buffer
var me = this,
source = this.context.createBufferSource(),
gainNode = this.context.createGain();
if (!source.start) { source.start = source.noteOn; }
if (!source.stop) { source.stop = source.noteOff; }
source.connect(gainNode);
gainNode.connect(this.context.destination);
source.buffer = data;
source.loop = true;
source.startTime = this.context.currentTime; // important for later!
source.start(0);
return source;
}
There are then also specific nodes already designed for your mixing purposes like the ChannelMergerNode (combines multiple mono channels into a new channel buffer). This is if you don't want to deal with the signal processing yourself in javascript but will be faster using the Web Audio objects since they are native compiled code already within the browser.
Following that complete guide sent before, there are also options to export the file (as a .wav in the demo case) using the following code :
var rate = 22050;
function exportWAV(type, before, after){
if (!before) { before = 0; }
if (!after) { after = 0; }
var channel = 0,
buffers = [];
for (channel = 0; channel < numChannels; channel++){
buffers.push(mergeBuffers(recBuffers[channel], recLength));
}
var i = 0,
offset = 0,
newbuffers = [];
for (channel = 0; channel < numChannels; channel += 1) {
offset = 0;
newbuffers[channel] = new Float32Array(before + recLength + after);
if (before > 0) {
for (i = 0; i < before; i += 1) {
newbuffers[channel].set([0], offset);
offset += 1;
}
}
newbuffers[channel].set(buffers[channel], offset);
offset += buffers[channel].length;
if (after > 0) {
for (i = 0; i < after; i += 1) {
newbuffers[channel].set([0], offset);
offset += 1;
}
}
}
if (numChannels === 2){
var interleaved = interleave(newbuffers[0], newbuffers[1]);
} else {
var interleaved = newbuffers[0];
}
var downsampledBuffer = downsampleBuffer(interleaved, rate);
var dataview = encodeWAV(downsampledBuffer, rate);
var audioBlob = new Blob([dataview], { type: type });
this.postMessage(audioBlob);
}
So I think Web-Audio has everything you could want for this purpose! However could be challenging depending on your web development experience, but its a skill definately worth learning!
Do we need to use another library?
If you can I think it's definately worth trying it with Web-Audio, as you'll almost definately get the best speeds for processing, but there are other libraries such as Pizzicato.js just to name one. I'm sure you will find plenty others.
Is it possible to create an AudioBufferSourceNode from another node?
The reason I ask is that I'm trying to create a waveform visualization, not just from a buffer source, but of an entire finished song that I "program" with the web audio API. I can time and program the song perfectly and it sounds just the way I want in the end. I can also create individual visualizations from the source WAV files that I use to create the song.
I used the code from this post to create the visualizations and show them in multiple canvases:
Create a waveform of the full track with Web Audio API
So here's the code snippet that creates the visualizations:
var canvas1 = document.querySelector('.visualizer1');
var canvasCtx1 = canvas1.getContext("2d");
createvis(mybuffer, canvasCtx1);
function createvis(buff, Ctx) {
var leftChannel = buff.getChannelData(0); // Float32Array describing left channel
var lineOpacity = 640 / leftChannel.length;
Ctx.save();
Ctx.fillStyle = '#222';
Ctx.fillRect(0, 0, 640, 100);
Ctx.strokeStyle = '#121';
Ctx.globalCompositeOperation = 'lighter';
Ctx.translate(0, 100 / 2);
Ctx.globalAlpha = 0.06; // lineOpacity ;
for (var i = 0; i < leftChannel.length; i++) {
// on which line do we get ?
var x = Math.floor(640 * i / leftChannel.length);
var y = leftChannel[i] * 100 / 2;
Ctx.beginPath();
Ctx.moveTo(x, 0);
Ctx.lineTo(x + 1, y);
Ctx.stroke();
}
Ctx.restore();
}
The results ended up being pretty cool. I end up with an image for each source buffer that looks something like this:
http://www.bklorraine.com/waveform.png
However, as I load my various source buffers into createBufferSource nodes, I chain them and add effects and it would be nice for the waveforms I generate here to reflect these effects. It would also be nice if the final context.destination that contains the output of everything (ie the finished song) could have a visualization as well.
I would also like to eventually change my "program" so that it doesn't just assemble a song from source wav files, but also combines them with nodes that generate a signal from scratch and, obviously, these particular sounds wouldn't have any sort of source buffer to create a visualization from.
Does anyone know if this is possible using the javascript web audio api?
So I have a downloading .mp4 file. I would like to stream the download file into a video element using the MediaSource API. How would I do this?
const NUM_CHUNKS = 5;
var video = document.querySelector('video');
video.src = video.webkitMediaSourceURL;
video.addEventListener('webkitsourceopen', function(e) {
var chunkSize = Math.ceil(file.size / NUM_CHUNKS);
// Slice the video into NUM_CHUNKS and append each to the media element.
for (var i = 0; i < NUM_CHUNKS; ++i) {
var startByte = chunkSize * i;
// file is a video file.
var chunk = file.slice(startByte, startByte + chunkSize);
var reader = new FileReader();
reader.onload = (function(idx) {
return function(e) {
video.webkitSourceAppend(new Uint8Array(e.target.result));
logger.log('appending chunk:' + idx);
if (idx == NUM_CHUNKS - 1) {
video.webkitSourceEndOfStream(HTMLMediaElement.EOS_NO_ERROR);
}
};
})(i);
reader.readAsArrayBuffer(chunk);
}
}, false);
How would I dynamically change the NUM_CHUNKS,and slice the video?
The code you're using from Eric Bidelman chops up a video that the browser already fully downloaded to demonstrate how the api works. In reality, you'd slice the video on the server, and the client would download each chunk in order, probably with an AJAX request.
I'd first suggest you try your .mp4 in the demo code you have, because MediaSource seems pretty picky about the format of the video files it accepts. See Steven Robertson's answer about how to create an mp4 that'll work.
Then it's up to you whether you want to slice the video manually beforehand, or do it dynamically on the server (which will vary depending on your server). The javascript client shouldn't care how many or how large each chunk each is, as long as they're fed in order (and I think the spec even allows some amount of out-of-order appending).
webkitMediaSourceURL; is now outdated in Chrome, and now createObjectURL(); needs to be used.
The patch here: HTMLMediaElement to the new OO MediaSource API gave me some pointers as to what I needed to update in my code.
I have a webpage that rapidly streams JSON from the server and displays bits of it, about 10 times/second. One part is a base64-encoded PNG image. I've found a few different ways to display the image, but all of them cause unbounded memory usage. It rises from 50mb to 2gb within minutes. Happens with Chrome, Safari, and Firefox. Haven't tried IE.
I discovered the memory usage first by looking at Activity Monitor.app -- the Google Chrome Renderer process continuously eats memory. Then, I looked at Chrome's Resource inspector (View > Developer > Developer Tools, Resources), and I saw that it was caching the images. Every time I changed the img src, or created a new Image() and set its src, Chrome cached it. I can only imagine the other browsers are doing the same.
Is there any way to control this caching? Can I turn it off, or do something sneaky so it never happens?
Edit: I'd like to be able to use the technique in Safari/Mobile Safari. Also, I'm open to other methods of rapidly refreshing an image if anyone has any ideas.
Here are the methods I've tried. Each one resides in a function that gets called on AJAX completion.
Method 1 - Directly set the src attribute on an img tag
Fast. Displays nicely. Leaks like crazy.
$('#placeholder_img').attr('src', 'data:image/png;base64,' + imgString);
Method 2 - Replace img with a canvas, and use drawImage
Displays fine, but still leaks.
var canvas = document.getElementById("placeholder_canvas");
var ctx = canvas.getContext("2d");
var img = new Image();
img.onload = function() {
ctx.drawImage(img, 0, 0);
}
img.src = "data:image/png;base64," + imgString;
Method 3 - Convert to binary and replace canvas contents
I'm doing something wrong here -- the images display small and look like random noise. This method uses a controlled amount of memory (grows to 100mb and stops), but it is slow, especially in Safari (~50% CPU usage there, 17% in Chrome). The idea came from this similar SO question: Data URI leak in Safari (was: Memory Leak with HTML5 canvas)
var img = atob(imgString);
var binimg = [];
for(var i = 0; i < img.length; i++) {
binimg.push(img.charCodeAt(i));
}
var bytearray = new Uint8Array(binimg);
// Grab the existing image from canvas
var ctx = document.getElementById("placeholder_canvas").getContext("2d");
var width = ctx.canvas.width,
height = ctx.canvas.height;
var imgdata = ctx.getImageData(0, 0, width, height);
// Overwrite it with new data
for(var i = 8, len = imgdata.data.length; i < len; i++) {
imgdata.data[i-8] = bytearray[i];
}
// Write it back
ctx.putImageData(imgdata, 0, 0);
I know it's been years since this issue was posted, but the problem still exists in recent versions of Safari Browser. So I have a definitive solution that works in all browsers, and I think this could save jobs or lives!.
Copy the following code somewhere in your html page:
// Methods to address the memory leaks problems in Safari
var BASE64_MARKER = ';base64,';
var temporaryImage;
var objectURL = window.URL || window.webkitURL;
function convertDataURIToBlob(dataURI) {
// Validate input data
if(!dataURI) return;
// Convert image (in base64) to binary data
var base64Index = dataURI.indexOf(BASE64_MARKER) + BASE64_MARKER.length;
var base64 = dataURI.substring(base64Index);
var raw = window.atob(base64);
var rawLength = raw.length;
var array = new Uint8Array(new ArrayBuffer(rawLength));
for(i = 0; i < rawLength; i++) {
array[i] = raw.charCodeAt(i);
}
// Create and return a new blob object using binary data
return new Blob([array], {type: "image/jpeg"});
}
Then when you receive a new frame/image base64Image in base64 format (e.g. data:image/jpeg;base64, LzlqLzRBQ...) and you want to update a html <img /> object imageElement, then use this code:
// Destroy old image
if(temporaryImage) objectURL.revokeObjectURL(temporaryImage);
// Create a new image from binary data
var imageDataBlob = convertDataURIToBlob(base64Image);
// Create a new object URL object
temporaryImage = objectURL.createObjectURL(imageDataBlob);
// Set the new image
imageElement.src = temporaryImage;
Repeat this last code as much as needed and no memory leaks will appear. This solution doesn't require the use of the canvas element, but you can adapt the code to make it work.
Try setting image.src = "" after drawing.
var canvas = document.getElementById("placeholder_canvas");
var ctx = canvas.getContext("2d");
var img = new Image();
img.onload = function() {
ctx.drawImage(img, 0, 0);
//after drawing set src empty
img.src = "";
}
img.src = "data:image/png;base64," + imgString;
This might be helped
I don't think there are any guarantees given about the memory usage of data URLs. If you can figure out a way to get them to behave in one browser, it guarantees little if not nothing about other browsers or versions.
If you put your image data into a blob and then create a blob URL, you can then deallocate that data.
Here's an example which turns a data URI into a blob URL; you may need to change / drop the webkit- & WebKit- prefixes on browsers other than Chrome and possibly future versions of Chrome.
var parts = dataURL.match(/data:([^;]*)(;base64)?,([0-9A-Za-z+/]+)/);
//assume base64 encoding
var binStr = atob(parts[3]);
//might be able to replace the following lines with just
// var view = new Uint8Array(binStr);
//haven't tested.
//convert to binary in ArrayBuffer
var buf = new ArrayBuffer(binStr.length);
var view = new Uint8Array(buf);
for(var i = 0; i < view.length; i++)
view[i] = binStr.charCodeAt(i);
//end of the possibly unnecessary lines
var builder = new WebKitBlobBuilder();
builder.append(buf);
//create blob with mime type, create URL for it
var URL = webkitURL.createObjectURL(builder.getBlob(parts[1]))
return URL;
Deallocating is as easy as :
webkitURL.revokeObjectURL(URL);
And you can use your blob URL as your img's src.
Unfortunately, blob URLs do not appear to be supported in IE prior to v10.
API reference:
http://www.w3.org/TR/FileAPI/#dfn-createObjectURL
http://www.w3.org/TR/FileAPI/#dfn-revokeObjectURL
Compatibility reference:
http://caniuse.com/#search=blob%20url
I had a very similar issue.
Setting img.src to dataUrl Leaks Memory
Long story short, I simply worked around the Image element. I use a javascript decoder to decode and display the image data onto a canvas. Unless the user tries to download the image, they'll never know the difference either. The other downside is that you're going to be limited to modern browsers. The up side is that this method doesn't leak like a sieve :)
patching up ellisbben's answer, since BlobBuilder is obsoleted and https://developer.mozilla.org/en-US/Add-ons/Code_snippets/StringView provides what appears to be a nice quick conversion from base64 to UInt8Array:
in html:
<script src='js/stringview.js'></script>
in js:
window.URL = window.URL ||
window.webkitURL;
function blobify_dataurl(dataURL){
var parts = dataURL.match(/data:([^;]*)(;base64)?,([0-9A-Za-z+/]+)/);
//assume base64 encoding
var binStr = atob(parts[3]);
//convert to binary in StringView
var view = StringView.base64ToBytes(parts[3]);
var blob = new Blob([view], {type: parts[1]}); // pass a useful mime type here
//create blob with mime type, create URL for it
var outURL = URL.createObjectURL(blob);
return outURL;
}
I still don't see it actually updating the image in Safari mobile, but chrome can receive dataurls rapid-fire over websocket and keep up with them far better than having to manually iterate over the string. And if you know you'll always have the same type of dataurl, you could even swap the regex out for a substring (likely quicker...?)
Running some quick memory profiles, it looks like Chrome is even able to keep up with deallocations (if you remember to do them...):
URL.revokeObjectURL(outURL);
I have used different methods to solve this problem, none of them works. It seems that memory leaks when img.src = base64string and those memory can never get released. Here is my solution.
fs.writeFile('img0.jpg', img_data, function (err) {
// console.log("save img!" );
});
document.getElementById("my-img").src = 'img0.jpg?'+img_step;
img_step+=1;
Note that you should convert base64 to jpeg buffer.
My Electron app updating img every 50ms, and memory doesn't leak.
Forget about disk usage. Chrome's memory management piss me off.
Unless Safari or Mobile Safari don't leak data urls, server-side might be the only way to do this on all browsers.
Probably most straightforward would be to make a URL for your image stream, GETting it gives a 302 or 303 response redirecting to a single-use URL that will give the desired image. You will probably have to destroy and re-create the image tags to force a reload of the URL.
You will also be at the mercy of the browser regarding its img caching behavior. And the mercy of my understanding (or lack of understanding) of the HTTP spec. Still, unless server-side operation doesn't fit your requirements, try this first. It adds to the complexity of the server, but this approach uses the browser much more naturally.
But what about using the browser un-naturally? Depending on how browsers implement iframes and handle their associated content, you might be able to get data urls working without leaking the memory. This is kinda Frankenstein shit and is exactly the sort of nonsense that no one should have to do. Upside: it could work. Downside: there are a bazillion ways to try it and uneven, undocumented behavior is exactly what I'd expect.
One idea: embed an iframe containing a page; this page and the page that it is embedded in use cross document messaging (note the GREEN in the compatibility matrix!); embeddee gets the PNG string and passes it along to the embedded page, which then makes an appropriate img tag. When the embeddee needs to display a new message, it destroys the embedded iframe (hopefully releasing the memory of the data url) then creates a new one and passes it the new PNG string.
If you want to be marginally more clever, you could actually embed the source for the embedded frame in the embeddee page as a data url; however, this might leak that data url, which I guess would be poetic justice for trying such a reacharound.
"Something that works in Safari would be better." Browser technology keeps on moving forward, unevenly. When they don't hand the functionality to you on a plate, you gotta get devious.
var inc = 1;
var Bulk = 540;
var tot = 540;
var audtot = 35.90;
var canvas = document.getElementById("myCanvas");
//var imggg = document.getElementById("myimg");
canvas.width = 550;
canvas.height = 400;
var context = canvas.getContext("2d");
var variation = 0.2;
var interval = 65;
function JLoop() {
if (inc < tot) {
if (vid.currentTime < ((audtot * inc) / tot) - variation || (vid.currentTime > ((audtot * inc) / tot) + variation)) {
contflag = 1;
vid.currentTime = ((audtot * inc) / tot);
}
// Draw the animation
try {
context.clearRect(0, 0, canvas.width, canvas.height);
if (arr[inc - 1] != undefined) {
context.drawImage(arr[inc - 1], 0, 0, canvas.width, canvas.height);
arr[inc - 1].src = "";
//document.getElementById("myimg" + inc).style.display = "block";;
// document.getElementById("myimg" + (inc-1)).style.display = "none";
//imggg.src = arr[inc - 1].src;
}
$("#audiofile").val(inc);
// clearInterval(ref);
} catch (e) {
}
inc++;
// interval = 60;
//setTimeout(JLoop, interval);
}
else {
}
}
var ref = setInterval(JLoop, interval);
});
Worked for me on memory leak thanks dude.