How can i optimaly send a drawing via websocket? - javascript

I'm currently working on an online pictionnary (one person do a drawing and the others have to guess what it is).
Currently i'm sending my entire image from the drawer (in a canvas) base64 on the "mouseup" DOM event to the server and i send it back to the guesser in the same format, my problem is that it seems pretty heavy and i have no idea how to have a lighter payload.
$('#canvas').on('mouseup touchend', function() {
mouse.down = false;
// My problem is sending canvas.toDataURL() every time
update_canvas(canvas.toDataURL());
});
// I use this function inside ActionCable so this.perform will send the data to my channel
var update_canvas = function(data_url) {
this.perform('update_canvas', {data_url: data_url});
},
Do you have a better way to send my image data ?

You could either send a list of mouse positions or incremental changes of the canvas, i.e send only new drawings using a second canvas.

Well i think the best approach would be to send the coordinates you are drawing, i have implemented a similar kind of thing, u cn hv a look
http://connectboard.herokuapp.com/
open it in two or more different browser or computer and draw.

Related

Pass canvas from client (browser) to server (nodejs)

I have a canvas in my browser that displays a feed from my webcam. What I want to do, is send the canvas data to my nodejs server, manipulate it, and send it back.
I can do it sending the canvas data via socket.io like so:
socket.emit('canvas_data', canvas.toDataURL());
And then rebuilding it on the nodejs server:
let img = new Image();
img.src = data; // this is the canvas_data from the first step
const canvas = createCanvas(640,480);
const ctx = canvas.getContext('2d');
ctx.drawImage(img,0,0,640,480);
However this seems really wasteful as I'm taking an already rendered canvas, converting it to base64, sending it, and then rebuilding it on the other side.
The whole point of this is to use tfjs on the server side:
let converted = tfjs.browser.fromPixels(canvas);
If I just send the canvas from the first step:
socket.emit('canvas_data', canvas);
And then run tfjs:
let converted = tfjs.browser.fromPixels(data);
I get the following error:
Error: pixels passed to tf.browser.fromPixels() must be either an HTMLVideoElement, HTMLImageElement, HTMLCanvasElement, ImageData in browser, or OffscreenCanvas, ImageData in webworker or {data: Uint32Array, width: number, height: number}, but was object
Is there a more efficient way to accomplish this?
using toDataURL is always going to be slow as browser needs to
encode all data before sending it.
your second example is better, just on the node side you need to create tensor from Buffer that you receive on socket (that would be fastest way), no need to use higher-level functions such as fromPixels
take a look at https://github.com/vladmandic/anime/blob/main/sockets/anime.ts for client-side code and https://github.com/vladmandic/anime/blob/main/sockets/server.ts for server-side code
note you also may need to account for channel-depth (does your model work with rgba or rgb) and/or model specific any pre-processing normalization, that's handled in https://github.com/vladmandic/anime/blob/main/sockets/inference.ts

visualize mediastream which is coming from a remote peer connection

Since some days I`m trying to visualize an audiostream which is coming over webrtc.
We already wrote some visuals which are working fine for the normal local stream (webaudio microphone usage).
Then I found some really interesting things on https://github.com/muaz-khan/WebRTC-Experiment/tree/master/ for streaming the microphone input between different browsers.
We need this to have the same audio data from one backend for all clients in the frontend.
Everything works fine and some tests showed that we can hear each other. So I thought that it is also not a problem to visualize the incoming stream.
But: all frequency data are empty (zero), even if we can hear each other.
Does anybody has a solution or hint for this? Thanks in advance!
This is my test for analysing the remote frequence data:
include these files first:
webrtc-experiment.com/firebase.js
webrtc-experiment.com/one-to-many-audio-broadcasting/meeting.js
var meeting = new Meeting('test');
var audioContext = new window.webkitAudioContext();
var analyser = audioContext.createAnalyser();
// on getting local or remote streams
meeting.onaddstream = function(e) {
console.log(e.type);
console.log(e.audio);
console.log(e.stream);
if(e.type === 'local')
{
//try it with the local stream, it works!
}
else
{
var source = audioContext.createMediaStreamSource(e.stream);
source.connect(analyser);
analyser.connect(audioContext.destination);
console.log(analyser.fftSize);
console.log(analyser.frequencyBinCount);
analyser.fftSize = 64;
console.log(analyser.frequencyBinCount);
var frequencyData = new Uint8Array(analyser.frequencyBinCount);
analyser.getByteFrequencyData(frequencyData);
function update() {
requestAnimationFrame(update);
analyser.getByteFrequencyData(frequencyData);
console.log(frequencyData);
};
update();
}
};
meeting.check();
meeting.setup('test');
Note that analysing remote streams should work for Firefox and it is known to not to work in Chrome, see http://code.google.com/p/chromium/issues/detail?id=241543
A possible workaround could be taking the remote audio level value by using WebRTC statistics API.
If you go to chrome://webrtc-internals/ then select your page playing remote stream then one of the ssrc_XXXX_recv will contain dynamically changing audioOutputLevel value which you can use.
You can access the value using Chrome PeerConnection's statistics API, specifically, getStats() method.
A possible downside may be that this is the value of actual sound the user hears from the video/audio element, so if the user mutes or changes volume of the media element, it will affect the audioOutputLevel value.
Good luck! :-)
I found a simple "solution" at least for the next time:
I have plugged a male / male audio cable from microphone to headphone in all my clients. Then I read the local microphone stream and, what a miracle, i can visualize what Im hearing.
Not a good solution, but it`s doing the job..
One Question: Is it possible to re-grab the destination as a stream in javascript? Then I would not need the audio cables..

restoring a saved canvas always fails and errors

I have a canvas paint app I'm working on that uses mouse clicks to draw. Simple enough. There's a listener on mouseup that saves the current drawing vis-a-vis getImageData and sets a session cookie that the user did in fact draw. Snippet:
var canvasData;
function save () {
// get the data
canvasData = context.getImageData(0, 0, canvas.width, canvas.height);
};
...
this.mouseup = function (ev) {
if (tool.started) {
tool.mousemove(ev);
tool.started = false;
save();
document.cookie = 'redraw=true; path=/'
}
};
The functionality I'm looking for is for the user to be able to leave the page, and come back to it, non-cached, and have the site see their cookie, read the drawing and map it using putImageData. Snippet:
function restore () {
// restore the old canvas
context.putImageData(canvasData, 0, 0);
};
var checking = readCookie('redraw')
if (checking) {
restore();
};
But when I try to do that, I get error consoles saying "Image corrupt or truncated" and "TypeError: Value not an object" on the putImageData line.
When I tried just saving the canvas to memory (save to data, draw image) :
var savedData = new Image();
function save () {
savedData.src = canvas.toDataURL("image/png");
};
function restore () {
context.drawImage(savedData,0,0);
};
I got "NS_ERROR_NOT_AVAILABLE: Component is not available" and "permission denied to access property 'toString'". Anyone know what I'm doing wrong? I'd put it on jsfiddle, but in this case that won't work so much, so here's the full. Thanks.
Cookies are too limited in size so when you store data to them the data will be truncated it it exceeds the limit of about 4 kb - which isn't much when it comes to base64 encoded images (data url).
Modern clients (browsers) support more recent storing methods. You can use the following storage mechanisms in the major browsers (see link under each section for which browser support what):
localStorage
Stores data as a key/value pair. You can store the image as a Data Url or save arrays holding the data if that is more convenient.
This interface is synchronous.
(There is localSession as well which is for temporary storage).
Client-support:
http://caniuse.com/#search=localstorage
For the usage you are describing this interface is probably the simplest (if you store often you will probably prefer an async interface), example:
localStorage[myKey] = 'myData';
var myData = localStorage[myKey];
indexedDB
A little more complicated, but you can store data as Blob objects.
This interface is asynchronous.
Client-support:
http://caniuse.com/#search=indexeddb
Example:
http://www.html5rocks.com/en/tutorials/indexeddb/todo/
file API
Currently only supported in Chrome; works as a file-system with directories and so forth. Here you store everything as blobs
This interface is asynchronous.
Client-support:
http://caniuse.com/#search=filesystem
Example:
http://www.html5rocks.com/en/tutorials/file/dndfiles/
Web SQL
Officially deprecated, but still in use (and will be for a while) with browsers such as Safari and IIRC Opera. Chrome has support as well, but not Firefox and IE.
This interface is asynchronous.
Client-support:
http://caniuse.com/#search=websql
Example:
http://html5doctor.com/introducing-web-sql-databases/
userData
userData is supported in older IE versions. This API is somewhat limited though in more than one sense.
Cookies
Cookies are sent to server with each request so the bigger the cookie(s) the slower the communication. They are limited to 4 kb.
Be aware of that the client may or may not ask for user permission to store data in their computer for some of these interfaces - usually if you request more than 5 mb (Web SQL, indexedDB). This is up to the client though. This can seem intrusive to some users, so giving a heads-up to the user that the app will ask for this first can be a good idea.

How long does a Blob persist?

I'm trying to write a fail-safe program that uses the canvas to draw very large images (60 MB is probably the upper range, while 10 MB is the lower range). I have discovered long ago that calling the canvas's synchronous function toDataURL usually causes the page to crash in the browser, so I have adapted the program to use the toBlob method using a filler for cross-browser compatibility. My question is this: How long do Blob URLs using the URL.createObjectURL(blob) method last?
I would like to know if there's a way to cache the Blob URL that will allow it to last beyond the browser session in case somebody wants to render part of the image at one point, close the browser, and come back and finish it later by reading the Blob URL into the canvas again and resuming from the point at which it left off. I noticed that this optional autoRevoke argument may be what I'm looking for, but I'd like a confirmation that what I'm trying to do is actually possible. No code example is needed in your answer unless it involves a different solution, all I need is a yes or no on if it's possible to make a Blob URL last beyond sessions using this method or otherwise. (This would also be handy if for some reason the page crashes and it acts like a "restore session" option too.)
I was thinking of something like this:
function saveCache() {
var canvas = $("#canvas")[0];
canvas.toBlob(function (blob) {
/*if I understand correctly, this prevents it from unloading
automatically after one asynchronous callback*/
var blobURL = URL.createObjectURL(blob, {autoRevoke: false});
localStorage.setItem("cache", blobURL);
});
}
//assume that this might be a new browser session
function loadCache() {
var url = localStorage.getItem("cache");
if(typeof url=="string") {
var img = new Image();
img.onload = function () {
$("#canvas")[0].getContext("2d").drawImage(img, 0, 0);
//clear cache since it takes up a LOT unused of memory
URL.revokeObjectURL(url);
//remove reference to deleted cache
localStorage.removeItem("cache");
init(true); //cache successfully loaded, resume where it left off
};
img.onprogress = function (e) {
//update progress bar
};
img.onerror = loadFailed; //notify user of failure
img.src = url;
} else {
init(false); //nothing was cached, so start normally
}
}
Note that I am not certain this will work the way I intend, so any confirmation would be awesome.
EDIT just realized that sessionStorage is not the same thing as localStorage :P
Blob URL can last across sessions? Not the way you want it to.
The URL is a reference represented as a string, which you can save in localStorage just like any string. The location that URL points to is what you really want, and that won't persist across sessions.
When using URL.toObjectUrl() in conjuction with the autoRevoke argument, the URL will persist until you call revokeObjectUrl or "till the unloading document cleanup steps are executed." (steps outlined here: http://www.w3.org/TR/html51/browsers.html#unloading-document-cleanup-steps)
My guess is that those steps are being executed when the browser session expires, which is why the target of your blobURL can't be accessed in subsequent sessions.
Some other discourse on this: How to save the window.URL.createObjectURL() result for future use?
The above leads to a recommendation to use the FileSystem API to save the blob representation of your canvas element. When requesting the file system the first time, you'll need to request PERSISTENT storage, and the user will have to agree to let you store data on their machine permanently.
http://www.html5rocks.com/en/tutorials/file/filesystem/ has a good primer everything you'll need.

Large file upload with WebSocket

I'm trying to upload large files (at least 500MB, preferably up to a few GB) using the WebSocket API. The problem is that I can't figure out how to write "send this slice of the file, release the resources used then repeat". I was hoping I could avoid using something like Flash/Silverlight for this.
Currently, I'm working with something along the lines of:
function FileSlicer(file) {
// randomly picked 1MB slices,
// I don't think this size is important for this experiment
this.sliceSize = 1024*1024;
this.slices = Math.ceil(file.size / this.sliceSize);
this.currentSlice = 0;
this.getNextSlice = function() {
var start = this.currentSlice * this.sliceSize;
var end = Math.min((this.currentSlice+1) * this.sliceSize, file.size);
++this.currentSlice;
return file.slice(start, end);
}
}
Then, I would upload using:
function Uploader(url, file) {
var fs = new FileSlicer(file);
var socket = new WebSocket(url);
socket.onopen = function() {
for(var i = 0; i < fs.slices; ++i) {
socket.send(fs.getNextSlice()); // see below
}
}
}
Basically this returns immediately, bufferedAmount is unchanged (0) and it keeps iterating and adding all the slices to the queue before attempting to send it; there's no socket.afterSend to allow me to queue it properly, which is where I'm stuck.
Use web workers for large files processing instead doing it in main thread and upload chunks of file data using file.slice().
This article helps you to handle large files in workers. change XHR send to Websocket in main thread.
//Messages from worker
function onmessage(blobOrFile) {
ws.send(blobOrFile);
}
//construct file on server side based on blob or chunk information.
I believe the send() method is asynchronous which is why it will return immediately. To make it queue, you'd need the server to send a message back to the client after each slice is uploaded; the client can then decide whether it needs to send the next slice or a "upload complete" message back to the server.
This sort of thing would probably be easier using XMLHttpRequest(2); it has callback support built-in and is also more widely supported than the WebSocket API.
In order to serialize this operation you need the server to send you a signal every time a slice is received & written (or an error occurs), this way you could send the next slice in response to the onmessage event, pretty much like this:
function Uploader(url, file) {
var fs = new FileSlicer(file);
var socket = new WebSocket(url);
socket.onopen = function() {
socket.send(fs.getNextSlice());
}
socket.onmessage = function(ms){
if(ms.data=="ok"){
fs.slices--;
if(fs.slices>0) socket.send(fs.getNextSlice());
}else{
// handle the error code here.
}
}
}
You could use https://github.com/binaryjs/binaryjs or https://github.com/liamks/Delivery.js if you can run node.js on the server.
EDIT : The web world, browsers, firewalls, proxies, changed a lot since this answer was made. Right now, sending files using websockets
can be done efficiently, especially on local area networks.
Websockets are very efficient for bidirectional communication, especially when you're interested in pushing information (preferably small) from the server. They act as bidirectional sockets (hence their name).
Websockets don't look like the right technology to use in this situation. Especially given that using them adds incompatibilities with some proxies, browsers (IE) or even firewalls.
On the other end, uploading a file is simply sending a POST request to a server with the file in the body. Browsers are very good at that and the overhead for a big file is really near nothing. Don't use websockets for that task.
I think this socket.io project has a lot of potential:
https://github.com/sffc/socketio-file-upload
It supports chunked upload, progress tracking and seems fairly easy to use.

Categories