Is there a way to capture images from a webcam with JavaScript?
Nope - think of the security implications!
It is possible with Flash, but the user will see a popup requesting access to their webcam.
If the webcam had a web interface, then in theory it would be possible to just slap an image tag into a page somewhere and point it directly at the cam's snapshot interface:
<img src="http://address.of.webcam.example.com/webcam/capture" />
But otherwise, no. Standard Javascript has no API for accessing a webcam. There's no
var wc = new WebCam();
img = wc.capture();
type calls you can do.
The situation has changed from when this question was originally posted. The getUserMedia API was introduced to allow things like capturing webcam images. You can find tutorials and plugins demonstrating it.
But MDN now says the Navigator.getUserMedia API is deprecated in favour of the experimental API MediaDevices.getUserMedia. The getUserMedia tutorials and plugins don't work on iOS devices - they just don't support it.
The answer at the moment seems to be that there is an HTML API for it, but browser support is patchy and the API is possibly on its way out. You can only use it in Firefox and IE Edge, and in Chrome only from an https domain.
I would like to revive this question and ask if anyone knows of any web API that will successfully capture webcam images in all major browsers and devices.
Related
Some browsers (mobile Mi Browser, for instance) don't support WebRTC - they have no RTCPeerConnection API. So the users of your WebRTC web app have to open it in another one.
Is there a way to make your WebRTC app work without an explicit browser-change action from the user, especially on a mobile device?
I tried to investigate the following:
Deep Link. Looks like we can't redirect the user to another browser using deep link (I haven't found Chrome deep link for mobile).
Send WebRTC sources to browser / use third-party WebRTC lib. This won't work either, you need WebRTC support in the browser source code.
WebRTC is a framework based on a set of standards. It includes not only capability to get information about user input/output devices, but also set of network protocols which based on UDP (from getting client's IP to transfer arbitrary data through data channel using SCTP protocol). So, as you already may guess, it's impossible to support in a browser which doesn't have it, this is why point (2) will not work.
In case of point (1 - Open Chrome). On iOS exists custom protocol to open URL in chrome "googlechromes://stackoverflow.com", but it's better to explicitly say user that current browser doesn't support required functionality; And provide links on list of popular browsers for download (Chrome, Firefox, etc.); And already on these websites user will be redirected to the proper store for downloading native app.
I would like to capture either part or the whole screen and record it as a video from a web page in javascript.
Currently, I can record video from a web cam with the built in MediaRecorder but I would like to know if it is possible to get screen output and use that as a stream for the MediaRecorder?
I'd like to know if there is a standard way to do this without using any 3rd party libraries? (I can record audio/web cam video in almost all the browsers as of 2018)
You can record Screen or Windows only using browser extensions (in Chrome for instance with these). Which as you can see they ask specific permissions to the browser.
Alternatively you can record content inside your web app using MediaRecorder. But unfortunately has a pretty limited support.
As you pointed out, stream from webcam is widely supported today.
While visiting Hulu's website, I encountered this message from the browser:
I have never seen this before. Is this part of some new HTML5 API that everyone can use, or is it part of a vendor specific API, ie chrome.*? I don't think this is doable before but the HTML5 spec might have changed.
This is related to the playback of DRM media which is encrypted. There is an article here on how Netflix rolled out their protected playback for browsers.
Google has their own open source library called Shaka Player which would allow you to implement your own DRM media solution. You might be interested in the available underlying web APIs which makes this possible in the browser by looking under Media here.
And of course, you can control which sites are allowed to ID your device and play protected content by visiting this page in your Chrome config and looking under Protected Content.
I would like to make a web application where people can add recorded sounds and samples to a timeline.
I want it to output 1 soundfile (approximately 3min long) which will be send to the server.
Now I would like to do this with the HTML5 Audio API and found out that I could do this with the AudioContext But Audiocontext is only supported in chrome.
Now I do dislike flash and I wanted to ask if there is any way to do this with HTML5 with decent browser support (so newest Chrome, IE and Firefox).
Some thoughts:
I could record audio using the HTML5 audio API and the user can add it to the timeline. When finished the client will upload all the audio files to the server including the location on the timetable. The server can combine the audio files to make it one file. Now this is an solution, but I would prefer to do this kind of work on the client side
At the moment in my opinion it is not possible (if it should be html5 and supported by IE and Firefox), see the List of Browser that Support the Audio API
Browsers with Audio API Support. But this Information could be outdated already.(these Browsers update so frequently)
You could wait, first serve only Chrome and hope the other Browser, catch up(IE might be a problem). Or you use Java (if you dont like flash). The other technologie out there is Silverlight, but it is "dead", so i wouldn't recommend it.
I hope my input helps a bit.
So I am having a look into Firefox OS right now. One thing I would like to try is to manipulate the device camera's live feed using canvas et.al.
From what I can see in the blog posts (like this one) and the code in the boilerplate app this is always done using a MozActivity, meaning that the user is leaving the application, takes a picture and passes this picture back to the application, where I could post-process it.
But for live manipulation I would need to have a live camera feed inside my App, just like you would do using getUserMedia when accessing a computer's webcam. getUserMedia doesn't seem to be supported in the Firefox OS simulator though.
Is there any workaround to this or some API that I am missing?
I'll answer this with the response I got from the Mailing List (the answer is "not yet"):
WebRTC will (hopefully) land in Firefox OS 1.2 (as reference, the
initial launch is on 1.0.1, closely followed but dependent on carriers
is 1.1). WebRTC is a of lot of low-level hardware dependencies, so it
might take longer.
You can follow along on this meta bug and its dependencies:
https://bugzilla.mozilla.org/show_bug.cgi?id=750011
Thanks for the answer, Mozilla, can't wait for 1.2!
refer to thread getUserMedia for Firefox OS
WebRTC for audio is supported in Firefox OS 1.2.
But WebRTC for video (ex: live camera feed) is just landed in Firefox OS 1.4, which is still in development phase. So be patient, it will be available this year.