Sorry if the question is very stupid
I am trying to show the stream from a person to another user using js
I have tried putting it in a cookie but it doesnt work even then.Even when the object in video is the same as the other
File 1
var video = document.querySelector("#videoElement");
var x=document.cookie
console.log(x)
video.srcObject = x;
File 2
var video = document.querySelector("#videoElement");
if (navigator.mediaDevices.getUserMedia) {
navigator.mediaDevices.getUserMedia({ video: true })
.then(function (stream) {
video.srcObject = stream;
document.cookie=video.srcObject
console.log(stream,video.srcObject)
})
.catch(function (err0r) {
console.log("Something went wrong!");
});
console.log(stream,video.srcObject)
}
I would like to just for now show it on two pages but for future what language should i use to store the video if you know you can share it
A cookie is not a centralized universal camera access repository on the web. Thank goodness.
A MediaStream is a local resource object representing active camera use, not a shareable URL.
This object lives solely in your local JS page, and isn't addressable on the web.
Since it doesn't live on any server, transporting the graphical bits from your camera to a friend's system, requires quite a bit of heavy lifting. This includes establishing an RTCPeerConnection which is the domain of WebRTC:
navigator.mediaDevices.getUserMedia({ video: true })
.then(function (stream) {
const iceServers = [{urls: "stun:stun.l.google.com:19302"}];
const pc = new RTCPeerConnection({iceServers});
for (const track of stream.getTracks())
pc.addTrack(track, stream);
/* lots of other WebRTC negotiation code */
You'll also typically need a server of some kind, both to solve discovery, i.e. point of contact, as well as a web socket server to exchange critcal offer/answer session descriptions that are necessary for connection establishment, between the two peers.
Perhaps the simplest proof of concept is this cut'n'paste demo, which let you and a friend exchange the required WebRTC offer/answer session descriptions manually, letting you establish a connection without any server, to see and talk to each other.
That has about a 70% chance of working. If you're both behind symmetric NATs (most mobile networks), then it gets harder still (you'll need a TURN server, which costs money).
Related
I have created a WebRTC connection where I can send messages and video to the other party, but if one of the two disconnects, then there can be no communication, even if the response that the other party sent me at the beginning is accepted again. but it still doesn't work.
Is it possible or not to resume that communication without having to redo the connection process and having to send the sdp to the other party again?
Part of the data I'm sending::
const connection = new RTCPeerConnection({ iceServers: [{ urls: 'stun:stun.l.google.com:19302' }] });
async function start() {
const localStream = await navigator.mediaDevices.getUserMedia({video: true, audio: true});
localStream.getTracks().forEach(track => {
connection.addTrack(track, localStream);
});
}
If the browser window is closed it is dead for good. You can not recover from that without going through the signalling process again.
The original signalling process sends messages across including randomized ID's and TCP/UDP ports. They lose their meaning in the moment the object is destroyed.
I'm working on implementing a distributed hash table using WebRTC and IndexedDB, and I ran into an issue I can't quite find the correct solution for. I have a simple application I've been using to explore WebRTC, and I'm successfully sending data over a data channel.
It would be impossible to keep a connection alive for every saved node, but it would also be unwieldy to re-signal every time. Normally for this you'd just keep an IP, port, and ID saved, but WebRTC needs more than that; below is the code I'm using to try to retrieve the connection, with information from the previous session stored in the "session" object:
function retrieve() {
console.log("Retrieving peer connection");
pc = new RTCPeerConnection(pcConfig);
pc.localStream = session.dataChannel;
pc.onicecandidate = () => {};
pc.onremovestream = (e) => console.log("Remote stream removed.");
if(initiator) {
pc.createOffer().then((sessionDescription) => {
pc.setLocalDescription(sessionDescription);
});
} else {
pc.createAnswer().then((sessionDescription) => {
pc.setLocalDescription(sessionDescription);
});
}
pc.setRemoteDescription(session.remoteDescription);
cands = session.candidates;
cands.map((cand) => {
pc.addIceCandidate(new RTCIceCandidate({
sdpMLineIndex: candidate.label,
candidate: candidate.candidate,
}));
});
}
This actually works, but it causes a few errors which I worry may indicate problems. The first of these occurs on both ends:
InvalidStateError: Cannot set remote answer in state stable
The second only occurs on the offering side, in my tests:
InvalidStateError: Cannot add ICE candidate when there is no remote SDP
For some reason, the data stream even works fine if I don't make an offer or answer at all, and just reinstate the candidates; the error in that case is the latter error on both sides.
What's going on here? Do I need to worry about it, or can I forge on? And as a side question, is this a silly way of solving this problem, with an obvious missed alternative?
Edit: I've now also tried a bunch of ways of reordering these steps, according to these errors, and haven't gotten much in the way of results; however, this sort of approach does allow real connections to reconnect and doesn't cause problems with the signaling even when I change networks on my phone, so I'll carry on testing the limits of this approach. I'd still like to know exactly what causes these errors, though, since they don't necessarily seem to indicate what they appear to indicate.
I stumbled on a weird issue in a WebRTC webapp. Here's the setup:
Client A and client B send audio via a send-only WebRTC connections to a SFU.
Client C receives via two receive-only connections to that same SFU the audio streams from client A and B and adds them to two different "audio" elements. The routing between these send and receive connections work properly.
Here's the problem:
On refreshing the page, sometimes client C hears audio from both client A and B. But most of the time client C only hears audio from randomly A or B.
It's happening in both firefox and chrome.
Both connections are receiving data (see graph "bitsReceivedPerSecond") but only one connection is outputting audio. Here an example where C could hear A but not B:
Connection Client A -> C:
Connection Client B -> C:
My understanding of these graphs is that the raw WebRTC connection is working fine (data is sent and received) but somehow a connection does not output audio randomly.
Does anyone have a clue how this can happen?
Here is the "ontrack" callback for adding the streams to the audio elements. The Logs do appear correctly for each connection.
gotRemoteStream(e) {
Logger.log("Remote Streams: #"+e.streams.length);
if (this.audioElement.srcObject !== e.streams[0]) {
Logger.log("Received remote Stream with tracks to audio: " + this.audioElement.id);
this.audioElement.srcObject = e.streams[0];
}
}
A single audio element can only play one audio track at a time.
You said there were two incoming audio tracks, so if this.audioElement is the same element, then each call to gotRemoteStream will race setting srcObject, one overwriting the other.
This is most likely why you only hear one or the other.
The simplest solution might be to key off the stream associations sent, since they'll tend to differ:
const elements = {};
gotRemoteStream({streams: [stream]}) {
if (!elements[stream.id]) {
elements[stream.id] = this.audioElementA.srcObject ? this.audioElementB
: this.audioElementA;
}
elements[stream.id].srcObject = stream;
}
This should work for two incoming connections. More than that is left as an exercise.
I'm programming using PeerJS and, because PeerJS is already very outdated, I'm testing everything on Firefox version 38. I know it is not the best to do, but I don't have time for more. So, I'm trying to do the following:
Peer1 transmits audio and video to Peer2.
Peer2 wants to transmit to Peer3 the video that receives from Peer1 but not the audio. Peer2 wants to send it's own audio.
Basically, Peer3 will receive the video from Peer1 (with Peer2 relaying it) and audio from Peer2, but it will arrive to him all together, like if it was a normal WebRTC call.
I do this like this:
var mixStream = remoteStream;
var audioMixStream = mixStream.getAudioTracks();
mixStream = mixStream.removeStream(audioMixStream);
var mixAudioStream = localStream;
var audioMixAudioStream = mixAudioStream.getAudioTracks();
mixStream.addTrack(audioMixAudioStream);
//Answer the call automatically instead of prompting user.
call.answer(window.audioMixAudioStream);
But I'm getting an error on removeStream. Maybe I will get more errors after that one, but now I'm stuck on this one.
Can someone please tell what I should use instead of removeStream?
P.S.: I already used removeTrack too and got an error too.
Ive asked a question about this before but without any luck..
Im having problems following this tutorial https://www.pubnub.com/blog/2014-10-21-building-a-webrtc-video-and-voice-chat-application/ .
Ive written the code and it works flawlessly on local network, but when i try to connect with a remote client(i.e. not on the same network) the code doesnt work anymore. It just shows a black screen where the video from the client should be.
phone.receive(function(session){
session.connected(function(session){
$("#vid-box").append(session.video); //outputs black screen
});
session.ended(function(session) {alert("Call ended: "+session.number});
});
Ive even contacted PubNub but they were unable to help.
Anyone has any ideas?
WebRTC Double NAT Oh no!
⚠️ TURN Server NOT PROVIDED ⚠️
Make sure you are not on NAT network forwarding. Otherwise you'll need TURN servers (not provided). TURN Servers broker network traffic and allow constrained network video conversations. Most mobile providers are basic open routing (non-NAT). Most corporate firewalls have at least one NAT.
TURN Streams BINARY VIDEO. Needed for NATed networks but not required.
STUN Resolves IP Address. Peer to Peer discovery.
PUBNUB Sends IP Address.
STUN provides the IP Address. There is nothing in WebRTC to provide a means to exchange that IP Address between the connecting clients. This is where PubNub comes in.
WebRTC Resources and SDK Links
Download: ZIP Download WebRTC SDK
GitHub: GitHub Repository for WebRTC SDK
Documentation: GitHub WebRTC Documentation
Video: What is WebRTC Video Introduction
Demo: WebRTC Live Calling App Demo
So, iv'e finally managed to make it work.
i simply added Turn/Stun servers to the pubnub call function, following the tutorial mentioned here: https://xirsys.com/pubnub-part-2/
Thanks alot #PubNub for your suggestion.
function get_xirsys_servers() {
var servers;
$.ajax({
type: 'POST',
url: 'https://service.xirsys.com/getIceServers',
data: {
room: 'default',
application: 'default',
domain: 'www.thedomainyoucreated.com',
ident: 'yourxirsysident',
secret: 'secret-token-from-xirsys-dash',
},
success: function(res) {
res = JSON.parse(res);
if (!res.e) servers = res.d.iceServers;
},
async: false
});
return servers;
}
//Request to connect to Remote User
function makeCall( remoteId ){
if (!window.phone) alert("Login First!");
else if( !remoteId ) alert("The call id is missing or invalid!");
else phone.dial( remoteId, get_xirsys_servers() );
}