I'm currently working on a library that sends a MediaStreamTrack from one electron window to another one via an RTCPeerConnection.
The problem I currently have is that everything works fine except remote tracks. A MediaStreamTrack of my camera is created on localhost and can be sent without any problem but when I'm trying to send a track, which I'm receiving from an external server, then it only shows black frames.
It's the same problem as here I guess.
What I've checked
The track is getting received (I checked chrome://webrtc-internals) and the video element is in a playing state. I don't see any errors, neither in the console nor in webrtc-internals.
Additionally the stream/track is enabled and not muted. Using captureStream() on the source window results in the same.
Capturing and streaming a video element with an external source is working without any problems.
Possible reason
I think the problem might be that the initial stream is coming from a domain that is not localhost and for which reason it is impossible to restream it. In the w3c standard, it is described as follows:
A track could have contents that are inaccessible to the application. This can be due to anything that would make a track CORS cross-origin. These tracks can be supplied to the addTrack() method, and have an RTCRtpSender created for them, but content MUST NOT be transmitted. Silence (audio), black frames (video) or equivalently absent content is sent in place of track content.
Source (10.)
Idea
I'm not sure if the "possible problem" I mentioned above is the real problem but one idea I had was to rewrite the request and response headers to match localhost but it didn't work. I guess it doesn't affect WebRTC connections.
Electron hooks I used:
session.defaultSession.webRequest.onBeforeSendHeaders();
session.defaultSession.webRequest.onHeadersReceived();
Related
I am currently experimenting with the google-castable-video component of the Google Web Components of the Polymer library. So far no larger issues. But when I try to stream a video, which has a blob-url as source, the stream on the Chromecast starts buffering, but immediatly stops without showing a frame.
Now i am asking myself, if it is even possible to use an url like blob:http%3A//127.0.0.1%3A8889/fd3e3425-f5ea-48f1-a380-5febf0f071ad with the Chromecast-SDK. If not, are there any alternative ways to load a local video and stream that with this Web-Component? (Excluding existing tools like Videostream, etc.)
Any help appreciated.
The URL you have provided is pointing to the local loopback so when chromecast receives that, it tries to load that from its own local device and clearly that is not what you want. If you want to serve local content from your sender side, you need to embed a local web server in your sender and serve content using that web server; you can search on StackOverflow for prior posts on serving local content to chromecast.
I use html5 video tag to play video.
Then I use canvas element to draw video frame.
Video is from remote source.
There is no problem to draw frame. But there is problem to get image data from canvas. I want to get image data to make img element or to send data to server for creating image, but it is not possible, because operations with canvas are insecure.
When I use video from same domain, there is no problem.
Only one way, that i have found - is to make script on server, which get remote video and output it, and us this script like source for video element. But it is not very good idea, because it makes additional overload for server.
I am not sure that i have understood properly articles about using "cross-origin"? I think server, where remote video is located, has to send headers like: "Access-Control-Allow-Origin: *" , but if I don't have access to that server, for example I want to user youtube video, there is no ability to execute to do it?
The simple answer is, no, unfortunately.
It can't be done if you don't have access to the remote server to allow cross-origin use, or can ask the administrator of that site (which is very little likely to happen with a site like YouTube).
Generally, you can try to request cross-origin use by supplying the crossOrigin attribute:
<video ... crossOrigin="anonymous">...</video>
If it's allowed you will see data when you get the image data.
Solutions
One is to copy the video to your own server and stream it from there.
Two, use a proxy server or script as you already tried with your server.
And yes, both cases will impact traffic on your server (and there is a possible legal aspect to this regarding copyright etc.)
Sorry, no way around it.
The problem
I made a receiver application that is just showing a video in loop on the Chromecast. The problem is that the Chromecast doesn't seems to be caching the video in it's cache. So the video keeps getting downloaded every time it finishes a loop and it takes a lot of bandwidth. The video will be hosted on external server so the Chromecast will have to download it from internet every time (I cannot change that spec).
Just for you know, when debugging the receiver application on a desktop chrome application, the video is cached by the browser, so the problem doesn't seems to come from http responses for the caching behaviour.
A solution I explored
I tried to download the video file in ajax and play it. The problem is the Chromecast seems to crash when my Javascript tries to read the responseText field of the xhr when the result has more than 28MB (I tried with a 50MB file (it crashed) and a 28MB file (it didn't crash), the limit could actually be 32MB).
EDIT:
I also tried this example and it also makes the chromecast crash...
The question
Is it possible to cache a video of 50-100MB on the Chromecast and prevent it from downloading it every time or is there a memory trick I could be doing to store that video in the Chromecast memory? Loading the video once per application use would be my target result to reduce bandwidth usage.
I'm a bit unsure about this answer because I find it a bit too obvious. But I'll give it a try:
You said you had no trouble with a setup where you download 28MB via ajax. Why don't you cut it down even further? You could for example go with 4MB. I'm suggesting this because it may alleviate problems arising from "bursts" of computation as you for example mentioned with reading the responseText field of the xhr object.
After you decided on an appropriate chunk size you could use https://datatracker.ietf.org/doc/html/draft-ietf-httpbis-p5-range-22#section-3 to download your video in parts and then concatenate it in javascript according to your needs. See also Download file in chunks in Chrome Javascript API?
If you have access to the server you could also split the file on the server side such that you can send requests from the client like so:
example.com/movies/my_movie.mp4?chunk=1
Try using an Application Cache manifest file to ensure that the file is only downloaded once:
<html manifest="some.manifest">
where some.manifest has the contents:
CACHE MANIFEST
# version 1.0
the_video_to_cache.webm
This will ensure that future HTTP requests for the resource will not cause download. The video will only re-download when the manifest file changes (so you can change the #-prefixed comment string to cause a re-download). Note that the new version will be shown on first page load after the download completes. After an update, the user will see an out-of-date video one time (while the new version downloads) and then see the new version on the next visit.
Note that this may not work if your video is larger that the permitted size of the app cache.
I'm don't have chromecast, and not sure. Is it possible to use experimental features, like Quota Management API? This API, could add some extra memory for you stored data, may be you should try to use it.
I have an mvc webpage with videojs on it streaming an mp4 file. when I run the page from a desktop, and debug the site, I can see that with a desktop view of the page, the video gets called twice for some reason, both calls seems to have a range-request of the entire filesize. this seems strange that its being called twice, but even stranger, if I call this same page from IOS (IPAD), I see 6 calls for the stream. the first two are usually requesting the first two bytes which makes sense based on my knowledge that IOS does this to determine if the stream is seekable. Then it makes 4 more calls to pull the stream each with a range-request of the entire filesize from what I can recall. Anyone know if this is normal for IOS and videojs use?
Yes, this is normal. the player is probing the file to determine characteristics. It is trying to find the location of the mdat and moov atoms. Even though it asks for the entire file, it will disconnect the TCP session as soon as it has the data it needs to seek with into file.
Getting the track data with GET yields the normal results. But, the stream URL (with appended client-id) doesn't work.
Here are the important ones:
<sharing>public</sharing>
<streamable type="boolean">true</streamable>
<stream-url>http://api.soundcloud.com/tracks/112288415/stream</stream-url>
I'm at a loss now, because everything should be good to go, and play with SC.stream().
I've found the issue in my particular case. Our client had some arrangement with Soundcloud that fell outside of the norm, and they had disabled streams but the API still reported the sound as available. After some help from our client we were able to get proper streams... Sorry for a non-technical solution!
For those coming here for an answer to this problem, it is a known bug when a song reports streamable: true yet results in a blank white page in browser when trying to stream. The bug is in the streamable boolean being false.
Email response from SoundCloud on this issue:
The developers have let me know that the problems you are having is
due to issues with RTMP.
Currently certain content on SoundCloud is using a secure streaming
method called RTMP.
To explain RTMP, even if a track is set to public and streamable by
the artist, if the artist is under a major label, this label can
further control those streaming permissions. So, it looks like it
should stream correctly, however it doesn't.
This particular bug that you have highlighted is more complicated than
originally thought, and as only a handful of tracks are affected we
unfortunately don't have the resources to dedicate a team entirely to
this project as of right now.
So unfortunately you'll just have to deal with/work around this issue.
You need to add ?client_id=YourClientID to the url when you're requesting the stream.