I am trying to build a service worker that retrieves a video from cache if available and fetches from online if it is not available. Here is my code for that:
self.addEventListener("fetch", function (event) {
if (event.request.headers.get("range")) {
caches.match(event.request.url).then(function (res) {
if (!res) {
log.debug(
`Range request NOT found in cache for ${event.request.url}, activating fetch...`
);
return fetch(event.request);
}
returnRangeRequest(event);
});
} else {
event.respondWith(
caches.match(event.request).then((response) => {
return response || fetch(event.request);
})
);
}
});
function returnRangeRequest(event) {
var rangeHeader = event.request.headers.get("range");
var rangeMatch = rangeHeader.match(/^bytes\=(\d+)\-(\d+)?/);
var pos = Number(rangeMatch[1]);
var pos2 = rangeMatch[2];
if (pos2) {
pos2 = Number(pos2);
}
event.respondWith(
caches
.match(event.request.url)
.then(function (res) {
return res.arrayBuffer();
})
.then(function (ab) {
let responseHeaders = {
status: 206,
statusText: "Partial Content",
headers: [
["Content-Type", "video/mp4"],
[
"Content-Range",
"bytes " +
pos +
"-" +
(pos2 || ab.byteLength - 1) +
"/" +
ab.byteLength,
],
],
};
var abSliced = {};
if (pos2 > 0) {
abSliced = ab.slice(pos, pos2 + 1);
} else {
abSliced = ab.slice(pos);
}
log.debug(
`Returning range request response`
);
return new Response(abSliced, responseHeaders);
})
.catch(function (err) {
log.error(err);
})
);
}
When I am online and I try to play the video, it works fine and it prints the debug line Range request NOT found in cache for https://example.com/vid.mp4, activating fetch...
When I have cached the video url using cache.add("https://example.com/vid.mp4");, and I try to play it, the video plays fine.
The problem arises when I turn off the Wifi on the iPad. When I try to play the video after turning of wifi, the video stays at 0:00 with a total length of 0:00.
Some of my findings:
When I have wifi on and I have the video cached, there are two requests made with bytes bytes=0-1 and then bytes=0-4444000.
When I have wifi off, the request for bytes=0-1 is made, but it stops with that.
Where am I going wrong?
Safari appears to have a very strict approach to range requests and similar issues to your problem are sometimes seen with regular online web servers.
In particular, Safari expects to see a '206' response when it sends a request with a byte range. If the server responds with a '200' request it appears Safari cannot handle this. Some other browsers seem to be ok with this - for example Chrome.
Apple provide some info on hoe to check for this:
If you are not sure whether your media server supports byte-range requests, you can open the Terminal application in OS X and use the curl command-line tool to download a short segment from a file on the server:
curl --range 0-99 http://example.com/test.mov -o /dev/null
If the tool reports that it downloaded 100 bytes, the media server correctly handled the byte-range request. If it downloads the entire file, you may need to update the media server. For more information on curl, see OS X Man Pages.
See this answer for more detail and background: https://stackoverflow.com/a/32998689/334402
Related
I'm using the new v2 Twilio Javascript SDK to make calls from the browser to other people.
This works fine but I've been asked to add volume controls for the incoming audio stream.
After some research it seems that I need to take the remote stream from the call and feed it through a gain node to reduce the volume.
Unfortunately the result from call.getRemoteStream is always null even when I can hear audio from the call.
I've tested this on latest Chrome and Edge and they have the same behavior.
Is there something else I need to do to access the remote stream?
Code:
async function(phoneNumber, token)
{
console.log("isSecureContext: " + window.isSecureContext); //check we can get the stream
var options = {
edge: 'ashburn', //us US endpoint
closeProtection: true // will warn user if you try to close browser window during an active call
};
var device = new Device(token, options);
const connectionParams = {
"phoneNumber": phoneNumber
};
var activeCall = await device.connect({ params: connectionParams });
//Setup gain (volume) control for incoming audio
//Note, getRemoteStream always returns null.
var remoteStream = activeCall.getRemoteStream();
if(remoteStream)
{
var audioCtx = new AudioContext();
var source = audioCtx.createMediaStreamSource(remoteStream);
var gainNode = audioCtx.createGain();
source.connect(gainNode)
gainNode.connect(audioCtx.destination);
}
else
{
console.log("No remote stream on call");
}
}
The log output is:
isSecureContext: true
then
No remote stream on call
Twilio support gave me the answer: you need to wait until you start receiving volume events before requesting the stream.
ie
call.on('volume', (inputVolume, outputVolume) => {
if(inputVolume > 0)
{
var remoteStream = activeCall.getRemoteStream();
....
}
});
I finally found a useful library to parse meta data from an audio stream, here: https://github.com/ghaiklor/icecast-parser. But still, I can't get the expected response , when I send the headers as in the example below.
The first function makes the request to the radio station / channel and gets the stream:
_makeRequest() {
const request = http.request(this.getConfig('url'));
console.log("Making request to: " + this.getConfig('url'));
request.setHeader('Range', 'bytes=0-');
request.setHeader('User-Agent', 'VLC/2.2.4 LibVLC/2.2.4');
request.setHeader('Icy-MetaData', '1');
request.setHeader('Connection', 'close');
request.once('response', this._onRequestResponse.bind(this));
request.once('error', this._onRequestError.bind(this));
request.end();
return this;
}
When a request to a radio station is successfully called, this function is called:
_onRequestResponse(response) {
console.log("Received response!");
console.log("Headers:");
console.log(response.headers['content-type']);
const icyMetaInt = response.headers['icy-metaint'];
console.log("icyMetaInt= " + icyMetaInt);
if (icyMetaInt) {
const reader = new StreamReader(icyMetaInt);
reader.on('metadata', metadata => {
console.log(metadata);
this._destroyResponse(response);
this._queueNextRequest(this.getConfig('metadataInterval'));
this.emit('metadata', metadata);
});
response.pipe(reader);
this.emit('stream', reader);
} else {
this._destroyResponse(response);
this._queueNextRequest(this.getConfig('emptyInterval'));
this.emit('empty');
}
return this;
}
When I use these functions on this URL (url: 'http://streaming.radionomy.com/70-s-80-sMetal'), the reply in the console is:
audio/mpeg
icyMetaInt= undefined
I understood the most crucial header here is:
setHeader('Icy-MetaData', '1')
Still, I am not receiving the 'icyMetaInt'. The URL does seem to contain metadata when checking it with other tools.
Any ideas what is going wrong here? Thank you!
;
Requests are part of the Fetch API, and when making cross origin requests you only have access to a limited range of headers.
I'm playing around with service workers. The following code should proxy JS files to patch the imports so that they conform to platform standards (i.e., "./", "../", "/", or "http://...").
Works great in Chromium (67.0.3396.79 on Arch Linux). And seems to work just as well in Firefox (60.0.2 (64-bit) on Arch), at least from the network tab, I can see all of the patched sources loading, but for some reason the JS modules aren't running. Can't console.log etc. or anything. Not sure how to get Firefox to bootstrap the application.
I noticed that the fetch headers are all toLowerCaseed, but I read up on that here and Mozilla also points out that header names are case-insensitive here.
I also thought maybe because the content-length was possibly changed that the file wasn't being received completely, but I didn't see any parse errors and indeed the network tab had the correct content-length changes, so I ruled that out.
const maybeAppendJS = (x) =>
x.endsWith(".js")
? x
: `${x}.js`;
const maybePatchURL = (x) =>
x.match(/(^'#.*'(.)?$)|(^"#.*"(.)?$)/)
? `"/node_modules/${maybeAppendJS(eval(x))}";`
: x;
const maybePatchImport = (x) =>
x.startsWith("import ")
? x.split(/\s/).map(maybePatchURL).join(" ")
: x;
async function maybeRewriteImportStatements(event) {
let candidate = event.request.url;
const url = maybeAppendJS(candidate);
const resp = await fetch(url);
if (!resp.headers.get("content-type").startsWith("text")) {
const text = await resp.text();
const newText = text.split(/\n/g)
.map(maybePatchImport)
.join("\n");
return new Response(newText, {headers: resp.headers});
}
if (resp.headers.get("content-type").startsWith("text/")) {
const location = `${url.substring(0, url.length - 3)}/index.js`;
return new Response(null, {status: 302, headers: {location: location}});
}
console.log("Service worker should never get here");
}
this.addEventListener('fetch', (event) => {
if (event.request.destination === "script" || event.request.referrer.endsWith(".js") || event.request.url.endsWith(".js")) {
event.respondWith(maybeRewriteImportStatements(event));
}
});
This was fixed by upgrading to the Firefox nightly (62.0a1.20180611-1).
I am new to this thing. Please don't hang me for the poor grammar. I am trying to create a proof of concept application which I will later extend. It does the following: We have a html page which asks for permission to use the microphone. We capture the microphone input and send it via websocket to a node js app.
JS (Client):
var bufferSize = 4096;
var socket = new WebSocket(URL);
var myPCMProcessingNode = context.createScriptProcessor(bufferSize, 1, 1);
myPCMProcessingNode.onaudioprocess = function(e) {
var input = e.inputBuffer.getChannelData(0);
socket.send(convertFloat32ToInt16(input));
}
function convertFloat32ToInt16(buffer) {
l = buffer.length;
buf = new Int16Array(l);
while (l--) {
buf[l] = Math.min(1, buffer[l])*0x7FFF;
}
return buf.buffer;
}
navigator.mediaDevices.getUserMedia({audio:true, video:false})
.then(function(stream){
var microphone = context.createMediaStreamSource(stream);
microphone.connect(myPCMProcessingNode);
myPCMProcessingNode.connect(context.destination);
})
.catch(function(e){});
In the server we take each incoming buffer, run it through ffmpeg, and send what comes out of the std out to another device using the node js 'http' POST. The device has a speaker. We are basically trying to create a 1 way audio link from the browser to the device.
Node JS (Server):
var WebSocketServer = require('websocket').server;
var http = require('http');
var children = require('child_process');
wsServer.on('request', function(request) {
var connection = request.accept(null, request.origin);
connection.on('message', function(message) {
if (message.type === 'utf8') { /*NOP*/ }
else if (message.type === 'binary') {
ffm.stdin.write(message.binaryData);
}
});
connection.on('close', function(reasonCode, description) {});
connection.on('error', function(error) {});
});
var ffm = children.spawn(
'./ffmpeg.exe'
,'-stdin -f s16le -ar 48k -ac 2 -i pipe:0 -acodec pcm_u8 -ar 48000 -f aiff pipe:1'.split(' ')
);
ffm.on('exit',function(code,signal){});
ffm.stdout.on('data', (data) => {
req.write(data);
});
var options = {
host: 'xxx.xxx.xxx.xxx',
port: xxxx,
path: '/path/to/service/on/device',
method: 'POST',
headers: {
'Content-Type': 'application/octet-stream',
'Content-Length': 0,
'Authorization' : 'xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx',
'Transfer-Encoding' : 'chunked',
'Connection': 'keep-alive'
}
};
var req = http.request(options, function(res) {});
The device supports only continuous POST and only a couple of formats (ulaw, aiff, wav)
This solution doesn't seem to work. In the device speaker we only hear something like white noise.
Also, I think I may have a problem with the buffer I am sending to the ffmpeg std in -> Tried to dump whatever comes out of the websocket to a .wav file then play it with VLC -> it plays everything in the record very fast -> 10 seconds of recording played in about 1 second.
I am new to audio processing and have searched for about 3 days now for solutions on how to improve this and found nothing.
I would ask from the community for 2 things:
Is something wrong with my approach? What more can I do to make this work? I will post more details if required.
If what I am doing is reinventing the wheel then I would like to know what other software / 3rd party service (like amazon or whatever) can accomplish the same thing.
Thank you.
In the standard specs it says you can set ENUM value to "relay" : http://dev.w3.org/2011/webrtc/editor/webrtc.html#idl-def-RTCIceServer
but, How do you set the enum to "relay" using Javascript of following?
if (tmp.indexOf("typ relay ") >= 0) { never occur in my test. how can i force it to enum "relay"?
or
is it a BUG? https://code.google.com/p/webrtc/issues/detail?id=1179
Any idea.
function createPeerConnection() {
try {
// Create an RTCPeerConnection via the polyfill (adapter.js).
pc = new RTCPeerConnection(pcConfig, pcConstraints);
pc.onicecandidate = onIceCandidate;
console.log('Created RTCPeerConnnection with:\n' +
' config: \'' + JSON.stringify(pcConfig) + '\';\n' +
' constraints: \'' + JSON.stringify(pcConstraints) + '\'.');
} catch (e) {
console.log('Failed to create PeerConnection, exception: ' + e.message);
alert('Cannot create RTCPeerConnection object; \
WebRTC is not supported by this browser.');
return;
}
pc.onaddstream = onRemoteStreamAdded;
pc.onremovestream = onRemoteStreamRemoved;
pc.onsignalingstatechange = onSignalingStateChanged;
pc.oniceconnectionstatechange = onIceConnectionStateChanged;
}
function onIceCandidate(event) {
if (event.candidate) {
var tmp = event.candidate.candidate;
if (tmp.indexOf("typ relay ") >= 0) {
/////////////////////////////////////////// NEVER happens
sendMessage({type: 'candidate',
label: event.candidate.sdpMLineIndex,
id: event.candidate.sdpMid,
candidate: tmp});
noteIceCandidate("Local", iceCandidateType(tmp));
}
} else {
console.log('End of candidates.');
}
}
There is a Chrome extension, WebRTC Network Limiter, that Configures how WebRTC's network traffic is routed by changing Chrome's privacy settings. You can force traffic to go through TURN without having to do any SDP mangling.
EDIT 1
The point of making this available in extensions, is for users that are worried about their security. You can check this excellent post about WebRTC security. The same as this extension does, can also be done in FF, by changing the flag media.peerconnection.ice.relay_only. This can be found in about:config. Don't know about the other two, but I'd wager they do have something similar.
On the other hand, if you are distributing an app and want all your clients to go through TURN, you won't have access to their browsers, and can't expect them to change those flags or install any extension. In that case, you'll have to mangle your SDP. You can use this code
function onIceCandidate(event) {
if (event.candidate) {
var type = event.candidate.candidate.split(" ")[7];
if (type != "relay") {
trace("ICE - Created " + type + " candidate ignored: TURN only mode.");
return;
} else {
sendCandidateMessage(candidate);
trace("ICE - Sending " + type + " candidate.");
}
} else {
trace("ICE - End of candidate generation.");
}
}
That code is taken from the discuss-webrtc list.
If you never get those candidates, it might very well be that your TURN server is not correctly configured. You can check your TURN server in this page. Don't forget to remove the existing STUN server configured in that demo.
EDIT 2
Even simpler: set iceTransportPolicy: "relay" in your WebRtcPeer config to force TURN:
var options = {
localVideo: videoInput, //if you want to see what you are sharing
onicecandidate: onIceCandidate,
mediaConstraints: constraints,
sendSource: 'screen',
iceTransportPolicy: 'relay',
iceServers: [{ urls: 'turn:XX.XX.XX.XX', username:'user', credential:'pass' }]
}
webRtcPeerScreencast = kurentoUtils.WebRtcPeer.WebRtcPeerSendrecv(options, function(error) {
if (error) return onError(error) //use whatever you use for handling errors
this.generateOffer(onOffer)
});
This is working for me:
iceServers = [
{ "url": "turn:111.111.111.111:1234?transport=tcp",
"username": "xxxx",
"credential": "xxxxx"
}
]
optionsForWebRtc = {
remoteVideo : localVideoElement,
mediaConstraints : videoParams,
onicecandidate : onLocalIceCandidate,
configuration: {
iceServers: iceServers,
iceTransportPolicy: 'relay'
}
}
To force the usage of a TURN server, you need to intercept the candidates found by the browser. Just like you are doing.
But if it never occurs in your testings, you may have some problems with your TURN server.
If you have a working TURN server, you'll get the candidates with typ relay.
Remember that you need to configure TURN server with authentication. It is mandatory and the browser will only use the candidates "relay" if the request is authenticated. Your iceServers config for PeerConnection must have the credentials defined for TURN servers.