I'm trying to get sound working on my iPhone game using the Web Audio API. The problem is that this app is entirely client side. I want to store my mp3s in a local folder (and without being user input driven) so I can't use XMLHttpRequest to read the data. I was looking into using FileSystem but Safari doesn't support it.
Is there any alternative?
Edit: Thanks for the below responses. Unfortunately the Audio API is horribly slow for games. I had this working and the latency just makes the user experience unacceptable. To clarify, what I need is sounething like -
var request = new XMLHttpRequest();
request.open('GET', 'file:///./../sounds/beep-1.mp3', true);
request.responseType = 'arraybuffer';
request.onload = function() {
context.decodeAudioData(request.response, function(buffer) {
dogBarkingBuffer = buffer;
}, onError);
}
request.send();
But this gives me the errors -
XMLHttpRequest cannot load file:///sounds/beep-1.mp3. Cross origin requests are only supported for HTTP.
Uncaught Error: NETWORK_ERR: XMLHttpRequest Exception 101
I understand the security risks with reading local files but surely within your own domain should be ok?
I had the same problem and I found this very simple solution.
audio_file.onchange = function(){
var files = this.files;
var file = URL.createObjectURL(files[0]);
audio_player.src = file;
audio_player.play();
};
<input id="audio_file" type="file" accept="audio/*" />
<audio id="audio_player" />
You can test here:
http://jsfiddle.net/Tv8Cm/
Ok, it's taken me two days of prototyping different solutions and I've finally figured out how I can do this without storing my resources on a server. There's a few blogs that detail this but I couldn't find the full solution in one place so I'm adding it here. This may be considered a bit hacky by seasoned programmers but it's the only way I can see this working, so if anyone has a more elegent solution I'd love to hear it.
The solution was to store my sound files as a Base64 encoded string. The sound files are relatively small (less than 30kb) so I'm hoping performance won't be too much of an issue. Note that I put 'xxx' in front of some of the hyperlinks as my n00b status means I can't post more than two links.
Step 1: create Base 64 sound font
First I need to convert my mp3 to a Base64 encoded string and store it as JSON. I found a website that does this conversion for me here - xxxhttp://www.mobilefish.com/services/base64/base64.php
You may need to remove return characters using a text editor but for anyone that needs an example I found some piano tones here - xxxhttps://raw.github.com/mudcube/MIDI.js/master/soundfont/acoustic_grand_piano-mp3.js
Note that in order to work with my example you're need to remove the header part data:audio/mpeg;base64,
Step 2: decode sound font to ArrayBuffer
You could implement this yourself but I found an API that does this perfectly (why re-invent the wheel, right?) - https://github.com/danguer/blog-examples/blob/master/js/base64-binary.js
Resource taken from - here
Step 3: Adding the rest of the code
Fairly straightforward
var cNote = acoustic_grand_piano.C2;
var byteArray = Base64Binary.decodeArrayBuffer(cNote);
var context = new webkitAudioContext();
context.decodeAudioData(byteArray, function(buffer) {
var source = context.createBufferSource(); // creates a sound source
source.buffer = buffer;
source.connect(context.destination); // connect the source to the context's destination (the speakers)
source.noteOn(0);
}, function(err) { console.log("err(decodeAudioData): "+err); });
And that's it! I have this working through my desktop version of Chrome and also running on mobile Safari (iOS 6 only of course as Web Audio is not supported in older versions). It takes a couple of seconds to load on mobile Safari (Vs less than 1 second on desktop Chrome) but this might be due to the fact that it spends time downloading the sound fonts. It might also be the fact that iOS prevents any sound playing until a user interaction event has occured. I need to do more work looking at how it performs.
Hope this saves someone else the grief I went through.
Because ios apps are sandboxed, the web view (basically safari wrapped in phonegap) allows you to store your mp3 file locally. I.e, there is no "cross domain" security issue.
This is as of ios6 as previous ios versions didn't support web audio api
Use HTML5 Audio tag for playing audio file in browser.
Ajax request works with http protocol so when you try to get audio file using file://, browser mark this request as cross domain request. Set following code in request header -
header('Access-Control-Allow-Origin: *');
Related
I'm trying to load an image called "border.bmp" so that I can use it as a texture for WebGL. Here's how I reference the image.
var img;
function preload() {
img = loadImage("assets/border.bmp");
}
I then get this error in the console.
Access to Image at 'file:///C:/P5.js/empty-example/assets/border.bmp' from
origin 'null' has been blocked by CORS policy: Invalid response. Origin
'null' is therefore not allowed access.
What is this error message? What does it mean? How do I load the image?
The comment by gman and the answer by Dale are both correct. I highly recommend you take the time to understand exactly what they're saying before you downvote or dismiss them.
CORS stands for cross-origin resource sharing, and basically it's what allows or prevents JavaScript on one site from accessing stuff on another site. A quick google search of "JavaScript CORS" or just "cors" will give you a ton of information that you should read through.
So if you're getting a CORS error, that means that the site that holds your image is not letting the site that holds your code access the image. In your case, it looks like you're loading stuff from a file: URL, which is not being loaded from a server. That's what the Origin null part of the error means.
So, step one is to listen to gman's comment and run your sketch from a local webserver instead of using a file: URL. His comment already contains links explaining how to do that, or you could use the P5.js web editor or CodePen or any other basic web host.
The most common setup is to include the image files in the same server as the code, so that should be pretty much all you need to do. But if you're storing the images at a different URL than the code, then step 2 is to follow Dale's answer and setup your image server to allow requests from your code server.
While the following doesn't directly answer the OP question, it can help others, like me, that ended up here searching for P5.JS CORS.
To bypass CORS restrictions simply add the blocked URL after https://cors-anywhere.herokuapp.com, i.e.:
var = 'https://cors-anywhere.herokuapp.com/http://blocked.url'
P5.JS Example:
Lets say you want to load a remote mp4 video (it can be any file-type) using P5.JS from a server that doesn't have CORS enabled, for these situations, you can use :
var vid;
function setup() {
vid = createVideo(['https://cors-anywhere.herokuapp.com/http://techslides.com/demos/sample-videos/small.mp4'], vidLoad);
}
// This function is called when the video loads
function vidLoad() {
vid.play();
}
UPDATE:
The demo server of CORS Anywhere (cors-anywhere.herokuapp.com) is meant to be a demo of this project. But abuse has become so common that the platform where the demo is hosted (Heroku) has asked me to shut down the server, despite efforts to counter the abuse (rate limits in #45 and #164, and blocking other forms of requests). Downtime becomes increasingly frequent (e.g. recently #300, #299, #295, #294, #287) due to abuse and its popularity.
To counter this, I will make the following changes:
The rate limit will decrease from 200 (#164) per hour to 50 per hour.
By January 31st, 2021, cors-anywhere.herokuapp.com will stop serving as an open proxy.
From February 1st. 2021, cors-anywhere.herokuapp.com will only serve requests after the visitor has completed a challenge: The user (developer) must visit a page at cors-anywhere.herokuapp.com to temporarily unlock the demo for their browser. This allows developers to try out the functionality, to help with deciding on self-hosting or looking for alternatives.
CORS Stands for Cross-origin resource sharing. By default WebGL will block any resource (images, textures, etc) from any external origin. WebGL thinks the local resource is from an external origin.
You can bypass this by using img.crossOrigin = "";
I'm not an expert in webGL, all this information was found here
If you want to load local images in P5, you can use the drawImage() function of the canvas instead of image() function of the P5.js
let img;
function setup() {
createCanvas(windowWidth,windowHeight);
img = new Image();
img.src = "assets/smiley.png";
}
function draw() {
drawingContext.drawImage(img,mouseX,mouseY);
}
I'm looking for a solution to fully preload an html5 video so that I can play it through and seek to different times without any risk of buffering. I've seen solutions that involve using xhr to download the video file as a 'blob' type and subsequently construct a url to that blob using the createObjectURL method. This is the code example in the solution I mentioned above:
var r = new XMLHttpRequest();
r.onload = function() {
myVid.src = URL.createObjectURL(r.response);
myVid.play();
};
if (myVid.canPlayType('video/mp4;codecs="avc1.42E01E, mp4a.40.2"')) {
r.open("GET", "slide.mp4");
}
else {
r.open("GET", "slide.webm");
}
r.responseType = "blob";
r.send();
This works for me in Chrome and Firefox, but not in Safari when using a video hosted on a CDN. This solution does work in Safari if I use a video hosted on the same server. I found this Safari bug, although I'm not sure if the bug is still valid. There's no mention of the Safari bug on the page with the above solution. I've seen another method which essentially pauses the video and waits for it to buffer to 100%, but Chrome doesn't seem to ever fully buffer the video.
I looked into PreloadJS, which apparently supports video preloading, but I couldn't find any working examples. I also looked into html5Preloader, but again I couldn't figure out what to do once the finish event was fired.
I'm not sure if it makes any difference, but I'm using Videogular to play my video, which needs to be fed a video url. I suppose if I use some preloader library such as PreloadJS or html5Preloader, which I'm guessing would in turn use xhr for video, I would need access to a new blob url in my finished handler.
Has anyone come up with a video preloading solution that works in Safari? Thanks in advance.
It turns out the problem was being caused by the content type response header on the videos coming from Amazon S3. They were set to octet-stream, which Chrome and Firefox were able to handle, but Safari threw a media error 4. Changing the content type in the Amazon S3 admin site to 'video/mp4' solved the problem for me.
More info about Safari and octet-stream here in the 'Known issues' tab: http://caniuse.com/#feat=bloburls
I need to loop an audio file (WAV, OGG or raw PCM) in the browser that contains segments which are unheard (ultrasonic) by the human ear (yet contain data which is valuable to me).
Using Chrome on Mac, I've noticed that if the segments of unheard sound are relatively short, I get all the data back (heard + unheard). In contrast, if the segments of unheard sound are longer than a certain threshold, it will fade out the whole channel quickly and effectively cancel the rest of the file completely, until the next loop cycle begins.
The way I'm loading and playing the sound is like so:
var b = msg.data; // binary msg received from websocket
b.type = "audio/wav";
var URLObject = window.webkitURL || window.URL;
var url = URLObject.createObjectURL(b);
var snd = document.createElement("audio");
snd.setAttribute("src", url);
snd.addEventListener("loadeddata", function() {
snd.loop = true;
snd.muted = false;
snd.play();
});
I'm looking for a way to cancel this automatic filtering of unheard sounds. Eventually, I would like a way to do this cross-browser. If not possible using JavaScript, a Flash solution will also be accepted.
Sample ultrasonic WAV files (~1MB each):
https://drive.google.com/file/d/0B5sMkxczD6sNbm04MmxMTmIwdlk/edit?usp=sharing
https://drive.google.com/file/d/0B5sMkxczD6sNal91WUhRNWo2d3c/edit?usp=sharing
There isn't a single approach that will work on all browsers, unfortunately.
For most browsers on the desktop and iOS too you can use the Web Audio API as shown here:
http://www.html5rocks.com/en/tutorials/webaudio/intro/
For IE/Android you need to use Flash to play a WAV/PCM, or play OGG with HTML5 Audio tag, but the latter may lose the ultrasonic frequencies.
So in general, you need to write code that will check what the current browser supports and use that, starting with Web Audio API, then trying HTML5 Audio, then Flash.
I have seen a number of questions that don't answer this, is it possible to check someones bandwidth using java script and load specific content based on it?
The BBC seem to give me low quality images when using my mobile and in the middle of nowhere.
by the looks of this this cool service does this and its a CDN so it could be server side.
http://www.resrc.it/docs/
Does anyone know how they do it? or how I could do it using asp.net or javascript, or an community opensource plug in.
I think it may be possible with https://github.com/yahoo/boomerang/ but not sure this is its true purpose.
Basically you do this like this:
Start a timer
Load an fixed size file e.g a image through an ajax call
Stop the timer
Take some samples and compute the average badwidth
Somethign like this could work:
//http://upload.wikimedia.org/wikipedia/commons/5/51/Google.png
//Size = 238 KB
function measureBW(cnt, cb) {
var start = new Date().getTime();
var bandwidth;
var i = 0;
(function rec() {
var xmlHttp = new XMLHttpRequest();
xmlHttp.open('GET', 'http://upload.wikimedia.org/wikipedia/commons/5/51/Google.png', true);
xmlHttp.onreadystatechange = function () {
if (xmlHttp.readyState == 4) {
var x = new Date().getTime() - start;
bw = Number(((238 / (x / 1000))));
bandwidth = ((bandwidth || bw) + bw) / 2;
i++;
if (i < cnt) {
start = new Date().getTime();rec();
}
else cb(bandwidth.toFixed(0));
}
};
xmlHttp.send(null);
})();
}
measureBW(10, function (e) {
console.log(e);
});
Not that var xmlHttp = new XMLHttpRequest(); won't work on all browsers, you should check for the UserAgent and use the right one
And of course its just an estimated value.
Heres a JSBin example
Start a timer.
Send a AJAX request to your server, requesting a file of known size.
When the AJAX request's done loading, stop the timer, and calculate the bandwidth from the passed time and file size.
The problem with JavaScript is that users can disable it. (Which is more common on phones, that happen to be better off with smaller images)
I've knocked this up based on timing image downloads (ref: http://www.ehow.com/how_5804819_detect-connection-speed-javascript.html)
Word of warning though:
It says my speed is 1.81Mbps,
But according to SpeedTest.Net my speeds are this:
The logic of timing the download seems right but not sure if it's accurate?
Well, like I said in my comments, you can choose 2 approaches:
1) You are in the context of a mobile app, then you can query the technology used by the device directly so you can notify the server directly what type (and size) of content you area able to render. I think phone gap can help you with accessing some of the native mobile API's using JavaScript.
2) The server-timer thing. You can "serve" some files yourself, lets say you have a magic file in your landing page, that, as soon as the client request the file, you grab this HTTP request with a custom handler. You "manually" serve the file by writing to the output stream, and you measure the bytes send and the time it took to reach the EOF, then you can somehow measure the bandwith. Combine this with the session cookie and you have this information per connected browser.
While this isn't an answer, it may be important to note that measuring bandwidth isn't always reliable.
http://www.smashingmagazine.com/2013/01/09/bandwidth-media-queries-we-dont-need-em/
To paraphrase the above:
...the number of bits downloaded divided by the time it took to download them...is true when you download a large file over a single warmed-up TCP connection. That is rarely the case.
Typical page load scenario:
Initial HTML page is downloaded using slow-start mechanism, so measurement will significantly underestimate the available bandwidth
CSS and JavaScript external resources are loaded -- a collection of new TCP connections, all in their slow-start phase, and they are not all necessarily to the same destination server
Images are loaded -- multiple connections, each one downloading a resource. The problem is that these connections are not always in the same phase of their life cycle. Some might be in the slow-start phase; some may have suffered a packet loss and, thus, reduced their window and the bandwidth they are trying to fill; and some might be warmed-up TCP connections, ready to fill the bandwidth. These TCP connections are not necessarily all to the same destination server, and the bandwidth towards the various destination servers might be different between one another.
So, estimating bandwidth is possible, but it is far from simple, and it is possible only for certain phases of the page-loading process. And because having several TCP connections to various destination servers is common (for example, a CDN could host the image resources of a Web page), we cannot really tell what is the bandwidth we want to measure.
Since this is an older question, the alternative suggestion at the end of the article is to consider the more recent srcset attribute for responsive imagery, which lets the browser decide which asset to load based on whatever it knows (which should be more than us). It sounds like it's weighted more towards just determining resolution, but maybe it'll get smarter as support goes up.
I have released BwCh which is an open-source JavaScript API to detect bandwidth for web-based environments
It is built with ES2015. It uses some of the latest JavaScript innovation (window.navigator.connection currently supported in Chrome 48+ for Android as of April 2016) in order to provide a flexible method to detect bandwidth for both mobile and desktop devices. It fallbacks/complements to image pre-loading to detect bandwidth where those newest API are not available.
I would like to display a live video stream in a web browser. (Compatibility with IE, Firefox, and Chrome would be awesome, if possible.) Someone else will be taking care of streaming the video, but I have to be able to receive and display it. I will be receiving the video over UDP, but for now I am just using VLC to stream it to myself for testing purposes. Is there an open source library that might help me accomplish this using HTML and/or JavaScript? Or a good website that would help me figure out how to do this on my own?
I've read a bit about RTSP, which seems like the traditional option for something like this. That might be what I have to fall back on if I can't accomplish this using UDP, but if that is the case I still am unsure of how to go about this using RTSP/RTMP/RTP, or what the differences between all those acronyms are, if any.
I thought HTTP adaptive streaming might be the best option for a while, but it seemed like all the solutions using that were proprietary (Microsoft IIS Smooth Streaming, Apple HTTP Live Streaming, or Adobe HTTP Dynamic Streaming), and I wasn't having much luck figuring out how to accomplish it on my own. MPEG-DASH sounded like an awesome solution as well, but it doesn't seem to be in use yet since it is still so new. But now I am told that I should expect to receive the video over UDP anyways, so those solutions probably don't matter for me anymore.
I've been Googling this stuff for several days without much luck on finding anything to help me implement it. All I can find are articles explaining what the technologies are (e.g. RTSP, HTTP Adaptive Streaming, etc.) or tools that you can buy to stream your own videos over the web. Your guidance would be greatly appreciated!
It is incorrect that most video sites use FLV, MP4 is the most widely supported format and it is played via Flash players as well.
The easiest way to accomplish what you want is to open a S3Amzon/cloudFront account and work with JW player. Then you have access to RTMP software to stream video and audio. This service is very cheap. if you want to know more about this, check out these tutorials:
http://www.miracletutorials.com/category/s3-amazon-cloudfront/ Start at the bottom and work your way up to the tutorials higher up.
I hope this will help you get yourself on your way.
If you don't need sound, you can send JPEGs with header like this:
Content-Type: multipart/x-mixed-replace
This is a simple demo with nodejs, it uses library opencv4nodejs to generate images. You can use any other HTTP server which allows to append data to the socket while keeping connection opened. Tested on Chrome and FF on Ubuntu Linux.
To run the sample you will need to install this library with npm install opencv4nodejs, it might take while, then start the server like this: node app.js, here is app.js itself
var http = require('http')
const cv = require('opencv4nodejs');
var m=new cv.Mat(300, 300, cv.CV_8UC3);
var cnt = 0;
const blue = new cv.Vec3(255, 220, 120);
const yellow = new cv.Vec3(255, 220, 0);
var lastTs = Date.now();
http.createServer((req, res) => {
if (req.url=='/'){
res.end("<!DOCTYPE html><style>iframe {transform: scale(.67)}</style><html>This is a streaming video:<br>" +
"<img src='/frame'></img></html>")
} else if (req.url=='/frame') {
res.writeHead(200, { 'Content-Type': 'multipart/x-mixed-replace;boundary=myboundary' });
var x =0;
var fps=0,fcnt=0;
var next = function () {
var ts = Date.now();
var m1=m.copy();
fcnt++;
if (ts-lastTs > 1000){
lastTs = ts;
fps = fcnt;
fcnt=0;
}
m1.putText(`frame ${cnt} FPS=${fps}`, new cv.Point2(20,30),1,1,blue);
m1.drawCircle(new cv.Point2(x,50),10,yellow,-1);
x+=1;
if (x>m.cols) x=0;
cnt++;
var buf = cv.imencode(".jpg",m1);
res.write("--myboundary\r\nContent-type:image/jpeg\r\nDaemonId:0x00258009\r\n\r\n");
res.write(buf,function () {
next();
});
};
next();
}
}).listen(80);
A bit later I've found this example with some more details in python: https://blog.miguelgrinberg.com/post/video-streaming-with-flask
UPDATE: it also works, if you stream this into html img tag.
True cross-browser streaming is only possible through "rich media" clients like Flash, which is why almost all video websites default to serving video using Adobe's proprietary .flv format.
For non-live video the advent of video embeds in HTML5 shows promise, and using Canvas and JavaSCript streaming should be technically possible, but handling streams and preloading binary video objects would have to be done in JavaScript and would not be very straightforward.