I'm trying to get my processing.js application working on web browsers. When I'm running it on Processing 3.5.4 (on Windows) everything's fine, but on HTML it draws everything but the sounds don't play. (fonts don't load either, I can't see any text). Do I need another javascript audio and font library?
That's how I import the sound library:
import processing.sound.*;
SoundFile file;
And that's how I get it played (It does play on the desktop app, doesn't play on browsers)
file = new SoundFile(this, "song.wav");
file.play();
What approach would be the easiest to solve this?
Try p5.js and p5.sound library and see the loadSound() example:
let mySound;
function preload() {
soundFormats('mp3', 'ogg');
mySound = loadSound('assets/doorbell');
}
function setup() {
let cnv = createCanvas(100, 100);
cnv.mousePressed(canvasPressed);
background(220);
text('tap here to play', 10, 20);
}
function canvasPressed() {
// playing a sound file on a user gesture
// is equivalent to `userStartAudio()`
mySound.play();
}
Processing.js is no longer updated AFAIK.
Related
I have the following code in preload:
this.load.audio('theme', './audio/theme.mp3');
and in create():
gameState.music = this.sound.add('theme');
gameState.music.play();
gameState.music.loop = true;
I also added in config following some info i found out there:
audio: {
disableWebAudio: true
}
It works on desktop and iphone, but not on android. Funny thing that it worked before i made some - unrelated to audio - changes to the game.
There are some points to unpack, so I opted, for a anwser instead of a comment.
(Sorry that it is so long, I tried to keep it short)
You change the Audio file type from mp3 (which is well supported over many devices, especially older), to ogg which is not as well supported _(tipp: best info page for information to browser feature support)
specific android version support say's android 5.0+ should be able to play ogg file (android version table), I tested it on Android 5.5, it works. So if the andorid version is not under 5.0, and you tap the screen only once, and no sound is playing, there is a different issue.
iphone should not work (ogg is not supported, if I can belief the apple forum posts), but I tested it on iPhone SE 2015 and it works. Strange
btw.: fullscreen also works, but only on the safari browser, not chrome I tap scrolled
Mobile devices need user interaction to play sounds/music, so if you let the page load and don't tap the screen, the music shouldn't (won't) play. After tap the music starts. This is the part, where we are entering uncharted territory:
You are starting the music in the Scene 'FirstScene', but this Scene is stopped after first tap. Theoretically everything in the Scene will/should be unloaded, but the music plays on. (This fact seems to be know, but is still Strange, to me)
Personally I would:
call the play() function only after a user interaction, this prevents possible problems due to the "browser blocking",
use mp3 since it wider supported,
[and optional] if possible keep the sounds / music it the same Scene, and if the scene is closed/stopped, I would stop any audio, that was initiated in that Scene. Or even handle all the sound in a seperate always running scene like mentioned, in this forum post. (But I usually always have a "always-running"-scene which manages my scene switches and other background actions)
But, it seems okay, since currently it works on my local server, for:
Win10 Chrome 104+
iPhone SE 2015 safari, fullscreen won't work on iphone, if we can trust this site
Android 5.5 Chrome
Android 11.0 Chrome
Update Sound Manager Option:
an "easy" way is to add 2 extra Scenes
BootScene:
class BootScene extends Phaser.Scene {
constructor(){
super({ key: 'BootScene' })
}
create() {
// `launch` starts a scene without stopping other
this.scene.launch('FirstScene');
this.scene.launch('SoundScene');
}
}
SoundScene:
Scene that runs parallel to the game
class SoundScene extends Phaser.Scene {
constructor(){
super({ key: 'SoundScene' })
}
preload(){
this.load.audio('theme', './audio/theme.ogg');
}
create() {
this.input.once('pointerdown', _ => {
gameState.music = this.sound.add('theme');
gameState.music.play();
gameState.music.loop = true;
});
}
}
And make some minor tweaks:
index.html
(add new Scenes)
...
<script src="BootScene.js"></script>
<script src="SoundScene.js"></script>
...
game.js
(add new Scenes to config)
...
config = {
...
scene: [BootScene, SoundScene, FirstScene, ...],
...
}
...
FirstScene.js:
Remove the theme stuff from preload and the create function.
add extra checks to prevent errors (check if gameState.music is set already)
update() {
if (gameState.music && !document.hasFocus()) {
gameState.music.pause();
}
if (gameState.music && document.hasFocus()) {
gameState.music.resume();
}
}
After some digging and some testing here some more infos, on how to get the Audio, work consistently, on iphone or Android:
It seems that for Mobile (iOs or Android) you might need to unlock the audio/sound.
in the SoundScene just adapt the create function, to mirror these changes:
create() {
this.input.once('pointerdown', _ => {
// UNLOCK audio, doesn't work immediately on iphone
this.sound.unlock();
gameState.music = this.sound.add('theme2', {loop: true});
// if unlock worked, just play the sound
if(!this.sound.locked){
gameState.music.play();
}
else { // IF Not wait on unlock event
this.sound.once(Phaser.Sound.Events.UNLOCKED, () => {
gameState.music.play();
})
}
});
}
With the current config / code, this should work with Android, but not with iPhone.
If you remove the disableWebAudio:true from the config configutation in the game.js - file, than it works on iPhone, but not on Android. Strange
Currently this are the only two configurations, how I could get the audio to work on android and iPhone. I couldn't find a single configuration, that works for both.
Update: I retested everything, it seems to work, with the SoundScene changes, on win 10, android and iphone, with disableWebAudio:true using a mp3- file.
I'm new to React. I'm trying to create a <canvas> that will render a video in React so that I can provide a restriction on video-downloading as mentioned here(the paint on canvas part). (I know I can't prevent users from downloading it, just a preventive measure from my side for newbie users).
I was using this for HTML-JS :
var canvas = document.getElementById("canV");
var ctx = canvas.getContext("2d");
var video = document.createElement("video");
video.src = "http://techslides.com/demos/sample-videos/small.mp4";
video.addEventListener('loadeddata', function() {
video.play(); // start playing
update(); //Start rendering
});
function update(){
ctx.drawImage(video,0,0,256,256);
requestAnimationFrame(update); // wait for the browser to be ready to present another animation fram.
}
and
<canvas id="canV" width='256' height='256'></canvas>
Now, How can I do this in React? Also, any other simple preventive measures to prevent the videos from getting downloaded?
Also, there is this answer, Can I do something like this in react/gatsby? If yes, then please guide.
Thanks.
Suppose you use the Web Audio API to play a pure tone:
ctx = new AudioContext();
src = ctx.createOscillator();
src.frequency = 261.63; //Play middle C
src.connect(ctx.destination);
src.start();
But, later on you decide you want to stop the sound:
src.stop();
From this point on, src is now completely useless; if you try to start it again, you get:
src.start()
VM564:1 Uncaught DOMException: Failed to execute 'start' on 'AudioScheduledSourceNode': cannot call start more than once.
at <anonymous>:1:5
If you were making, say, a little online keyboard, you're constantly turning notes on and off. It seems really clunky to remove the old object from the audio nodes graph, create a brand new object, and connect() it into the graph, (and then discard the object later) when it would be simpler to just turn it on and off when needed.
Is there some important reason the Web Audio API does things like this? Or is there some cleaner way of restarting an audio source?
Use connect() and disconnect(). You can then change the values of any AudioNode to change the sound.
(The button is because AudioContext requires a user action to run in Snippet.)
play = () => {
d.addEventListener('mouseover',()=>src.connect(ctx.destination));
d.addEventListener('mouseout',()=>src.disconnect(ctx.destination));
ctx = new AudioContext();
src = ctx.createOscillator();
src.frequency = 261.63; //Play middle C
src.start();
}
div {
height:32px;
width:32px;
background-color:red
}
div:hover {
background-color:green
}
<button onclick='play();this.disabled=true;'>play</button>
<div id='d'></div>
This is exactly how the web audio api works. Sound generator nodes like oscillator nodes and audio buffer source nodes are intended to be used once. Every time you want to play your oscillator, you have to create it and set it up, just like you said. I know it seems like a hassle, but you can abstract it into a play() method that handles those details for you, so you don't have to think about it every time you play an oscillator. Also, don't worry about the performance implications of creating so many nodes. The web audio api is intended to be used this way.
If you just want to make music on the internet, and you're not as interested in learning the ins and outs of the web audio api, you might be interested in using a library I wrote that makes things like this easier: https://github.com/rserota/wad
I am working on a 12 Voice Polyphonic Syntesizer with 2 Osc per Voice.
I now never Stop the Osc's. I disconnect the Osc's. You can do that by setTimeout. For the Time take the longest release Phase (1 of 2) from the amp Enveloop for this Set of Osc's. Subtract the AudioContext.currentTime(), multiply with 1000 (setTimeout works with milisecs, web Audio works with seconds.)
This might be a obvious question but I haven't seen it addressed anywhere, but I'm trying to figure out why my web export of my Processing sketch is not retrieving saved images? I'm thinking I should package the image with the other files, but just dropping the image in the same folder doesn't seem to work. I also know that I am exporting it correctly because when I export sketches without saved images (just shapes or text created within the program), it works just fine.
Does anyone have any experience with this? If this helps at all, the code is below (it's really very simple). Thank you!
draw.pde
void setup () {
size(1280,800);
background(255,255,255);
}
void draw() {
PImage img;
img = loadImage("drake.png");
image(img, mouseX, mouseY);
}
Processing's JavaScript mode uses asynchronous loading of images. That means that the image is loaded in the background, and it isn't loaded by the time you try to draw it.
A quick fix (which you should use anyway, even in Java mode) is to not load your images in the draw() function: you'll be reloading the same file 60 times per second! Instead, load it once at the start of the sketch.
Also, if you're using Processing.js, then you need the preload at the top.
/* #pjs preload="drake.png"; */
PImage img;
void setup () {
size(1280,800);
img = loadImage("drake.png");
background(255,255,255);
}
void draw() {
image(img, mouseX, mouseY);
}
More info in the Processing.js reference here.
I've built this HTML5 video player that I am loading into a canvas to manipulate and back onto a canvas to display it. The video starts out quite slow and the frame rate only gets worse each time it is played. All I am currently manipulating in the video now is the color value when the video is paused, but will eventually be using real time manipulation throughout videos that will be posted in the future.
I used the below tutorial to learn this trick https://www.youtube.com/watch?v=zjQzP3mOXdc
Here is the relevant code, but there may possibly be interference coming from elsewhere so feel free to check the source code at the link at the bottom
var v = document.getElementById('video');
var color = "#DA7AC1";
var processes={
timerCallback:function() {
if (this.v2.paused || this.v2.ended) {
return;
}
this.ctxIn.drawImage(this.v2,0,0,this.width,this.height);
this.pixelScan();
var self=this;
setTimeout(function() {
self.timerCallback();
}, 0);
},
doLoad:function(){
this.v2=document.getElementById("video");
this.cIn=document.getElementById("cIn");
this.ctxIn=this.cIn.getContext("2d");
this.cOut=document.getElementById("cOut");
this.ctxOut=this.cOut.getContext("2d");
var self=this;
this.v2.addEventListener("playing", function() {
self.width=self.v2.videoWidth;
self.height=self.v2.videoHeight;
cIn.width=self.v2.videoWidth;
cIn.height=self.v2.videoHeight;
cOut.width=self.v2.videoWidth;
cOut.height=self.v2.videoHeight;
self.timerCallback();
}, false);
},
pixelScan: function() {
var frame = this.ctxIn.getImageData(0,0,this.width,this.height);
for(var i=0; i<frame.data.length;i+=4) {
var grayscale=frame.data[i]*.3+frame.data[i+1]*.59+frame.data[i+2]*.11;
frame.data[i]=grayscale;
frame.data[i+1]=grayscale;
frame.data[i+2]=grayscale;
}
this.ctxOut.putImageData(frame,0,0);
return;
}
}
http://coreytegeler.com/ethan/
Any and all help would be greatly appreciated! Thanks!
Reason 1
Try to adjust your timer avoiding 0 as timeout value:
setTimeout(function() {
self.timerCallback();
}, 34);
34ms is plenty as video frame rate is typically never more than 30 FPS (NTSC) or 25 FPS (PAL), ie 1000 / 30. If you use 0 you risk stacking up your calls which means the browser will be busy trying to empty the event queue.
If you use anything lower than 33-34ms you end up having the same frame processed twice or more which of course is unnecessary (your video is actually 29.97 FPS/NTSC so you might want to consider keeping 34ms).
Reason 2
The video resolution is also full HD (1920x1080) which is a bit too much for canvas and JS to process in real-time (for a typcial consumer computer). Try to reduce the video size so a normal spec'ed computer will be able to process the data.
Reason 3 (in part)
You don't need two on-screen canvases or even an on-screen video. Try to create these tags dynamically and not inserting them into the DOM. Use a single canvas on-screen and draw the result to that (you can putImageData from one canvas to another).
Reason 4 (in part)
Ideally, replace setTimeout with a requestAnimationFrame approach as this improves the synchronization and efficiency considerably. You can implement a toggle to reduce the FPS to for example 30 as you don't need to process each frame twice (ref. 30 FPS video frame rate).
Update
To create these elements dynamically (ref reason 3) you can do something like this:
var canvas = document.createElement('canvas'),
video = document.createElement('video'),
ctx = canvas.getContext('2d');
video.preload = 'auto';
video.addEventListener('canplay', start, false);
if (video.canPlayType('video/mp4')) {
video.src = 'videoUrl.mp4';
} else if ...etc.
Then when the video has loaded enough data (on metadata or canplay) you set the off-screen (and on-screen) canvas element to the size of the video:
canvas.width = video.videoWidth;
canvas.height = video.videoHeight;
Then when playing process its buffer and copy to the on-screen canvas you defined before.
You don't have have an off-screen canvas - I merely mention this as you in your original code used and in and out canvas IIRC. You can simply use a single on-screen canvas and the off-screen video and draw to the video frame to the canvas, process it and put back the processed data. Should work fine too in this case.
I ran a profile in chrome and it points to line 46 as taking up the most CPU.
setTimeout(function() {
self.timerCallback();
}, 0);
Perhaps increasing the timeout will stop it from lagging.
I had the same issues and tried a number of fixes. I was using Premier Elements which didn't export to mp4 and using HandBrake to convert the format. I also Tried FFMpeg to do the conversion, but neither worked.
What I did was switch to Kdenlive as my video editor, it exported directly to MP4, and that video worked perfectly.
So, if you are have this slow render issue, it is probably an issues with the video encoding. Easiest fix is to get a high quality video editor like Premier Pro, Final Cut, or Kdenlive. Kdenlive is free but it has a huge learning curve and poor public documentation.