I can't manage to get the camera control to work on a phone using babylon defaultVRExperience
I can't understand what's missing. i've tried everything i can think of and i can't find any examples that work outside of the babylonjs playground.
Example of it working perfectly in babylon playground with just a few lines of code: https://www.babylonjs-playground.com/#VIGXA3#38
Example of same code not working outside of babylon: http://jsfiddle.net/dr3k5oqb/
Here's an example with some stuff i found in an article about making vr stuff for phones with babylon.. not working either: https://jsfiddle.net/2cdLw0tk/2/
Phone: A one+ 5 with oxygenOS 9.0.9
Browser: Chrome Version 79.0.3945.93
Literally any help would be greatly appreciated...
I assume that you are using iphone safari.
The story is that Apple is preparing to introduce a new
security/privacy setting to prevent sites from being able to access a
device’s accelerometer and gyroscope, which means some of those VR/AR
items you come across online probably won’t work quite as well until
you give express permission to do so. full article
In order to use vr we should ask users to allow access to motion and orientation by using this code:
function onClick()
{
if (typeof DeviceMotionEvent.requestPermission === 'function')
{
DeviceMotionEvent.requestPermission()
.then(permissionState => {
if (permissionState === 'granted')
{
// DeviceMotionEvent.requestPermission() has been granted
}
})
.catch(console.error);
}
}
Here is jsfiddle the babylon iphone working vr example outside the playground.
Open this demo link in your phone
Chrome v76 and forward has removed usage of DeviceMotionEvent for http, meaning that vr accelerometer control for chrome only works when using https
Source: https://www.chromestatus.com/feature/5688035094036480
This can be confirmed by just switching my example links to https and they start working on chrome on my phone.
Mudin's answer could be good to look at if you want to support safari.
Related
Summary
We cannot access camera from an iOS11 (public release) home screen web app using either WebRTC or the file input, details below. How can our users continue to access the camera please?
We are serving the web app page over https.
Update, April
The public release of iOS 11.3 seems to have fixed the issue and file input camera access is working again!
Update, March
As people here have said the Apple docs advise web app camera function is returning in 11.3 along with service workers. This is good but we are not sure yet if we want to everyone to to reinstall again until we can thoroughly test on 11.3GM.
Solution, November
We lost hope Apple want to fix this and moved forward. Modified our web app to remove the iOS "Add to home screen" function and asked affected users to remove any previous home screen icon.
Update, 6 December
iOS 11.2 and iOS 11.1.2 don't fix.
Workarounds, 21 September
Seems we could ask existing customers of the web app
not upgrade to iOS11 - good luck with that :)
take photos in iOS camera and then select them back in the web app
wait for next ios beta
reinstall as a Safari in-browser page (after we remove ATHS logic)
switch to Android
File Input
Our current production code uses a file input which has worked fine for years with iOS 10 and older. On iOS11 it works as a Safari tab but not from the home screen app. In the latter case the camera is opened and only a black screen is shown, hence it is unusable.
<meta name="apple-mobile-web-app-capable" content="yes">
...
<input type="file" accept="image/*">
WebRTC
Safari 11 on iOS11 offers WebRTC media capture which is great.
We can capture a camera image to canvas on a normal web page on desktop and mobile using navigator.mediaDevices.getUserMedia per the sample code linked here.
When we add the page to iPad or iPhone home screen, navigator.mediaDevices becomes undefined and unusable.
<meta name="apple-mobile-web-app-capable" content="yes">
...
// for some reason safari on mac can debug ios safari page but not ios home screen web apps
var d = 'typeof navigator : ' + typeof navigator; //object
d += 'typeof navigator.mediaDevices : ' + typeof navigator.mediaDevices; // undefined
// try alternates
d += 'typeof navigator.getUserMedia : ' + typeof navigator.getUserMedia; // undefined
d += 'typeof navigator.webkitGetUserMedia : ' + typeof navigator.webkitGetUserMedia; // undefined
status1.innerHTML = d;
We have quite similar problem. So far the only workaround we were able to do is to remove the meta tag for it to be "apple-mobile-web-app-capable" and let users to open it in Safari, where everything seems to work normally.
Update: While some earlier published changelogs and postings led me to believe that Web Apps using a manifest.json instead of apple-mobile-web-app-capable would finally have access to a proper WebRTC implementation, unfortunately this is not true, as others here have pointed out and testing has confirmed. Sad face.
Sorry for the inconveniences caused by this and let's hope that one lucky day in a galaxy far, far away Apple will finally give us camera access in views powered by (non-Safari) WebKit...
Yes, as others have mentioned, getUserMedia is only available directly in Safari but neither in a UIWebView nor WKWebView, so unfortunately your only choices are
removing <meta name="apple-mobile-web-app-capable" content="yes"> so your 'app' runs in a normal Safari tab, where getuserMedia is accessible
using a framework like Apache Cordova that grants you access to a device's camera in other ways.
Here's to hoping Apple removes this WebRTC restriction rather sooner than later...
Source:
For developers that use WebKit in their apps, RTCPeerConnection and RTCDataChannel are available in any web view, but access to the camera and microphone is currently limited to Safari.
Good news! The camera finally seems to be accessible from a home screen web app in the first iOS 11.3 beta.
I have made a repo with a few files, which demonstrate that it works:
https://github.com/joachimboggild/uploadtest
Steps to test:
Serve these files from a website accessible from your phone
Open the index.html in iOS Safari
Add to home screen
Open app from home screen. Now the web page is open in full screen, without navigation ui.
Press the file button to select an image from camera.
Now the camera should work normally and not be a black screen. This demonstrates that the functionality works again.
I must add that I use a plain field, not getUserMedia or somesuch. I do not know if that works.
Apparently is solved in "ios 13 beta 1":
https://twitter.com/ChromiumDev/status/1136541745158791168?s=09
Update 20/03/2020: https://twitter.com/firt/status/1241163092207243273?s=19
This seems to be working again in iOS 11.4 if you are using a file input field.
Recently I faced the same problem, the only solution I came up with was to open in the app in browser instead of the normal mode. But only on iOS!
The trick was to create 2 manifest.json files with different configurations.
The normal one for android and one for everything is Apple, manifest-ios.json, the only difference will be on the display property.
Step 1: Add id to the manifest link tag:
<link id="manifest" rel="manifest" href="manifest.json">
Step 2: Added this script to the bottom of the body:
<script>
let isIOS = /(ipad|iphone|ipod|mac)/g.test(navigator.userAgent.toLowerCase());
let manifest = document.getElementById("manifest");
if (isIOS)
manifest.href = 'manifest-ios.json'
</script>
Step 3: in the manifest-ios.json set the display to browser
{
"name": "APP",
"short_name": "app",
"theme_color": "#0F0",
"display": "browser", // <---- use this instead of standard
...
}
Another problem appears such as opening the app multiple times in multple tabs, sometimes.
But hope it helps you guys!
Currently i am developing application using phaser js.
Is this a browser compatibility issue or it is on the phaser js issue
where in the full screen api is not functioning
Here is code snippet that I use
if (phaser.scale.isFullScreen) {
phaser.scale.stopFullScreen();
} else {
phaser.scale.startFullScreen(false);
}
Similar Problem
So i tried what they suggest on the link
phaser.scale.compatibility.supportsFullScreen = true;
phaser.scale.startFullScreen(false, false);
Even the example of the phaser page it not functioning
Phaser Full Screen
Failed to execute 'requestFullscreen' on 'Element': API can only be initiated by a user gesture.
startFullScreen # phaser.2.6.2.min.js:22
phaser.2.6.2.min.js:22
Phaser.ScaleManager: requestFullscreen failed or device does not
support the Fullscreen API fullScreenError # phaser.2.6.2.min.js:22
I test in on
Chrome 56.0.2924.87
Android 4.4.2
You can’t set the value of supportsFullScreen, that’s up to the user’s browser to decide.
Instead, query it and only start full screen if it is supported.
if (game.scale.compatibility.supportsFullScreen) {
game.scale.startFullscreen();
}
Tobe's answer is correct, but the error message also points out you need to trigger it from an event:
"API can only be initiated by a user gesture."
This may help others that stumble upon this error. Make sure you only call startFullScreen on a user tap.
I get this error in Firefox 51 when I try to execute the following code and when I select my laptop's camera:
navigator.getMedia = (navigator.getUserMedia ||
navigator.webkitGetUserMedia ||
navigator.mediaDevices.getUserMedia ||
navigator.msGetUserMedia);
navigator.getMedia({
video: true,
audio: false
},
function(stream) {
if (navigator.mozGetUserMedia) {
video.mozSrcObject = stream;
} else {
var vendorURL = window.URL || window.webkitURL;
video.src = vendorURL.createObjectURL(stream);
}
video.play();
},
function(err) {
console.log("An error occured! " + err);
}
);
Error:
NotReadableError: Failed to allocate videosource
Can someone elaborate what this means? Is my webcam broken? I used it from the script just yesterday without problems. It's not allocated to other application.
NotReadableError is the spec compliant error thrown by Firefox when webcam access is allowed but not possible.
Most commonly this happens on Windows because the webcam is already in use by another app. Firefox will throw this error on both Windows and Mac even though only on Windows processes get exclusive access to the webcam.
The error can happen for other reasons:
Although the user granted permission to use the matching devices, a hardware error occurred at the operating system, browser, or Web page level which prevented access to the device.
Chrome throws TrackStartError instead. It also throws it for other reasons. Chrome tabs can share the same device.
Source: common getUserMedia() errors .
Please make sure your camera is not been used by some other application (chrome, ie or any other browser).
I wasted half my day searching for a solution, and in the end, found out my camera was used by other application.
I've encountered same issue on Windows 10, no other apps using my video device. The problem was that in Windows 10 in Settings->App permissions (in left column) there is a setting for microphone and camera (Allow apps to access your mic/camera) which needs to be turned on. It does not matter that you don't find your browser in app list below this setting, just enable it here and voila.
The message getUserMedia() error: NotReadableError was displayed for Chromium but not Firefox web browser. I also noticed that WebRTC examples using getUserMedia function without microphone access worked correctly in Chromium.
In fact, I had to make sure my microphone is enabled and select the correct microphone in Chromium / Chrome settings. Then WebRTC with audio and video access worked correctly.
If it is not a microphone problem, it may also be a webcam problem so you have to make sure your webcam is enabled and selected correctly in Chromium / Chrome settings.
Note that only one app at a time can use the webcam / microphone.
If you are here after or in December 2019 i would like to tell you few things
This feature navigator.getUserMedia() is deprecated.
Successor of this feature in the browsers will be window.navigator.mediaDevices.getUserMedia.
The new feature may not support in many browsers, since its still in the experiment mode few days ago chrome released its chrome 79 and its still not supporting in chrome 79 for me, and other than chrome and IE its working in all the browsers for me
Here is a quick code
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<meta http-equiv="X-UA-Compatible" content="ie=edge">
<title>Jello</title>
<style>
video{
width: 30%;
height: auto;
}
</style>
</head>
<body>
<video autoplay controls></video>
<button>Open Cam</button>
<script>
function getCam(){
window.navigator.mediaDevices.getUserMedia({video:true}).then((stream)=>{
// let videoTrack = stream.getVideoTracks()[0];
// console.log(videoTrack);
document.querySelector("video").srcObject = stream;
}).catch(err=> console.log(err.name))
}
// getCam();
document.querySelector("button").addEventListener("click", getCam);
</script>
</body>
</html>
Edit => if you are using in windows 10 make sure give chrome access your microphone and camera, otherwise it won't work
There is another solution to this problem. I had it with camera not working in Firefox and Skype, but working with the Camera app.
The solution was to give access to camera for "classical apps" (I do not know how it's called in English). It is in the same place access can be given or taken for all other apps, just bellow them make sure the classical apps are allowed as well. And not just giving for the app in question, like Firefox, all classical apps need to have that enabled
Tl;dr; - Check device drivers for any "funny" camera drivers
I just spent an hour on a call with a user who kept hitting this error, no matter what we tried and that includes go down every answer to this question. And we found another cause which I will now add here for the next poor soul to stumble on this.
In his case, he had installed an app called ChromaCam (which I exited almost as a first diag step) and this app installs a device driver called "Personify Virtual Camera Universal" Web search for that driver name will show a whole bunch of people having camera problems and the solution seems to be somewhat universal: uninstall the device driver. He didn't even know what ChromaCam was or why it was on his laptop, so we removed it, uninstalled the driver and everything started working perfectly!
There was another person in a different thread who had similar problem and for him it was some custom HP (?? - I think that's what he said) camera driver instead of normal generic one that Windows would have chosen.
Can someone elaborate what this means? Is my webcam broken? I used it
from the script just yesterday without problems. It's not allocated to
other application.
I've encountered exactly the same issue!
Shame on me! Because, in the meantime I'd added a beforeunload event, including the event.preventDefault as reported in the example.
After removing this event.preventDefault, everything worked fine - as expected.
I have searched everywhere for the solution at last found this. Basically in my case camera permission was turned on and Mozilla firefox can access web cam but chrome can't. Infact older versions of chrome like 74.x can use webcam but latest 84.x cannot. I thought the problem is with chrome but at last, I tried turning on my microphone access from windows 10 settings. Now chrome can access webcam too.
Solution: Please check you camera and microphone access both are turned on from windows settings.
The NonReadableError: Could not start video source is also thrown during a session (not local only!) if the camera change happens too quickly.
I don't know the solution yet, but I will edit my post accordingly once I got it.
I'm quite frustrated with a simple soundcloud api streaming example I'm working on for couple of days know. It basically just chooses randomly a SC-url and prints out a volume information. I'm using the audiostreamsource.js library by Gregg Tavares for creating the audiocontext and the p5.js for creating a paragraph.
That's it...
It works perfectly fine on Fireforx/Chrome. But for some reason it only works in Safari when I refresh the page. Sometime it even crashes with Safari. :( I think I'm just not experienced enough to get behind that problem. I really tried to solve it from every possible angle. Now I'm stuck.... :(
You can see my little example here: http://christianlosert.com/test/01/
EDIT: To see the problem you have to past the link above into a new browser tab. Then you'll see that after a couple of milliseconds the streaming aborts in Safari.
Can anybody see my mistake?
PS: With the help of Gregg Tavares I tried to create an streaming example without the audiostreamsource.js/p5.js library to make sure the bug is on my side of the code. Curiously this example works ONLY with Safari (not Firefox/Chrome): http://christianlosert.com/test/greg/
I really have absolutely no clue what's going on here
The soundcloud 3.0 library does not work with Safari. See
soundcloud's 2.0 sdk works but 3.0 does not as of Oct 29th, 2015
You can either
use the soundcloud 2.0 library (problem: it initializes flash even if it doesn't use it)
Just make the soundcloud API request on your own using some XHR code
Basically you make an XHR request to
https://api.soundcloud.com/resolve?url=<musicUrl>&client_id=<yourclientid>&format=json&_status_code[302]=200
note: you need to call encodeURIComponent on <musicUrl> and possibly on _staus_code[302]
If the result has a status of 302 then follow the location
var result = JSON.parse(resultString);
if (result.status.substr(0, 3) === "302" && result.location) {
do XHR on result.location
else
use result as normal
I have an application that is using jquery, jquerymobile and spine.js running on phonegap (0.9.5.1) and have been having some issues getting it to work properly on iOS.
The application should be launching the camera when a div is tapped. In my controller I have it so that it does something similar to the following:
myController = Spine.Controller.create({
events: {"tap .take-picture": "takePic"},
takePic: function(){
var self = this;
navigator.camera.getPicture(function(data){
self.doStuffWith(data);
},
null,
{quality: 50, destinationType: Camera.DestinationType.DATA_URL, sourceType: Camera.PictureSourceType.CAMERA})
},
doStuffWith: function(data){
// Doing stuff with said data
}
});
What is really confusing me, is that this code works properly on Android. Are there some kind of iOS quirks that make it so that tap events aren't sent off properly?
I think that you are trying to use the Android phonegap js within the iPhone app. You need to make sure that you are including the right phonegap.js for the platform you are developing. Although they share the same name, each version of phonegap is tailored to its host OS.
This could be several things:
You are testing this in the iOS Simulator. There is no Camera in the Simulator, you don't have a fail callback specified, but there is a bug (I believe) in the API where it doesn't call the fail callback if the source type is not available anyway. You should see this ("Source Type Not Available") in the Run Log (Cmd-Shift-R).
On a device, I tested your code separately, and ran it in deviceReady(), it runs - so the API call seems to be correct. I added a touch handler to a button to call the code also, so it appears tap events are working. So based on these tests (on a device):
(a) the API call works
(b) tap events work
Which leads me to the conclusion that the bug is outside of those two possibilities.