I'm currently working on a progressive web app to play music (like Spotify or Google Play Music). I came upon a few things I'm not quite sure how to accomplish.
One of them is the feature that the music stopps playing as soon as the aux cable is removed from the phone. Is there a Javascript event or something similar I can attach an event handler to to dispatch the 'pause' action once the aux jack is removed?
You will not have such hardware event listener in PWA just with JS.
If you have the option to ask the user to install an service(like some app asks to install barcode scanner service to scan barcode in their app), you can create a helper class like in this solution, which will have an indent filter for aux like in this solution, when detected aux remove event, calls your PWA indent code, which pauses the music.
As an addition; I ran into some more issues discussed in this post regarding media notifications.
Implementing a hidden <audio> element allowed me to display those notifications with controls on them. Once I hooked up all of the controls (onPause, onPlay, skip, seek, ..) the device would automatically execute the onPause event once the cable has been removed from the device. This might be an Android or Chrome specific feature but it did the trick and was suitable for me.
Related
I'm developing an Desktop application using ElectronJS.
How can I play System Media sounds?
I know that for C# I can use
// Plays the sound associated with the Asterisk system event.
System.Media.SystemSounds.Asterisk.Play();
How can I do a similar call on ElectronJS?
As far as I am aware, there is no way to play a system sound using an Electron app directly. However, there are workarounds. You can ship with system sounds in your application and playing one depending on the OS your user is running, and shell beeps can be provided by importing shell from Electron and calling shell.beep();.
Another alternative may be detecting what OS is being used and pointing your media player at the relevant system sound file. This can be done with a hidden window that includes an HTML5 media player.
I'm currently developing a prototype that logs video chat information from third party services like Hangouts, Zoom, etc.
So far I'm unable to get a simple event to log to the console with navigator.mediaDevices.ondevicechange. Using the latest version of Chrome.
https://codepen.io/anon/pen/dqbNKR
I'm using this pen, and all I want to do is just log to the console when my video camera turns on/off. Is ondevicechange the right event?
https://developer.mozilla.org/en-US/docs/Web/API/MediaDevices/ondevicechange
A devicechange event is sent to a MediaDevices instance whenever a media device such as a camera, microphone, or speaker is connected to or removed from the system. It's a generic Event with no added properties.
I know I can also look at streams of specific elements, but since it's 3rd party services I don't necessarily know which elements to look at.
So how can I detect when my webcam turns on/off in a third party app in the browser?
Thanks for help =)
As I was typing this I came across this, but need to test it.
How to check with JavaScript that webcam is being used in Chrome
I am trying to fix the issue when a sender (android app) send a queue of audio items. Few of those items can be with an invalid URL
When it happens the receiver occurs an error and set the player to an IDLE state.
Then I try to click on play next button or do something else on my android app - the receiver doesn't respond.
What am I doing wrong?
screenshot of the chrome console
screenshot of the chrome console 2
You may try these suggestions discussed in Media events:
Receiver applications can tweak this logic by applying their own heuristics and changing the state in the customizedStatusCallback API. This API will provide the current state, and the application can return a modified state.
The receiver app can also not to go to IDLE after ended or error by overriding onEnded / onError, if this makes sense in a particular scenario.
I am creating a game using Construct 2, but on previewing on Android smartphone, I found Audio playback and delay issue:
This is possibly the curse of Web Audio API in that many browsers require user to touch the screen first or no music will be played. Worse, if another music is to be played, the user must touch the screen once again. This is "by design" of these smartphone browsers. Only Firefox seems to allow music to be played without user initiated touch.
I've seen that this issue has been covered for several times (ex. Website HTML 5 Audio Autoplay and https://stackoverflow.com/a/22331782/144201) and some of the possible suggestions include using other audio javascript libraries entirely such as SoundJS and howler.js.
Has anyone have experience bringing in such audio library that could solve the issue above for Cordova Android export option? Does it work for all Android devices? In fact, can anyone provide me a link for a HTML5 game/page/app, exported with C2 that uses such audio library and play music without requiring user's initial touch on Android so I could check? I just want a confirmation that this is truly possible.
Or is there a more elegant way for Construct 2?
Previewing on browser has the "user must touch screen once" issue because of it is "by design". But if the C2 app is exported via Cordova and uses Crosswalk, the game can play the music without requiring the user to ever touch the screen first.
See https://www.scirra.com/tutorials/809/how-to-export-to-android-with-crosswalk . Although the tutorial is outdated for the current Intel XDK, the instruction is more or less the same. However, the newer C2 versions also create an .xdk file upon Cordova export. In the Intel XDK, you must "Open an Intel XDK project" instead of "Import an existing HTML5 project". See https://software.intel.com/en-us/forums/intel-xdk/topic/607195 for more info.
Okay, so my application is a flashlight/torch app found here:
https://github.com/Skelware/Fancy-Flashlight and it uses a Cordova plugin found here: https://github.com/Skelware/Cordova-Flashlight
Currently I only care about Android. To briefly explain the way an app like this works on Android: The app has to request access to the camera and this is done in the background, it takes a while before the camera is loaded and thus loading (and unloading) should happen as little as possible. While an app has access to the camera, no other app can request access.
When my app starts, it loads the camera and does what it needs to do. But when the user switches to a different app or closes my app, the camera is still registered to my app causing all other apps to be unable to use the camera.
Although I would prefer to handle this on the JavaScript side, it would also be okay to handle this natively in the plugin.
Window's unload event seems to be fired when exiting (fully exiting, that is) the app, but it doesn't have enough time to release the camera.
Cordova version is 4.0 and Android version is 4.4, although I doubt that would matter.
What should I do?
The problem on your case is that the app goes to sleep when it hits background mode (user presses home button etc.). This can be prevented with a Cordova background plugin by calling
cordova.plugins.backgroundMode.enable();
You can call this when you have acquired the camera and when you release it, you can call its counterpart
cordova.plugins.backgroundMode.disable();
This way you don't prevent the sleeping when not necessary, and thus you safe some battery.
Then, you need to simply bind the pause event
document.addEventListener("pause", function() {
// Here call your release function and in the release function, you can call the disable for background mode
}, false);