I've recently started using JavaFX for it's WebKit browser instead of the one found in SWT (because it doesn't support any current browsers).
The application I'm developing loads up a browser page that has a pre-generated youtube playlist that plays automatically. There's nothing wrong with the youtube player itself (as in when viewed on a regular browser).
However, every time the song finishes and loads the next one, the volume ignores the youtube settings and goes straight to 100%. If I click the volume meter again it fixes.
Here is the code used to display the given url:
final WebView browser = new WebView();
final WebEngine webEngine = browser.getEngine();
public Browser(String url) {
getStyleClass().add("browser");
webEngine.load(url);
getChildren().add(browser);
}
Which gets wrapped in a JFXPanel:
private JFXPanel playerPanel;
....
Browser b = new Browser("google.com/youtubeplayer/");
Scene scene = new Scene(b, playerPanel.getWidth(), playerPanel.getHeight(), Color.web("#666970"));
playerPanel.setScene(scene);
I did find a similar question on here, but there was no solution listed (other than the authors self edit which left no explanation).
Related
Say we have a standard login page like the one below:
We can access the HTML elements in the DOM using jQuery or plain JavaScript like this:
In other words, the way to get the pixel location of an element in a web page is quite simply by using element.getBoundingClientRect():
var rect = element.getBoundingClientRect();
console.log(rect.top, rect.right, rect.bottom, rect.left);
So we can do this from the console or programmatically from a web app.
Now, say we have an Android browser (Chrome/Mozilla/WebView) in the foreground at any time. I can retrieve the URL of the web page in the browser. In this case:
https://login.microsoftonline.com/
So my question is, given the URL of a login page, how do I similarly get access to the same input field on an Android browser?
I need to be able to access the HTML elements of a web page in an Android browser, and calculate its pixel location. As input, I have the URL of a web page in any Android browser.
I am talking about doing this from an Android app, within the Android runtime, i.e. programmatically using Java/JS code.
In case someone needs the DOM structure of the page as text, it can be obtained programmatically with the following (Java) code:
URL url;
HttpURLConnection urlConnection;
String DOMContent = null;
try {
url = new URL("https://login.microsoftonline.com/");
urlConnection = (HttpURLConnection) url.openConnection();
int responseCode = urlConnection.getResponseCode();
if(responseCode == HttpURLConnection.HTTP_OK){
DOMContent = readStream(urlConnection.getInputStream());
}
} catch (MalformedURLException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
I need access to the HTML elements of a mobile web page within the Android runtime, just as we would in a web app or extension in a desktop browser. Or in other words, I need to be able to access/manipulate the DOM content of a mobile browser from an Android app.
How can this be done?
Update:
JavaScriptBridge looks promising. DocumentBuilder could help us convert the DOM into Java objects which may then be accessed/manipulated natively from Android.
References:
1. How to execute JavaScript on Android?
2. Calling JavaScript functions in WebView
3. How to run Javascript code in a background Service on Android
4. Is there any way to get access to DOM structure in Android's WebView?
5. Android webview Access the DOM
6. In Android Webview, am I able to modify a webpage's DOM?
7. Android WebViews and the JavaScript to Java bridge
8. Using Javascript bridge in android
9. Alternative way for communication between WebView and native
Use the following code after the page has been loaded (implement a custom WebViewClient and check the onPageFinished)
String query = "document.getElementById(\"WhateverElement\").getBoundingClientRect();"
webView.evaluateJavascript(query, new ValueCallback<String>() {
#Override
public void onReceiveValue(String s) {
Log.d("LogName", s); // s has the getBoundingClientRect
}
});
The webpage needs to be rendered first in order to get the pixel position of the input field. and to manipulate dom content in android, the best way is to go with JS Interface. The way mentioned by ZUN is also helpful. You can execute arbitrary JavaScript at runtime either way.
However, there is one particular library that can catch your attention i.e. jsoup
From the jsoup website:
jsoup is a Java library for working with real-world HTML. It provides
a very convenient API for extracting and manipulating data, using the
best of DOM, CSS, and jquery-like methods. jsoup implements the WHATWG HTML5 specification and parses HTML to the same DOM as modern browsers do.
Side note: if you don't want to show webpage to the user till all of your calculations are done, you might be interested in playing with Visibility of WebView
I'm developing an Elecron app (JavaScript) to audio visualization. There is a Playlist() instance which receives audio file paths the user wants to play. When the first audio finishes, it plays the next one. So far so good. The app does an intense computational work extracting audio features from each channel, re-rendering canvases and animating plots. It does it beautifully.
The problem is: each time the app plays a next file, the more slow it gets, as if all the audio data before is still somewhere. I've found in documentation the method close() from AudioContext():
"The close() method of the AudioContext Interface closes the audio context, releasing any system audio resources that it uses."
"An AudioContext can now be explicitly closed, thereby releasing any hardware resources associated with the AudioContext. Without this, developers had to depend on garbage collection of the AudioContext to release hardware resources."
I also have found this example of closing and restarting audio contexts:
https://github.com/mdn/webaudio-examples/blob/master/audiocontext-states/index.html
https://mdn.github.io/webaudio-examples/audiocontext-states/
The problem is that I use a audioContext.createMediaElementSource(HTMLelementID) and it doesn't allow me to restart everything recreating all the nodes like in the example. A simplified code that represents what I did before is:
class Audio() {
constructor(audioElementID, playlistObj) {
this.audioContext = new AudioContext();
this.audioElement = document.getElementById(audioElementID);
this.track = this.audioContext.createMediaElementSource(this.audioElement);
this.gainNode = this.audioContext.createGain();
this.track.connect(this.gainNode);
this.gainNode.connect(this.audioContext.destination);
this.audioElement.addEventListener('ended', () => {
playlistObj.playnextTrack() // changes the src from the html element (audioElementID) and sets this.audioElement.currentTime to 0
}
}
// everything is a property here for debugging reasons
}
const audio = new Audio('audioID', playlist);
// playlist defined somewhere else
To implement the close() method I had to change (just exactly the example, a function that recreates everything again):
class Audio() {
constructor(audioElementID, playlistObj) {
this.createAudioContext = () => {
this.audioContext = new AudioContext();
this.audioElement = document.getElementById(audioElementID);
this.track = this.audioContext.createMediaElementSource(this.audioElement);
this.gainNode = this.audioContext.createGain();
this.track.connect(this.gainNode);
this.gainNode.connect(this.audioContext.destination);
this.audioElement.addEventListener('ended', () => {
playlistObj.playNextTrack() // changes the src from the html element (audioElementID) and sets this.audioElement.currentTime to 0
}
}
this.createAudioContext();
}
}
and in playlist.playNextTrack() I pause the audioElement, call audio.audioContext.close(), wait for it (it's a promise), call audio.createAudioContext() to recreate everything and plays. The logic returns an error at this.track = this.audioContext.createMediaElementSource(this.audioElement) with:
"Failed to execute 'createMediaElementSource' on 'BaseAudioContext': HTMLMediaElement already connected previously to a different MediaElementSourceNode, at Audio.createAudioContext"
In the example, the audio source is just a random oscillator and not a mp3 audio file.
I'm really stuck here. Don't know what to do. I'm not even sure if AudioContext() really holds data from all the audio files before causing this performance problem. And if so, how could I reconnect the HTMLMediaElement to a new node audio.createAudioContext() creates? I've already tried audio.track.disconnect()but it doesn't work (as it shouldn't because here I'm disconnecting track from gainNode). And also audioElement doesn't have a disconnect()method as It's just a html element.
Any idea?
UPDATE:
I passed over the problem of recreating the audio context deleting and creating again the html element. But the problem persist: the more new audio files are played, the app gets slower. More precisely now: the more new AudioContext() is created, the slower it gets (even if I close the previous one).
I'm really stuck here. Don't know what to do. I'm not even sure if AudioContext() really holds data from all the audio files before causing this performance problem.
No, it's really unlikely this is the case. The AudioContext sets up things like the sample rate, output destination, and the graph. That's all.
The close() method of the AudioContext Interface closes the audio context, releasing any system audio resources that it uses.
You're misunderstanding what this means. Those "system audio resources" are the sound devices. While the AudioContext is running, there is an audio device requested. This is particularly meaningful in low power environments, like mobile. Another example would be Bluetooth. If the AudioContext is kept running, your Bluetooth headset may just stay on. If the AudioContext is allowed to close, then the Bluetooth headset may go to sleep.
And if so, how could I recconect the HTMLMediaElement to a new node audio.createAudioContext() creates?
You don't. While it would be nice if the API supported this, it seems it doesn't. Simply create a new HTMLMediaElement.
What you should do is properly profile your application to figure out where the slowdown is occurring. Use your developer tools. Might be faster though just to start commenting out sections of things that are running. We certainly can't tell you where the problem is, specifically, from the code you've shown.
I have an android app with a WebView, it is written in Java Script and Angular JS code to access the Camera , I just load the URL in web view and had given camera access permission and read and write external storage permission, still it is not able to access the camera , I had browsed for this problem ,but didn't find the exact solution.
I just want to open the camera , when user click on take photo on his profile page which was written in java script code , don't know when this will happen because I have base URL only which was loaded in web view.
How to let the user open the camera in this application? And also it should support from Kitkat onward, Can any one help me to figure out this?
Use openChooser method when implement UIClient
public void openFileChooser(ValueCallback<Uri> uploadMsg) {
Intent i = new Intent(Intent.ACTION_GET_CONTENT);
i.addCategory(Intent.CATEGORY_OPENABLE);
i.addCategory(Intent.ACTION_CAMERA_BUTTON);
i.setType("*/*");
startActivityForResult(Intent.createChooser(i,"File Chooser"), 111);
}
I am in the process of developing an HTML5 canvas interactive piece that uses Createjs and the Web Audio API. I've managed to get audio working in Chrome/Firefox/Safari despite the deprecation of webkitAudioContext by Chrome and FF but not Safari. However, filters for some reason are not working in Safari, but sound still plays. Filters DO work in Chrome/FF.
I have my filters set up like this:
var sound = new Audio();
sound.src = './sounds/sound.mp3';
sound.autoplay = false;
sound.loop = true;
soundSource = context.createMediaElementSource(sound);
var soundFilter = context.createBiquadFilter();
soundFilter.type = "lowpass";
soundFilter.frequency.value = 500;
soundSource.connect(soundFilter);
soundFilter.connect(context.destination);
Am I unknowingly using a deprecated term or something? Live project can be found here. Cheers.
UPDATE: This has been recognised as a genuine bug by the WebKit team, and will be patched. Full details here
Apparently Safari doesn't implement createMediaElementSource correctly. So instead of redirecting the sound through your Web Audio nodes, it still just plays the sound directly to the audio device.
Is there any particular reason why you can't use a BufferSourceNode? It makes you jump through extra hoops to get the sound file and decode it, but it should work.
I'm making an audio heavy webpage. I've read that there are some issues with audio playback on certain systems that are solved by calling the load() method before the play() method, so I'm designing everything around that premise.
I'm clueless about the audio element, and I'm worried that the load() method is rising the bandwith consumption. This is what I'm doing:
var x = new Audio("x.mp3");
function playMe(){
x.load();
x.play();
}
It's my understanding that the audio file is downloaded when the x Audio object is created. My concern is if the load() method is downloading it again every time the play button is clicked.
Thanks for your time.
You can check your own if the x.load() method re-downloads the file:
open your browser developer tools [e.g. in Chrome];
check the Network tab for activity on x.load() calls.