I was trying to build a Jupyter notebook solution for outlier analysis of video dataset. I wanted to use Video widget for that purpose, but I didn't find in the documentation how to get a current video frame and/or scroll to needed position by calling some widget's method. My problem is very similar (virtually the same) to these unanswered questions one and two.
I managed to implement the idea by saving the video frames to numpy array and employing matplotlib's imshow function, but video playing is very jittering. I used blitting technique to get some extra fps, but it didn't help much, and in comparison, Video widget produces a smoother experience. It looks like Video widget is essentially a wrapper for browser's built-in video player.
Question: How can I get control of widget's playing programmatically so I can synchronize multiple widgets? Could some %%javascript magic help with a IPython.display interaction?
Here below is a Python code snippet just for illustration purposes, to give you an idea of what I want to achieve.
%matplotlib widget
from videoreader import VideoReader # nice pithonic wrapper for video reading with opencv
import numpy as np
import matplotlib.pyplot as plt
from ipywidgets import IntSlider, Play, link, HBox, VBox
# prepare buffered video frames
vr = VideoReader('Big.Buck.Bunny.mp4')
fps = vr.frame_rate
frames = []
for frame in vr[0:300:1]:
frames.append(frame[:,:,::-1]) # brg2rgb
del vr
vi_buff = np.stack(frames, axis=0) # dimensions (T, H, W, C)
# simulate random signal for demonstration purposes
t = np.linspace(0.0, vi_buff.shape[0], num=vi_buff.shape[0]*10)
s = np.sin(2*np.pi*t)*np.random.randn(vi_buff.shape[0]*10)
plt.ioff()
fig = plt.figure(figsize=(11, 8))
ax1 = plt.subplot2grid((6, 6), (0, 0), rowspan=2, colspan=3)
ax2 = plt.subplot2grid((6, 6), (0, 3), colspan=3)
ax3 = plt.subplot2grid((6, 6), (1, 3), colspan=3)
plt.ion()
# initial plots
img = ax1.imshow(vi_buff[0,...])
l0 = ax2.plot(t, s)
l1 = ax3.plot(t, -s)
# initial scene
lo_y, hi_y = ax2.get_ybound()
ax2.set_xbound(lower=-12., upper=2.)
ax3.set_xbound(lower=-12., upper=2.)
def update_plot(change):
val = change.new
img.set_data(vi_buff[val,...])
ax2.axis([val - 12, val + 2, lo_y, hi_y])
ax3.axis([val - 12, val + 2, lo_y, hi_y])
fig.canvas.draw_idle()
player = Play(
value=0, #intial frame index
min=0,
max=vi_buff.shape[0]-1,
step=1,
interval=int(1/round(fps)*1000) #referesh interval in ms
)
fr_slider = IntSlider(
value=0,
min=0,
max=vi_buff.shape[0]-1
)
fr_slider.observe(update_plot, names='value')
link((player,"value"), (fr_slider,"value"))
VBox([HBox([player, fr_slider]), fig.canvas])
I had to learn it the hard way and write my own custom Video widget. Thanks to the authors of ipywidgets module, it was not from the scratch. I also found out that, what I wanted to do, possibly could be done in the easier way with PyViz Panel Video. I am sharing my solution if anyone interested and it could save you the time learning the front-end and Backbone.js.
Caution: it won't run in JupyterLab, you need to switch to Jupyter Classic Mode. Please share a solution in the comments, if you know how to fix: “Javascript Error: require is not defined”
Widget Model on the Python side
from traitlets import Unicode, Float, Bool
from ipywidgets import Video, register
#register
class VideoE(Video):
_view_name = Unicode('VideoEView').tag(sync=True)
_view_module = Unicode('video_enhanced').tag(sync=True)
_view_module_version = Unicode('0.1.1').tag(sync=True)
playing = Bool(False, help="Sync point for play/pause operations").tag(sync=True)
rate = Float(1.0, help="Sync point for changing playback rate").tag(sync=True)
time = Float(0.0, help="Sync point for seek operation").tag(sync=True)
#classmethod
def from_file(cls, filename, **kwargs):
return super(VideoE, cls).from_file(filename, **kwargs)
Widget View on the front-end
%%javascript
require.undef('video_enhanced');
define('video_enhanced', ["#jupyter-widgets/controls"], function(widgets) {
var VideoEView = widgets.VideoView.extend({
events: {
// update Model when event is generated on View side
'pause' : 'onPause',
'play' : 'onPlay',
'ratechange' : 'onRatechange',
'seeked' : 'onSeeked',
},
initialize: function() {
// propagate changes from Model to View
this.listenTo(this.model, 'change:playing', this.onPlayingChanged); // play/pause
this.listenTo(this.model, 'change:rate', this.onRateChanged); // playbackRate
this.listenTo(this.model, 'change:time', this.onTimeChanged); // currentTime
},
// View -> Model
onPause: function() {
this.model.set('playing', false, {silent: true});
this.model.save_changes();
},
// View -> Model
onPlay: function() {
this.model.set('playing', true, {silent: true});
this.model.save_changes();
},
// Model -> View
onPlayingChanged: function() {
if (this.model.get('playing')) {
this.el.play();
} else {
this.el.pause();
}
},
// View -> Model
onRatechange: function() {
this.model.set('rate', this.el.playbackRate, {silent: true});
this.model.save_changes();
},
// Model -> View
onRateChanged: function() {
this.el.playbackRate = this.model.get('rate');
},
// View -> Model
onSeeked: function() {
this.model.set('time', this.el.currentTime, {silent: true});
this.model.save_changes();
},
// Model -> View
onTimeChanged: function() {
this.el.currentTime = this.model.get('time');
},
});
return {
VideoEView : VideoEView,
}
});
And the actual job I needed to get done
from ipywidgets import link, Checkbox, VBox, HBox
class SyncManager():
'''
Syncing videos needs an explicit setting of "time" property or clicking on the progress bar,
since the property is not updated continuously, but on "seeked" event generated
'''
def __init__(self, video1, video2):
self._links = []
self._video1 = video1
self._video2 = video2
self.box = Checkbox(False, description='Synchronize videos')
self.box.observe(self.handle_sync_changed, names=['value'])
def handle_sync_changed(self, value):
if value.new:
for prop in ['playing', 'rate', 'time']:
l = link((self._video1, prop), (self._video2, prop))
self._links.append(l)
else:
for _link in self._links:
_link.unlink()
self._links.clear()
video1 = VideoE.from_file("images/Big.Buck.Bunny.mp4")
video1.autoplay = False
video1.loop = False
video1.width = 480
video2 = VideoE.from_file("images/Big.Buck.Bunny.mp4")
video2.autoplay = False
video2.loop = False
video2.width = 480
sync_m = SyncManager(video1, video2)
VBox([sync_m.box, HBox([video1, video2])])
I have a use case where we donot want the user who is taking a test get sleep timeout while in test which is of 3 hrs. I was looking at nosleep.js but i has cpu overuse problem as it was not working in windows.
I was wondering how test taking apps do it, also how youtube achieves this.
Any help on this is really appreciated
Wake Lock API arrives in Chrome 79. However none of the other browser support this as of now.
https://developers.google.com/web/updates/2019/12/nic79#wake-lock
YouTube handles this by simply playing video. When you play a video with sound in the browser, the browser requests a wake lock automatically so as not to interrupt the user.
It isn't clear from your question what your specific requirements are, but if playing a video is possible, try that.
So after dealing with the pain of this problem for nice afternoon. I'm posting my solution. I'm using modules so delete export statements if you're not. My goal was to minimize effect on battery. So I play empty video file (10 x 10px, 1 frame lenght with empty audio track) every 10 seconds. I play it for the first time when user first clicks so the browser will allow me to play it. There are still side effects though. I.e. in Safari, the audio icon will blink and on iPhone lockscreen, the page will be shown as the last one playing video/audio.
let noSleep = null
let video = null
// --- Public ---
export function activate() {
if (!noSleep) noSleep = new NoSleep()
}
export function deactivate() {
if (noSleep) {
noSleep.stop()
noSleep = null
}
}
// --- Private ---
class NoSleep {
constructor() {
this._noSleep = true
this._keepAwake()
}
stop() {
this._noSleep = false
}
async _keepAwake() {
while(this._noSleep) {
if (video) video.play()
document.body.innerHTML += "<br>Played"
await wait(10000)
}
}
}
window.addEventListener("click", function enable() {
window.removeEventListener("click", enable)
// Initialize video element
video = document.createElement("video")
video.setAttribute("playsinline", "")
// Add mp4 source
let source = document.createElement("source")
source.src = mp4Src
source.type = "video/mp4"
video.append(source)
// Add webm source
source = document.createElement("source")
source.src = webmSrc
source.type = "video/webm"
video.append(source)
// Play it as a result of user interaction
video.play()
document.body.innerHTML = "Activated"
activate()
})
function wait(milliseconds) {
return new Promise(resolve => setTimeout(resolve, milliseconds))
}
const mp4Src = "data:video/mp4;base64,AAAAHGZ0eXBpc29tAAACAGlzb21pc28ybXA0MQAAAyBtb292AAAAbG12aGQAAAAAAAAAAAAAAAAAAAPoAAAAGwABAAABAAAAAAAAAAAAAAAAAQAAAAAAAAAAAAAAAAAAAAEAAAAAAAAAAAAAAAAAAEAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAACAAACSnRyYWsAAABcdGtoZAAAAAMAAAAAAAAAAAAAAAEAAAAAAAAAGwAAAAAAAAAAAAAAAQEAAAAAAQAAAAAAAAAAAAAAAAAAAAEAAAAAAAAAAAAAAAAAAEAAAAAAAAAAAAAAAAAAACRlZHRzAAAAHGVsc3QAAAAAAAAAAQAAABoAAAgAAAEAAAAAAcJtZGlhAAAAIG1kaGQAAAAAAAAAAAAAAAAAAKxEAAAEgFXEAAAAAAAxaGRscgAAAAAAAAAAc291bgAAAAAAAAAAAAAAAENvcmUgTWVkaWEgQXVkaW8AAAABaW1pbmYAAAAQc21oZAAAAAAAAAAAAAAAJGRpbmYAAAAcZHJlZgAAAAAAAAABAAAADHVybCAAAAABAAABLXN0YmwAAAB7c3RzZAAAAAAAAAABAAAAa21wNGEAAAAAAAAAAQAAAAAAAAAAAAIAEAAAAACsRAAAAAAAM2VzZHMAAAAAA4CAgCIAAQAEgICAFEAVAAAAAAJ/9wACf/cFgICAAhIQBoCAgAECAAAAFGJ0cnQAAAAAAAJ/9wACf/cAAAAgc3R0cwAAAAAAAAACAAAAAwAABAAAAAABAAAAgAAAABxzdHNjAAAAAAAAAAEAAAABAAAABAAAAAEAAAAkc3RzegAAAAAAAAAAAAAABAAAAXMAAAF0AAABcwAAAXQAAAAUc3RjbwAAAAAAAAABAAADTAAAABpzZ3BkAQAAAHJvbGwAAAACAAAAAf//AAAAHHNiZ3AAAAAAcm9sbAAAAAEAAAAEAAAAAQAAAGJ1ZHRhAAAAWm1ldGEAAAAAAAAAIWhkbHIAAAAAAAAAAG1kaXJhcHBsAAAAAAAAAAAAAAAALWlsc3QAAAAlqXRvbwAAAB1kYXRhAAAAAQAAAABMYXZmNTguNzYuMTAwAAAACGZyZWUAAAXWbWRhdCERRQAUUAFG//EKWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaXemCFLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS8IRFFABRQAUb/8QpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpd6aIUtLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS8IRFFABRQAUb/8QpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpd6YIUtLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLwhEUUAFFABRv/xClpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWlpaWl3pohS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLw="
const webmSrc = "data:video/webm;base64,GkXfo59ChoEBQveBAULygQRC84EIQoKEd2VibUKHgQRChYECGFOAZwEAAAAAAANXEU2bdLpNu4tTq4QVSalmU6yBoU27i1OrhBZUrmtTrIHYTbuMU6uEElTDZ1OsggE/TbuMU6uEHFO7a1OsggNB7AEAAAAAAABZAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAVSalmsirXsYMPQkBNgI1MYXZmNTguNzYuMTAwV0GNTGF2ZjU4Ljc2LjEwMESJiEBBAAAAAAAAFlSua+KuAQAAAAAAAFnXgQFzxYjHh4Jmxpm3C5yBACK1nIN1bmSGhkFfT1BVU1aqg2MuoFa7hATEtACDgQLhkZ+BArWIQOdwAAAAAABiZIEYY6KTT3B1c0hlYWQBAjgBgLsAAAAAABJUw2dB03NzAQAAAAAAAQ5jwIBnyAEAAAAAAAAVRaOLTUFKT1JfQlJBTkREh4RxdCAgZ8gBAAAAAAAAFEWjjU1JTk9SX1ZFUlNJT05Eh4EwZ8gBAAAAAAAAG0WjkUNPTVBBVElCTEVfQlJBTkRTRIeEcXQgIGfIAQAAAAAAABlFo4hUSU1FQ09ERUSHizAwOjAwOjAwOjAwZ8gBAAAAAAAAKkWjn0NPTS5BUFBMRS5RVUlDS1RJTUUuRElTUExBWU5BTUVEh4VlbXB0eWfIAQAAAAAAACRFo5lDT00uQVBQTEUuUVVJQ0tUSU1FLlRJVExFRIeFZW1wdHlnyAEAAAAAAAAaRaOHRU5DT0RFUkSHjUxhdmY1OC43Ni4xMDBzcwEAAAAAAACxY8CLY8WIx4eCZsaZtwtnyAEAAAAAAAAiRaOMSEFORExFUl9OQU1FRIeQQ29yZSBNZWRpYSBBdWRpb2fIAQAAAAAAABtFo4lWRU5ET1JfSUREh4xbMF1bMF1bMF1bMF1nyAEAAAAAAAAjRaOHRU5DT0RFUkSHlkxhdmM1OC4xMzQuMTAwIGxpYm9wdXNnyKJFo4hEVVJBVElPTkSHlDAwOjAwOjAwLjAzNDAwMDAwMAAAH0O2daTngQCjh4EAAID8//6gAQAAAAAAAA+hh4EAFQD8//51ooNwiJgcU7trkbuPs4EAt4r3gQHxggMY8IED"
Implementation is made so multiple calls to activate()/deactivate() have no effect (which was beneficial in my use case). I create NoSleep object to make sure that it's gonna deactivate (operation may take up to 10 seconds to deactivate). Without it quick successive activations and deactivations could result in multiple wake cycles to run at the same time without quiting themselves eventually.
Play looped VIDEO or AUDIO on your page
You can use this a quick example to add a Fake looped video to your page and prevent mobile device from sleep:
// Create the root video element
var video = document.createElement('video');
video.setAttribute('loop', '');
// Add some styles if needed
video.setAttribute('style', 'position: fixed;');
// A helper to add sources to video
function addSourceToVideo(element, type, dataURI) {
var source = document.createElement('source');
source.src = dataURI;
source.type = 'video/' + type;
element.appendChild(source);
}
// A helper to concat base64
var base64 = function(mimeType, base64) {
return 'data:' + mimeType + ';base64,' + base64;
};
// Add Fake sourced
addSourceToVideo(video,'webm', base64('video/webm', 'GkXfo0AgQoaBAUL3gQFC8oEEQvOBCEKCQAR3ZWJtQoeBAkKFgQIYU4BnQI0VSalmQCgq17FAAw9CQE2AQAZ3aGFtbXlXQUAGd2hhbW15RIlACECPQAAAAAAAFlSua0AxrkAu14EBY8WBAZyBACK1nEADdW5khkAFVl9WUDglhohAA1ZQOIOBAeBABrCBCLqBCB9DtnVAIueBAKNAHIEAAIAwAQCdASoIAAgAAUAmJaQAA3AA/vz0AAA='));
addSourceToVideo(video, 'mp4', base64('video/mp4', 'AAAAHGZ0eXBpc29tAAACAGlzb21pc28ybXA0MQAAAAhmcmVlAAAAG21kYXQAAAGzABAHAAABthADAowdbb9/AAAC6W1vb3YAAABsbXZoZAAAAAB8JbCAfCWwgAAAA+gAAAAAAAEAAAEAAAAAAAAAAAAAAAABAAAAAAAAAAAAAAAAAAAAAQAAAAAAAAAAAAAAAAAAQAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAIAAAIVdHJhawAAAFx0a2hkAAAAD3wlsIB8JbCAAAAAAQAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAABAAAAAAAAAAAAAAAAAAAAAQAAAAAAAAAAAAAAAAAAQAAAAAAIAAAACAAAAAABsW1kaWEAAAAgbWRoZAAAAAB8JbCAfCWwgAAAA+gAAAAAVcQAAAAAAC1oZGxyAAAAAAAAAAB2aWRlAAAAAAAAAAAAAAAAVmlkZW9IYW5kbGVyAAAAAVxtaW5mAAAAFHZtaGQAAAABAAAAAAAAAAAAAAAkZGluZgAAABxkcmVmAAAAAAAAAAEAAAAMdXJsIAAAAAEAAAEcc3RibAAAALhzdHNkAAAAAAAAAAEAAACobXA0dgAAAAAAAAABAAAAAAAAAAAAAAAAAAAAAAAIAAgASAAAAEgAAAAAAAAAAQAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAABj//wAAAFJlc2RzAAAAAANEAAEABDwgEQAAAAADDUAAAAAABS0AAAGwAQAAAbWJEwAAAQAAAAEgAMSNiB9FAEQBFGMAAAGyTGF2YzUyLjg3LjQGAQIAAAAYc3R0cwAAAAAAAAABAAAAAQAAAAAAAAAcc3RzYwAAAAAAAAABAAAAAQAAAAEAAAABAAAAFHN0c3oAAAAAAAAAEwAAAAEAAAAUc3RjbwAAAAAAAAABAAAALAAAAGB1ZHRhAAAAWG1ldGEAAAAAAAAAIWhkbHIAAAAAAAAAAG1kaXJhcHBsAAAAAAAAAAAAAAAAK2lsc3QAAAAjqXRvbwAAABtkYXRhAAAAAQAAAABMYXZmNTIuNzguMw=='));
// Append the video to where ever you need
document.body.appendChild(video);
// Start playing video after any user interaction.
// NOTE: Running video.play() handler without a user action may be blocked by browser.
var playFn = function() {
video.play();
document.body.removeEventListener('touchend', playFn);
};
document.body.addEventListener('touchend', playFn);
I'm making an iOS app to wrap my javascript game. On mobile safari it works fine because after I play a sound in ontouchstart, then I'm allowed to play any audio whenever I want, and I can set their volume too.
Problem 1: In a WKWebView I can only play the specific sounds that I played in ontouchstart, not the rest. So I'm playing every single audio in my game on the first tap. It sounds really bad. Otherwise in the javascript on audio.play() I get
Unhandled Promise Rejection: NotAllowedError: The request is not allowed by the user agent or the platform in the current context, possibly because the user denied permission.
Problem 2: I also can't lower the volume of sounds in a WKWebView. If I set myAudio.volume=.5 it's instantly 1 again. Which means the user has to actually hear them, in order for them to get to readyState=4
Any good solution or hack? Right now I'm playing every single sound on the first tap, and everything is full volume.
<html>
<body style='bakcground:#DDD;height:333px'>
<div id=debug style='font-size:20px' onclick="playSound(myAudio)">
Debug<br />
Tap here to play the sound.<br />
</div>
<script>
var context = new webkitAudioContext()
var myAudio = new Audio("sound/upgrade.wav")
myAudio.addEventListener('canplaythrough', function(){log('canplaythrough1')}, false)
var myAudio2 = new Audio("sound/dragon.wav")
myAudio2.addEventListener('canplaythrough', function(){log('canplaythrough2')}, false)
function playSound(sound)
{
log("playSound() readyState="+sound.readyState)
try{
sound.play()
context.resume().then(() => {
log("Context resumed")
sound.play()
})
}
catch(e)
{
log(e)
}
}
function log(m)
{
console.log(m)
debug.innerHTML += m+"<br />"
}
window.ontouchstart = function()
{
log("ontouchstart()")
playSound(myAudio)
setTimeout(function() {
playSound(myAudio2, 1111)
})
}
</script>
</body>
</html>
And ViewController.swift
import UIKit
import WebKit
class ViewController: UIViewController {
private var myWebView3 : WKWebView!
override func viewDidLoad() {
super.viewDidLoad()
self.myWebView3 = WKWebView(frame: .zero)
self.myWebView3.configuration.preferences.setValue(true, forKey: "allowFileAccessFromFileURLs")
self.myWebView3.configuration.mediaTypesRequiringUserActionForPlayback = []
self.myWebView3.configuration.allowsInlineMediaPlayback = true
let url = Bundle.main.url(forResource: "index6", withExtension: "html", subdirectory: "/")!
self.myWebView3.loadFileURL(url, allowingReadAccessTo: url)
let request = URLRequest(url: url)
self.myWebView3.load(request)
self.view.addSubview(self.myWebView3)
}
override func viewDidLayoutSubviews() {
self.myWebView3.frame = self.view.bounds
}
}
I'm using Xcode 10.1
iPhone SE 12.1.1
I have also tried on an iPad with iOS 10 and get the same error.
I have also tried context.decodeAudioData() / context.createBufferSource() and get the same error.
Re: problem 1, you need to create the configuration object before you set the web view's configuration. The line
self.myWebView3.configuration.mediaTypesRequiringUserActionForPlayback = []
will fail silently. Just create a new WKWebViewConfiguration object, set its properties, and then create the web view:
webView = WKWebView(frame: .zero, configuration: webConfiguration)
In the WKWebView's delegate, after webview loading finished, execute javascript that addEventListener to audio and video tag with event play, pause, ended. Then check the playback state for purpose.
I'm trying to open a PDF file as a smart object layer in Photoshop (CS5). Croped to the trimbox.
It works when using openDialog() and open().
But in case the property asSmartObject is true the PDFOpenOptions.cropPage = CropToType.TRIMBOX will be ignored.
There have to be options to open as smart object, like the PDFOpenOptions.
But I can't find them, did you?
My page should be croped to the trimbox with a minimum height/width of 400 px. Open the PDF as rendered pic won't work, because PDFOpenOptions.height ist deprecated for CS5 (and newer).
I don't want to render with a higher resolution, the PDF could be a small business card or a huge poster. Opening a file should be fast. This is why I choosed "open as smart object".
var openPDF = File.openDialog (undefined, undefined, false);
var openPDFoptions = new PDFOpenOptions;
openPDFoptions.antiAlias = true;
openPDFoptions.bitsPerChannel = BitsPerChannelType.EIGHT;
openPDFoptions.cropPage = CropToType.TRIMBOX;
openPDFoptions.mode = OpenDocumentMode.RGB;
openPDFoptions.name = "unnamed";
openPDFoptions.page = 1;
openPDFoptions.resolution = 72;
openPDFoptions.suppressWarnings = true;
openPDFoptions.usePageNumber = true;
app.open (openPDF, openPDFoptions, false)
I have this code-
/**
Save the web view as a screenshot. Currently only supports saving to
the photo library.
/
- (void)saveScreenshot:(NSArray)arguments withDict:(NSDictionary*)options
{
CGRect screenRect = [[UIScreen mainScreen] bounds];
CGRect imageRect = CGRectMake(0, 0, CGRectGetWidth(screenRect),
CGRectGetHeight(screenRect));
UIGraphicsBeginImageContext(imageRect.size);
[webView.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *viewImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
UIImageWriteToSavedPhotosAlbum(viewImage, self, nil, nil);
UIAlertView *alert= [[UIAlertView alloc] initWithTitle:nil
message:#"Image Saved" delegate:self cancelButtonTitle:#"OK"
otherButtonTitles:nil];
[alert show];
[alert release];
}
This is for saving whatever you drew in my app. How would I add the button for this in the HTML code. How do i call from it?
So you are trying to send an event from html to a UIWebView?
The easiest way is to use a private scheme. In the html:
Take Picture
In the UIWebView delegate:
-(BOOL) webView:(UIWebView *)inWeb shouldStartLoadWithRequest:(NSURLRequest *)inRequest navigationType:(UIWebViewNavigationType)inType {
NSString *string = [[[inRequest URL] absoluteString] lowercaseString];
if ( [string isEqualToString:#"myscheme:save_picture"] ) {
[self savePicture];
return NO;
}
return YES;
}
You can look at PhoneGap for a more robust usage of this mechanism.