Not work video and audios on safari - NodeJS - angular4 - javascript

I have an application where I present some images, videos and audios.
The images works perfectly but I have problem with the audios and the videos.
Node API:
let config = new Config(),
client = config.Storage
export default class ImageCtrl {
product = (req, res, next) => {
var params = req.params,
point = params.filename.lastIndexOf('.'),
nameFile = params.filename.slice(0, point),
ext = params.filename.slice(point);
res.writeHead(200, {
'Cache-Control': 'no-cache'
});
let remote = `${nameFile}-thumb${ext}`;
if (nameFile.includes('audio')) {
remote = `${nameFile}${ext}`
}
client.download({
container: 'Multimedia',
remote: remote
}, function(err, result) {
// handle the download result
}).pipe(res);
}
}
config.ts
export default class Config {
Storage: any = pkgcloud.storage.createClient(config);
}
Angular4-HTML5
<video width="100%" *ngIf="product.mediacategory === 'video'" controls preload="none" [poster]="'/api/image/products/' + product._id + '-video.png'" controlsList="nodownload">
<source [src]="/api/image/products/5a69b1b32e4be51cb82a7659-video.webm" type="video/webm">
<source [src]="/api/image/products/5a69b1b32e4be51cb82a7659-video.mp4" type="video/mp4">
<source [src]="/api/image/products/5a69b1b32e4be51cb82a7659-video.ogv'" type="video/ogv">
</video>
For audio and image is the same API but only works in safari image. In chrome and in firefox audio, video and image works fine.
In safari it appears like this:
In chrome:
----If I remove [] of [src] it doesn't work on chrome.-----

Safari on iOS supports low-complexity AAC audio, MP3 audio, AIF audio, WAVE audio, and baseline profile MPEG-4 video. Safari on the desktop (Mac OS X and Windows) supports all media supported by the installed version of QuickTime, including any installed third-party codecs
click here for reference

Related

Record video in lower resolution than original using MediaRecorder

I'm creating a video recorder script using JavaScript and the MediaRecorder API. I'm using a video capture as source. The video output is 1920 x 1080 but I'm trying to shrink this resolution to 640 x 360 (360p).
I will write all the code below. I tried many configurations and variants of HTML and JS, and according to this site my video source can fit that size I'm trying to force.
The video source is from this elgato camlink 4k
UPDATE
Instead of using exact in video constraints, replace it with ideal and it will see if this resolution is available in the device.
The elgato camlink device don't support 360p apparently, I tested with external webcam which does support 360p and using ideal it works.
Using windows camera settings you can see there is no other resolutions available on elgato camlink only HD and FHD.
The HTML tag:
<video id="videoEl" width="640" height="360" autoplay canplay resize></video>
This is the getUSerMedia() script:
const video = document.getElementById('videoEl');
const constraints = {
audio: { deviceId: audioDeviceId },
video: {
deviceId: videoDeviceId,
width: { exact: 640 },
height: { exact: 360 },
frameRate: 30
}
};
this.CameraStream = await navigator.mediaDevices.getUserMedia(constraints);
video.srcObject = this.CameraStream;
Before that I choose the video source using navigator.mediaDevices.enumerateDevices();
Then I tried some options for the MediaRecorder constructor:
this.MediaRecorder = new MediaRecorder(this.CameraStream)
this.MediaRecorder = new MediaRecorder(this.CameraStream, { mimeType: 'video/webm' })
Found this mimeType in this Forum
this.MediaRecorder = new MediaRecorder(this.CameraStream, { mimeType: 'video/x-matroska;codecs=h264' })
And the event listener
this.MediaRecorder.addEventListener('dataavailable', event => {
this.BlobsRecorded.push(event.data);
});
MediaRecorder on stop
As I mention before, I tried some variants of options:
const options = { type: 'video/x-matroska;codecs=h264' };
const options = { type: 'video/webm' };
const options = { type: 'video/mp4' }; // not supported
const finalVideo = URL.createObjectURL(
new Blob(this.BlobsRecorded, options)
);
Note
Everything is working perfectly, I just leave the code to let you see the used constraints and for illustrative purposes. If there is something missing let me know to put it here.
Thank you for your time.

Flutter Web Plugin for HTML and JS

I couldn't find plugin html and js for Flutter Web. My purpose is open camera and capture a picture. But it is working on Flutter Web. I wrote html and js camera codes. But it can't working on Flutter Web app. . I tried flutter_html plugin. But has just a few meaningless line in Chrome Web Emulator. Please help me or suggestion to me other plugins.
import "package:flutter/material.dart";
import 'package:flutter_html/flutter_html.dart';
class kamera extends StatefulWidget {
#override
_kameraState createState() => _kameraState();
}
class _kameraState extends State<kamera> {
#override
Widget build(BuildContext context) {
return Html(data:
"""
<video id="video" width="640" height="480" autoplay></video><br>
<button id="snap">Snap Photo</button><br>
<canvas id="canvas" width="640" height="480"></canvas><br>
<script>
var video = document.getElementById('video');
// Get access to the camera!
if(navigator.mediaDevices && navigator.mediaDevices.getUserMedia) {
// Not adding `{ audio: true }` since we only want video now
navigator.mediaDevices.getUserMedia({ video: true }).then(function(stream) {
//video.src = window.URL.createObjectURL(stream);
video.srcObject = stream;
video.play();
});
}
var canvas = document.getElementById('canvas');
var context = canvas.getContext('2d');
var video = document.getElementById('video');
// Trigger photo take
document.getElementById("snap").addEventListener("click", function() {
context.drawImage(video, 0, 0, 640, 480);
});
</script>
""");
}
}

Adding Quality Selector to plyr when using HLS Stream

I am using plyr as wrapper around HTML5 video tag and using Hls.js to stream my .m3u8 video .
I was going around a lot of issues on plyr to enable quality selectors and came arounf multiple PR's which had this question but was closed saying the implementation is merged, till i came around this PR which says it's still open, but there was a custom implementation in the Comments which assured that it works . I was trying that implementation locally in order to check if we can add a quality selector but seems like i am missing something/ or the implementation dosent work .
<!DOCTYPE html>
<html>
<head>
<meta charset="utf-8">
<title>HLS Demo</title>
<link rel="stylesheet" href="https://cdn.plyr.io/3.5.10/plyr.css" />
<style>
body {
max-width: 1024px;
}
</style>
</head>
<body>
<video preload="none" id="player" autoplay controls crossorigin></video>
<script src="https://cdn.plyr.io/3.5.10/plyr.js"></script>
<script src="https://cdn.jsdelivr.net/hls.js/latest/hls.js"></script>
<script>
(function () {
var video = document.querySelector('#player');
var playerOptions= {
quality: {
default: '720',
options: ['720']
}
};
var player;
player = new Plyr(video,playerOptions);
if (Hls.isSupported()) {
var hls = new Hls();
hls.loadSource('https://content.jwplatform.com/manifests/vM7nH0Kl.m3u8');
//hls.loadSource('https://test-streams.mux.dev/x36xhzz/x36xhzz.m3u8');
hls.attachMedia(video);
hls.on(Hls.Events.MANIFEST_PARSED,function(event,data) {
// uncomment to see data here
// console.log('levels', hls.levels); we get data here but not able to see in settings .
playerOptions.quality = {
default: hls.levels[hls.levels.length - 1].height,
options: hls.levels.map((level) => level.height),
forced: true,
// Manage quality changes
onChange: (quality) => {
console.log('changes',quality);
hls.levels.forEach((level, levelIndex) => {
if (level.height === quality) {
hls.currentLevel = levelIndex;
}
});
}
};
});
}
// Start HLS load on play event
player.on('play', () => hls.startLoad());
// Handle HLS quality changes
player.on('qualitychange', () => {
console.log('changed');
if (player.currentTime !== 0) {
hls.startLoad();
}
});
})();
</script>
</body>
</html>
The above snippet works please run , but also if you uncomment the
line in HLS Manifest you will see we get data in levels and also pass
the data to player options but it dosent come up in settings.How can
we add a quality selector to plyr when using Hls stream .
I made a lengthy comment about this on github [1].
Working example: https://codepen.io/datlife/pen/dyGoEXo
The main idea to fix this is:
Configure Plyr options properly to allow the switching happen.
Let HLS perform the quality switching, not Plyr. Hence, we only need a single source tag in video tag.
<video>
<source
type="application/x-mpegURL"
<!-- contain all the stream -->
src="https://bitdash-a.akamaihd.net/content/sintel/hls/playlist.m3u8">
</video>
[1] https://github.com/sampotts/plyr/issues/1741#issuecomment-640293554

Javascript MediaSource API and H264 video

I have a problem playing a H264 video using javascript MediaSource Extension API.
I'll describe the scenario with details below.
I've already successfully achieved the result playing audio and video source
of vp8, vp9, opus and vorbis codec, also from a range request ( if server has the capability, using any byte range ) or chunked files, chunks done using shaka packager.
The problem comes when the source is an H264 video, in details in my case
codecs are avc1.64001e and mp4a.40.2, full codec string is
video/mp4;codecs="avc1.64001e, mp4a.40.2" but the issue still happens with any other avc1 codec.
What I am trying to do is to play a 10 megabytes chunk of the full video,
chunk generated by a byterange curl request saving the response locally using -o.
Below the stream info from shaka packager passing this file as input
[0530/161459:INFO:demuxer.cc(88)] Demuxer::Run() on file '10mega.mp4'.
[0530/161459:INFO:demuxer.cc(160)] Initialize Demuxer for file '10mega.mp4'.
File "10mega.mp4":
Found 2 stream(s).
Stream [0] type: Video
codec_string: avc1.64001e
time_scale: 17595
duration: 57805440 (3285.3 seconds)
is_encrypted: false
codec: H264
width: 720
height: 384
pixel_aspect_ratio: 1:1
trick_play_factor: 0
nalu_length_size: 4
Stream [1] type: Audio
codec_string: mp4a.40.2
time_scale: 44100
duration: 144883809 (3285.3 seconds)
is_encrypted: false
codec: AAC
sample_bits: 16
num_channels: 2
sampling_frequency: 44100
language: und
Packaging completed successfully.
The chunk is playable with external media player applications ( like VLC )
and more important, it plays without problem adding it to the webpage using the < source > tag.
This is the error I can see in the Chrome console
Uncaught (in promise) DOMException: Failed to load because no supported source was found.
Here below the html and js code if you want to reproduce ( I did all local tests using the built-in php7.2 dev server )
<!DOCTYPE html>
<html>
<head>
<meta charset="utf-8"/>
<title>VideoTest</title>
<link rel="icon" href="/favicon.ico" />
<script type="text/javascript" src="/script.js"></script>
<style>
video {
width: 98%;
height: 300px;
border: 0px solid #000;
display: flex;
}
</style>
</head>
<body>
<div id="videoContainer">
<video controls></video>
</div>
<video controls>
<source src="/media/10mega.mp4" type="video/mp4">
</video>
</body>
</html>
And here below the JS code ( scripjs )
class MediaTest {
constructor() {
}
init(link) {
this.link = link;
this.media = new MediaSource();
this.container = document.getElementsByTagName('video')[0];
this.container.src = window.URL.createObjectURL(this.media);
return new Promise(resolve => {
this.media.addEventListener('sourceopen', (e) => {
this.media = e.target;
return resolve(this);
});
});
}
addSourceBuffer() {
let codec = 'video/mp4;codecs="avc1.64001e, mp4a.40.2"';
let sourceBuffer = this.media.addSourceBuffer(codec);
// These are the same headers sent by the < source > tag
// with or without the issue remains
let headers = new Headers({
'Range': 'bytes=0-131072',
'Accept-Encoding': 'identity;q=1, *;q=0'
});
let requestData = {
headers: headers
};
let request = new Request(this.link, requestData);
return new Promise(resolve => {
fetch(request).then((response) => {
if(200 !== response.status) {
throw new Error('addSourceBuffer error with status ' + response.status);
}
return response.arrayBuffer();
}).then((buffer) => {
sourceBuffer.appendBuffer(buffer);
console.log('Buffer appended');
return resolve(this);
}).catch(function(e) {
console.log('addSourceBuffer error');
console.log(e);
});
});
}
play() {
this.container.play();
}
}
window.addEventListener('load', () => {
let media = new MediaTest();
media.init('/media/10mega.mp4').then(() => {
console.log('init ok');
return media.addSourceBuffer();
}).then((obj) => {
console.log('play');
media.play();
});
});
What I want to achieve is to play the file with MediaSource API since it plays well using < source > tag.
I don't want to demux and re-encode it, but use it as is.
Here below the error dump taken from chrome://media-internals
render_id: 180 player_id: 11 pipeline_state: kStopped event:
WEBMEDIAPLAYER_DESTROYED
To reproduce I think it is possible to use any H264 video that has audio and video track within it.
This question is strictly related with this other question I've found H264 video works using src attribute. Same video fails using the MediaSource API (Chromium) but it is from 4 years ago so I decided not to answer there.
Does anybody have some idea about this issue?
Is there any way to solve it or h264 It is just not compatible with MSE?
Thanks in advance
Its not the codec, its the container. MSE requires fragmented mp4 files. Standard mp4 is not supported. for standard mp4 you must use <video src="my.mp4">

Use/Access Android and iOS (Mobile) Camera

I make an HTML5-CSS-JS Android (and iOS) Application and i try to get access to the mobile device camera.
I tried two ways to get access to camera and both ways work fine when i use them from a Browser, but when i build my .APK file and i open my App none of them is working the same way as from a Browser:
First 1) I tried Input Tag and when i click to choose/browse it allows me only to select an Existing file NOT to capture or record a new one:
<input type="file" accept="image/*" capture="camera"/>
Then 2) I found (navigator.getUserMedia) this code on web that works also from a browser but when i open my App it seems like no working and not requesting for device hardware access at all:
navigator.getUserMedia = navigator.getUserMedia ||
navigator.webkitGetUserMedia ||
navigator.mozGetUserMedia;
if (navigator.getUserMedia) {
navigator.getUserMedia({ audio: true, video: { width: 1280, height: 720 } },
function(stream) {
var video = document.querySelector('video');
video.src = window.URL.createObjectURL(stream);
video.onloadedmetadata = function(e) {
video.play();
};
},
function(err) {
console.log("The following error occured: " + err.name);
}
);
} else {
console.log("getUserMedia not supported");
}
/*****************************************************************/
function init()
{
if(navigator.webkitGetUserMedia)
{
navigator.webkitGetUserMedia({video:true}, onSuccess, onFail);
}
else
{
alert('webRTC not available');
}
}
function onSuccess(stream)
{
document.getElementById('camFeed').src = webkitURL.createObjectURL(stream);
}
function onFail()
{
alert('could not connect stream');
}
function takePhoto()
{
var c = document.getElementById('photo');
var v = document.getElementById('camFeed');
c.getContext('2d').drawImage(v, 0, 0, 320, 240);
}
<div style="width:352px; height:625px; margin:0 auto; background-color:#fff;" >
<div>
<video id="camFeed" width="320" height="240" autoplay>
</video>
</div>
<div>
<canvas id="photo" width="320" height="240">
</canvas>
</div>
<div style="margin:0 auto; width:82px;">
<input type="button" value="Take Photo" onclick="takePhoto();">
</div>
</div>
Anyone has an idea of how can I access camera on Android or iOS?
I tried it with and without camera permissions but this was not the problem.
Help... Thank you in Advance. (If my question is not clear comment please.)
Here is the basic code, it will allow you to record/capture the file.
HTML:
<input id="reviewVideoCapture" type="file" accept="video/*"/>
JS (Using JQuery):
var videoFile = $('#reviewVideoCapture').get(0).files[0];
Use this if you want to upload the video somewhere:
<form enctype="multipart/form-data" action="/api/photo" method="POST" >
<input id="reviewVideoCapture" type="file" accept="video/*"/>
<input id="submitFormButton" type="submit">
</form>
Even though this works, I would still not recommend doing it this way. The results on mobile are buggy and hard to work with. The better way is to use cordova/ionic. A good example on how to do this can be found here:
https://github.com/yafraorg/ionictests

Categories