I am developing a website with a full background video.
To optimize for low speed connections / mobile, I am using a media query to detect screen sizes smaller then 768 px, then doing a display:none on the video container and displaying a background image instead.
My question here is:
Is this the correct way to optimize for low speed connections / mobile?
Will it have any impact on my optimization when not displaying containers with css or should I be doing it in JavaScript instead, when loading the page?
Media queries will allow you to load different images if they are set as backgrounds, so that's a start for small screens, but not for low speed on a computer, and it won't work in the case of a video, or additionnal files being loaded or not.
In JS
This is what I can think of at the moment, probably not very reliable, because it depends on how much content you have on your website.
It would consist in only having the most important stuff loaded (low speed connexion), and getting an approximate loading time for the content (DOM, images, css, js...). Then you can choose to either load the rest or not.
// get the current time as soon as you can (directly in the head tag)
var start = new Date().getTime();
// do the same after the page has loaded and find out the difference
window.onload = function(){
var end = new Date().getTime();
var timeTaken = end - start;
alert('It took ' + timeTaken + ' ms to load');
if(timeTaken < 2000){
// load more stuff if it took less than 2 seconds, for example
}
}
Again: not very reliable (a page with lots of images is going to take longer, and finding the perfect "timeout" (2 seconds here) won't be easy. Also, this won't work is your users have JS disabled, but that's not a concern I'm worried about these days :) You should probably wait for other answers.
In PHP
Another method I can think of is doing it in PHP if that's an option for you. You could have your php page get the time of its request by the client. Then for example if you have an external JS, you can do this:
index.php
<script src="myScript.php?time=<?=microtime()?>"></script>
myScript.php would be a php page that will get the time of this request, compare it with the first one , and then you can choose to serve different JS files based on that (That is called a proxy page).
From the JS file you choose, you can load different stuff based on what you want to do.
myScript.php
<?php
header("Content-Type: text/javascript");
$start = intval( $_GET['time'] );
$end = microtime();
$timeTaken = $end - $start;
if( $timeTaken < 2000 ){
echo file_get_contents('JSForHighSpeed.js');
} else {
echo file_get_contents('JSForLowSpeed.js');
}
?>
What are you using as a player for your videos?
For what you're doing, the answers will be in jQuery, not CSS. With videos, it's important to know what the user's bandwidth is so that you can supply the correct video resolution. Most phones can support 1080p resolutions (often times double, especially with Apple's Retina Display, or Samsung's 5K screens). In other words, it shouldn't matter if they are using a phone or a cinema display; what matters is their connection speed.
I've had good luck with JWPlayer and using Amazon S3 for storage. It's also been my experience that H.264 MP4's are the way to go.
Whatever you're using, you should be able to supply multiple versions of your video(s). For example, you might create different resolutions - 360, 720 and 1080.
Here's a jQuery utility you can use to determine the user's bandwidth. Make sure to create a file named "10.kb.file.zip" (and make sure it's exactly 10 kb).
/*
* measureBandwidth.js
* Directory: ~/lib/js/
* jQuery utility for measuring a user's bandwidth
*/
var url = 'js/10.kb.file.zip?{0}';
var start = '';
function getBandwidth(callback) {
start = new Date();
getFile(1, callback);
}
function getFile(i, callback) {
$.get(url.f(Math.random()), function () {
i++;
if (i < 6) {
getFile(i, callback);
} else {
var end = new Date();
var speed1 = Math.round(((50 / ((end - start) * .001) * 8) / 1000) * 10) / 10;
var speed2 = Math.round(50 / ((end - start) * .001) * 10) / 10;
callback(speed1, speed2);
}
});
}
String.prototype.f = function () { var args = arguments; return this.replace(/\{(\d+)\}/g, function (m, n) { return args[n]; }); };
Then, you can use it like this:
getBandwidth(function (Mbits, kBs) {
$('#speed1').html(Mbits + ' Mbit/s');
$('#speed2').html(kBs + ' kB/s');
});
Based on those results, you can then set the appropriate video for the user.
For example, I route kBs < 128 to default to 360p video, and kBs > 128 to the 720p video.
In JWPlayer, you would add all of your videos to the "playlist" and give them labels like "360p", "720p" etc.
Related
Just for fun, i'm trying to implement a "15 puzzle", but with 16 images (from 1 music photo) instead.
The thing is split into 2 scripts / sides. 1 Python CGI script that will perform the Last.FM query + splitting the image in Y x Z chunks. When the python script finishes it outputs a JSON string that contains the location (on server), extension etc.
{"succes": true, "content": {"nrofpieces": 16, "size": {"width": 1096, "height": 961}, "directoryname": "Mako", "extension": "jpeg"}}
On the other side is a HTML, JS, (CSS) combo that will query the CGI script for the images.
$(document).ready(function () {
var artiest = $("#artiest")
var rijen = $("#rijen")
var kolommen = $("#kolommen")
var speelveld = $("#speelveld")
var search = $("#search")
$("#buttonClick").click(function () {
var artiestZ = artiest.val()
var rijenZ = rijen.val()
var kolommenZ = kolommen.val()
$.getJSON("http://localhost:8000/cgi-bin/cgiScript.py", "artiest=" + artiestZ + "&rijen=" + rijenZ + "&kolommen=" + kolommenZ, function (JsonSring) {
console.log("HIIIIII")
if (JsonSring.succes === true){
console.log(JsonSring)
var baseUrl = "http://localhost:8000/"
var extension = JsonSring.content.extension
var url = baseUrl + JsonSring.content.directoryname + "/"
var amountX = rijenZ
var amountY = kolommenZ
for (var i = 0; i < amountX; i += 1){
for (var p = 0; p < amountY; p += 1){
console.log("HI")
var doc = new Image
doc.setAttribute("src", url + JsonSring.content.directoryname + i + "_" + p + "." +extension)
document.getElementById("speelveld").appendChild(doc)
}
}
}else {
// Search failed. Deal with it.
}
})
})
})
where the various id's link to various HTML elements. (Text Fields & Buttons & Div's).
Beneath is a screenshot of the full folder that contains the image files.
Now, coming to the point. All the HTML img tags with src seem correct, yet. Some images don't load, yet other do. I also noticed that all images failed to load in 2s intervals. Is there some kind of timeout, or so?
All this is being ran from a local machine, so disk speed and cpu shouldn't really affect the matter. Also, from what I understand: The call for making the img tags etc is done in a callback from the getJson, meaning it'll only run when getJson has finished / had a reply.
Does the great StackOverFlow community have an idea what's happening here?
To share my knowledge/experiences with the great StackOverflow community,
Small backstory
After progressing a bit further into the project I started to run into various issues going from JSON parsing to not having Allow-Control-Allow-Origin: * headers, making it very hard to get the Ajax Request (Client ==> Python CGI) done.
In the meantime I also started dev'ing on my main desktop (which for some reason either has massive issues with Python versioning or has none). But due to the terminal on my desktop having Python 3.4+ , there was no module CGIHTTPServer. After a small amount of digging, I found that CGIHTTPServer had been transfered into http.server, yet when running plain old python -m http.server, I noticed the CGI script wouldn't run. It would just display. Ofcourse, I forgot the option -cgi.
Main solution
The times I was succesfully using CGIHTTPServer, I had troubles. The images wouldn't load as described above. I suspect that the module just couldn't take the decent amount of requests. Meaning that when suddenly Y x Z requests came in, it would struggle to deliver all the data. ==> Connection Refused.
Since switching to python -m http.server -cgi, no problems what so ever. Currently working on a Bootstrap Grid for all those images!
Thx #Lashane and #Ruud.
I am need to collect data from a website that allows me to download the results of a query but only the ones currently being displayed on that page.
I have no experience whatsoever with javascript or even any real programming. A friend told me that I might be able to do it with the "Custom JavaScript for websites" Addon for Chrome.
I managed to get it to download the file I want for each page using:
document.getElementById('dContent').value = 'full';
document.getElementById('submit-download').click();
I still need to manually change to the next page. I tried to do this automatically by adding a cycle:
for (i = 1; i < 20; i++) {
location.href='https://www.myurl.com/search?query=keyword&page=' +i;
document.getElementById('dContent').value = 'full';
document.getElementById('submit-download').click();
}
But it doesn't seem to work*. My google skills and limited knowledge only took me this far. Am I doing something wrong? I wonder if the addon actually allows this as I reckon it might not work as a new page is being loaded. Is there another software that I might use for this purpose?
Thank you in advance for your help.
the url for the queries follows the page=1 / =2 / ... rule for the results
setTimeout(function(){
document.getElementById('dContent').value = 'full';
document.getElementById('submit-download').click();
var page = parseInt(location.search.match(/\d\d?$/)[0])
if(!isNaN(page) && page < 20){
location.href='https://www.myurl.com/search?query=keyword&page=' +(page + 1);
}
}, 1000);
I've noticed that videos can be automatically streamed by chrome / firefox.
If you open for example
http://domain-foo.com/file.mp4
in chrome / firefox you can jump on their media player from one place to another on the timeline without loading the file to the end.
Is that possible to invoke that code inside php providing the url to video from database?
I have got .htaccess that is interpreting .mp4 files url's as .php in order to prevent people from stealing not bought video content. After checking if somebody bought content / he is logged in, i'm returning in php proper header and i'm reading the file in the loop using fread function.
Everything works fine, but I don't know how to change it in order to let people jump on timeline and have videos secure in the same time.
1) Any ideas? Is that at all possible to invoke in PHP script browser's player or at least return it as html?
2) Is that possible to return somehow video link from .php file to for example jwplayer from script and parse it normally by the native code instead of creating my own parsing code which is surely less effective and uses lots of CPU power and what's more after some time (30 minutes -time set on server) closes parsing function because php script can't run for so long...
Best regards
Your need to add and handle header('Accept-Ranges: bytes'); headers
When a user clicks to skip forward in the video the player will send a $_SERVER['HTTP_RANGE'] to the server you need to access this and then seek to the part of the file.
Example:
<?php
...
...
//check if http_range is sent by browser (or download manager)
if(isset($_SERVER['HTTP_RANGE'])){
list($size_unit, $range_orig) = explode('=', $_SERVER['HTTP_RANGE'], 2);
if ($size_unit == 'bytes'){
//multiple ranges could be specified at the same time, but for simplicity only serve the first range
//http://tools.ietf.org/id/draft-ietf-http-range-retrieval-00.txt
list($range, $extra_ranges) = explode(',', $range_orig, 2);
}else{
$range = '';
header('HTTP/1.1 416 Requested Range Not Satisfiable');
exit;
}
}else{
$range = '';
}
//figure out download piece from range (if set)
list($seek_start, $seek_end) = explode('-', $range, 2);
//set start and end based on range (if set), else set defaults
//also check for invalid ranges.
$seek_end = (empty($seek_end)) ? ($file_size - 1) : min(abs(intval($seek_end)),($file_size - 1));
$seek_start = (empty($seek_start) || $seek_end < abs(intval($seek_start))) ? 0 : max(abs(intval($seek_start)),0);
//Only send partial content header if downloading a piece of the file (IE workaround)
if ($seek_start > 0 || $seek_end < ($file_size - 1)){
header('HTTP/1.1 206 Partial Content');
header('Content-Range: bytes '.$seek_start.'-'.$seek_end.'/'.$file_size);
header('Content-Length: '.($seek_end - $seek_start + 1));
}else{
header("Content-Length: $file_size");
}
header('Accept-Ranges: bytes');
...
...
?>
I want to run a JavaScript code to ping 4 different IP addresses and then retrieve the packet loss and latency of these ping requests and display them on the page.
How do I do this?
You can't do this from JS. What you could do is this:
client --AJAX-- yourserver --ICMP ping-- targetservers
Make an AJAX request to your server, which will then ping the target servers for you, and return the result in the AJAX result.
Possible caveats:
this tells you whether the target servers are pingable from your server, not from the user's client
so the client won't be able to test hosts its LAN
but you shouldn't let the host check hosts on the server's internal network, if any exist
some hosts may block traffic from certain hosts and not others
you need to limit the ping count per machine:
to avoid the AJAX request from timing out
some site operators can get very upset when you keep pinging their sites all the time
resources
long-running HTTP requests could run into maximum connection limit of your server, check how high it is
many users trying to ping at once might generate suspicious-looking traffic (all ICMP and nothing else)
concurrency - you may wish to pool/cache the up/down status for a few seconds at least, so that multiple clients wishing to ping the same target won't launch a flood of pings
The only method I can think of is loading e.g. an image file from the external server. When that load fails, you "know" the server isn't responding (you actually don't know, because the server could just be blocking you).
Take a look at this example code to see what I mean:
/*note that this is not an ICMP ping - but a simple HTTP request
giving you an idea what you could do . In this simple implementation it has flaws
as Piskvor correctly points out below */
function ping(extServer){
var ImageObject = new Image();
ImageObject.src = "http://"+extServer+"/URL/to-a-known-image.jpg"; //e.g. logo -- mind the caching, maybe use a dynamic querystring
if(ImageObject.height>0){
alert("Ping worked!");
} else {
alert("Ping failed :(");
}
}
I was inspired by the latest comment, so I wrote this quick piece of code.
This is a kind of "HTTP ping" which I think can be quite useful to use along with XMLHttpRequest calls(), for instance to figure out which is the fastest server to use in some case or to collect some rough statistics from the user's internet connexion speed.
This small function is just connecting to an HTTP server on an non-existing URL (that is expected to return a 404), then is measuring the time until the server is answering to the HTTP request, and is doing an average on the cumulated time and the number of iterations.
The requested URL is modified randomely at each call since I've noticed that (probably) some transparent proxies or caching mechanisms where faking results in some cases, giving extra fast answers (faster than ICMP actually which somewhat weird).
Beware to use FQDNs that fit a real HTTP server!
Results will display to a body element with id "result", for instance:
<div id="result"></div>
Function code:
function http_ping(fqdn) {
var NB_ITERATIONS = 4; // number of loop iterations
var MAX_ITERATIONS = 5; // beware: the number of simultaneous XMLHttpRequest is limited by the browser!
var TIME_PERIOD = 1000; // 1000 ms between each ping
var i = 0;
var over_flag = 0;
var time_cumul = 0;
var REQUEST_TIMEOUT = 9000;
var TIMEOUT_ERROR = 0;
document.getElementById('result').innerHTML = "HTTP ping for " + fqdn + "</br>";
var ping_loop = setInterval(function() {
// let's change non-existent URL each time to avoid possible side effect with web proxy-cache software on the line
url = "http://" + fqdn + "/a30Fkezt_77" + Math.random().toString(36).substring(7);
if (i < MAX_ITERATIONS) {
var ping = new XMLHttpRequest();
i++;
ping.seq = i;
over_flag++;
ping.date1 = Date.now();
ping.timeout = REQUEST_TIMEOUT; // it could happen that the request takes a very long time
ping.onreadystatechange = function() { // the request has returned something, let's log it (starting after the first one)
if (ping.readyState == 4 && TIMEOUT_ERROR == 0) {
over_flag--;
if (ping.seq > 1) {
delta_time = Date.now() - ping.date1;
time_cumul += delta_time;
document.getElementById('result').innerHTML += "</br>http_seq=" + (ping.seq-1) + " time=" + delta_time + " ms</br>";
}
}
}
ping.ontimeout = function() {
TIMEOUT_ERROR = 1;
}
ping.open("GET", url, true);
ping.send();
}
if ((i > NB_ITERATIONS) && (over_flag < 1)) { // all requests are passed and have returned
clearInterval(ping_loop);
var avg_time = Math.round(time_cumul / (i - 1));
document.getElementById('result').innerHTML += "</br> Average ping latency on " + (i-1) + " iterations: " + avg_time + "ms </br>";
}
if (TIMEOUT_ERROR == 1) { // timeout: data cannot be accurate
clearInterval(ping_loop);
document.getElementById('result').innerHTML += "<br/> THERE WAS A TIMEOUT ERROR <br/>";
return;
}
}, TIME_PERIOD);
}
For instance, launch with:
fp = new http_ping("www.linux.com.au");
Note that I couldn't find a simple corelation between result figures from this script and the ICMP ping on the corresponding same servers, though HTTP response time seems to be roughly-exponential from ICMP response time. This may be explained by the amount of data that is transfered through the HTTP request which can vary depending on the web server flavour and configuration, obviously the speed of the server itself and probably other reasons.
This is not very good code but I thought it could help and possibly inspire others.
The closest you're going to get to a ping in JS is using AJAX, and retrieving the readystates, status, and headers. Something like this:
url = "<whatever you want to ping>"
ping = new XMLHttpRequest();
ping.onreadystatechange = function(){
document.body.innerHTML += "</br>" + ping.readyState;
if(ping.readyState == 4){
if(ping.status == 200){
result = ping.getAllResponseHeaders();
document.body.innerHTML += "</br>" + result + "</br>";
}
}
}
ping.open("GET", url, true);
ping.send();
Of course you can also put conditions in for different http statuses, and make the output display however you want with descriptions etc, to make it look nicer. More of an http url status checker than a ping, but same idea really. You can always loop it a few times to make it feel more like a ping for you too :)
I've come up with something cause I was bored of searching hours after hours for something that everyone is saying "impossible", only thing I've found was using jQuery.
I've came up with a new simple way using Vanilla JS (nothing else than base JavaScript).
Here's my JSFiddle: https://jsfiddle.net/TheNolle/5qjpmrxg/74/
Basically, I create a variable called "start" which I give the timestamp, then I try to set an invisible image's source to my website (which isn't an image) [can be changed to any website], because it's not an image it creates an error, which I use to execute the second part of the code, at this time i create a new variable called "end" which i give the timestamp from here (which is different from "start"). Afterward, I simply make a substraction (i substract "start" from "end") which gives me the latency that it took to ping this website.
After you have the choice you can store that in a value, paste it on your webpage, paste it in the console, etc.
let pingSpan = document.getElementById('pingSpan');
// Remove all the way to ...
let run;
function start() {
run = true;
pingTest();
}
function stop() {
run = false;
setTimeout(() => {
pingSpan.innerHTML = "Stopped !";
}, 500);
}
// ... here
function pingTest() {
if (run == true) { //Remove line
let pinger = document.getElementById('pingTester');
let start = new Date().getTime();
pinger.setAttribute('src', 'https://www.google.com/');
pinger.onerror = () => {
let end = new Date().getTime();
// Change to whatever you want it to be, I've made it so it displays on the page directly, do whatever you want but keep the "end - start + 'ms'"
pingSpan.innerHTML = end - start + "ms";
}
setTimeout(() => {
pingTest();
}, 1000);
} // Remove this line too
}
body {
background: #1A1A1A;
color: white
}
img {
display: none
}
Ping:
<el id="pingSpan">Waiting</el>
<img id="pingTester">
<br> <br>
<button onclick="start()">
Start Ping Test
</button>
<button onclick="stop()">
Stop
</button>
function ping(url){
new Image().src=url
}
Above pings the given Url.
Generally used for counters / analytics.
It won't encounter failed responses to client(javascript)
I suggest using "head" to request the header only.
xhr.open('head', 'asstes/imgPlain/pixel.txt' + cacheBuster(), true);
and than ask for readystate 2 - HEADERS_RECEIVED send() has been called, and headers and status are available.
xhr.onreadystatechange = function() {
if (xhr.readyState === 2) { ...
Is it possible to ping a server from Javascript?
Should check out the above solution. Pretty slick.
Not mine, obviously, but wanted to make that clear.
You can't PING with Javascript. I created Java servlet that returns a 10x10 pixel green image if alive and a red image if dead. https://github.com/pla1/Misc/blob/master/README.md
I'm working on a Javascript/jQuery powered image preloader, and have hit a bit of a snag. While as of currently it provides the progress based on loaded_images / total_images, this is not very accurate given a page could have a thousand 1kB images, and a single 1MB image.
I'm looking for a way to incorporate filesize into the progress calculations. Now, I've looked into some (cross browser compatible) tricks at capturing the filesize of a given image, and it seems that Ajax requests for Content-Length were the most reliable (in terms of accuracy) like so:
var imageSizeTotal = 0;
var ajaxReqest = $.ajax({
type: 'HEAD',
url: 'path/to/image',
success: function(message){
imageSizeTotal += parseInt(ajaxRequest.getResponseHeader('Content-Length'));
}
});
Now, I find this method to be quite useful, as I can provide a status message of Initializing while the necessary requests are taking place. However my issue now is two-fold:
Is there any way possible to capture the bytes loaded of a given image object, perhaps using setInterval() to periodically check? Otherwise, I'm sort of back at the issue of the progress indicator hanging on large files.
How can I force the actual progress calculator, etc., portion of the script to wait until the necessary Ajax requests are completed (displaying Initializing or whatever), so it can go ahead with the loading?
Also, here's the script I currently use, which again, calculates progress based on the number of images, regardless of filesize or bytes received.
var preloaderTotal = 0;
var preloaderLoaded = 0;
var preloaderCurrent = null;
$('#preloaderCurtain')
.bind('preloaderStart', function(){
$(this)
.show();
$('*')
.filter(function(e){
if($(this).css('background-image') != 'none'){
preloaderTotal++;
return true;
}
})
.each(function(index){
preloaderCurrent = new Image();
preloaderCurrent.src = $(this).css('background-image').slice(5, -2);
preloaderCurrent.onload = function(e){
preloaderLoaded++;
if(preloaderLoaded == preloaderTotal - 1){
$('#preloaderCurtain')
.trigger('preloaderComplete')
}
$('#preloaderCurtain')
.trigger('preloaderProgress')
};
});
})
.bind('preloaderComplete', function(){
$(this)
.fadeOut(500)
startAnimation();
})
.bind('preloaderProgress', function(e){
$('#preloaderProgress')
.css('opacity', 0.25 + (preloaderLoaded / preloaderTotal))
.text(Math.floor((preloaderLoaded / preloaderTotal) * 100) + '%');
})
.trigger('preloaderStart');
Hopefully I'll be able to turn this into a plugin, once I work the bugs out of it.
It looks like a similar question was asked and answered here:
XmlHttpRequest.responseText while loading (readyState==3) in Chrome
and here:
Comet Jetty/Tomcat, having some browser issues with Firefox and Chrome
Basically - .responseText.length for Firefox and iPhone, .responseBody.length for IE, WebSockets for Chrome.
The second thread suggests bayeux/dojo encapsulate all this for you into a higher-level API so you don't have to write it yourself.