Multiple Base64 images on Html page and smooth loading - javascript

Does someone know a technique to make asynchonous image parsing when multiple images are printed (base64) on a webpage ?
It causes Firefox to have small freeze on loading/parsing on a gaming machine (for more than 15 images 1.5MB), so I'm a bit worried of that.
Still I think giving an url and using a javascript async (lazy) image loading is better, if someone have some more informations tips, I'll be glad to hear it Thanks.

The answer is, there is no way to control browser parsing speed with printed base64 images (sended in html HTTP response). If you print a lot of images the browser will use more CPU to parse the webpage.
The solution if you have bin images is to call them separately, you will have to get an url to serve image data individually (no static files).
The problem with this is making the browser cache working : if not every time the page is loaded you will have to render every image on the page causing the webserver overheat.
Another solution would be to cache image on the server side, but still the client would have to download image every time consuming the webserver bandwidth.
Browser cache can be activated with http cache-control : https://css-tricks.com/snippets/php/intelligent-php-cache-control/
The best thing is to use cache-control, but also use cache on server side.
Of course this only apply for binary data, if you have image file just let your webserver to server image file naturally.

Related

Programmatically save or get byte array data (to save by other means) of images on page in any browser without having to redownload them again

I'm looking for a way to save the images on a page I'm browsing (scrape-as-you-browse essentially), without making repeated download requests. There are multiple cases where the images simply won't download properly if you try that, and since the browser already has downloaded them once, I feel like there has to be a way to avoid wasting internet traffic and time spent waiting for images, especially large ones, to download again. Maybe there is a way to read them from cache which I haven't found yet.
I've already tried using canvas to redraw the images and get the base64-encoded string and getting a binary blob array of bytes, but these methods don't work if there are any CORS restrictions. And also these won't result in the original 1-to-1 image bytes.
Is it possible in any modern browser to get the images which have already been received (as you can see, for example, in Firefox > Tools > Page info > Media tab) without making a second download request?
I would then send the byte array (or base64 encoded string of it) of the source image file to a localhost address to save it as a file using a listening app (this part I've already implemented).
I'm not looking for browser addons/extensions or which menu buttons to press in browser GUI, but for javascript methods I can call in a userscript via GreaseMonkey (or any other userscript extension that supports a working solution) to get the source image file data.
I've already looked at older questions and they either don't have any working answers or the answers have problems like CORS or lossy canvas redrawing or repeated download requests which may not work, as described above. So if you link to a duplicate, please make sure it has a solution that works without those caveats.

Is there a way to extend DATA URI size limit on chrome?

Good day,
I have made a system that renders large amount of data into pdf file... the file sizes up to 30mb or more so if I use open(dataUri) the pdf is not rendered on the opened window, for now I am forcing the use to download the file before they open it.. But the problem with this is that files are being duplicated over and over again every time they click the download button.. The solution I have in mind is that to extend data uri limit on the browser (specifically chrome) since I am deploying the system locally (for now).
any suggestion are very much appreciated.. Thanks!!

How to download video in background HTML5 / JS

Is there a way to make a video download in the background (Possible on a different thread?) than I get my images and do webrequests?
Situation:
I show a video, video plays fine.
But, once I start getting 2 images at the same time from that server, the page won't response and will wait for my video to have finished loading, then it loads the images within a few seconds. (about 100kb per image).
If I try to open any other page on the same browser and the crashed page's server it won't load untill the crashed page is done loading, however any other site(For example google) will just load fine.
So is there a way to make the browser not want to download full video, or maybe just give priority to images?
This sounds like the server throttling your requests, as everything apart from scripts always load asynchronously in a browser.
It might be that you are only allowed so much bandwidth per second from the server - or so many connections - and that could be the reason why the server won't respond until your first request has finished.
Check your server configuration and perhaps have a play with it to exclude this possibility.
Maybe you can try a worker, it provides a way to execute background scripts (you will have to split your video and images in separate files), as you can see in the Use Cases it refers to "Prefetching and/or caching data for later use" as a possible scenario. Hope it helps.
More info here:
http://www.html5rocks.com/en/tutorials/workers/basics/#toc-examples

Using blur.js (blurjs.js) to blur image hosted remotely

Is it possible to blur a remote image using http://www.blurjs.com?
I have our images hosted on a remote CDN and we want to use blurjs to blur the image for a background effect. When we try and use blur js directly with the remote image javascript cannot read the file and throws a unable to read image data error.
The way i'm currently doing it is regenerating the image in php and then using blurjs, but it is very slow and consumes a lot of resources.
We've tried the css solution too with filters but the browers runs too slow when we do.
does anybody have a solution?
Your problem is that pixel access in canvas is not allowed for images loaded from a different domain than the one the page is hosted on. What you need is a proxy script that runs on your server which that allows your javascript to load images from other domains via your server. Of course the downside is that all traffic will also run through your server and that the time to retrieve the image will increase (since the image has first to be loaded to your server and then to the client) and there is unfortunately no way around that.
The good news is that this is a problem that Flash developers had to face many years ago already so it has been solved many times:
Here's for example a php script: http://www.abdulqabiz.com/blog/archives/2007/05/31/php-proxy-script-for-cross-domain-requests/
Here's a more recent implementation in Node.js http://codelikebozo.com/creating-an-image-proxy-server-in-nodejs

Image caching in when using javascript

its common place to use javascript like in this exsample:
Example: http://www.htmlite.com/JS019.php
My question is: Does the second image download each time or is it cached when the page first loads. If so how does the server know to cache the image?
Are you asking if the image is cached in your browser or in the server?
In both cases, the answer would be yes if cache is on.
Basically, your server knows the image must be put in cache when it receives a request from a client asking to download this image. So the behavior is the same with or without javascript on your HTML page. This only applies in the case where cache is activated on the server, obviously.

Categories