its common place to use javascript like in this exsample:
Example: http://www.htmlite.com/JS019.php
My question is: Does the second image download each time or is it cached when the page first loads. If so how does the server know to cache the image?
Are you asking if the image is cached in your browser or in the server?
In both cases, the answer would be yes if cache is on.
Basically, your server knows the image must be put in cache when it receives a request from a client asking to download this image. So the behavior is the same with or without javascript on your HTML page. This only applies in the case where cache is activated on the server, obviously.
Related
I am attempting to populate a WebGL Earth with meshes that are compiled from images. These images are cross-domain, and hosted on a server where setting the appropriate headers isn't an option. Can I XMLHttpRequest the image urls, and then serve them back to myself via PHP to bypass CORS errors?
Or, more specifically, can I use my own webserver as a proxy to serve img urls back to myself (to get around CORS) in a WebGL context?
EDIT: The real question here is if I can use my own webserver as a proxy to pass the urls, or if I'll have to actually download each image to the server to then use it.
I had a similar issue once using an API. First I tried to do everything in JS probably getting the same error message as you do.
My solution was to switch to PHP and do it server side since modern Browsers block what you want to do.
So yes, it is possible.
Get the pictures on the backend and then provide them to the frontend.
Simply retrieve the pictures first and then send them as Output to the Browser. You can do that synchronously by doing something like:
$ch = curl_init ...
...
$pic = curl_exec ... // get the picture
// and then echo it
This I have done once but don't remember correctly. Or you can do it async, what is usually done when using img-tags. I'm not sure how it works with WebGL but should be similar:
Download the pic to your filesystem
then provide the URL to the browser.
It then depends on how big the images are, how long you need them, and the API whether you want to go this direction.
Answer to first comment:
Tricky. I don't have experience using WebGL Earth and whether it is possible to load data async via Ajax (look here) or if you use AngularJS (look here) into it. You would need to try that one out. I'd especially look into the loading times.
There is a API-call like http://example.com/api/get_image/65446 which downloads the picture, resizes it and then sends it to the browser.
What you would do in this case is:
Send the 'normal' Page to the user
Then there you look for the events for which you want to show pictures
When the event happened use the API-call I just mentioned and add it to your page with the success handler. Again, how that can work with WebGL Earth is another question I can't answer.
And if you want to use that for mobile devices you need to think about the picture size. Since the screens are relatively small you should make the pics smaller first. But then, how long does it take to get the picture I guess this is the biggest challenge. Somebody who wants scrolls the globe would like to see the pictures immediately, not after 5 seconds (since I scrolled more probably)
Think about whether you can prepare the download and resize first. If you want to show only certain pictures, like 10.000 in total then I would do that. Then you don't need to think about loading times as much and when you delete which pictures. You should open another question for that topic and try first whether Ajax is possible.
Does someone know a technique to make asynchonous image parsing when multiple images are printed (base64) on a webpage ?
It causes Firefox to have small freeze on loading/parsing on a gaming machine (for more than 15 images 1.5MB), so I'm a bit worried of that.
Still I think giving an url and using a javascript async (lazy) image loading is better, if someone have some more informations tips, I'll be glad to hear it Thanks.
The answer is, there is no way to control browser parsing speed with printed base64 images (sended in html HTTP response). If you print a lot of images the browser will use more CPU to parse the webpage.
The solution if you have bin images is to call them separately, you will have to get an url to serve image data individually (no static files).
The problem with this is making the browser cache working : if not every time the page is loaded you will have to render every image on the page causing the webserver overheat.
Another solution would be to cache image on the server side, but still the client would have to download image every time consuming the webserver bandwidth.
Browser cache can be activated with http cache-control : https://css-tricks.com/snippets/php/intelligent-php-cache-control/
The best thing is to use cache-control, but also use cache on server side.
Of course this only apply for binary data, if you have image file just let your webserver to server image file naturally.
I have a PhoneGap application in which I need to download certain images for offline usage and show those inside an iframe. Is this possible and do I need something like CorHTTPD (https://github.com/floatinghotpot/cordova-httpd) to serve the assets locally?
I have been trying to store the files on file system but when I try to show those (even without being inside iframe), those doesn't show. They seem to be loaded (can be seen in network console in remote debugging), though, but (of course) without any headers.
After spending more and more time on this and settings GapDebug correctly to remote debug my application, I was finally able to solve my problem by giving
{responseType: "arraybuffer"}
to AngularJS's $http.get method as config parameter as described here. Now I am able to get the images to ArrayBuffer correctly and from there to base64 encode them to be added inside HTML stored offline. Suitable solution for my case at least..
I have a web page with a file swf and an HTML button: when I click the button I want to save (download to my disk) the current image my swf file is showing (it is a sort of image gallery).
It perfectly works when the button is inside my swf but it fails when -through ExternalInterface- I call from JavaScript the method that saves the image.
I verified the JS-AS communication (it's ok) and I know that FileReference.save() only works when triggered by a user event. Probably, the click on an HTML button is not considered a user event.
Aside from changing anything (eg, moving some code on the server side, sending the image to server, then downloading it...), is there any way to simulate a user event? Any other solution or tip is appreciated.
NB: I would use a Flash button but the HTML is required.
Solution (or not as the case may be)
Flash based
Currently I would say your best bet is to stick with your button operating from within Flash. If you need to dislocate the button from your main Flash runtime, you could try what you are doing using two embeds of Flash and communicate between them using LocalConnection. I wouldn't recommend this however as LocalConnection is a pain to get working well, and there is no guarantee that you wont come up against security sandbox problems working across two instances.
Server-side based
You could implement a save system that would involve sending the image data back to a server and forming an actual URL that your front end could request. This would allow you to specify whatever you wanted for the download. The downsides to this are that it requires a server (so wont work for offline apps), it also requires quite a lot of hassle of sending the image data one way only to pull it down later...
I've gone in to more detail about this here:
Canvas Image to an Image file
HTML5 based
Currently I wouldn't recommend the Data URL download as I suggested in my comment because it is not a complete solution yet. However, on the plus side I'd keep an eye out on what the top browsers are implementing though, because that answer could change shortly.
Workings
Basically I just tried to implement an image download via a data URI (thinking this would be the best solution for your poblem), which all works fine, plus you could quite happily derive the Base64 data you need from your BitmapData object. However, the problem is that there is no way to specify a filename along with the download. So you end up with rather ugly filenames that don't even have the correct extension.
Click to Download File
After researching a bit it seems there isn no workable workaround for this, but there are specifications that are ready to be implemented that would help:
<a download="filename.png" href="data:image/octet-stream;...">Download File</a>
The download attribute is designed for precisely the problem I mention above, and would allow naming of the download. Unfortunately I can't find a browser that implements it yet...
References
about the download attribute of an a tag
more about the download attribute of an a tag
stackoverflow : suggest a file name when using data uri
stackoverflow : force download an image using javascript
I have a web page that displays .gif images, I want to display only the first frame of the .gif (without animation) and I don't have a still version of it.
Is this possible?
UPDATE :
I want to make this on the client side. I don't have access to server (i.e) the server gives me .gif images and I want to display the first frame on my web page. Maybe there is a solution using javascript or css.
You may be able to use a JavaScript canvas as pointed out by Boldewyn in comments.
If that doesn't work out, I think you will have to do it server side. One tool that can do this is ImageMagick. However, that needs to be present on your server and PHP needs to be able to access it.
Command line usage:
convert 'image.gif[0]' singleframe.gif
I'm sure PHP's ImageMagick extension can do this as well.