I have an MVC.net website which serves images based on this old article:
http://blogs.msdn.com/b/miah/archive/2008/11/13/extending-mvc-returning-an-image-from-a-controller-action.aspx
My c# controller code looks like this:
public ActionResult GetThumbnail(string code)
{
byte[] image = _dataProvider.GetThumbnailImage(code);
return this.Image(image, "image/jpeg");
}
On the client side I have an AngularJS controller which loads up a search result set from the server. The result set includes a number of image URLs like this:
<tr ng-repeat="item in data.items | filter:filter" class="fixed-height-80">
<td>
<img ng-src="{{item.thumbnailUrl}}"/>
</td>
</tr>
The thumbnailUrl points to the GetThumbnail action on my MVC controller and is constructed server side in a model factory and returned to the Angular ready for use.
The problem is that the images load very slowly even thought they are only about 3kb per image. After the async search return is completed in the javascript the images appear one at a time, about one per second, until they are all loaded.
I put a Stopwatch in the C# on the .net controller and the loading of the image data from the dataProvider on the server side takes about 0.9ms But even with just ten images to serve it takes about six seconds before all the images are loaded in the page. The JS renders the links almost immedialtly, it's just the images that are slow.
How can I speed up image loading in this context?
Update
If I move the images into ~/images/image.jpg and route the url to point directly at the folder using Url.Content it doesn't seem to have the same issue. therefore the problem seems to be with the way the controller is serving the images - sometimes if can take < 10ms and other times over 2000ms for the same image, but it's unclear why.
Several options are available:
Caching the images in the browser/client-side by adding cache-control HTTP to the response headers.
Caching the images in the server-side (in-memory) instead of carrying-out an I/O operation to the file-system (IIS will cache frequently used images and serving them from memory).
Compression of HTTP response's resources using GZIP, for example (you should do it in your Web-Server).
Reduce images size/quality in the server (maybe while uploading them).
Use CSS image sprite technique: An image sprite is a collection of images put into a single image and by that, reducing the number of images that the browser need to load.
Use a dedicated CDN for serving your images faster and in turn, reduce the load-time on the server.
You should decide what's the best choice in your case.
UPDATE:
Another 2 options:
With ng-repeat, in each iteration of your results you're effectively accessing the server instead of accessing the browser cache - there could be hundreds of records. It would be better to send your images with cache-control HTTP response headers (A Beginner's Guide to HTTP Cache Headers) in order to prevent accessing the server again and again for fetching the images and reducing the repeated round-trips.
Pre-load your images into the browser cache: http://perishablepress.com/3-ways-preload-images-css-javascript-ajax/
Consider option to resize images once they loaded on the server and store thumbnails images on the server or CDN as well as orignal version. This will reduce server load and make image load as fast as getting image without any processing every time image requested.
Use a proper CDN or just serve it directly from virtual directories. Your current implementation is IIS is good and fast as a web server, .Net framework is good as a server-side technology.
A properly configured IIS will help you save bandwidth and file system I/O, if you need resizing capabilities, check out Image Resizer
It turns out that it's the MVC SessionState that causes this issue - although it's not entirely clear why at thgis stage.
Adding a new controller just for images and decorating it as below will disable the default session state behaviour and prevent the images getting stuck 'waiting' for ages.
[SessionState(System.Web.SessionState.SessionStateBehavior.Disabled)]
public class ImageController : Controller
{
public ActionResult GetThumbnail(string code)
{
byte[] image = _dataProvider.GetThumbnailImage(code);
return this.Image(image, "image/jpeg");
}
}
Related
In my webapp the user has the option to download a file containing some data, which they do by clicking on a button. For small amounts of data the file starts downloading pretty much immediately and that shows in the browser's download area. Which is good.
For large amounts of data it can take the server a substantial amount of time to calculate the data, even before we start downloading. This is not good. I want to indicate that the calculation is in progress. However I don't want to put a "busy" indicator on my UI, because the action does not block the UI - the user should be able to do other things while the file is being prepared.
A good solution from my point of view would be to start the download process before I have finished the calculation. We always know (or can quickly calculate) the first few hundred bytes of the file. Is there a mechanism where I can have the server respond to a download request with those few bytes, thus starting the download and making the file show up in the download area, and provide the rest of the file when I have finished calculating it? I'm aware that it will look like the download is stalled, and that's not a problem.
I can make a pretty good estimate of the file size very quickly. I would prefer not to have to use a third-party package to achieve this, unless it's a very simple one. We are using Angular but happy to code raw JS if needed.
To indicate that the link points to a download on the client, the easiest way is the download attribute on the link. The presence of the attribute tells the browser not to unload the current tab or create a new one; the value of the attribute is the suggested filename.
For the back-end part, after setting the correct response headers, just write the data to the output stream as it becomes available.
You asked for a general solution
1) First, at your HTML/JS you can prevent the UI from being blocked by setting you download target to any other WebPage, the preferred way for doing this is to set the target to an IFRAME:
<!-- your link must target the iframe "downloader-iframe" -->
<a src="../your-file-generator-api/some-args?a=more-args" target="downloader-iframe">Download</a>
<!-- you don't need the file to be shown -->
<iframe id="downloader-iframe" style="display: none"></iframe>
2) Second, at your back-end you'll have to use both Content-Disposition and Content-Length(optional) headers, be careful using the "length" one, if you miss calculate the fileSize it will not be downloaded. If you don't use Content-Length you'll not see the "downloading progress".
3) Third, at you'r back-end you have to make sure that you are writing your bytes directly at your response! that way your Browser and your Web-Server will know that the download is "in progress",
Example for Java:
Using ServletOutputStream to write very large files in a Java servlet without memory issues
Example for C#:
Writing MemoryStream to Response Object
HOW this 3 steps are built will be up to you, frameworks and libraries you are using, for example Dojo & JQuery have great IFRAME manipulation utilities, all thought you can do the coding by yourself, this is a JQuery sample:
Using jQuery and iFrame to Download a File
Also:
Adding a "busy" animation is ok! you just have to make sure that it's not blocking you'r UI, something like this:
I have a web app with offline functionality.
I managed to get the manifest working properly with all the required assets; css, js and images.
However I use angular to get from the server a list of user generated images, in a similar manner:
$scope.images = angular.fromJson(localStorage.getItem('images')) || []
$http
.get('/my-rest-endpoint')
.then(function(res){
$scope.images = res.data
localStorage.setItem('images', angular.toJson(res.data))
}).catch()
I use the localStorage to keep the list even if the user is offline,
however I would like to have these pictures included in my offline cache...
Any idea how to achieve this?
BONUS:
In the future there may also be video files. What to do in that case?
This is one of the many offline challenges. Your options are:
1) Add images paths to manifest.appcache:
Dynamically generate your manifest.appcache, inserting references to the images you want to offline. You could use the appcache-nanny to check for updates.
2) Store images in localStorage:
This might be okay for a few images, but keep in mind localStorage capacity ranges from 2.5mb to 10mb depending on the device/browser, so that rules out video.
3) Use localForage with offline-fetch:
localForage is a wrapper around IndexedDB/WebSQL that could give you up to 50mb client-side storage. offline-fetch is a wrapper around fetch (the XMLHttpRequest replacement).
If you request your images using offline-fetch configured to use localForage, it will take care of serving them from disk when it detects the app is offline but would mean loading your images via JS and not img src attribute.
I have a javascript file which internally calls a function to load an xml file.
$(document).ready(function()
{
urlVal ="web/help.xml";
}
The javaxcript is versioned so that the browser always loads it instead of
caching it
"./js/help_min.js?ver=${verNumber}"
I am facing an issue where browser downloads the latest js file but has cached help.xml included in js file.
is there a way that the browser will always load latest "hepl.xml" rather than caching it.
The proper apporach would be to fix the backend to send headers telling the browser not to cache the data (see i.e. How to control web page caching, across all browsers?). But if you cannot do that, make the request unique each time, i.e.
"./js/help_min.js?ver=${verNumber}&random=${something_random}"
where something_random value of random can be i.e. current time stamp (with millis). That way your request will not match the cache entry enforcing fetch on each request.
PS: you seem to also have design flaw, as by logic using the same ${verNumber} should return the same data, hence caching would be more than welcome to reduce the traffic and speed up loading time.
The problem
My website fails to load random images at random times.
Intermittent failure to load image with the following error in console:
"GET example.com/image.jpg net::ERR_CONTENT_LENGTH_MISMATCH"
Image either doesn't load at all and gives the broken image icon with alt tag, or it loads halfway and the rest is corrupted (e.g. colors all screwed up or half the image will be greyed out).
Setup
Litespeed server, PHP/mySQL website, with HTML, CSS, Javascript, and JQuery.
Important Notes
Problem occurs on all major web browsers - intermittently and with various images.
I am forcing UTF-8 encoding and HTTPS on all pages via htaccess.
Hosting provider states that all permissions are set correctly.
In my access log, when an image fails to load, it gives a '200 OK' response for the image and lists the bytes transferred as '0' (zero).
It is almost always images that fail to load but maybe 5% of the time it will be a CSS file or Javascript file.
Problem occurred immediately after moving servers from Apache to Litespeed and has been persistent over several weeks.
Gzip and caching enabled.
This error is definite mismatch between the data that is advertised in the HTTP Headers and the data transferred over the wire.
It could come from the following:
Server : If a server has a bug with certain modules that changes the content but don't update the content-length in the header or just doesn't work properly.
Proxy : Any proxy between you and your server could be modifying the request and not update the content-length header.
This could also happens if setting wrong content-type.
As far as I know, I haven't see those problem in IIS/apache/tomcat but mostly with custom written code. (Writing image yourself on the response stream)
It could be even caused by your ad blocker.
Try to disable it or adding an exception for the domain from which the images come from.
Suggest accessing the image as a discrete url using cURL, eg
php testCurlimg >image.log 2>&1 to see exactly what is being returned by the server. Then you can move upon level to test the webpage
php testCurlpg >page.log 2>&1 to see the context for mixed data
I just ran into this same ERR_CONTENT_LENGTH_MISMATCH error. I optimized the image and that fixed it. I did the image optimization using ImageOptim but I'm guessing that any image optimization tool would work.
Had this problem today retrieving images from Apache 2.4 when using a proxy I wrote in php to provide a JWT auth gateway for accessing a couchdb backend. The proxy uses php fsockopen and the fread() buffer was set relatively low (30 bytes) because I had seen this value used in other peoples work and I never thought to change it. In all my failing JPG (JFIF) images I found the discrepancy in the original versus the image served was a series of crlf that matched the size of the fread buffer. Increased the byte length for the buffer and the problem no longer exists.
In short, if your fread buffer streaming the image is completely full of carriage returns and line feeds, the data gets truncated. This possibly also relates to the post from Collin Krawll as to why image optimization resolved that problem.
I'm currently trying to preload images for a webpage I'm creating as those images are quite big.
Currently I know (thanks to another post here) how to handle the images themselves via preloading them (via javascript pre loading and then displaying them in a canvas).
BUT whenever I switch the page the preloaded images need to be preloaded again, thus they are not cached.
So my question is: Is there any possibility to cache these images?
(or is it even best to put them into a session variable?)
The images themselves are quite big and can take up 1.5MB each (in total there are 20 images alone in the part that is currently already in existence, which takes about 4 seconds to preload).
As infos if necessary:
I'm using an apache server and php as primary language with javascript as support.
Edit:
As I forgot to mention it: The webserver I will finally store the site on is an external one (hosting provider) so I won't be able to edit the webserversettings themselves there
If the images don't change, try something like this in .htaccess:
#Set caching on image files for 11 months
<filesMatch "\.(ico|gif|jpg|png)$">
ExpiresActive On
ExpiresDefault "access plus 11 month"
Header append Cache-Control "public"
</filesMatch>
If you think this is not the right approach, like the images may change, just eager-load the images right when the page hits (warning, definitely a hack):
(function(){
var hiddenCache = document.createElement("div");
hiddenCache.style.display = "none";
document.body.appendChild(hiddenCache);
// or for loop if ECMA 3
myEagerLoadedImageUrls.forEach(function(urlStr){
var hiddenImg = document.createElement("img");
hiddenImg.src = urlStr;
hiddenCache.appendChild(hiddenImg)
});
})()
The browser already caches the images in its memory and/or disk cache as long as the headers coming from the server aren't telling it to avoid caching. The browser cache endures across page loads. SO, if your images have been loaded once on the first page, they should be in the browser cache already for the second page and thus when requested on the second page, they should load locally and not have to be fetched over the internet.
If you're looking for client-side code that can be used to preload images, there are many examples:
How do you cache an image in Javascript
Image preloader javascript that supports events
Is there a way to load images to user's cache asynchronously?
FYI, it is possible in newer browsers to use a combination of Local Storage and data URIs to implement your own image caching, but I'd be surprised if there was any real world situation where that was required and if you have a lot of images, you may run into storage limits on Local Storage quicker than limits on the size of the browser cache.