I have a web app with offline functionality.
I managed to get the manifest working properly with all the required assets; css, js and images.
However I use angular to get from the server a list of user generated images, in a similar manner:
$scope.images = angular.fromJson(localStorage.getItem('images')) || []
$http
.get('/my-rest-endpoint')
.then(function(res){
$scope.images = res.data
localStorage.setItem('images', angular.toJson(res.data))
}).catch()
I use the localStorage to keep the list even if the user is offline,
however I would like to have these pictures included in my offline cache...
Any idea how to achieve this?
BONUS:
In the future there may also be video files. What to do in that case?
This is one of the many offline challenges. Your options are:
1) Add images paths to manifest.appcache:
Dynamically generate your manifest.appcache, inserting references to the images you want to offline. You could use the appcache-nanny to check for updates.
2) Store images in localStorage:
This might be okay for a few images, but keep in mind localStorage capacity ranges from 2.5mb to 10mb depending on the device/browser, so that rules out video.
3) Use localForage with offline-fetch:
localForage is a wrapper around IndexedDB/WebSQL that could give you up to 50mb client-side storage. offline-fetch is a wrapper around fetch (the XMLHttpRequest replacement).
If you request your images using offline-fetch configured to use localForage, it will take care of serving them from disk when it detects the app is offline but would mean loading your images via JS and not img src attribute.
Related
I'm working with an extremely old database system containing people and associated PDFs. I can access most data over a webbrowser, however PDFs cannot be requested via web-api - I do however have the liberty of loading any javascript library, and use chrome webdev console.
I want to get a proof of principle working, where I can load a person's PDFs. But I'm not quite sure what the best approach is.
Idea to upload a file to the website's local storage in my browser (since it's viewed several times). However I seem to be lacking a good library to save/load files from the cache directory. This library wasn't updated since 2016 and Filesaver.js doens't seem to be keen on loading the files after saving. Using a fully-fledged database implementation seems overkill (most files will be <= 5-10MB)
Loading a file from local storage (even if dir is added to workspace in chrome) seems completely impossible, that would've been an alternative
adding the local file path to a <a href=""> element did not work in chrome
Is there a feasible approach to manage/associate PDF files on my local drive with the website I'm working with (and have full client-side control, i.e. can load any library)?
Note: Access is restricted, no chrome addons can be used and chrome cannot be started using custom flags
I don't exactly know what you are asking for, but this code will get all the pdfs in a selected directory and display them and also makes a list of all the file objects. This will only work in a "secure context" and on chrome
(it also wont run in a sandbox like a stackoverflow code snippet)
js
let files = [];
async function r() {
for await (const [_, h] of (await window.showDirectoryPicker()).entries()) files.push(await h.getFile());
files = files.filter(f => f.type === "application/pdf");
for (const f of files) {
let e = document.createElement("embed");
e.src = URL.createObjectURL(f), e.type = f.type;
document.body.appendChild(e);
}
}
html
<button onclick="r()">read PDFs</button>
also you can probably use this if you need to send the local PDF somewhere
not sure this answers the question but i hope it helps
Since ActiveX controls are no longer available browsers can display a PDF or a user can download the pdf.
For any more control over that I suspect you could try render the pdf using a JavaScript library like https://mozilla.github.io/pdf.js/
For full control you wont be in a position to control the PDF version, you could alternatively render the PDFs to images on the server and display image versions of the pages.
I'm working in Angular2. In my app, I've to load excessive amount of images. But instead of loading those images for every reload, I want to put those images into browser cache but in key-value pair form.
I want something like following
<img src=getImageFromCache('image-key')/>
And when API returns image URL, then I want to put that image in browser cache like following
addImageToBrowserCache('image-key', 'image-url')
Any kind help, please?
Regards
For this you can use local storage of browser.
As:
localStorage['img-key'] = 'img-url';
or
localStorage.setItem('img-key','img-url')
Same for retrieving.
localStorage.getItem('img-key')
How to secure the src path of the image when clicks on inspect element so that user should not get to know about the actual src path..please help me with the solution and it should be done with javascript only no other tags should be used.
You can convert image into base 64 data URIs for embedding images.
Use: http://websemantics.co.uk/online_tools/image_to_data_uri_convertor/
Code sample:
.sprite {
background-image:url(data:image/png;base64,iVBORw0KGgoAAAA... etc );
}
This is commonly done server-side, where you have an endpoint that serves the image file to you as bytes...
You can store the images in a private location on the server where IIS/<your favourite web server> doesn't have direct access to it, but only a web app, running on it, with the required privilege is authorized to do so.
Alternatively people also "store" the images in the database itself and load it directly from there.
In either case, the response which has to be sent back has to be a stream of bytes with the correct mime type.
Edit:
Here are a couple of links to get you started if you are into ASP.NET:
http://www.codeproject.com/Articles/34084/Generic-Image-Handler-Using-IHttpHandler
http://aspalliance.com/1322_Displaying_Images_in_ASPNET_Using_HttpHandlers.5 <- this sample actually does it from a database.
Don't let the choice of front-end framework (asp.net, php, django, etc) hinder you. Search for similar techniques in your framework of choice.
Edit:
Another way if you think html5 canvas is shown here: http://www.html5canvastutorials.com/tutorials/html5-canvas-images/
However you run into the same problem. Someone can view the image url if they can see the page source. You'll have to revert to the above approach eventually.
I have an MVC.net website which serves images based on this old article:
http://blogs.msdn.com/b/miah/archive/2008/11/13/extending-mvc-returning-an-image-from-a-controller-action.aspx
My c# controller code looks like this:
public ActionResult GetThumbnail(string code)
{
byte[] image = _dataProvider.GetThumbnailImage(code);
return this.Image(image, "image/jpeg");
}
On the client side I have an AngularJS controller which loads up a search result set from the server. The result set includes a number of image URLs like this:
<tr ng-repeat="item in data.items | filter:filter" class="fixed-height-80">
<td>
<img ng-src="{{item.thumbnailUrl}}"/>
</td>
</tr>
The thumbnailUrl points to the GetThumbnail action on my MVC controller and is constructed server side in a model factory and returned to the Angular ready for use.
The problem is that the images load very slowly even thought they are only about 3kb per image. After the async search return is completed in the javascript the images appear one at a time, about one per second, until they are all loaded.
I put a Stopwatch in the C# on the .net controller and the loading of the image data from the dataProvider on the server side takes about 0.9ms But even with just ten images to serve it takes about six seconds before all the images are loaded in the page. The JS renders the links almost immedialtly, it's just the images that are slow.
How can I speed up image loading in this context?
Update
If I move the images into ~/images/image.jpg and route the url to point directly at the folder using Url.Content it doesn't seem to have the same issue. therefore the problem seems to be with the way the controller is serving the images - sometimes if can take < 10ms and other times over 2000ms for the same image, but it's unclear why.
Several options are available:
Caching the images in the browser/client-side by adding cache-control HTTP to the response headers.
Caching the images in the server-side (in-memory) instead of carrying-out an I/O operation to the file-system (IIS will cache frequently used images and serving them from memory).
Compression of HTTP response's resources using GZIP, for example (you should do it in your Web-Server).
Reduce images size/quality in the server (maybe while uploading them).
Use CSS image sprite technique: An image sprite is a collection of images put into a single image and by that, reducing the number of images that the browser need to load.
Use a dedicated CDN for serving your images faster and in turn, reduce the load-time on the server.
You should decide what's the best choice in your case.
UPDATE:
Another 2 options:
With ng-repeat, in each iteration of your results you're effectively accessing the server instead of accessing the browser cache - there could be hundreds of records. It would be better to send your images with cache-control HTTP response headers (A Beginner's Guide to HTTP Cache Headers) in order to prevent accessing the server again and again for fetching the images and reducing the repeated round-trips.
Pre-load your images into the browser cache: http://perishablepress.com/3-ways-preload-images-css-javascript-ajax/
Consider option to resize images once they loaded on the server and store thumbnails images on the server or CDN as well as orignal version. This will reduce server load and make image load as fast as getting image without any processing every time image requested.
Use a proper CDN or just serve it directly from virtual directories. Your current implementation is IIS is good and fast as a web server, .Net framework is good as a server-side technology.
A properly configured IIS will help you save bandwidth and file system I/O, if you need resizing capabilities, check out Image Resizer
It turns out that it's the MVC SessionState that causes this issue - although it's not entirely clear why at thgis stage.
Adding a new controller just for images and decorating it as below will disable the default session state behaviour and prevent the images getting stuck 'waiting' for ages.
[SessionState(System.Web.SessionState.SessionStateBehavior.Disabled)]
public class ImageController : Controller
{
public ActionResult GetThumbnail(string code)
{
byte[] image = _dataProvider.GetThumbnailImage(code);
return this.Image(image, "image/jpeg");
}
}
I'm currently trying to preload images for a webpage I'm creating as those images are quite big.
Currently I know (thanks to another post here) how to handle the images themselves via preloading them (via javascript pre loading and then displaying them in a canvas).
BUT whenever I switch the page the preloaded images need to be preloaded again, thus they are not cached.
So my question is: Is there any possibility to cache these images?
(or is it even best to put them into a session variable?)
The images themselves are quite big and can take up 1.5MB each (in total there are 20 images alone in the part that is currently already in existence, which takes about 4 seconds to preload).
As infos if necessary:
I'm using an apache server and php as primary language with javascript as support.
Edit:
As I forgot to mention it: The webserver I will finally store the site on is an external one (hosting provider) so I won't be able to edit the webserversettings themselves there
If the images don't change, try something like this in .htaccess:
#Set caching on image files for 11 months
<filesMatch "\.(ico|gif|jpg|png)$">
ExpiresActive On
ExpiresDefault "access plus 11 month"
Header append Cache-Control "public"
</filesMatch>
If you think this is not the right approach, like the images may change, just eager-load the images right when the page hits (warning, definitely a hack):
(function(){
var hiddenCache = document.createElement("div");
hiddenCache.style.display = "none";
document.body.appendChild(hiddenCache);
// or for loop if ECMA 3
myEagerLoadedImageUrls.forEach(function(urlStr){
var hiddenImg = document.createElement("img");
hiddenImg.src = urlStr;
hiddenCache.appendChild(hiddenImg)
});
})()
The browser already caches the images in its memory and/or disk cache as long as the headers coming from the server aren't telling it to avoid caching. The browser cache endures across page loads. SO, if your images have been loaded once on the first page, they should be in the browser cache already for the second page and thus when requested on the second page, they should load locally and not have to be fetched over the internet.
If you're looking for client-side code that can be used to preload images, there are many examples:
How do you cache an image in Javascript
Image preloader javascript that supports events
Is there a way to load images to user's cache asynchronously?
FYI, it is possible in newer browsers to use a combination of Local Storage and data URIs to implement your own image caching, but I'd be surprised if there was any real world situation where that was required and if you have a lot of images, you may run into storage limits on Local Storage quicker than limits on the size of the browser cache.