Local web storing of files using HTML5 - javascript

I was fascinated by Google Gears and its potential use in online game development, particularly massive online game development. One could take the game resources and store them locally using ResourceStore, thus reducing game load time, server bandwidth issues, etc. I have therefore welcomed the news that HTML5 supports offline storage.
However, from what I can tell it only supports a manifest file for resource caching, which looks like something ManagedResourceStore is supposed to offer. I didn't study either in detail.
I also didn't find something that would allow programatically loading resources and caching them (as apparently ResourceStore allows).
Is it possible to programatically control which resources should be cached? Or would I have to store each "map" on a separate page with separate manifest file (with this being in fact done through a server side script, instead of literally creating pages and manifests -- this is still fugly)?

If you're asking about design work, it seems like you're looking for Programmable HTTP Caching and Serving. If you actually wanted to know if there's something like this in shipping browsers, then I don't know, but doubt it.

Related

Load libraries, scripts, etc for the website from external or internal sources?

I just started my adventure with frontend, most likely with web design. I've been struggling to answer one technical question and I couldn't find yet a reasonable answer.
There's so many libraries you can load, download to make your web developing faster. Therefore there is my question.
Is it better to download these libraries (e.g. Boostrap, jQuery, Angular, fonts from Google and so) and link to them (externally) from the official source or download it, upload to your server and then link to the location file (internal source) on your server?
My imagination tells me that if I would download them and upload em on my server, then link to it would make the whole website load quicker. Is that a good thinking?
Pro hosting and linking to external resources (may it be JS libraries, images or whatever):
Spreading the load: your server doesn't have to serve all content, it can concentrate on its main functionality and serve the application itself.
Spread the HTTP connections: due to more and more asynchronously working applications it is a good thing to use the maximum parallel HTTP connections per site/subdomain to deliver application data and load all necessary additional resources from other servers.
as Rafael mentioned above, CDNs scale very good and seldom go offline.
Cons
Even with fast internet connections there is a high chance that resources will be served faster when they are located on the same Intranet. That's why some companies have their own "Micro-CDNs" inside their local networks to combine the advantages of multiple servers and local availability.
External dependancy: as soon as an Internet connection becomes unavailable or a Proxy server goes down, all external resources become unavailable leaving the application in a broken state.
Sometimes it may be actually faster if you link from an external source. That's because the browser stores recent data it accesses, and many sites use Bootstrap, jQuery and the such. It might not happen frequently with less popular libraries.
Keep in mind, though, since you're downloading from external sources, you're at the mercy of their servers. If for some reason or another it gets offline, your page won't work correctly. CDNs are not supposed to go offline for that very reason, but it's good to be aware of that. Also, when/if you're offline and working on your page, you won't be able to connect during development.
It is always better to download these files locally if you are developing some application for more security so that you do not really have to depend on any third party server which hosts the CDN.
Talking about performance using CDN might be beneficial because the libraries that you require might be cached in your browser so the time to fetch the file is saved. But if the file is available locally loading these files will definately take time and space.
https://halfelf.org/2015/cdn-vs-local/
https://www.sitepoint.com/7-reasons-not-to-use-a-cdn/
I agree with Rafael's answer above, but wanted to note a few benefits of serving up these libraries locally that he or she omitted.
It is still considered best practice (until HTTP2 becomes widespread) to try to minimize the amount of downloads being made by your site by concatenating many files into a single file. SO-- if you are using three Javascript libraries/frameworks (for instance, Angular, jQuery and Moment.js), if you are using a CDN that is three separate script elements pulling down three separate .js files. However, if you host them locally, you can have your build process include a step where it bundles the three libraries together into a single file called "vendor.js" or something along those lines. This has the added bonus of simplifying dependency loading to some degree, as you can concatenate them in a certain order should the need be.
Finally, although it is a little advanced if you are just getting started, if you are considering hosting your library files with your project it would definitely be worth looking into bower (https://bower.io/docs/api/) -- it is a node build tool that allows you to define what packages need to be included in your project and then install them with a single command-- particularly useful for keeping unnecessary library files out of your version control. Good luck!

Building web-apps with offline asset capablilities

Since being asked to develop a web-app for someone, I have ben thinking about the whole project. One of the main things that the frontend needs is the ability to operate offline. At first it seemed that maintaining the application offline would be easy:
Important information from the database could be replicated into indexedDB.
Storage API's would be useful for storing tidbits of info.
the Application Cache could handle storing assets offline.
My ideas seemed solid, until I did some research. The application cache has been deprecated. Apparently, it had some issues and wasn't as great as I thought. Now it seems nearly impossible to build offline apps. Through research and thought, I have considered a few solutions, but they all have some sort of flaw.
One article I read considered using localStorage for storing assets. This seems ok, I guess, as the application would be single paged, but assets such as CSS, JavaScript libraries, and images would large, and while I could compress them, it seems kind of hacky to store them as strings in localStorage.
MDN pointed me to Service workers. These seem good, but also overcomplicated and their browser support just wont work for me.
I considered using the File API instead of localStorage to handle assets. The problem is that the File API only seems to work with user interaction such as file upload or drag and drop which is not what I need. I would need just to write to files using JavaScript behind the scenes. Even if that, however, I would expect a performance hit especially with user with slower disks.
As you can see from my solutions one of the main factors is speed. I suppose a procedure like this could be isolated from the main application using WebWorkers, but even then, the feeling of storing files in localStorage is not a good one.
I don't believe than any of these solutions are viable ones, but I cannot be too sure. How should I go about storing assets for offline applications? Ideally, I would like mobile support, but as of now I am looking for a solution that:
Will not have a serious degrade on performance
and
Semantically looks good and does not use any hacks or bad practices.
What solutions do I have available? Are any of my above solutions decent?
Application Cache has only just been deprecated by Firefox a couple of weeks ago, but to me it seems a rather rash move on their part as they haven't finished the replacement yet! See https://www.fxsitecompat.com/en-US/docs/2015/application-cache-api-has-been-deprecated/ and https://bugzilla.mozilla.org/show_bug.cgi?id=1204581 (in particular: "We have not shipped service workers yet, and our implementation of Cache API without service workers is not quite useful for replacing appcache yet.")
I reckon it will be at least a couple of years until AppCache is removed from browsers, and as you've discovered, right now it's your only real choice for cross-browser compatibility. As Service Workers become more mature, a wrapper will probably be developed to ease transition from AppCache to the SW equivalent. (Sounds possible, at least: Is Service Worker intended to replace or coexist with Appcache?)
Which brings me to my next point: for offline database stuff, I recommend LocalForage, Mozilla's wrapper for various offline storage options. It will choose the best available option on the user's browser, saving you the hassle of deciding. I've just used it on a project, it's really simple to use. https://mozilla.github.io/localForage/
Speed-wise you'll probably be pleasantly surprised. Even using LocalStorage (which is synchronous, so blocks execution while running) you might never notice any delay in real-world use.

protect contents of cordova android app

I'm developing a Cordova app for Android (so it's all HTML/CSS/Javascript code).
This app is going to feature contents that I don't want them to be freely distributed on the internet, mostly audios, videos and some XML files.
Although those contents will be loaded from a server and other content providers, a user could unzip the APK and look into the www folder, analyze the source code (mostly jQuery and jQuery Mobile stuff) and find the direct paths to all those contents. Then, easily download them. Those paths might be inside the javascript code or inside XML files.
Is there any way to prevent this? I know of JS obfuscators, but I believe that they're pretty easy to reverse.
I think you've pretty much answered your own question. Obfuscation is the only way to "protect" the Javascript code, and there really is no way to protect the content. You try encryption, but the Javascript code to un-encrypt it will be exposed, so that solution practically useless.
Perhaps one option is to encrypt content on the server with a key provided by the user, then download it on the app's first run. This has obvious drawbacks as well: Some kind of separate user registration or account is required, entering a password every time the app starts is inconvenient, dealing with lost passwords, et cetera.
There are lots of obfuscation libraries for Javascript, just Google for them.
"Resources are world-readable by design.
Even if you were to not package the ""images or soundFX files"" as resources but were to download them on first run,
users with root access could still get to the files.
Since this is not significantly different than any other popular operating system humanity has developed,
it is unclear why you think this is an Android problem.
Sufficiently interested users can get at your ""images or soundFX files"" on iOS, Windows, OS X, Linux, and so on."

Is there any way to automatically synchronize html5 localstorage between computers

I have a simple offline html5/javascript single-html-file web application that I store in my dropbox. It's a sort of time tracking tool I wrote, and it saves the application data to local storage. Since its for my own use, I like the convenience of an offline app.
But I have several computers, and I've been trying to come up with any sort of hacky way to synchronize this app's data (which is currently using local storage) between my various machines.
It seems that chrome allows synchronization of data, but only for chrome extensions. I also thought I could perhaps have the web page automatically save/load its data from a file in a dropbox folder, but there doesn't appear to be a way to automatically sync with a specific file without user prompting.
I suppose the "obvious" solution is to put the page on a server and store the data in a database. But suppose I don't want a solution which requires me to maintain apps on a server - is there another way, however hacky, to cobble together synchronization?
I even looked for a while to see if there was a vendor offering a web database service - where I could, say, post/get a blob of json on demand, and then somehow have my offline app sync with this service, but the same-origin policy seems to invalidate that plan (and besides I couldn't find such a service).
Is there a tricky/sneaky solution to this problem using chrome, or google drive, or dropbox, or some other tool I'm not aware of? Or am I stuck setting up my own server?
I have been working on a Project that basically gives you versioned localStorage with support for conflict resolution if the same resource ends up being edited by two different clients. At this point there are no drivers for server or client (they are async in-memory at the moment for testing purposes) but there is a lot of code and abstraction to make writing your own drivers really easy... I was even thinking of doing a dropbox/google docs driver myself, except I want DynamoDB/MongoDB and Lawnchair done first.
The code is not dependent on jQuery or any other libraries and there's a pretty full features (though ugly) demo for it as are well.
Anyway the URL is https://github.com/forbesmyester/SyncIt
Apparently, I have exactly the same issue and invetigated it thoroghly. The best choice would be remoteStorage, if you could manage to make it work. It allows to use 3rd party server for data storage or run your own instance.

Is it really possible to achieve grade "A" in yahoo's yslow for all things for a dynamic/CMS website?

Is it really possible to achieve grade "A" in yslow for all things for a dynamic and CMS(PHP/Asp.net) based websites? and using same server.
(source: haacked.com)
http://developer.yahoo.com/yslow/help/index.html#performance_view
Yes, I guess it is possible to achieve this on one server, except of course for the CDN part which relies on an external service. You'll probably need full control over your server to configure things like ETags and such.
I think it's rarely worth the effort to fulfill all this literally down to the last percent except if you're a huge site like Google or Yahoo themselves, where every saved byte can mean tens or hundreds of thousands in savings. Just get a proper grade so things work fast and reliably - much like in school :)
Yes. First of all, try making any and all of your JS external, and load it on demand, only preloading the components that you really need. Then monitor when each javascript file is loaded, in order. Run that through JSBuilder (JavaScript packaging and compressor tool).
Turn on GZIP on your server. Gzip compression was able to reduce my static filesizes (css, js, etc.) by 73.43%.
Cache, cache, cache. Anything that doesn't change between application deployments needs to have a far-future expires header.
If you can afford it, serve your files from a cdn. They are distributed networks that make delivering content easier.
Get rid of your cookies, or combine them by encoding their values in JSON or using a caching service server-side to cache the values, and only store the cache key in the cookie. That way you only have one cookie instead of hundreds.
Put your css at the top, and optimize it by stripping out any unused selectors and properties.
Oh, and consider switching to a thin client...reloading the web page is so 1999. Using a thin client allows you to try different page download optimization techniques, and decouples your view (the web client) from your server and middleware api, allowing you to develop a front end in just about any RIA environment of your choice. You can go extremely lightweight with JQuery, or go with the more robust pre-built UI's of Ext or Dojo.
Reduce the amount of unused HTML. Tables are evil unless absolutey necessary or inserted into the dom after page load.
I'm sure that some of this will require some major re-work, which your application architecture and developer skill set may not be geared for at this time. The good news is that you can improve your user-experience just by cacheing cookies server-side (like I mentioned above), gzipping your static components, and combining and minifying any and all JS, and optimizing your CSS and layout, without getting into re-structuring your web application.
Sure, why not?
Each item follows through to a link with more details on how to acheive a higher grade
Yes, it is possible.
Here is a guided success example of an optimisation of a Typo3 installation. Take a look at this Yahoo! page for optimisation goodies.
Is it worth trying to optimise for bandwidth reduction and server responsiveness? Sure!
A lot of people connect through hand-held devices with expensive mobile phone plans. The school system where I live even have limited and expensive broadband plans. Waiting for a website to load is a waste of time.

Categories