Ajax "caching", the good, the bad, the indifferent? - javascript

So I don't actually mean browser caching of an Ajax request using the GET method, but storing large queries (any number, likely double-digits, of 40 - 300kb queries) in the the browser's memory.
What are the unseen benefits, risks associated with this?
var response = JSON.parse(xhr.responseText);
Cache.push(response); // Store parsed JSON object in global variable `Cache`
// Time passes, stuff is done ...
if(Cache[query])
load(Cache[query])
else
Ajax(query, cache_results);

Is there an actual need? Or is it just optimization for the sake of? I'd suggest doing some profiling first and see where the bottlenecks lie. Remember that a web page session typically doesn't last that long, so unless you're using some kind of offline storage the cache won't last that long.

Not having the full view of your system it is hard to tell but I would think that potentially playing with stale data will be a concern.
Of course, if you have a protocol in place to resolve "cache freshness" you are on the right track.... but then, why not rely on the HTTP protocol to do this? (HTTP GET with ETag/Last-Modified headers)

You'll probably want to stress-test the memory usage and general performance of various browsers when storing many 300kb strings. You can monitor them in task manager, and also use performance tools like Speed Tracer and dynatrace ajax edition.
If it turns out that caching is a performance win but it starts to get bogged down when you have too many strings in memory, you might think of trying HTML5 storage or Flash storage to store the strings--that way you can cache things across sessions as well. Dojo storage is a good library for this.

Related

Node.js, store object in memory or database?

Am developing a node js application, which reads a json list from a centralised db
List Object is around 1.2mb(if kept in txt file)
Requirement is like, data is to be refreshed every 24 hours, so i kept a cron job for it
Now after fetching data i keep it into a db(couchbase) which is locally running on my server
Data access is very frequent, i get around 1 or 2 req per sec and nearly all req need that Object
Is it good to keep that Object as a in memory object in node js or to keep it in local db ?
What are the advantages and disadvantages of both ?
Object only read for all requests , only written once by cron job
it's a high end system, i7 quad core, 16gb ram
It depends from your hardware
If this object is immutable, per requests, it's better to keep it in memory. If no - depends.
In any case workflow open connection to db - fetch data - return result - free data will consume more resources than caching in memory.
For example, in our project we processing high definition images, and keep all objects in memory - 3-7mb in raw format. Tests shows that this is much efficient than usage of any caching systems, such as redis or couch base.
I would keep the most recent version as memory object, and store it as well. That way you have a backup if anything crashes. If you edit the file however, I would only keep it as database object.
Accessing the DB for that object every 2 seconds would probably work fine, but 1.2MB of memory is not that much and if you can keep that contained, your server won't likely run into problems.
The DB is a little slow compared to memory, but has the advantage to (most likely) be thread-safe. If you would edit the document, you could run into thread problems with a memory object.
You know the application and the requirements, you should be able to tell if you would need a thread-safe database, or if you need to safe your memory on the server. If you don't know, we need to see the actual code and use-cases to tell you what you could do best.

Should I use a cache for this?

I made a code this summer holidays and today I look for the first time at my code again, and I am strugging on one thing I did.
My system is a system with multiple types (pages, newsletters etc.) and multiple subtypes (items, archive, concepts etc.). The idea now I have an object like this:
object { 1: { normal: { 1: { content: 'somecontent', title: 'sometitle' } } } }
Another example:
object { 1: { normal: { 1: { content: 'somecontent', title: 'sometitle' } }, archive: {} }, 2: { normal: {} } }
The data originally comes from the database. I'm making a system to edit pages on the website and other things like newsletters. Because I have multiple types and subtypes.
I made a cache for the reason I don't want to get all items from the database every time. But now the problem is if I add an item, edit an item and remove an item I have to delete it from the cache / edit / add.
My question: is this a good way? I thought it is because you don't have to call an AJAX file to get the data from the database.
I'm sorry if I'm not allowed to ask this here.
My question: is this a good way? I thought it is because you don't
have to call an AJAX file to get the data from the database.
The answer is that "it depends". There is no always right and always wrong answer for caching because caching is a tradeoff between efficiency and timeliness of data.
If you want maximum efficiency, you cache like crazy, but your data may not be perfectly up to date because you're using old data from the cache.
If you want the most up-to-date data, you don't cache anything so you always get the latest data, but obviously efficiency may suffer if you are regular requesting the same data over and over.
So, it's a tradeoff and the tradeoff depends entirely upon the application, its needs, how often the data is modified and what the consequences are for having stale data or for not caching. There is no single right or wrong answer for that tradeoff. It depends entirely upon the particular situation for your application and the tradeoff may even be different for some types of data vs. others within the same application.
For example, let's supposed you were writing an online bidding site that offered some functionality like eBay. You would probably be fine caching the item description for at least several hours because that almost never changes and even if it does, the consequences of being a bit tardy on seeing a new item description are fairly low. But, you could never cache the data on the current bid because the timeliness of that information is critical. The user needs to always see the latest info on the current bid, even if you have to make some sacrifices in efficiency.
Also, remember that caching isn't completely all or none. You can set a lifetime for a cached value such that it can only be used for a certain period of time that is appropriate for the type of data. For example, you might cache an item description in the above auction for up to 2 hours. This allows you to achieve some efficiency gains, but also to eventually see the new data if it happens to change.
In general, you have to review the consequences of showing stale data. If the consequences for having data that is even minutes out of date are high (like the latest price in a live auction), then you can't cache that data at all.
If the consequences of having data that is even hours out of date are low, then you can likely cache that value for at least several hours - maybe even longer.
And, when considering what to cache, you obviously want to first look at the items that are most requested and are the most expensive on your server to retrieve. Some analysis of the usage pattern on your server would give you a prioritized list of candidates to consider for caching.
My question: is this a good way? I thought it is because you don't
have to call an AJAX file to get the data from the database.
This is fine if
1) You want to provide offline reading continuity to the user. User doesn't have to wait for internet connection to be available so that they can read at any time.
2) Your data-service is quite heavy and you want to avoid multiple/frequent visits to the server to get the same data over and over again.
3) You want your app to be bundled with a native package (like phonegap) to become a hybrid app and give a complete offline experience to the user.
This is not a comprehensive list, but just to get your started in terms of when to go for offline and when to keep totally offline
So, on the other hand, this is a bad idea if
1) Your local storage structure is going to change frequently for user to require re-install (unless you can figure out auto-upgrate of local storage)
2) All your features are transactional and require synch with other users also.
Nothing wrong with your approach, just make sure you have kept these points in mind while managing client-side cache
You have one variable 'version' maintained, this version is to be increased whenever there's any change in structure, this version will be sent to client every time, client is responsible for comparison of versions and empty client cache if server version is greater than client version.
You can implement or find any open-sources to handle your ajax responses, this one might be useful - https://github.com/SaneMethod/jquery-ajax-localstorage-cache.
you can set proper expiry tag from server, which can also help, browser to cache response for you, if it is 'get' request.
You can also implement server-side cache, which will not make calls to database, it will cache response against request-url, Note - if different users are supposed to receive different response than this approach wont work. You can delete the cache if any changes happens related to that particular data set - delete/update
In your case you can also maintain flags on server, which simply tells if data has been updated or not the time of article update, if stored version is older you can make server-request or just use local version.
I hope it helps.

Parameter passing vs local storage

I have a lobby written in HTML5 / javascript. A .json file provides a few config parameters for the lobby and for the various other HTML5 games that can be launched from it. These parameters can either be passed to the games in the window.open string ( in the form of:
window.open(http://www.myLovelyDomain.com/index.html?username=bob&token=aaaXXX")
or could be held in localStorage and accessed by the game following it's launch.
My question is, what is the best (most secure/likely to cause least errors/etc) method? I know users can turn off localStorage, but I don't know how many do. Any thoughts?
Advantages of localStorage over URL query strings
Less likely to be user edited
Less likely to be copy&pasted to someone else
Can persist across sessions
Wider choice of characters
(Marginally) less bandwidth usage (shorter GETs)
Can store whole files
Invisible to basic user
Disadvantages
Server doesn't get access to the variables without additional ajax
May be harder to debug
May need extra checks if things change every session (or consider sessionStorage)
Not supported by old browsers
Can't use cross-domain directly (may be advantage, depending on how you look at it)
For supported list and max sizes see here.

Better for loading speed: large JSON file, or small CSV file that requires processing?

For maximum load speed and page efficiency, is it better to have:
An 18MB JSON file, containing an array of dictionaries, that I can load and start using as a native JavaScript object (e.g. var myname = jsonobj[1]['name']).
A 4MB CSV file, that I need to read using the jquery.csv plugin, and then use lookups to refer to: var nameidx = titles.getPos('name'); var myname = jsonobj[1][nameidx]).
I'm not really expecting anyone to give me a definitive answer, but a general suspicion would be very useful. Or tips for how to measure - perhaps I can check the trade-off between load speed and efficiency using Developer Tools.
My suspicion is that any extra efficiency from using a native JavaScript object in (1) will be outweighed by the much smaller size of the CSV file, but I would like to know if others think the same.
Did you considered delivering the json content using gzip - here is some benchmarks on gzip http://www.cowtowncoder.com/blog/archives/2009/05/entry_263.html
What is your situation? Are you writing some intranet site where you know what browser users are using and have something like a reasonable expectation of bandwidth, or is this a public-facing site?
If you have control of what browsers people use, for example because they're your employees, consider taking advantage of client-side caching. If you're trying to convince people to use this data you should probably consider breaking the data up into chunks and serving it via XHR.
If you really need to serve it all at once then:
Use gzip
Are you doing heavy processing of the data on the client side? How many of the items are you actually likely to go through? If you're only likely to access fewer than 1,000 of them in any given session then I would imagine that the 14MB savings would be worth it. If on the other hand you're comparing all kinds of things against each other all the time (because you're doing some sort of visualization or... anything) then I imagine that the JSON would pay off.
In other words: it depends. Benchmark it.
4MB vs 18MB? Where problem? Json is just standard format now, csv is maybe same good and ok if you using it. My opinion.
14Mb of data are a HUGE difference, but I will try first to serve both the content with GZIP/Deflate server side compression and, thus, make a comparison of these requests (probably the CSV request will be again better in content length)
Then, I would also try to create some data manipulation tests on jsperf both with CSV and JSON data with a real test case/common usage
That depends a lot on the bandwidth of the connection to the user.
Unless this is only going to be used by people who have a super fast connection to the server, I would say that the best option would be an even smaller file that only contains the actual information that you need to display right away, and then load more data as needed.

Is it possible to serialize Javascript object variable and store into cookies?

Is it possible to serialize Javascript object variable and store into cookies? Or is there other way to accomplish the same thing?
If these objects aren't sensitive (I.e., you don't care if your users modify them), then serializing them into cookies is fine, provided that your objects are small enough not to cause issue.
If your cookies ARE sensitive (you need to depend on them to a level of integrity) or you have large structures, then why not consider storing these serialized objects in a persistant session that is stored on your server. You can then use the cookies as a key or ID to know which session to restore when your visitor returns. In this manner, the size of your serialized objects and whether they might 'fit' in a cookie is no longer relevant.
Another possibility if you not fussy about users modifying things, but do require ample space, (although may not work for all browsers,) is to create a HTML5 'local database' or client-side storage. In this manner, you are both eliminating your concern about the size of the cookies as well as the growing size of your own server-side database. This is probably the best option for sites where you want to store a lot of data per user, but you're not sure if they'll ever come back again. You can always resort to server-side storage (see above) for older browsers.
Here's a particularly good tutorial for getting started with HTML5 local databases: http://blog.darkcrimson.com/2010/05/local-databases/
I hope this is helpful & good luck!
I don't see why not if it fit into length limit of the cookie. I would convert serialized object into say Base64 though.
What problem you're solving?
Yes, it's possible, if the resulting string does'nt exceed the limit of the cookie-size(4KB)

Categories