I am developing a native iPhone app in Titanium.
Within this app I am using data from a remote API (which I have developed in Rails 3).
I want the user to cache the API data on their phones as much as possible.
What I need help with is the concept of caching. What is the best way of doing it? The nature of the data in the API is that it needs to be up to date. Because it is contact data that can change anytime.
I have no clue in how the cache process would work. If you someone can explain
the best way of managing a caching process for the API I would be more than happy!
I am using JSON and Javascript.
"The nature of the data in the API is that it needs to be up to date. Because it is contact data that can change anytime"
If that's true then it makes any kind of caching redundant, as you would need to compare the cache to live data to check for changes, thus making the cache itself pointless.
The only reason you may still want to cache the data is to have it available off-line. That being the case i would use an SQLite database, which is native to the iphone.
titanium-cache is clean code with a unit tests and provides some sample code in the readme. I integrated this with my own project in a matter of minutes and it worked nicely.
I think the type of cache it's application dependent.
You can cache data on:
client;
server;
other network element.
Critical point is refresh of data. A bad algorithm produce inconsistent data.
You can find interesting information on literature of distributed systems
Bye
A couple options here.
1) You can use ASIHTTPRequest and ignore cache headers to cache everything. When your app is being used, you can detect if the cache is being hit. If it is hit, you fire off a request to the server after the cache hit to request any new data. You can do this by appending a random URL param to the end of the URL since the cache keys off of the URL. If you have a good connection and new data, load it in. Otherwise do nothing and your user has the latest data when using the app under a good connection
2) Do most of #1 by always hitting the cache but instead of firing a non-cachable version of the same request to the server after hitting the cache, fire off a non-cacheable timestamp check to see if data was updated. If it has been, fire off the non-cachable full API request. If it hasn't or it fails, you can do nothing.
Related
I am working on a chrome extension which has to store data of its user. For that I am using a hosted server which is running a mysql database. But currently any addition or change in data fires a request to the hosted server.
Chrome extension provides chrome.storage.local API which is suitable to store data upto 5mb. I want to take advantage of this storage API to reduce number of requests to my hosted server by using it as a temporary storage.
I am planning to use chrome.storage.onChanged.addListener and chrome.storage.local.getBytesInUse to check if data stored crosses a certain threshold value and then only fire an ajax request to the remote server to save the data. Upon successful response, the old data in chrome.storage will be flushed off.
But there are chances of losing some new data which is created during the process of request/response cycle from the server.
How can I prevent any loss of data? Is there any alternative solution to this optimization problem of reducing number of requests to the remote server from the extension?
Thanks.
This isn't really a question about chrome extensions. It's more about persistent databases that work offline and synchronize intelligently. Which happens to be a very hard problem to do right.
The easiest solution is to use chrome.storage.sync. That buys you persistence for free with the caveat of limited storage. You should definitely see if this is feasible before trying other options.
Otherwise, I recommend looking into 3rd party options before rolling your own solution. You might have heard of progressive web apps, which work offline, and sync when internet is available.
An article about the advantages of progressive web apps
Google Tutorial
PouchDB, a well regarded web database that works offline and syncs to other databases
Look into those. It'll be well worth the trouble. otherwise you'll just end up building hacks on top of hacks trying to get syncing to work.
... one last thing... make sure to add your remote database's URL to your manifest's permissions.
Is there a library/project which smooths over the process of caching some json data in the browser when using socket.io? I guess what I really mean is, is there a github/opensource project already out there focused on this task so that a developer could more or less drop it into any socket.io project?
For example, let's say I am getting tabular data for a page and the data is received by using socket.io. I want the data to remain cached so that I can save a server request if the user reloads the browser.
Additionally, I'd want this to happen more or less without me having to manually create cache variables, like: http://davidwalsh.name/cache-ajax . I want the socket.io cache library to be able to do this for me.
I want to occasionally & easily be able to clear the cache if it changed on the server. So, let's assume for that what I'm looking for has a method for analyzing timestamps on when the remote data was modified. What I mean is, let's assume for now that there is a way of notifying the browser when database table/rows/documents have been modified so that it knows when to clear the socket.io cache (perhaps sending meta information about database table modify timestamps along with data requests or maybe with an occasional comet-type message).
Why not use local storage? Read about it from Mozilla Developer Network
It is easier to use.
To set value
localStorage[ 'key' ] = strValue;
To retrieve
strVal = localStorage[ 'key' ];
Yup, just simple associative arrays
I would like to retrieve table data from a REST backend server with angularjs. The data changes by the second. I would like to use angularjs to refresh the data as it changes in real-time. How can this be done? Do I force angularjs to make ajax calls at regular time intervals to refresh the data display?
Yeah with REST there's no other way then polling. To refresh the data itself in the browser view you can use $rootScope.$apply() in the service (I presume that you're using a service to get the data), first you have to inject the dependency of $rootScope of course.
EDIT
Actually you shouldn't use $rootScope.$apply(), because it can lead to ugly $digest in progress errors. Instead use
$timeout(function(){
// set data here...
})
A tip for improvement if you're not happy with the polling and if you have programmed the backend or are in position to change it, is:
Try to use WebSockets:
It gets rid of the polling, because then your browser can communicate directly to the server.
It has less overhead.
It is supported in the major browsers and you can use libraries with fallback mechanisms like SocketIO
The server can then always push the data right then, when it's available. The server-side implementation depends on the backend you are using, but most backend frameworks/servers nowadays also support websockets.
You have to either use the poll or push strategy. However, since you already know that the resource changes regularly, it should be enough to setup a recurring timeout so that your application polls the resource on the REST server every second. Once it is retrieved, AngularJS updates the view. So, yes you have to force AngularJS to make calls to the service.
I'm currently experimenting with WebSockets in a bid to reduce / remove the need for constant AJAX requests in a potentially low bandwidth environment. All devices are WebSocket compliant so there's no issue there, and I'm trying to keep it to native PHP WebSockets, no node.js or other frameworks / libraries (Which so far has been fine).
What I'm looking to do is to decide how to go about notifying connected clients about an update to a database by another Client. The use case in question is a person pressing a button on their device, which then alerts that persons manager(s) to that press. So the two options I have though of are as follows:
1. Looping a Database Query (PHP)
My first thought was to insert a query into the WebSocket server that is effectively saying "Has the alert field changed? If so, notify the manager(s)". Whilst this is the most straightforward and sensible approach (That I can think of), it seems wasteful to have a PHP script designed to reduce strain on the server, that is now running a query every second, however, at least this would ensure that when a Database update is detected, the update is sent.
2. Sending a notification from the Client
Another thought I had, was that when the client updates the Database, they could in fact send a WebSocket notification themself. This has the advantage of reducing any intensive and looped queries, but also means that I'd need to have a WebSocket message being sent every time I want to change any data, such as:
$.post("AttemptDatabaseUpdate.php", {Data}).function(Result) // Don't worry about the semantics of this, it's not actual code
{
if(Result == "Successful")
{
SendWebSocketNotification(OtherData);
}
}
Maybe this is the best option, as it is the most efficient, but I worry that there is a chance the connection may drop between updating the Database, and sending the WebSocket notification, which may create a need for a fallback check in the PHP file, much like the one in the first solution, albeit at a longer interval (Say every 30 seconds).
3. MySQL Trigger?
This is purely a guess, but perhaps another option is to create a MySQL trigger, which can somehow notify the server.php file directly? I've no idea how this would work, and would hazard a guess that this may end up with the same or similar Query requirements as solution #1, but it's just a though...
Thank you in advance for your help :)
EDIT: Solution possibility 4
Another thought has just popped into my head in fact, whereby the PHP file used to update the database could in fact have a WebSocket message built into it. So that when the PHP file updates the database, the WebSocket server is notified via PHP, is this possible?
If you use websockets, you should use notifications from client. That's one of their main use cases.
If you're worried about inconsistencies due to connection dropping or something changing in-between, you could implement a system similar to HTTP ETags, where client would send a hash code that you can respond on server side if there is a conflict in updating.
Update: I guess I understood your initial issue a bit wrong. If I understand your use case correctly: you are sending database updates from a client and after that all connected clients need to be updated. In that case, I think server should send the update messages after DB updates have been done, so I agree with solution 4. I am assuming here that your websocket server is the same server running PHP and doing the DB updates.
However, depending on your use case, client should still send a hash value on the next request identifying its "view of the world", so you would not be doing identical updates multiple times if a connection gets broken.
Update 2: so it was now understood that you indeed use a separate, standalone websocket server. Basically you have two different web servers on the server side and are having an issue on how to communicate between the two. This is a real issue, and I'd recommend only using one server at a time - either take a look at using Apache websocket support (experimental and not really recommended) or migrating your php scripts to the websocket instance.
Neither PHP or Apache was really build with websockets in mind. It is quite easy to set up a standalone websocket server using only PHP, but it might not be so easy then to migrate the rest of the PHP stack to it if the code is relying on Apache/web server on. Apache websocket support also is hardly optimal. For a real websocket solution, unfortunately, best practice would be using a technology that is built for it from the ground up.
The better answer is to send notification through Server side when database is updated by PHP script, so that script have to add options of web sockets to directly send notification to all web socket clients registered.
User send content->Php script process content and save data according to true condition->check database is updated by checking return of mysql_query/other alternative->if true than use web-socket and send notification to all users
now this is more easy/handy/bandwidth saver.
I want to store JSON objects at client side using any Java based implementation, What are the possible ways I can try.
These objects are created and stored at form submission time when Network is not available and will be sent to server when the next time Server is connected.
How can I achieve that ? Thanks in advance,
Look at jstorage, they use a few strategies to store values. The basic is HTML5, which gives you up to 5MB storage and is widely supported.
However offline storage is what you need if user's actions need to survive page refresh or close. If they need to survive only temporary network breakdown, you will need only to keep them in JavaScript memory and try to repeat JSON requests until success.
I recommend starting from frequent repeats, and then increase the timeout (if network or server app is down for hours, there's no need for pinging it every second).