I have to design an webapp that has the capability of working offline.
So there are many working points that in normal mode work online , connected to a central server.
Now there are moments when for different reasons , the central server might not be available.
(no internet ,server crash etc) so then is needed to enter in offline work mode.
Is not required to work fully just to do some work because clients should not wait , so invoicing should be possible (concrete case).
A custom solution I already have in mind but I am wondering if you know a framework or something that does such things already.
Thank you !
We wrote a desktop app for hundreds of employees to use on their laptops. It used database replication to merge the data from the laptop copy of the database to the server copy of the database. The amount of data contained in the database was significant -- product information, customer contact information, and so on. That was all needed for the rep to be able to create sales orders and invoices and the like. It was crucial that the rep be able to use the software all the time, not just once in a while when they had connectivity. However, this approach does have its challenges -- if the local databases don't get synched up frequently, data at both ends becomes stale, plus you have to deal with conflicing updates.
If the amount of database information needed locally for working disconnected isn't huge, you definitely can take advantage of the new HTML5 offline storage and use a website.
I think that the critical factors here are how much data the user needs when they are working offline, how fresh the data needs to be, and what percentage of time they will be working online vs. offline.
If your app is html/javascript, use the HTML5 application manifest. See the following resources
http://www.w3.org/TR/offline-webapps/
http://www.webreference.com/authoring/languages/html/HTML5-Application-Caching/
You can use Google Gears.
Here is another link: http://www.scriptol.com/ajax/google-gears.php
Related
Currently I'm trying to learn nativescript and for this I thought about doing an App like 'Anki'
But while thinking about the data storage I stumpled upon the problem on how to save my flash cards locally for keeping the app offline (for example with SQLite), save the users time when to reflect each card (e.g. to show again in 10 minutes or 1 day) AND have an update functionality to update the database with new cards without deleting the users data.
What's the best way to solve that problem, especially when I want to provide the updates with an App-Update and without fetching everything from an external database?
I don't have any code yet, therefore a recommendation on how to solve that would be nice.
There is several methods in NativeScript you can use:
NativeScript-Sqlite (disclaimer: I'm the author)
This allows full access to Sqlite for saving and loading items; you can have as big of databases as you need and Sqlite is very fast. Sqlite's biggest drawback is speed of writes; if you have a LOT of writing it can be slower than just writing to a file yourself.
NativeScript-LocalStorage (disclaimer again: I'm the author)
This is more geared to smaller data sizes; as when the app starts and saves it has to load the entire json backed data store into memory. This is really fast over all; but not something you want to use for 10's of thousands of records.
NativeScript-Couchbase
This uses sqlite for local storage and can use couchbase for the remote storage; very nice for having syncable storage - couchbase can be your own server or a leased or rented server.
NativeScript-Firebase
This is also very useful for having syncable storage; however Google charges for FireBase at a certain point.
Built in AppSettings.
This is really designed for a few application settings, not designed for lots of data. But useful for the smaller amounts of data.
Role your own to the file system.
I have done this in a couple of my projects; basically a hybrid between my localstorage plugin and a mini-sql type system. One project was very much write dependent so it made more sense to generate the 20 or so separate files on the phone for each table because I could save them much quicker than inserting/replacing > 100,000 records each time the app started up into sqlite. Had minimal searching needs.
Your storage really needs to be dependent upon what you are doing; it is a balancing act. Lots of searchable data; sqlite wins in almost all cases. Lots of frequent writing; something you create might be a lot faster.
I'm still new to web development. To learn more about JavaScript(JS) and web development, I am thinking of writing a simple web app which pulls and records time-series data (say, the price of a stock) periodically and draws a live chart showing the historical data. In addition to price data, I would like the app to record/maintain some user-related info such as the ticker of the stock(s) associated to each user.
Ideally, I would like to keep the app light-weight and portable/standalone (meaning, reduce the dependency as much as possible, and the end user hopefully doesn't have to do a lot of configuration/install of dependencies). The issue that I cannot figure out is where to store the historical data. I looked around for database solutions which will allow the app to write data directly from the browser (that is, using JS) to the client's machine. LocalStorage and IndexDB are non-persistent as far as I understand. Some suggested using PouchDB, but upon looking at it closer, it seems like the user need to install CouchDB or some compatible DB (say, SQLite). But that means I cannot share my app with users who aren't technical enough to install and configure CouchDB or SQLite on their machine before using my app.
If anyone could share some insights as to which DB might allow a JS-based app to write persistent data to the client's machine (if such thing even exist), that would be greatly helpful. If there is no such DB solution, please feel free to let me know alternative solutions that would allow the goal of building a simple, portable, JS-based web app. Thank you!
I think the best solution is to use Electron.js. The whole idea of this framework is to create web apps that can reside on client machines. You could package up any DB option you want, or even better, just include an API to your backend through the web app and it will work on your client machine like I think you want it to.
As for DB options, there is a great thread on S.O. that talks about what is possible. It looks like knex.js is your best bet (full disclosure - I haven't used knex).
I'm building a relatively complex and data heavy web application in AngularJS. I'm planning to use php as a RESTful backend (with symfony2 and FOSRESTbundle). I have spent weeks looking around for different solutions to on/off line synchronization solutions and there seem to be many half solutions (see list below for some examples). But non of them seem to fit my situation perfectly. How do I go about deciding which strategy will suite me?
What issues that might determine “best practices” for building an on/off line synchronization system in AngularJS and symfony 2 needs some research, but on the top of my head I want to consider things like speed, ease of implementation, future proof (lasting solution), extensibility, resource usage/requirements on the client side, having multiple offline users editing the same data, how much and what type of data to store.
Some of my requirements that I'm presently aware of are:
The users will be offline often and then needs to synchronize (locally created) data with the database
Multiple users share some of the editable data (potential merging issues needs to be considered).
User's might be logged in from multiple devices at the same time.
Allowing large amount of data to be stored offline(up to a gigabyte)
I probably want the user to be able to decide what he wants to store locally.
Even if the user is online I probably want the user to be able to choose whether he uses all (backend) data or only what's available locally.
Some potential example solutions
PouchDB - Interesting strategies for synchronizing changes from multiple sources
Racer - Node lib for realtime sync, build on ShareJS
Meteor - DDP and strategies for sync
ShareJS - Node.js operational transformation, inspired by Google Wave
Restangular - Alternative to $resource
EmberData - EmberJS’s ORM-like data persistence library
ServiceWorker
IndexedDB Polyfill - Polyfill IndexedDB with browsers that support WebSQL (Safari)
BreezeJS
JayData
Loopback’s ORM
ActiveRecord
BackBone Models
lawnchair - Lightweight client-side DB lib from Brian Leroux
TogetherJS - Mozilla Labs’ multi-client state sync/collaboration lib.
localForage - Mozilla’s DOMStorage improvement library.
Orbit.js - Content synchronization library
(https://docs.google.com/document/d/1DMacL7iwjSMPP0ytZfugpU4v0PWUK0BT6lhyaVEmlBQ/edit#heading=h.864mpiz510wz)
Any help would be much appreciated :)
You seem to want a lot of stuff, the sync stuff is hard... I have a solution to some of this stuff in an OSS library I am developing. The idea is that it does versioning of local data, so you can figure out what has changed and therefore do meaningful sync, which also includes conflict resolution etc. This is sort-of the offline meteor as it is really tuned to offline use (for the London Underground where we have no mobile data signals).
I have also developed an eco system around it which includes a connection manager and server. The main project is at https://github.com/forbesmyester/SyncIt and is very well documented and tested. The test app for the ecosystem will be at https://github.com/forbesmyester/SyncItTodoMvc but I have yet to write virtually any docs for it.
It is currently using LocalStorage but will be easy to move to localForage as it actually is using a wrapper around localStorage to make it an async API... Another one for the list maybe?
To work offline with your requeriments I suggest to divide problem into two scenarios: content (html, js, css) and data (API REST).
The content
Will be stored offline by appcache for small apps or for advanced cases with the awesome serviceworkers. Chrome 40+.
The data
Require solve the storage and synchronization and It becames a more difficult problem.
I suggest a deep reading of the Differential Synchronization algorimth, and take next tips in consideration:
Frontend
Store the resource and shadow (using for example url as key) into the localstorage for small apps or into more advanced alternatives (pouchdb,indexdb,...). With the resource you could work offline and when needs synchronize with the server use jsonpath to get diffs between the resource-shadow and to send it to server the PATCH request.
Backend
At backend take in consideration storage the shadow copies into redis.
The two sides (Frontend/Backend) needs to identify the client node, to do so you could use x- syn-token at HTTP header (send it in all request of the client with angular interceptors).
https://www.firebase.com/
it's reliable and proven, and can be used as a backend and sync library for what you're after. but, it costs, and requires some integration coding.
https://goinstant.com/ is also a good hosted option.
In some of my apps, I prefer to have both: syncing db source AND another main database. (mogno/express, php/mysql, etc..). then each db handles what's its best with, and it's features (real-time vs. security, etc...). This is true regardless to sync-db provider (be it Racer or Firebase or GoInstant ...)
The app I am developing has many of the same requirements and is being built in AngularJS. In terms of future proofing, there are two main concerns that I have found, one is hacking attempts requiring encryption and possible use of one time keys and an backend key manager and the other is support for WebSQL being dropped by the standards consortium in preference to indesedDB. So finding an abstraction layer that can support both is important. The solution set I have come up with is fairly straight forward. Where offline data is is loaded first into the UI and a request goes out to the REST Server if in an online state. As for resolving data conflicts in a multi user environment, that becomes a business rule decision. My decision was to simplify the matter and not delve into data mergers but to use a microtime stamp comparison to determine which version should be kept and pushed out to clients. When in offline mode, store data as a dirty write and the push to server when returning to an online state.
Or use ydn-db, which I am evaluating now as it has built in support for AWS and Google cloud storage built in.
Another suggestion:
Yjs leverages an OT-like algorithm to share a wide range of supported data types, and you have the option to store the shared data in IndexedDB (so it is available for offline editing).
I have a simple offline html5/javascript single-html-file web application that I store in my dropbox. It's a sort of time tracking tool I wrote, and it saves the application data to local storage. Since its for my own use, I like the convenience of an offline app.
But I have several computers, and I've been trying to come up with any sort of hacky way to synchronize this app's data (which is currently using local storage) between my various machines.
It seems that chrome allows synchronization of data, but only for chrome extensions. I also thought I could perhaps have the web page automatically save/load its data from a file in a dropbox folder, but there doesn't appear to be a way to automatically sync with a specific file without user prompting.
I suppose the "obvious" solution is to put the page on a server and store the data in a database. But suppose I don't want a solution which requires me to maintain apps on a server - is there another way, however hacky, to cobble together synchronization?
I even looked for a while to see if there was a vendor offering a web database service - where I could, say, post/get a blob of json on demand, and then somehow have my offline app sync with this service, but the same-origin policy seems to invalidate that plan (and besides I couldn't find such a service).
Is there a tricky/sneaky solution to this problem using chrome, or google drive, or dropbox, or some other tool I'm not aware of? Or am I stuck setting up my own server?
I have been working on a Project that basically gives you versioned localStorage with support for conflict resolution if the same resource ends up being edited by two different clients. At this point there are no drivers for server or client (they are async in-memory at the moment for testing purposes) but there is a lot of code and abstraction to make writing your own drivers really easy... I was even thinking of doing a dropbox/google docs driver myself, except I want DynamoDB/MongoDB and Lawnchair done first.
The code is not dependent on jQuery or any other libraries and there's a pretty full features (though ugly) demo for it as are well.
Anyway the URL is https://github.com/forbesmyester/SyncIt
Apparently, I have exactly the same issue and invetigated it thoroghly. The best choice would be remoteStorage, if you could manage to make it work. It allows to use 3rd party server for data storage or run your own instance.
I have an unusual web application similar in some ways to the live versions of DriveWorks and KBmax. The Logic, Interface, and Tables are created in a windows application by sales teams and engineers throughout the company and stored in sql server. In the Web application, when a user clicks an item in the product menu the the Logic and Interface are pulled from the database, compiled using CodeDOM into an InMemory executable and stored in Session variable. The Front interface in web application is dynamic and invokes the executable in session for the control logic events. The Tables are also stored in a DataSet in Session. At any rate, this all actually works, but seems to have random quirks that are hard to pinpoint. I'm wondering if this is way to much to put in session and alternatives.
Note: I am using javascripting where I can, however since the code is actually created by other users there is also a lot of postbacks going on. When I run this using my localhost it seems to run acceptable, but on the host server it seems clunky.
I have this same application as a Windows application and it works great, just trying to make the web version.
If 64bit process is used, using session should not be a problem. If 32 bit process is used you should look at out-proc (db / windows server AppFabric). Regarding random quirks, i guess something to do with code.