Offline / Online Data Synchronization Design (Javascript) [closed] - javascript

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 4 years ago.
Improve this question
I'm currently in the process of writing an offline webapp using all the html5 goodies for offline support. However I'm starting now to think about writing the sync module that will ensure that any offline data gets sent to the server and server data back to the client. Now I'm sure this has been done before, I mean its a pretty classic design issue that affects mobile devices and a plethora of other things. So I'm wondering can anyone point me to some good design resources for this kind of thing?
Now I really do not need to be too sophisticated with this, I mean I'm not handling multiple users accessing the same data and I'm happy not to merge conflicts (just take the latest) but still I would like a design that will allow me those options in the future.
Also, are there any open source projects implementing this type of thing? I'm not above ripping off someone else's code (if license allows) and I'm happy to port.

I had a similar problem. I decided to use a purely JSON in and out approach. The solution I'm taking on form submission is:
catch the form submit event
check whether or not the user is online
if user is online then submit the form as normal form POST
if user is offline then stringify a JSON request and store it locally (I decided to use Web SQL Database). Queue table is simply Uri and Payload.
Then I have global event hooks for the online / offline events. When the user comes back online, it checks the queue, and if the queue has items in it, it then sends them through as JSON POST requests.
If you are primarily interested in getting JSON data and caching it for offline usage, then take a look at jquery.offline.
The challenge with synchronizing in both direction is that you need to update the local cached lists with any CRUD work that you have queued.
I'd like to find a more generic way to do this.

My plan for a similar design (not yet tried) is to use something like PouchDB to store the data locally and then sync it with a remote couch instance.

Check out Derby, a Node MVC framework that has some pretty sweet synchronization and conflict resolution features. http://derbyjs.com/

in our team we have already developed app in offline/online mode.
we are using the next following libraries:
rack-offline
jquery
backbonejs
backbonejs-localStorage
backbonejs-queues
jammit
Using rack-offline we are caching all resources files and jst template for rendering content on the page. backbonejs and backbonejs-localStorage helps to make MVC app on the client. it's pretty awesome, you should try it. we are always using localstorage for saving data. when we create post for example model object and saving to the localStorage, we are triggering queues for syncing(also we have by timer background worker for auto running sync process). For each model we have separate sync class that should be run by queue sync trigger. if your navigator.onLine => true we are sending requests to the server with data for updating. if you close browser, anyway you don't loose your data because you have queues in the localStorage. in the next time client will sync data on the first loading with navigator.onLine => true.
How to use rack-offline you can check my small project in the github:
pomodoro-app
Good luck!

I faced the same problem and ended up using an XML-file for storage and git to track changes and commit them automatically, as soon as a connection is available. The sync is done with the usual git commit / push / pull commands in a shell script and a cronjob starting the script. This would also work if you store JSON in a textfile.

I'm currently working on similar webapp. I've decided to make such workflow:
Form isn't really submitted - "Submit" button actually saves serialized form data to localStorage (in some queue). This saves from troubles with submit capturing and from writing additional error processing code to handle disconnect during form submission.
Transport script is triggered after data saving. It checks online/offline state.
When online, it tries to send latest data from queue to server (AJAX request), and deletes it from queue on success (and continues to send next data from queue after short timeout).
It shedules re-check after some period of time (by setTimeout()).

If you are up for using the potentially heavy Ext JS / Sencha framework, it has a nice data API with offline (e.g. localStorage) support and a proxy approach for write-thru to local then server. I use Sencha Touch (mobile-specific).
For debugging web storage, check out Weinre.

DerbyJS were probably the best solution. However Derby is still in development and offline support is only in planning and has not yet been implemented. In the Google Group ( http://groups.google.com/group/derbyjs/browse_thread/thread/7e7f4d6d005c219c ) you can find aditional information about what is planned in the future.

I'd personally recommend you write a wrapper on top of the indexedDB API that checks whether you are online/offline.
if offline, just store in indexedDB and set persisted flag to false on all documents
if online, get all documents where persisted is false and store them in mongodb or something equivelant on the backend, then store new documents in both indexedDB and on the server with the persisted flag to true.
I've written a small one
You would have to augment the tunnel to set the persisted flag automatically and also tunnel the synchronization of these documents to the backend

Related

Fastest way to send data from C# to javascript? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 1 year ago.
Improve this question
I have two separate apps. One of them is a UI web application written with pure JS, the other one is console application written by C#.
Currently I'm calculating some variables (which can not done in JS because of browser limitations) with C# console app, then it's writing results to a txt file.
Then I read the file with JS application to bring results to UI. But the variable often changes in milliseconds and writing results to disk and retrieving it again is pretty slow.
What can I do? Any suggestions?
The console application is effectively a server. Communicating between a web app and a server by means of a local text file is, well, unconventional! If this is not just for your own use on the one machine, it will be very difficult to deploy for another user. Write a small server application and communicate with it the usual way, i.e, by posting the data to the server's IP address and receiving the server's response. You can remove any connection latency (after the initial connection) by communicating over websocket.
As others pointed out WebSockets seem to be a great choice for this task.
Mozilla has a mini tutorial that seems perfect for this task: Writing a WebSocket server in C#.
RE: Comment:
Good point! There is also a MSFT guide for SignalR: Tutorial: Get started with ASP.NET Core SignalR
It's open to some debate what you mean by "fastest" - fastest for you to write or fastest in terms of performance of the app..
It's relatively simple to turn your C# code into an API - visual studio has templates for API type projects; your logic will then get a url and can be triggered simply by visiting it in any browser or having JavaScript do a fetch of the url. The url itself can be used to pass variable data, C# knows how to parse it and present it in code so a method like (attributes etc removed for clarity)
public class CalcController:ApiController{
public int Add(int a, int b){
return a+ b;
}
}
Can be triggered by visiting a url of `http://host/api/add/1/2 and you get JSON back, which JS understands out of the box. If your web app serves your js up the js can automatically talk to the web app without any CORS etc because it was served by the same server
See https://learn.microsoft.com/en-us/aspnet/core/tutorials/web-api-javascript?view=aspnetcore-5.0 for a full tutorial (it's pretty involved, uses databases and everything - you can boil it down a lot simpler, probably even just making a web app project from a SPA template in vs will create everything you need to have front end JS and back end C#
Another option to look at is Blazor; you can dump the JS entirely and put c# in the browser (or leave c# on the server and let Blazor handle transiting the UI changes between client and server) if you want, or you can interoperate with JS
Finally it's been commented about SignalR - a tech from MS that builds on top of websockets and handles the connectivity management, and finding and calling code on either end. It helps create event driven apps where the events happen at either end (like a chat app; one person speaks, it causes JS in their browser to call a method on the server and transits what they types to the server, then the server pushes that message out to a group of other connected clients). You set up events in your JS like "onChatReceived" so when the server pushes data, you respond to it. SignalR deals firewalls automatically; it either uses websockets (long lived bidirectional data flow), long polling (make a request and the server doesn't answer until a message it available to send) or repeated polling (any update? any update? any update?) automatically, so it can make your app very portable, especially if one day you want to host your calculations on a server you control, to protect you're intellectual property
Full MS tutorial on SignalR here - https://learn.microsoft.com/en-us/aspnet/core/tutorials/signalr?view=aspnetcore-5.0&tabs=visual-studio - you're essentially setting up a group of eventing mechanisms; your JS can trigger your server to do something and later your server can trigger your JS to do something.
This is slightly different to your JS requesting your server do something and waiting for the response - that model might work fine and be simpler for you to implement as its most close to what you already have, you're just swapping out the file reading for a network call, which JS is allowed to do autonomously

Best way to periodically save javascript client data to server database and stay in sync?

I have created an application using javascript library D3. Users will constantly click and drag to frequently change graphical elements and I currently save the data in 3-4 local javascript objects and arrays. I want to save the data to the server periodically rather than after each change. Also I want them to be able to work if they are not connected. From twenty years ago, I imagine doing this manually where on the client side records are flagged as “new”, “revised”, and “deleted”. Every 10 seconds client data is saved via AJAX and either an object is updated or a SQL statement is executed. An id is returned from the database and saved on the client side to track each record for future modifications.
Note the data must be organized in a database for ease of separating elements for reuse. When the user is connected, updates every 5-10 seconds are fine. Then I can use an inexpensive and slow server. Of course a tool that deals with records that might not fully update is good, perhaps some transactional functionality.
There will be no separate mobile application. I can modify my javascript objects to be json compliant if need be. I see there are “offline-first” frameworks and javascript "state containers". Redux caught my eye, especially when I saw its use climbing over the years according to Google Trends. I’ve read about so many options and am thoroughly confused by all these. Here is a mish mash of tools I looked at: Store.js, now.js, indexedDB, couchDB, pouchDB, Cloudant, localForage, WebSQL, Polymer App Toolbox, Hoodie framework, Ionic and angular, and Loopback. Not to mention XHR, web sockets.
I have used MVC like Laravel and Zend, both are with PHP and MySql. I wonder if I could integrate the suggested solution. Thanks.
Related: How do I sync data with remote database in case of offline-first applications?
Saving the data locally using PouchDb and then syncing it with a CouchDb database (or IBM's Cloudant service) when a network connection is available is a well-trodden path for this sort of requirement. But your question is asking for an opinion, so there will be many other perfectly valid solutions to this.

Queue in webbrowser on top of database?

In a web application the user is able to perform some tasks I need to send to the server asynchronously. Basically, this is really easy, but now I would like it to be also working fine in offline-mode.
My idea is to use a client-side queue, and transfer elements from that queue to the server if the network connection is available.
I could use PouchDB, but I don't need all the tasks on the client-side, so I don't want a full client-side database with all the elements the server has as well. I only need some kind of queue: Put it in there, and try to send it to the server: If it worked, dequeue, otherwise try again after a short pause.
How could I implement this? Is there something such as RabbitMQ (conceptually!) available for browsers? A queue on top of the browsers' built-in database? Something like that?
Or can this issue be solved using PouchDB?
PouchDB does support one-way replication (just do clientDb.replicate.to("http://server/")), so if you are already running CouchDB on your server, it might be a quick & easy way to implement a queueing of tasks type of system.
You will probably want to use a filter on your replication, because when you "dequeue" or delete a task from the client side db, you probably don't want to replicate that delete to the server :) This answer is specific to CouchDB, but it should work in PouchDB too, as I think PouchDB does support filtered replication: CouchDB replicate without deleting documents.
That said, using PouchDB like this seems a little awkward, and the full replication system might be a little more overhead than is necessary for a simple queueing of tasks. Depends on what the needs of your app are though, and the exact nature of the tasks you are queueing! It could be as simple as an array that you push tasks into, and periodically check if there are tasks in there, which you can pop or shift off the array and send to the server.
There's also async.queue, which is commonly used in node.js but also works in the browser (this queue is not backed by any type of storage, but you could add persistent storage using PouchDB or another client-side db).

Dynamic Wall updating: NodeJS

Basically, I have created a NODEJS app that uses Jade as its templating engine, along with Express and a MySQL database.
I am looking to create a new page which allows user to share a portion of text, and then a div underneath it named "Wall" will update dynamically with the new status.
Basically, it would ideally be similar to Facebook where something is typed, shared and then the page updates below dynamically. I'm also looking to have the wall page update when a new post have been shared from a users friend. All updates shared by the users would be sent to a database.
I have conducted a lot of searches but seem unable to gather a right answer.
I have narrowed it down to the use of either of the following: JQuery, Ajax, PHP.
Since the site I am building is built in JS - what is my best option?
I'm pretty new to all of this, but I assume when a user clicks share it calls a JS file which then stores the update in the database. But how do I get my "Wall" to refresh upon new content?
Any help greatly appreciated.
You've posed a conceptual question. So I'll do my best to explain some of the conceptual options you can choose to further explore and do your own research on how to best implement it with your project.
You have two paths to go here.
You can have your own wall update (do a refresh / re-render on the UI side) upon a successful AJAX write to your database, this would be something you implement in your AJAX callback function - basically the JS function that gets executed after your write request (submitting the new post) to the database returns successful.
A whole other branch of options you could explore, is implement either of the following options to basically "listen" in for changes server-side, and have the re-rendering react as the callback you use:
Polling - basically issue a request every X number of
seconds to check if there have been updates, or change of state on
server-side.
WebSockets - checkout Socket.io. Through this you can "push" messages from the server-side to your clients. As a note, WebSockets are not universally supported in every browser and from past experience I've found WebSocket protocols even differ by browser versions. So if you need universal support, I'd go with a polling method.
Good luck on your project, hope this helps!
Use...
setTimeout(function(){
/* update wall here */
}, 1000)
to poll your "Wall" backend and updated the content.

Optimistic synchronization of replicated objects in javascript

I'm programming a browser application (html5+websockets+css3+js preferred) that enables users to concurrently access (read, write) attributes of the same object. To create a real-time experience I'd like to use optimistic synchronization. I read about Timewarp and Trailing State algorithms and I wonder if there is a javascript library, which already implements these or similar algorithms.
I found this question, but unfortunately it was not answered yet. XSTM only supports pessimistic synchronization as it seems.
Do you have any idea for me?
I am working on a realtime HTML5 web browser application now too. Maybe my choice of weaponry could inspire you...who knows, so I am using:
Frontend:
KnockoutJS -it takes care of displaying data which I send to every connected client in JSON(view models), you can easily subscribe to changes in the client data and push the changes back to server, though I am having problems displaying pages with knockoutjs on mobile browsers
on serverside I run custom made server based on Fleck
Since JSON is my favourite data format, I ditched SQL databases in favour of [RavenDB][2], which stores data almost exactly as they are sent via websocket protocol and also it is pretty quick

Categories