How can a web application store a very large amount of data client-side? (I'm talking concretely about allowing a capacity of some millions of records).
What I want to do here is to allow research of these records offline.
All of the users are using Chrome.
I was opting for indexedDb until I read that with about 400k records, it is almost unusable.
Then there is the Web SQL, but it had been deprecated.
I was then thinking that my last option would be to install a web server like apache with small script locally that would interact with my web application and store the records in a DB like MySQL. With AJAX I could access my script in localhost, but then there is the cross-domain problem.
I ran out of ideas
Update: clarification->
The main web application is running on a distant server. It has to be on a server as the application is used by several people at different locations (it is shared), and need to be accessed by smartphone, etc. The last idea was to install a web application locally (on all of the user's computer), that would interact with that distant web application to fetch the records from it and store them locally. Anyway it wouldn't work because of cross-domain issues I guess.
I see few alternatives:
don't you actually need a desktop application. I know, I know it is so 1990's...
installing a local web server and accessing your application via web browser is an option as well. But this is dangerously close to point 1.
you might consider developing a Java applet and permitting it to use the file system
Related
For my job, I am doing research on finding a means on how a web application running locally from file:\ in IE11, created with either HTML5 or Javascript, can access the raw data or listen to a computer's serial port being sent out from a windows service or proxy. The situation is that We have a proxy designed to collect data from a computer's serial port and it will send that data outward on our network to the local host.
What we want our web application to do is to catch that data the proxy is sending out directly from the service on the computer, removing the need to have the proxy send the data to a server and having the web application collect the data from a server. So far googling the solution has been difficult. Does anyone know the solution to our problem or knows where to find the solution?
Lazy people, why don't you use Google search bar (!?!)...
Here: https://github.com/garrows/browser-serialport
Note: You cannot use this in a Web page, i.e. cannot put it on a Web server. And it is supported only by Chrome.
I am working on a chrome extension which has to store data of its user. For that I am using a hosted server which is running a mysql database. But currently any addition or change in data fires a request to the hosted server.
Chrome extension provides chrome.storage.local API which is suitable to store data upto 5mb. I want to take advantage of this storage API to reduce number of requests to my hosted server by using it as a temporary storage.
I am planning to use chrome.storage.onChanged.addListener and chrome.storage.local.getBytesInUse to check if data stored crosses a certain threshold value and then only fire an ajax request to the remote server to save the data. Upon successful response, the old data in chrome.storage will be flushed off.
But there are chances of losing some new data which is created during the process of request/response cycle from the server.
How can I prevent any loss of data? Is there any alternative solution to this optimization problem of reducing number of requests to the remote server from the extension?
Thanks.
This isn't really a question about chrome extensions. It's more about persistent databases that work offline and synchronize intelligently. Which happens to be a very hard problem to do right.
The easiest solution is to use chrome.storage.sync. That buys you persistence for free with the caveat of limited storage. You should definitely see if this is feasible before trying other options.
Otherwise, I recommend looking into 3rd party options before rolling your own solution. You might have heard of progressive web apps, which work offline, and sync when internet is available.
An article about the advantages of progressive web apps
Google Tutorial
PouchDB, a well regarded web database that works offline and syncs to other databases
Look into those. It'll be well worth the trouble. otherwise you'll just end up building hacks on top of hacks trying to get syncing to work.
... one last thing... make sure to add your remote database's URL to your manifest's permissions.
HTML5 supports online web storage which can help in making our website to work offline. But, how can one share data between systems that are connected through LAN when offline?
The requirement is:
If offline, there will be a centralised system, through which all the systems of a particular group will be connected. Any update on one system will be reflected in all the systems in that group. When the centralised system go online, the data will be synced with a remote mysql DB.
And if online, all system will update to the remote mysql DB directly and hence always in sync.
How to get started for such a system?
You can't. This isn't a thing that HTML5 applications can do.
Specifically, there is no way for such an application to "discover" other instances of that application on the network, or to communicate with them, while offline.
Communicating with the "centralized system" you're describing in your question would require your application to be online. And if you're able to do that, the application doesn't need to operate in that fashion anyways!
What would be the most appropriate way to make a real-time web app that works on PHP (Apache web server)?
The idea of web application is to let two users at the same time edit same HTML form which is regularly saved to MySQL DB.
I am thinking about AngularJS + Laravel approach with lots of AJAX requests, but maybe there is more appropriate way to do this (maybe WebSockets)?
There are no requirements for browser compatibility except that it would work on latest version of Chrome.
Basically, if you want a real-time web application your best bet would be WebSockets.
It is event driven so the client doesn't has to pull for updates, the server push them.
Otherwise, the client would have to constantly pull updates from a REST API.
A quick search on Google led me to Ratchet which is a PHP websockets.
Good luck!
We have an application which consumes a large amount of data. Currently a desktop app, but we would like to deliver it via the browser.
It doesn't make sense to me to create a web app where we need to transfer a ll the data used for the visualizations.
We're looking at RDP and some products out there that provide RDP access via a fully javascript client. They seem to work well with our app, but I've been thinking about what it would take to move off Windows.
Switching the front end so that it could run under Linux would not be trivial, but not impossible, so the main stumbling block would be delivery.
I was wondering if there are any X11 javascript servers out there, but have not found any leads.
Use xpra's builtin html5 client, it supports any application you can run on an X11 desktop.
You can use an HTML5 VNC viewer like https://github.com/kanaka/noVNC coupled with a VNC server like RealVNC
AFAIK, recent GTK has been ported to HTML5+Javascript in Gtk Broadway
And you could make your application a web application, for instance by using Wt, or by making it an HTTP server thru specialized HTTP server libraries like libonion, libmicrohttpd etc.
By using AJAX techniques (e.g. thru jquery) your application won't transmit all the display data to the browser at once (but only incrementally and only the actually shown data).
You might also consider fastcgi as a way to connect your application to some web server.
I know two, both at very infancy:
https://github.com/GothAck/javascript-x-server
and
https://github.com/ttaubert/x-server-js
Both need simple tcp-to-websockets proxy in front, but all X11 logic happen inside web page and all x11 objects exist and interact within browser (so it's not just remote framebuffer but real server)
You can ever run full Linux distribution in Web Browser, but that's require to run x86/ARM emulator and GNU/Linux inside it. It provides X server with possible web connection too.
For very simple applications you can use libgreattao toolkit and tao-network-client to connect to it. I'm the author of both project. The API isn't yet frozen, but it rather behaves stable. You can read about it here:
https://nintyfan.wordpress.com/2015/04/30/server-buildin-into-libgreattao-and-tao-network-client/
It can provide some problems with applications with a lot of data, because all elements must be send to client, when it were created, but instead we don't send full graphics(only icons is send) and user interface could be changed quickly. It also don't support mouse enter/leave/move events.
I must tell: do not download tarbar, but download version from svn.
Sounds like the easiest approach for you is to get xrdp, which is an RDP-server for X. Then you would use your RDP client to connect to it. I think Nomachine NX supports html directly now, but I'm not sure. There was talk of an html X2go-client, but I don't know anything about that either.