Using Axios in Electron - Best Practice? - javascript

I am creating an app in Electron using React. The first version will be a purely desktop based app using a local database for data storage etc. However, eventually I would like to utilise as much of the same code base as possible and deploy the same app as a cloud-based alternative (by just leaving out the electron component).
I'd like some advice, thoughts, opinions from the community on if I should use axios for creating internal restAPI infrastructure rather than a API using javascript modules/functions etc. for performing requests such as getting data from a database.
My thinking is then that when I port the application to the cloud, I really just need to change the restAPI locations to point to a cloud version which would then get the data from a cloud database rather than a local database etc.
Would exposing such APIs locally within Electron using axios pose any security risks or other issues that I may need to consider?
Perhaps this answer/discussion may also provide useful for the community in the future.
Looking forward to thoughts, or even new suggestions on how this may be best approached.
Thanks.

Related

Security of .env file (enviromental variables) in React JS App

I heard that .env file was for securing variables (like API keys) in front-end, but then I read that enviromental variables are embedded in the build.
So, what are the enviromental variables, for example, in React?
I know this is old, but people are still arguing in 2022.
Just reiterating what Jayce444 stated, because he knows what he's talking about.
React JS is a Frontend technology.
Do NOT place your API keys in your react app's .env file.
They WILL be visible to anyone who inspects your app.
Instead, you can create a backend app with an api for your front end app.
Any requests like logging in, or database interactions, send from your front-end to the back-end app.
It's a bit of extra work, but this is how you secure it.
Also, if you plan to easily port your react app to a react-native app,
you'll need the backend api to support multi-platform apps anyway!
Here is a guide that helped me with examples.
https://youtu.be/yICiz12SdI4
The guy isn't the best at explaining and Passport is currently experiencing some bugs with their version (at the time of writing this) and cookie-sessions, but I was able to trouble shoot through them.
Hope this helps!
Doing something like this is probably one of the best projects to have in a portfolio for web dev.

Embedded database for electron, react-native and NodeJS apps?

I have a javascript application that I've implemented for mobile apps using react-native and its desktop counterpart using the electron framework. The mobile application uses react-native-sqlite-storage native module to save preferences and data (5 - 6 tables) whereas I use node-sqlite3 for the electron app.
Both, the mobile and desktop apps share a lot of functionality but due to the use of different database plugins, have a lot of differences. Also, for the desktop app, as node-sqlite3 is a native dependency, I have to build the app installers for Windows and macOS separately. That's a pain!
So, what I need is a database solution that is :-
embeddable into the app
efficient and performant compared to sqlite3
supports syncing to a remote database
supports macOS, Windows, and Linux
encrypts the data written within the database
consistent API across JS runtimes (Browser / NodeJS / JavascriptCore)
Here's a list of those that I've come across and that seem appealing:-
NeDB
RxDB
PouchDB
So, what are your suggestions and how have you implemented anything similar for your apps?
I've been testing PouchDB, RxDB (which relies on PouchDB with RxJS streams for queries), Realm-JS (native database like sqlite3), FireStore. NeDB doesn't support remote sync.
I won't go into performance metrics details on each database, but PouchDB has been very slow and heavy on memory when querying more than 20.000 items (tried with indexeddb/websql adapter).
RxDB was generally much faster with that many items, especially when subscribing to query changes (tried with indexeddb/websql adapter too). Also schema and migrations are very handy.
FireStore is a good choice and comes with very easy setup of server and client components, but you'll need to be comfortable to run on a google platform. There is some flexibility with firebase functions if you want some control over server logic and there is customizable ACLs for your collections. Speed has been good and on par with RxDB. Comes with a very good auth module if you want it.
If you want to be able to scale much much more on the client side, which you probably don't, or if you want to make complex queries, I'd really recommend using something like realm. You'll need to compile for each platform like you've experienced with sqlite3, but there is sync, offline persistence, rich queries and great performance. There is just no way that javascript based solutions like PouchDB, RxDB or FireStore can compete, even with sqlite backends, since much of the computation will still happen in your precious JS thread. realm is doing much of its heavy lifting on the native library. I've been able to do LIKE "abc" queries on 100.000 items, returning hundreds of results, within less than hundred milliseconds and without noticeably freezing my UI or pumping up memory usage heavily. Supports client migrations too, which is nice.
In the end, there are multiple answers:
1. Want to host everything yourself and don't need massive scale client side (you can sync against subsets of your server data with filters), RxDB ist very good and comes with nice set of features.
2. Want very easy setup, nice modules like auth, server functions etc, and don't need massive scale on the client (can also sync against server data subsets with filters), FireStore is great.
3. Need lots of lots of data on the client, realm is my preference. I personally really dislike the realm sync platform for its pricing model (though technically it's cool), but the database itself is free and maybe you could try and implement a custom sync.
Take my results with a grain of salt, I've had some very specific challenges like large, non-relational collections and fulltext-search, your use-case will probably differ a lot.

Suggestion for the best way to store persistent data for a light-weight, portable JS-based web app

I'm still new to web development. To learn more about JavaScript(JS) and web development, I am thinking of writing a simple web app which pulls and records time-series data (say, the price of a stock) periodically and draws a live chart showing the historical data. In addition to price data, I would like the app to record/maintain some user-related info such as the ticker of the stock(s) associated to each user.
Ideally, I would like to keep the app light-weight and portable/standalone (meaning, reduce the dependency as much as possible, and the end user hopefully doesn't have to do a lot of configuration/install of dependencies). The issue that I cannot figure out is where to store the historical data. I looked around for database solutions which will allow the app to write data directly from the browser (that is, using JS) to the client's machine. LocalStorage and IndexDB are non-persistent as far as I understand. Some suggested using PouchDB, but upon looking at it closer, it seems like the user need to install CouchDB or some compatible DB (say, SQLite). But that means I cannot share my app with users who aren't technical enough to install and configure CouchDB or SQLite on their machine before using my app.
If anyone could share some insights as to which DB might allow a JS-based app to write persistent data to the client's machine (if such thing even exist), that would be greatly helpful. If there is no such DB solution, please feel free to let me know alternative solutions that would allow the goal of building a simple, portable, JS-based web app. Thank you!
I think the best solution is to use Electron.js. The whole idea of this framework is to create web apps that can reside on client machines. You could package up any DB option you want, or even better, just include an API to your backend through the web app and it will work on your client machine like I think you want it to.
As for DB options, there is a great thread on S.O. that talks about what is possible. It looks like knex.js is your best bet (full disclosure - I haven't used knex).

Connecting DB with my Firefox OS app using JS

I'm developing an app for Firefox OS and I need to retrieve/sent data from/to my DB. I also need to use this data in my logic implementation which is in JS.
I've been told that I cannot implement PHP in Firefox OS, so is there any other way to retrieve the data and use it?
PS: This is my first app that I'm developing, so my programming skills are kind of rough.
You can use a local database in JS, e.g. PouchDB, TaffyDB, PersistenceJS, LokiJS or jStorage.
You can also save data to a backend server e.g. Parse or Firebase, using their APIs.
Or you can deploy your own backend storage and save data to it using REST.
You should hold on the basic communication paradigms when sending/receiving data from/to a DB. In your case you need to pass data to a DB via web and application.
Never, ever let an app communicate with your DB directly!
So what you need to do first is to implement a wrapper application to give controlled access to your DB. Thats for example often done in PHP. Your PHP application then offers the interfaces by which external applications (like your FFOS app) can communicate with the DB.
Since this goes to very basic programming knowledge, please give an idea of how much you know about programming at all. I then consider offering further details.
It might be a bit harder to do than you expect but it can be easier than you think. Using mysql as a backend has serious implication. For example, mysql doesn't provide any http interfaces as far as I know. In other words, for most SQL based databases, you'll have to use some kind of middleware to connect your application to the database.
Usually the middleware is a server that publish some kind of http api probably in a rest way or even rpc such as JSONrpc. The language in which you'll write the middleware doesn't really matter. The serious problem you'll face with such variant is to restrict data. Prevent other users to access data to which they shouldn't have access.
There is also an other variant, I'd say if you want to have a database + synchronization on the server. CouchDB + PouchDB gives you that for free. I mean it's really easy to setup but you'll have to redesign some part of your application. If your application does a lot of data changes it might end up filling your disks but if you're just starting, it's possible that this setup will be more than enough.

Best practice for on/off line data synchronization using AngularJS and Symfony 2

I'm building a relatively complex and data heavy web application in AngularJS. I'm planning to use php as a RESTful backend (with symfony2 and FOSRESTbundle). I have spent weeks looking around for different solutions to on/off line synchronization solutions and there seem to be many half solutions (see list below for some examples). But non of them seem to fit my situation perfectly. How do I go about deciding which strategy will suite me?
What issues that might determine “best practices” for building an on/off line synchronization system in AngularJS and symfony 2 needs some research, but on the top of my head I want to consider things like speed, ease of implementation, future proof (lasting solution), extensibility, resource usage/requirements on the client side, having multiple offline users editing the same data, how much and what type of data to store.
Some of my requirements that I'm presently aware of are:
The users will be offline often and then needs to synchronize (locally created) data with the database
Multiple users share some of the editable data (potential merging issues needs to be considered).
User's might be logged in from multiple devices at the same time.
Allowing large amount of data to be stored offline(up to a gigabyte)
I probably want the user to be able to decide what he wants to store locally.
Even if the user is online I probably want the user to be able to choose whether he uses all (backend) data or only what's available locally.
Some potential example solutions
PouchDB - Interesting strategies for synchronizing changes from multiple sources
Racer - Node lib for realtime sync, build on ShareJS
Meteor - DDP and strategies for sync
ShareJS - Node.js operational transformation, inspired by Google Wave
Restangular - Alternative to $resource
EmberData - EmberJS’s ORM-like data persistence library
ServiceWorker
IndexedDB Polyfill - Polyfill IndexedDB with browsers that support WebSQL (Safari)
BreezeJS
JayData
Loopback’s ORM
ActiveRecord
BackBone Models
lawnchair - Lightweight client-side DB lib from Brian Leroux
TogetherJS - Mozilla Labs’ multi-client state sync/collaboration lib.
localForage - Mozilla’s DOMStorage improvement library.
Orbit.js - Content synchronization library
(https://docs.google.com/document/d/1DMacL7iwjSMPP0ytZfugpU4v0PWUK0BT6lhyaVEmlBQ/edit#heading=h.864mpiz510wz)
Any help would be much appreciated :)
You seem to want a lot of stuff, the sync stuff is hard... I have a solution to some of this stuff in an OSS library I am developing. The idea is that it does versioning of local data, so you can figure out what has changed and therefore do meaningful sync, which also includes conflict resolution etc. This is sort-of the offline meteor as it is really tuned to offline use (for the London Underground where we have no mobile data signals).
I have also developed an eco system around it which includes a connection manager and server. The main project is at https://github.com/forbesmyester/SyncIt and is very well documented and tested. The test app for the ecosystem will be at https://github.com/forbesmyester/SyncItTodoMvc but I have yet to write virtually any docs for it.
It is currently using LocalStorage but will be easy to move to localForage as it actually is using a wrapper around localStorage to make it an async API... Another one for the list maybe?
To work offline with your requeriments I suggest to divide problem into two scenarios: content (html, js, css) and data (API REST).
The content
Will be stored offline by appcache for small apps or for advanced cases with the awesome serviceworkers. Chrome 40+.
The data
Require solve the storage and synchronization and It becames a more difficult problem.
I suggest a deep reading of the Differential Synchronization algorimth, and take next tips in consideration:
Frontend
Store the resource and shadow (using for example url as key) into the localstorage for small apps or into more advanced alternatives (pouchdb,indexdb,...). With the resource you could work offline and when needs synchronize with the server use jsonpath to get diffs between the resource-shadow and to send it to server the PATCH request.
Backend
At backend take in consideration storage the shadow copies into redis.
The two sides (Frontend/Backend) needs to identify the client node, to do so you could use x- syn-token at HTTP header (send it in all request of the client with angular interceptors).
https://www.firebase.com/
it's reliable and proven, and can be used as a backend and sync library for what you're after. but, it costs, and requires some integration coding.
https://goinstant.com/ is also a good hosted option.
In some of my apps, I prefer to have both: syncing db source AND another main database. (mogno/express, php/mysql, etc..). then each db handles what's its best with, and it's features (real-time vs. security, etc...). This is true regardless to sync-db provider (be it Racer or Firebase or GoInstant ...)
The app I am developing has many of the same requirements and is being built in AngularJS. In terms of future proofing, there are two main concerns that I have found, one is hacking attempts requiring encryption and possible use of one time keys and an backend key manager and the other is support for WebSQL being dropped by the standards consortium in preference to indesedDB. So finding an abstraction layer that can support both is important. The solution set I have come up with is fairly straight forward. Where offline data is is loaded first into the UI and a request goes out to the REST Server if in an online state. As for resolving data conflicts in a multi user environment, that becomes a business rule decision. My decision was to simplify the matter and not delve into data mergers but to use a microtime stamp comparison to determine which version should be kept and pushed out to clients. When in offline mode, store data as a dirty write and the push to server when returning to an online state.
Or use ydn-db, which I am evaluating now as it has built in support for AWS and Google cloud storage built in.
Another suggestion:
Yjs leverages an OT-like algorithm to share a wide range of supported data types, and you have the option to store the shared data in IndexedDB (so it is available for offline editing).

Categories