I want to create a really basic CRUD (sort-of) example app, to see how things work.
I want to store items (items of a shopping-list) in an array, using functions defined in my listService.js such as addItem(item), getAllItems() and so on.
My problems come when using the same module (listService.js) in different files, because it creates the array, in which it stores the data, multiple times, and I want it to be like a static "global" (but not a global variable) array.
listService.js looks like this:
const items = [];
function addItem (item) {
items.push(item);
}
function getItems () {
return items;
}
module.exports = {addItem, getItems};
and I want to use it in mainWindowScript.js and addWindowScript.js, in addWindowScript.js to add the elements I want to add to the array, and in mainWindowScript.js to get the elements and put them in a table. (I will implement later on Observer pattern to deal with adding in table when needed)
addWindowScript.js looks something like this:
const electron = require('electron');
const {ipcRenderer} = electron;
const service = require('../../service/listService.js');
const form = document.querySelector('form');
form.addEventListener('submit', submitForm);
function submitForm(e) {
e.preventDefault();
const item = document.querySelector("#item").value;
service.addItem(item);
console.log(service.getItems());
// This prints well all the items I add
// ...
}
and mainWindowScript.js like this:
const electron = require('electron');
const service = require('../../service/listService.js');
const buttonShowAll = document.querySelector("#showAllBtn")
buttonShowAll.addEventListener("click", () => {
console.log(service.getItems());
// This just shows an empty array, after I add the items in the add window
});
In Java or C#, or C++ or whatever I would just create a Class for each of those and in main I'd create an instance of the Service and pass a reference of it to the windows. How can I do something similar here ?
When I first wrote the example (from a youtube video) I handled this by
sending messages through the ipcRenderer to the main module, and then sending it forward to the other window, but I don't want to deal with this every time there's a signal from one window to another.
ipcRenderer.send('item:add', item);
and in main
ipcMain.on('item:add', (event, item) => {
mainWindow.webContents.send('item:add', item);
})
So, to sum up, I want to do something like : require the module, use the function wherever the place and have only one instance of the object.
require the module, use the function wherever the place and have only one instance of the object.
TL:DR - no, that isn't possible.
Long version: Nature of Electron is multi process, code you runs in main process (node.js side) and renderer (chromium browser) is runnning in different process. So even you require same module file, object created memory in each process is different. There is no way to share object between process except synchrnonize objects via ipc communication. There are couple of handful synchronization logic modules out there, or you could write your module do those job like
module.js
if (//main process)
// setup object
//listen changes from renderer, update object, broadcast to renderer again
else (//rendere process)
//send changes to main
//listen changes from main
but either cases you can't get away from ipc.
Related
I need to consume Context API within modules (not components).
In my app, when a user logs in, I store his data in a context (just to sync his data between components).
Storing this data in other place, and not in the state of a Context, is not the best option to synchronize the data between all the routes of my app.
I also have my "API" to connect with the server and make things like "getContent", "getMessage"... Until now, I was passing a "parse callback" (just to parse the server responses) to the "API" methods.
function ComponentA() {
const parseContent = (contentDoc) => {
// get current user data from context (the component is a consumer)
// parse the data
}
...
const content = await api.getContent(contentId, parseContent)...
...
}
function ComponentB() {
const parseMessage = (contentDoc) => {
// get current user data from context (the component is a consumer)
// parse the message data...
// if the message has "content", then parse the content (same as parseContent)
}
...
const messages = await api.getMessages(messageId, parseMessage)...
...
}
As you can see, this is something that makes me duplicating code. Because "parseContent" is can perfectly be used inside "parseMessage"
Now, I am trying to move the "parsers" methods to other modules, but some of these methods needs to consume the current user data (which is in a context). Something which makes my idea impossible to implement.
Is it common to pass "parse" callbacks to api methods in React? (Honestly, for me, this seems shitty coding)
Is there any way to consume the context within a module? Any ideas?
Thank you.
I'm trying to create a tool for editing files containing a object that is related to my companies business logic. I'm using electron to do so.
I've created a javascript class which represents the object, handles its internals, and provides buisness functions on it:
class Annotation {
constructor() {
this._variables = []
this._resourceGenerators = []
}
get variables() {
return this._variables
}
get resourceGenerators() {
return this._resourceGenerators
}
save(path) {
...
}
static load(path) {
...
}
};
module.exports = Annotation;
I create the object in my main process, and I have an event handler which gives render processes access to it:
const {ipcMain} = require('electron')
const Annotation = require('./annotation.js');
... Do electron window stuff here ...
var annotation = new Annotation()
ipcMain.on('getAnnotation', (event, path) => {
event.returnValue = annotation
})
I've just found out that sending an object through ipcMain.sendSync uses JSON.stringify to pass the annotation, meaning it looses the getters/functions on it.
I'm fairly new to web/electron development; what is the proper way of handling this? Previously I had handlers in main for dealing with most of the functions that the render processes needed, but main started to become very bloated, so I'm trying to refactor it somewhat.
TL; DR: RECONSTRUCT OBJECT ON RECEIVER SIDE.
Description: Electron's main architectural design is based on multi-process, separating main (node.js) and each renderer (chromium) processes and allow to communicate between processes via IPC mechanism. And due to several reason (efficiency, performance, security, etcs) Electron's OOTO IPC only allows serializable POJO to be sent / received. Once receiver have those data, you may need reconstruct desired object from there.
If your intention around access is to share references like true singleton, that's not available.
The first thing I would suggest is that in most cases, you don't need to transfer anything to the main process. The main process is mostly for creating windows and accessing Electron API's which are restricted to the main process. Everything else should and can be done from the renderer including access to all node modules. You can write files, access databases, etc all from the renderer.
Read this article about the differences between the main and renderer processes and what you should be using each for.
I'm developing a web app backed with firebase realtime database.
The app's frontend is quite complex and there are several methods that write data to the db. I have several utils that look like this:
var utils = {
setSomething: function(id, item) {
var myRef = firebase.database().ref('my/path');
myRef.set(item).then(something);
}
}
The question here is: is it okay to create a new Ref inside the method (and thereby, creating a new ref with each call) or should I "cache" the ref somewhere else (just like we cache jquery objects).
I could do something like this first:
var cachedRefs = {
myRef: firebase.database().ref('my/path'),
yourRef: firebase.database().ref('your/path'),
herRef: firebase.database().ref('her/path')
}
And then the former method could be rewritten as:
var utils = {
setSomething: function(id, item) {
cachedRefs.myRef.set(item).then(something);
}
}
Is there any performance gain besides having less code repetition?
firebaser here
References just contain the location in the database. they are cheap.
Adding the first listener to a reference requires that we start synchronizing the data, so that is as expensive as the data you listen to. Adding extra listeners is then relatively cheap, since we de-duplicate the data synchronization across listeners.
// Main class
function App() {
this.task = new Task(this); // pass the instance of this class to Task so
// it has access to doSomething
}
App.prototype.doSomething = function () {
alert("I do something that Task() needs to be able to do!");
};
function Task(app) {
// This class needs access to App()'s doSomething method
this.appInstance = app;
this.appInstance.doSomething(); // Great, now Task can call the method
}
var app = new App();
The aim of the code above is to give Task access to one of App's methods called doSomething. The code is the current way I'd go about it and I'm posting this to see if it's the best way...
To give Task access I simply pass the whole instance of App, is this efficient or is there a better way to go about it? Is the code above general practice in going about doing something like this?
Yes, what you have is fine. It is a circular dependency, however because of JavaScript's dynamic nature there aren't really any issues.
Another way you could reference App from Task would be a Singleton pattern or something similar, but that would probably be harder to test.
jsFiddle Demo
Generally bind would be used in this scenario assuming that the Task "class" didn't also setup other facilities which were not shown here.
Bind allows for the context to be provided for a function. This could be done in app's constructor. At which point only a function task would be required to call "someMethod".
function task(){
return this["someMethod"]();
}
function App(){
task.bind(this)();
}
App.prototype.someMethod = function(){
alert("Task needed access to this");
};
var a = new App();
However, if task must be a "class", and have other responsibilities then the prototype function could be shared.
function Task(){}
function App(){}
App.prototype.someMethod = Task.prototype.someMethod = function(){
alert("Task needed access to this");
};
var a = new App();
a.task();//->"Task needed access to this"
var t = new Task();
t.someMethod();//->"Task needed access to this"
Your app instances and task instances are tightly bound. App instances have tasks and this can be fine.
A design of loosely coupled objects is more flexible and easier to extend but more complicated to initially create. One such pattern is using a mediator/publish subscriber and have app raise an event/publish message any other object function can listen to this and take action on the event.
For example: your app creates an Ajax instance and when that instance is done it raises some event (fetchedData for example). A listener could be DomDependent.updateView function but later you may want to add/remove/change the order of tasks to do after data is fetched. This can all be configured in a app.init function or per procedure in a controller that kicks of certain procedures (like log in, search, ...).
Instead of creating a whole bunch of specific functions in Ajax (fetchUserPrefs, login, search, ...) you can create one general function and have the controller add listeners or pass the next event when fetchData is complete to run the correct next function.
Here is some pseudo code:
var app = {
init:function(){
mediator.add("updateLogin",domDependent.updateView);
mediator.add("updateLogin",app.loadUserPrefs);
mediator.add("failLogin",domDependent.updateView);
},
login: function(){
mediator.trigger("loadingSometing",{type:"login"});
ajax.fetch({
onComplete:"updateLogin",//what listens to updateLogin you decided in init
onFail:"failLogin",
loginDetails:domDependent.getLogin(),
url:settings.loginUrl,
type:"post"
});
}
}
var ajax = {
fetch:function(data){
data = data || {};
//simple check for onComplete, it's mandatory
var complete = data.onComplete || app.raiseError("ajax.fetch needs onComplete");
//other code to validate data and making ajax request
onSuccess:function(resp){
//mutate data object as the mediator will pass it to
// whatever other function is called next
// you don't hard code domDependent.updateView and
// app.loadUserPrefs because fetch can be used generally and
// success may have to do completely different things after its done
// and you want to define procedures in init, not all over your code
data.response=resp;
//trigger event to do whatever needs to be done next
mediator.trigger(complete,data);
}
}
}
As you can see it gets complicated and maybe doesn't look like code you're used to but it's highly configurable.
I may have misunderstood the advantages of the mediator pattern to loose couple and if so please comment. I use it to:
Make methods more general instead of copying a lot of logic only
because what to do after it's done is different. In fetch the ajax
object just fetches, this would be the same for login or getting
user preferences, the only thing different is what function to call
next/on error when it's done.
A procedure like login involves multiple functions in multiple
objects if this function chain hard code what to do next once a
particular function is done your procedure of login is defined all
over your code. When defining it in init/config you can easily change the
order or add/remove functions in the chain.
I'm writing an app using expressjs. I have my views in, commonly, /views folder. They cover 90% of my customers' needs, but sometimes I have to override one or another of those view to add custom-tailored features. I really wonder I can build a folder structure like:
*{ ...other expressjs files and folders...}*
/views
view1.jade
view2.jade
view2.jade
/customerA
view2.jade
/customerB
view3.jade
What I'd like is to override the behaviour of expressjs' response.render() function to apply the following algorithm:
1. a customer requests a view
2. if /{customer_folder}/{view_name}.jade exists, than
render /{customer_folder}/{view_name}.jade
else
render /views/{view_name}.jade
Thus, for customerA, response.render('view1') will refer to /views/view1.jade while response.render('view2') will refer to /customerA/view2.jade
(those who use appcelerator's titanium may sound it familiar)
I'd like an elegant way to implement this behavior without the hassle of modify expressjs' core functionality, and thus possibly get treated at upgrading my framework. I guess it's a common problem but I can't find any article on the web.
I would create a custom View class:
var express = require('express');
var app = express();
var View = app.get('view');
var MyView = function(name, options) {
View.call(this, name, options);
};
MyView.prototype = Object.create(View.prototype);
MyView.prototype.lookup = function(path) {
// `path` contains the template name to look up, so here you can perform
// your customer-specific lookups and change `path` so that it points to
// the correct file for the customer...
...
// when done, just call the original lookup method.
return View.prototype.lookup.call(this, path);
};
app.set('view', MyView);
You can hook http.ServerResponse.render.
Here's some code from the top of my head, to be used as middleware:
var backup = res.render
res.render = function() {
//Do your thing with the arguments array, maybe use environment variables
backup.apply(res, arguments) //Function.prototype.apply calls a function in context of argument 1, with argument 2 being the argument array for the actual call
}