i'm newbie looking for solution how to share a varible among files(modules) by reference. for an example my here is app.js
const users = {},
waitingQueue = []
module.exports = function(io){
io.on('connection', (socket)=>{
users[socket.id] = socket
socket.on("Search", (d, ack)=>{
waitingQueue.push(socket)
// logic with waitingQueue
// logic with users {}
})
socket.on("otherEvent", (d, ack)=>{
// logic with waitingQueue
// logic with users {}
})
})
}
now i'd like to divided it modulewise. now new app.js
const users = {},
waitingQueue = []
const Search = require('./search')
module.exports = function(io){
io.on('connection', (socket)=>{
users[socket.id] = socket
socket.on("Search", Search(users, waitingQueue))
})
}
now in differenct case my trouble
socket.on("Search", ...)
... should be a funtion
now if use encloser
socket.on("Search", ()=>...)
export modified{users, waitingQueue} form Search.js is ok but need to override users & waitingQueue varibles but not working.
i need to share these varible to other modules too with tightly coupled.
i also try event emitter based approach Object.observe() but unable to fix problem.
anyone help me pls to fix this problem
I would use REDIS to store the shared values.
You can use the redis client to get/set the values, those values will be stored in the redis database
Then those values can be get/set from any code in any process in the same local machine (as the db) or a set of remote machines (thats why i will mention scaling later on...)
Redis can be benchmarked and in my experience, compared to other db solutions its amazingly fast.
It's quite simple to use : An introduction to Redis data types and abstractions
Binary-safe strings.
Lists: collections of string elements sorted
according to the order of insertion.
Sets: collections of unique, unsorted string elements.
(Sorted sets, similar to Sets )
Hashes, which are maps composed of fields associated with values. Both the field and the value are strings. This is very similar to
Ruby or Python hashes.
...and some more...
I suggest you install the database server and client and try to get and set some values, I'm sure it will be useful knowledge for the future.
Extra Reasons :
It will give youeven more power to your arsenal
Its really good for scaling purposes (both : vertically &
horizontally scaling)
socket.io-redis (GitHub)
By running socket.io with the socket.io-redis adapter you can run
multiple socket.io instances in different processes or servers that
can all broadcast and emit events to and from each other.
socket.io-emitter (GitHub)
If you need to emit events to socket.io instances from a non-socket.io
process
EDIT I know you want to share variables between your code, but since I saw you are using socket.io, you might want to share those variables also accross your slaves...
Socket.IO : passing events between nodes
Well, my presumption is in a real world application users & waitingQueue would come from some sort of persistent storage e.g. Database, so the notion of "sharing" data across modules would be a non-issue as you'd most likely fetch directly from the DB.
However, if I did have in-memory "global" data that I wanted to share across different modules, then I'd most likely move it into it's own module e.g.
data.js
module.exports = {
users: {},
waitingQueue: []
}
app.js
const Search = require('../search');
const data = require('./data');
module.exports = io => {
io.on('connection', (socket) => {
data.users[socket.id] = socket;
socket.on('Search', Search(data.users, data.waitingQueue));
});
}
As it's own module, data.js can then be shared across various other modules.
Related
I'd like to make re-usable functions that get the Firestore Document/Collection reference across web and admin (node.js).
for example:
getUserDocumentReference(company: string, user: string) {
return firebase.collection("companies")
.doc(company)
.collection("users")
.doc(user);
}
This will reduce errors and coordinate changes across both environments.
Problem: Admin imports firestore from firebase-admin, and web imports from firebase.
I've tried making some class/function where I pass in my firestore reference, but it becomes a pain where I have to declare the return types:
const ref = (
getUserDocumentReference("a", "1") as
firebase.firestore.DocumentReference
)
.withConverter(converter)
Is there a smarter/cleaner way to do this without re-inventing the wheel (i.e. somehow passing an array or re-creating paths in a complex way)?
my current approach:
class FirestoreReferences {
constructor(firestore: firebase.firestore.Firestore
| admin.firestore.Firestore) {
this.firestore = firestore;
}
getUserDocumentReference(company: string, user: string): FirebaseFirestore.DocumentReference | firebase.firestore.DocumentReference {
return this.firestore.collection(...).doc(...);
}
}
Just found out about Typesaurus which provides generic types to share across web/admin!
The simplest answer: DO NOT use the .doc() or the .doc().ref. Use the .doc.ref.path - which is a string with the FULL PATH to the document. Save/share it as let refPath = whatever.doc().ref.path and re-build it as .doc(refPath) is either environment.
I DO NOT actually RECOMMEND this - it exposes your internal structure - but it isn't inherently insecure (your security rules better be taking care of that).
btw, I'm building an entire wrapper npm package (#leaddreamer/firebase-wrapper) for this specific purpose.
You should not do this. The Admin SDK is meant for server-side usage because it has full control over your entire project. If a user gets access to this, they have control over your app. Keep firebase and firebase-admin seperate.
I cannot find clear information on how to manage database connections (MongoDB in my case) from an Azure function written in Javascript.
The Microsoft document below says to not create a connection for each invocation of the function by using static variables in C# using .NET Framework Data Provider for SQL Server and the pooling is handled by the client connection. It does not describe how to do this in Javascript.
https://learn.microsoft.com/en-us/azure/azure-functions/manage-connections
A solution of creating a global variable to hold the database client between invocations is described here but the author is not confident this is the correct way to do it.
http://thecodebarbarian.com/getting-started-with-azure-functions-and-mongodb.html
Has anyone used this in production or understand if this is the correct approach?
Yes, there's a very close equivalence between C#/SQL storing a single SqlConnection instance in a static variable and JS/MongoDB storing a single Db instance in a global variable. The basic pattern for JS/MongoDB in Azure Functions is (assuming you're up to date for async/await - alternatively you can use callbacks as per your linked article):
// getDb.js
let dbInstance;
module.exports = async function() {
if (!dbInstance) {
dbInstance = await MongoClient.connect(uri);
}
return dbInstance;
};
// function.js
const getDb = require('./getDb.js');
module.exports = async function(context, trigger) {
let db = await getDb();
// ... do stuff with db ..
};
This will mean you only instantiate one Db object per host instance. Note this isn't one per Function App - if you're using a dedicated App Service Plan then there will be the number of instances you've specified in the plan, and if you're using a Consumption Plan then it'll vary depending on how busy your app is.
I'm trying to create a tool for editing files containing a object that is related to my companies business logic. I'm using electron to do so.
I've created a javascript class which represents the object, handles its internals, and provides buisness functions on it:
class Annotation {
constructor() {
this._variables = []
this._resourceGenerators = []
}
get variables() {
return this._variables
}
get resourceGenerators() {
return this._resourceGenerators
}
save(path) {
...
}
static load(path) {
...
}
};
module.exports = Annotation;
I create the object in my main process, and I have an event handler which gives render processes access to it:
const {ipcMain} = require('electron')
const Annotation = require('./annotation.js');
... Do electron window stuff here ...
var annotation = new Annotation()
ipcMain.on('getAnnotation', (event, path) => {
event.returnValue = annotation
})
I've just found out that sending an object through ipcMain.sendSync uses JSON.stringify to pass the annotation, meaning it looses the getters/functions on it.
I'm fairly new to web/electron development; what is the proper way of handling this? Previously I had handlers in main for dealing with most of the functions that the render processes needed, but main started to become very bloated, so I'm trying to refactor it somewhat.
TL; DR: RECONSTRUCT OBJECT ON RECEIVER SIDE.
Description: Electron's main architectural design is based on multi-process, separating main (node.js) and each renderer (chromium) processes and allow to communicate between processes via IPC mechanism. And due to several reason (efficiency, performance, security, etcs) Electron's OOTO IPC only allows serializable POJO to be sent / received. Once receiver have those data, you may need reconstruct desired object from there.
If your intention around access is to share references like true singleton, that's not available.
The first thing I would suggest is that in most cases, you don't need to transfer anything to the main process. The main process is mostly for creating windows and accessing Electron API's which are restricted to the main process. Everything else should and can be done from the renderer including access to all node modules. You can write files, access databases, etc all from the renderer.
Read this article about the differences between the main and renderer processes and what you should be using each for.
I am creating a caching module for Nodejs that needs to utilize multiple CPUs using sub processes.
I would like to store data in an index in the master process or, preferably in a subprocess like so:
var index = { key1: 21, key2: 22, key3: 55 }
and another process should be able to search in that index as efficiently as:
if('key2' in index) // do stuff
I assume using IPC would be significantly slower than achieving shared objects. Is this even possible? Thanks.
You might want to try mmap-object. It coordinates shared memory based on a memory-mapped file which means the data is persistent as well as shared among processes. I wrote it so to avoid both the performance and management overhead of redis or a similar solution.
For your application the read-write version may be what you want.
I would use redis in your case. There are ways to configure redis on how frequently it will store its memory data on disk based on how many entries are saved or just time (http://redis.io/topics/persistence).
Another option might be to send requests between instances to save that memory data only on master.
When a non master instance wants to save or load data it will request the master.
Here is some pseudocode:
var SharedMemory = {
storeObject: function(args, done) {
if (IAmMaster()) {
storeObjectToMemory(args);
done();
} else {
sendRequestToMasterForSave(args, done);
}
},
getObject: function(args, done) {
if (IAmMaster()) {
done(getObjectFromMemory(args));
} else {
getResponseFromMasterForLoad(args, done);
}
}
}
However, this is probably gonna be a painful process
Can't find any docs or posts for this, which may indicate I'm trying to do something incorrect.
Is it possible to use a Mongoose schema that is entirely virtual, i.e. not persisted to the db?
I have a number of models, most of which are persisted to db, but would like to consistently include models that are only retained in memory, not persisted?
The closest I can come up with is along these lines, but it will still persist objects with only an id attribute in the database. Simplified here:
// access_token.js
var schema = mongoose.Schema({});
schema.virtual('token').get(function() {
return 'abcde12345';
});
module.exports = mongoose.model('AccessToken', schema);
The idea in doing this is to abstract models so that the consuming part of the app does not need to be aware of whether a model is persisted to the database or only held in memory. Of course this could be achieved by creating the same object and methods as a plain object, but that approach would quickly become repetitive.
You could override (monkey patch) the Mongoose methods which save data (e.g. .save) but I suspect what you are trying to do is difficult/impossible.
You could take a look at sift.js, which is a query library to do in-memory querying.
https://github.com/crcn/sift.js
You can set a pre middleware for this model which always fails.
schema.pre('save', function (next) {
next(new Error('This can't be saved!');
});
So you will know when you are doing wrong.