share firestore collection paths across admin and web - javascript

I'd like to make re-usable functions that get the Firestore Document/Collection reference across web and admin (node.js).
for example:
getUserDocumentReference(company: string, user: string) {
return firebase.collection("companies")
.doc(company)
.collection("users")
.doc(user);
}
This will reduce errors and coordinate changes across both environments.
Problem: Admin imports firestore from firebase-admin, and web imports from firebase.
I've tried making some class/function where I pass in my firestore reference, but it becomes a pain where I have to declare the return types:
const ref = (
getUserDocumentReference("a", "1") as
firebase.firestore.DocumentReference
)
.withConverter(converter)
Is there a smarter/cleaner way to do this without re-inventing the wheel (i.e. somehow passing an array or re-creating paths in a complex way)?
my current approach:
class FirestoreReferences {
constructor(firestore: firebase.firestore.Firestore
| admin.firestore.Firestore) {
this.firestore = firestore;
}
getUserDocumentReference(company: string, user: string): FirebaseFirestore.DocumentReference | firebase.firestore.DocumentReference {
return this.firestore.collection(...).doc(...);
}
}

Just found out about Typesaurus which provides generic types to share across web/admin!

The simplest answer: DO NOT use the .doc() or the .doc().ref. Use the .doc.ref.path - which is a string with the FULL PATH to the document. Save/share it as let refPath = whatever.doc().ref.path and re-build it as .doc(refPath) is either environment.
I DO NOT actually RECOMMEND this - it exposes your internal structure - but it isn't inherently insecure (your security rules better be taking care of that).
btw, I'm building an entire wrapper npm package (#leaddreamer/firebase-wrapper) for this specific purpose.

You should not do this. The Admin SDK is meant for server-side usage because it has full control over your entire project. If a user gets access to this, they have control over your app. Keep firebase and firebase-admin seperate.

Related

Clean Architecture in NodeJS, how do useCases interact with each other?

I am trying to implement the Clean Architecture by Bob Martin in my project and I have a question.
How do use-cases interact with each other?
For example:
I have a Department entity and Employee entity.
Department entity has a peopleCount field
Whenever a new Emplyoee is created it is also assigned to a Department, which means that peopleCount must increase by 1.
So how should that interaction between say addEmployee.js and editDepartment.js use-cases be?
Do I const editDepartment = require("../departments"); within my addEmployee.js and use it within addEmployee.js?
Do I inject it as a dependency and then use it?
Do I create a separate useCase increasePeopleCountInDepartmentById.js and require/inject that one? So that its something with a specific purpose and not the "general" editing.
How do use-cases interact with each other?
A use-case is a scenario in which a system receives an external request (such as user input) and, following a list of actions, responds to it (Wikipedia). Therefore, use-cases by definition cannot interact with each other. Moreover, they have not interest to interact with each other.
A use-case, be it addEmployee or editDepartment (depending on your system design), should orchestrate participating domain entities (employee and department). Again, mixing use-cases is irrelevant.
Here's how you can implement addEmployee:
// TODO: start database transaction
const newEmployee = employeeFactory.create(id, name, age, targetDepartmentId);
const department = departmentRepository.get(targetDepartmentId);
department.peopleCount = department.peopleCount + 1;
departmentRepository.save(department);
employeeRepository.add(newEmployee);
// TODO: commit transaction
Do I inject it as a dependency and then use it?
As can be inferred from my example, three objects are to be injected into use-case: employeeFactory, departmentRepository, employeeRepository.

Is there any way to access `__filename` in Next.js?

I'm working on a custom i18n module and would love to replace this code (this is a an "about-us" page):
const messages = (await import(`./about-us.${locale}.json`))
.default as Messages;
By
const messages = (
await import(`./${__filename.replace('.tsx', `.${locale}.json`)}`)
).default as Messages;
Unfortunately __filename resolves to /index.js (I guess because of Webpack?) - is there any way to achieve what am I trying to do in my example or this would need to be built-in Next.js directly to work?
Refactor this so consumers don't know about the filesystem
Spoiler: I'm not going to tell you how to access __filename with Next.js; I don't know anything about that.
Here's a pattern that is better than what you propose, and which evades the problem entirely.
First, setup: it sounds like you've got a folder filled with these JSON files. I imagine this:
l10n/
about-us.en-US.json
about-us.fr-FR.json
contact-us.en-US.json
contact-us.fr-FR.json
... <package>.<locale>.json
That file organization is nice, but it's a mistake to make every would-be l10n consumer know about it.
What if you change the naming scheme later? Are you going to hand-edit every file that imports localized text? Why would you treat Future-You so poorly?
If a particular locale file doesn't exist, would you prefer the app crash, or just fall back to some other language?1
It's better to create a function that takes packageName and localeCode as arguments, and returns the desired content. That function then becomes the only part of the app that has to know about filenames, fallback logic, etc.
// l10n/index.js
export default function getLang( packageName, localeCode ) {
let contentPath = `${packageName}.${localeCode}.json`
// TODO: fallback logic
return JSON.parse(FS.readFileSync(contentPath, 'utf8'))
}
It is a complex job to locate and read the desired data while also ensuring that no request ever gets an empty payload and that each text key resolves to a value. Dynamic import + sane filesystem is a good start (:applause:), but that combination is not nearly robust-enough on its own.
At a previous job, we built an entire microservice just to do this one thing. (We also built a separate service for obtaining translations, and a few private npm packages to allow webapps to request and use language packs from our CMS.) You don't have to take it that far, but it hopefully illustrates that the problem space is not tiny.
1 Fallback logic: e.g. en-UK & en-US are usually interchangeable; some clusters of Romance languages might be acceptable in an emergency (Spanish/Portuguese/Brazilian come to mind); also Germanic languages, etc. What works and doesn't depends on the content and context, but no version of fallback will fit into a dynamic import.
You can access __filename in getStaticProps and getServerSideProps if that helps?
I pass __filename to a function that needs it (which has a local API fetch in it), before returning the results to the render.
export async function getStaticProps(context) {
return {
props: {
html: await getData({ page: __filename })
}, // will be passed to the page component as props
};
}
After a long search, the solution was to write a useMessages hook that would be "injected" with the correct strings using a custom Babel plugin.
A Webpack loader didn't seem like the right option as the loader only has access to the content of the file it loads. By using Babel, we have a lot more options to inject code into the final compiled version.

How do I access an object from the main process from a render process [Electron]

I'm trying to create a tool for editing files containing a object that is related to my companies business logic. I'm using electron to do so.
I've created a javascript class which represents the object, handles its internals, and provides buisness functions on it:
class Annotation {
constructor() {
this._variables = []
this._resourceGenerators = []
}
get variables() {
return this._variables
}
get resourceGenerators() {
return this._resourceGenerators
}
save(path) {
...
}
static load(path) {
...
}
};
module.exports = Annotation;
I create the object in my main process, and I have an event handler which gives render processes access to it:
const {ipcMain} = require('electron')
const Annotation = require('./annotation.js');
... Do electron window stuff here ...
var annotation = new Annotation()
ipcMain.on('getAnnotation', (event, path) => {
event.returnValue = annotation
})
I've just found out that sending an object through ipcMain.sendSync uses JSON.stringify to pass the annotation, meaning it looses the getters/functions on it.
I'm fairly new to web/electron development; what is the proper way of handling this? Previously I had handlers in main for dealing with most of the functions that the render processes needed, but main started to become very bloated, so I'm trying to refactor it somewhat.
TL; DR: RECONSTRUCT OBJECT ON RECEIVER SIDE.
Description: Electron's main architectural design is based on multi-process, separating main (node.js) and each renderer (chromium) processes and allow to communicate between processes via IPC mechanism. And due to several reason (efficiency, performance, security, etcs) Electron's OOTO IPC only allows serializable POJO to be sent / received. Once receiver have those data, you may need reconstruct desired object from there.
If your intention around access is to share references like true singleton, that's not available.
The first thing I would suggest is that in most cases, you don't need to transfer anything to the main process. The main process is mostly for creating windows and accessing Electron API's which are restricted to the main process. Everything else should and can be done from the renderer including access to all node modules. You can write files, access databases, etc all from the renderer.
Read this article about the differences between the main and renderer processes and what you should be using each for.

What is the way to create unit-test for sailsJS (or another framework)

Well I embraced test-driven-development in the past year while learning C# (those seem to go hand in hand). In javascript however I am struggling to find a good workflow for tdd. This is mainly due to the combination of many frameworks which seemingly consider testing a second class citizen.
As an example consider a class worker. This class would have some functionality to act upon a database. So how would I write unit tests for the functionality of this class?
In c# (and rest of C/JAVA family) I'd write this class in such a way that the constructor would take a database-connection parameter. Then during test runs the object is called with a mock-database-connection object instead of the real object. Thus no modification of the source.
In python a similar approach can be used, however apart from providing a mocking object to the constructor to handle HAS_A dependencies, we can also use dependency injection to mock IS_A dependencies.
Now apply this in javascript, and sailsJS in particular (Though a similar problem occurs with sencha and other frameworks). It seems that the code is so tightly coupled to the library/framework that I can't create manual stubs/mocks? - Other than by actually using a pre-run task to modify the source/config.js?
In sails an object (say worker, a controller) has to reside in a specific folder to work, and it "connects" automatically to the database, without me providing any notion of a database object. (Thus preventing me from actually supplying it with my own object).
Say I have a database with a table "Students", then a controller would look something like (With Students being a model defined in api/models:
const request = require('request');
module.exports = {
updateData: function (req, res) {
let idx = params.jobNumber;
Students.find({Nr:idx})
.exec(function (err, result) {
//....
});
},
};
So how would I load above function into a (mocha) test? And how would I decouple the database (used implicitly by sails) so that I can mock the functionality? - And what should I actually mock?
I of course don't wish to do integration tests, so I shouldn't build a "development database" as I don't wish to test the connection, I wish to test the controller functions.
In the documentation, they provide a nice quick example of how to set up testing using Mocha: https://sailsjs.com/documentation/concepts/testing
In the bootstrap.test.js file, all they're doing is lifting and lowering your application with Sails, just so your application has access to controllers/models/etc. within its test environment. They also show how to test individual controllers, which is essentially just making requests that hit the endpoints to fire off the controller's actions. To avoid testing the full lifecycle of a request, you can just require the controller file within a *.test.js file and test any exported action. Remember, though, Sails builds the request and response objects that get passed to the controllers. So, if you want all of the correct data and have those objects be valid, it's best to just let Sails handle it, and you only make a request to the endpoint, unless if you know exactly how to build the request and response objects. But, that's the point of a framework: you use it as intended, and test against/with it. You don't use your version of how it may work. TDD is in all languages and frameworks, you just need to fit it within your technology.
If you don't want to use a database for your test environment, you can tell it to use the sails-disk adapter by creating an environment file under config/env/ for the test environment and forcing that environment to use sails-disk.
For example...
config/env/test.js --> test environment file
module.exports = {
models: {
connection: 'localDiskDb',
migrate: 'drop',
},
port: 1337,
host: '127.0.0.1',
};
In config/connections.js the below to connections object (if not already there)...
localDiskDb: {
adapter: 'sails-disk'
},
And finally, we need to tell it to use that environment when running the tests. Modify test/bootstrap.test.js like the following...
var sails = require('sails');
before(function(done) {
// Increase the Mocha timeout so that Sails has enough time to lift.
this.timeout(10000);
// Set environment to testing
process.env.NODE_ENV = 'test';
sails.lift({
// configuration for testing purposes
}, function(err) {
if (err) {
return done(err);
}
//...
done(err, sails);
});
});
after(function(done) {
// here you can clear fixtures, etc.
// This will "refresh" the memory store so you
// have a clean test datastore every time you run tests
sails.once('hook:orm:reloaded', () => {
sails.lower((err) => {
done();
if (err) {
process.exit(1);
} else {
process.exit(0);
}
});
});
sails.emit('hook:orm:reload');
});
Adding Jason's suggestion in an "answer" format, so that others may find it more easily.
sails-mock-models allows simple mocking for sails model queries, based on sinon
Mock any of the standard query methods (ie 'find', 'count', 'update') They will be called with no side effects.
I haven't actually tried it yet (just found this question), but I'll edit this if/when I have any problems.
Sails Unit Test is perfectly explained in the following blog.
https://www.packtpub.com/books/content/how-add-unit-tests-sails-framework-application
Please refer to it.

Sails.js - Postgresql Adapter multiple schemas

I've been searching a lot about Sails.js multi tenancy capabilities and I know that such a feature is not yet implemented. My initial idea was to build multi tenant app by creating one database per tenant.
Since I realized that I can't do such a thing in Sails.js yet, I tried a different aproach by creating only one database ( POSTGRES ) but with lots of schemas, each one representing a tenant. My problem is that I can't/I dunno ( don't even know if that is possible in Sails/Postgres adapter ) how to dynamically ( on runtime ) define what schema a given object should query aganist, based on the logged user.
Has anyone faced a problem like this? How can I proceed?
Sorry for English and thanks.
In my experience adding in the model does not work.
The only thing that worked for me was using the meta call to specify the schema.
await Users.create(newUser).meta({ schemaName: 'admin' });
A bit cumbersome, but it is working.
Hope this helps someone.
I thinks is an issue of the waterline sequel adapter, based in this answer.
The way to do it is add a property in the model
meta: {
schemaName: 'schema'
},
but is not working, you can't define multiple schemas, only takes the user as an schema, if the property schema is set in true ins the config/models.js, the definition of a schema for every table is not working.
The clue is inside the sails-postgresql adapter code - several of its helpers include this bit:
var schemaName = 'public';
if (inputs.meta && inputs.meta.schemaName) {
schemaName = inputs.meta.schemaName;
} else if (inputs.datastore.config && inputs.datastore.config.schemaName) {
schemaName = inputs.datastore.config.schemaName;
}
So indeed the driver is looking for a schema named public by default, unless a different value is provides via calls to meta() as described above, OR the schema name is configured application-wide.
To configure the schema name for all models, a schemaName property needs to be included in the configuration of the postgresql datastore, which occurs in datastore.js:
...
default: {
adapter: 'sails-postgresql',
url: 'postgresql://username:password#localhost:5432/your_database_name',
schemaName: 'your_schema_name_here'
}
Once this is in place, you don't have to append meta({ schemaName: 'blah'}) to any of the queries. I struggled with this for a couple of days and have finally solved it in this manner.

Categories