extend existing API with custom endpoints - javascript

I'm creating an API for multiple customers. The core endpoints like /users are used by every customer but some endpoints rely on individual customization. So it might be that User A wants a special endpoint /groups and no other customer will have that feature. Just as a sidenote, each customer would also use his own database schema because of those extra features.
I personally use NestJs (Express under the hood). So the app.module currently registers all my core modules (with their own endpoints etc.)
import { Module } from '#nestjs/common';
import { UsersModule } from './users/users.module'; // core module
#Module({
imports: [UsersModule]
})
export class AppModule {}
I think this problem is not related to NestJs so how would you handle that in theory?
I basically need an infrastructure that is able to provide a basic system. There are no core endpoints anymore because each extension is unique and multiple /users implementations could be possible. When developing a new feature the core application should not be touched. Extensions should integrate themselves or should get integrated on startup. The core system ships with no endpoints but will be extended from those external files.
Some ideas come to my mind
First approach:
Each extension represents a new repository. Define a path to a custom external folder holding all that extension projects. This custom directory would contain a folder groups with a groups.module
import { Module } from '#nestjs/common';
import { GroupsController } from './groups.controller';
#Module({
controllers: [GroupsController],
})
export class GroupsModule {}
My API could loop through that directory and try to import each module file.
pros:
The custom code is kept away from the core repository
cons:
NestJs uses Typescript so I have to compile the code first. How would I manage the API build and the builds from the custom apps? (Plug and play system)
The custom extensions are very loose because they just contain some typescript files. Due to the fact they don't have access to the node_modules directory of the API, my editor will show me errors because it can't resolve external package dependencies.
Some extensions might fetch data from another extension. Maybe the groups service needs to access the users service. Things might get tricky here.
Second approach:
Keep each extension inside a subfolder of the src folder of the API. But add this subfolder to the .gitignore file. Now you can keep your extensions inside the API.
pros:
Your editor is able to resolve the dependencies
Before deploying your code you can run the build command and will have a single distribution
You can access other services easily (/groups needs to find a user by id)
cons:
When developing you have to copy your repository files inside that subfolder. After changing something you have to copy these files back and override your repository files with the updated ones.
Third approach:
Inside an external custom folder, all extensions are fully fledged standalone APIs. Your main API would just provide the authentication stuff and could act as a proxy to redirect the incoming requests to the target API.
pros:
New extensions can be developed and tested easily
cons:
Deployment will be tricky. You will have a main API and n extension APIs starting their own process and listening to a port.
The proxy system could be tricky. If the client requests /users the proxy needs to know which extension API listens for that endpoint, calls that API and forwards that response back to the client.
To protect the extension APIs (authentication is handled by the main API) the proxy needs to share a secret with those APIs. So the extension API will only pass incoming requests if that matching secret is provided from the proxy.
Fourth approach:
Microservices might help. I took a guide from here https://docs.nestjs.com/microservices/basics
I could have a microservice for the user management, group management etc. and consume those services by creating a small api / gateway / proxy that calls those microservices.
pros:
New extensions can be developed and tested easily
Separated concerns
cons:
Deployment will be tricky. You will have a main API and n microservices starting their own process and listening to a port.
It seems that I would have to create a new gateway api for each customer if I want to have it customizable. So instead of extending an application I would have to create a customized comsuming API each time. That wouldn't solve the problem.
To protect the extension APIs (authentication is handled by the main API) the proxy needs to share a secret with those APIs. So the extension API will only pass incoming requests if that matching secret is provided from the proxy.

There are several approaches to this. What you need to do is figure out what workflow is best suited for your team, organization, and clients.
If this was up to me, I would consider using one repository per module, and use a package manager like NPM with private or organization scoped packages to handle the configuration. Then set up build release pipelines that push to the package repo on new builds.
This way all you need is the main file, and a package manifest file per custom installation. You can independently develop and deploy new versions, and you can load new versions when you need to on the client-side.
For added smoothness, you could use a configuration file to map modules to routes and write a generic route generator script to do most of the bootstrapping.
Since a package can be anything, cross dependencies within the packages will work without much hassle. You just need to be disciplined when it comes to change and version management.
Read more about private packages here:
Private Packages NPM
Now Private NPM registries cost money, but if that is an issue there are several other options as well. Please review this article for some alternatives - both free and paid.
Ways to have your private npm registry
Now if you want to roll your own manager, you could write a simple service locator, that takes in a configuration file containing the necessary information to pull the code from the repo, load it up, and then provide some sort of method to retrieve an instance to it.
I have written a simple reference implementation for such a system:
The framework: locomotion service locator
An example plugin checking for palindromes: locomotion plugin example
An application using the framework to locate plugins: locomotion app example
You can play around with this by getting it from npm using npm install -s locomotion you will need to specify a plugins.json file with the following schema:
{
"path": "relative path where plugins should be stored",
"plugins": [
{
"module":"name of service",
"dir":"location within plugin folder",
"source":"link to git repository"
}
]
}
example:
{
"path": "./plugins",
"plugins": [
{
"module": "palindrome",
"dir": "locomotion-plugin-example",
"source": "https://github.com/drcircuit/locomotion-plugin-example.git"
}
]
}
load it like this:
const loco = require("locomotion");
It then returns a promise that will resolve the service locator object, which has the locator method to get a hold of your services:
loco.then((svc) => {
let pal = svc.locate("palindrome"); //get the palindrome service
if (pal) {
console.log("Is: no X in Nixon! a palindrome? ", (pal.isPalindrome("no X in Nixon!")) ? "Yes" : "no"); // test if it works :)
}
}).catch((err) => {
console.error(err);
});
Please note that this is just a reference implementation, and is not robust enough for serious application. However, the pattern is still valid and shows the gist of writing this kind of framework.
Now, this would need to be extended with support for plugin configuration, initializations, error checking, maybe add support for dependency injection and so on.

I would go for external packages option.
You can structure your app to have a packages folder. I would have UMD compiled builds of external packages in that folder so that your compiled typescript won't have any issues with the packages. All packages should have an index.js file on each package's root folder.
And your app can run a loop through the packages folder using fs and require all the packages index.js into your app.
Then again dependency installation is something you have to take care of. I think a configuration file on each package could solve that too. You can have a custom npm script on main app to install all the package dependencies before starting the application.
This way, you can just add new packages to your app by copy pasting the package into the packages folder and rebooting the app. Your compiled typescript files won't be touched and you don't have to use private npm for your own packages.

Related

How to bundle a plugin based javascript platform

I'am trying to achieve a plugin system for my typescript SPA, but I'am not sure how to approach this in terms of code bundling/splitting.
Current setup
The frontend and backend is implemented with Typescript / ESM.
Frontend:
Vuejs 3
Vitejs/Rollup
Pinia
Backend/API:
NestJs with Express/RestAPI
The app currently consist of 3 npm packages:
Server: NestJs Server Application
Web: Vuejs Frontend / UI Components
Common: Shared models and interfaces and utilities between frontend and backend
Requirements
I want to enable third parties to implement additional modules for this platform installable e.g. via NPM or ideally by just copying the module into a modules directory or some kind of marketplace in the future.
A third party module should be able to import and use classes from the core frontend/backend and common packages similar to the core modules itself. For example, a third party module should be able to access services/models/ui-components and stores of a core calendar module.
Ideally this should work without the need to rebuild the whole frontend/backend. I'am coming from languages like Java and PHP in which you have namespaces, packages and classloaders. I'am not sure how to bundle something like this in a javascript
application especially for the frontend.
So my questions are:
How to bundle/split the core packages
How to import core packages within third party modules
How to build, bundle/split the third party packages
I assume I have to somehow add those core dependencies as externals within the third party module and provide the core bundles as UMD or ESM.
Would be great if someone could point me in the right direction and can provide me with some hints, resources, best practices or even existing/similar solutions.

How to share code between client and cloud functions [duplicate]

I have a Node server and multiple controllers that perform DB operations and helpers (For e-mail, for example) within that directory.
I'd like to use source from that directory within my functions. Assuming the following directory structure:
src/
server/
/app/controllers/email_helper.js
fns/
send-confirm/
What's the best way to use email_helper within the send-confirm function?
I've tried:
Symbolically linking the 'server' directory
Adding a local repo to send-confirm/package.json
Neither of the above work.
In principle, your Cloud Functions can use any other Node.js module, the same way any standard Node.js server would. However, since Cloud Functions needs to build your module in the cloud, it needs to be able to locate those dependency modules from the cloud. This is where the issue lies.
Cloud Functions can load modules from any one of these places:
Any public npm repository.
Any web-visible URL.
Anywhere in the functions/ directory that firebase init generates for you, and which gets uploaded on firebase deploy.
In your case, from the perspective of functions/package.json, the ../server/ directory doesn't fall under any of those categories, and so Cloud Functions can't use your module. Unfortunately, firebase deploy doesn't follow symlinks, which is why that solution doesn't work.
I see two possible immediate fixes:
Move your server/ directory to be under functions/. I realize this isn't the prettiest directory layout, but it's the easiest fix while hacking. In functions/package.json you can then have a local dependency on ./server.
Expose your code behind a URL somewhere. For example, you could package up a .tar and put that on Google Drive, or on Firebase Cloud Storage. Alternatively, you can use a public git repository.
In the future, I'd love it if firebase deploy followed symlinks. I've filed a feature request for that in Firebase's internal bug tracker.

How to organize Javascript project to allow sharing of code between client SPA and Node server

I am developing an application which comprises a SPA front end and a Rest back-end.
To implement the Rest back-end I am using Node and Express.
Considering that both front-end and back-end are written in JavaScript (and TypeScript), I would like to share some code between these 2 parts (namely Interfaces and simple utils).
So basically my project is composed of three parts: Client, Server, Shared. Therefore I am inclined to have a project directory structure similar to this:
ProjecFolder
ClientFolder
.....
ServerFolder
.....
SharedFolder
.....
I am looking for suggestions on how best organize my project. I have done some research and I have found this interesting article which suggests to use a mechanism based on Gulp tasks that copy all files from SharedFolder into both ClientFolder and ServerFolder and then runs transpling.
I am wondering whether there can be an alternative approach or tools that perform what otherwise has to be configures as Gulp workflow.
My recommendation is to use a package manager tool. When you have dependencies, and the requirements of the server changed, you have to change the module. You don't want the SPA (frontend), to break, when you need to make changes to the server.
This is why package managers give you versions. Each module that depends on your shared code, can use a different version of it. You can use NPM for that. Build a module, publish it, and install it on your frontend and backend.
Many times, in production you split the frontend and backend. The frontend may exist in a file storage system (S3, Google Cloud Storage and similar), and the backend executed on your servers. Then it will be harder to use the same files on both systems.

Loading modules in qooxdoo desktop (browser environment)

I'm struggling how to integrate client-side modules like - just as an example - Apollo Client
into the qooxdoo-specific generate.py workflow so that they become available in the browser.
According to the installation notes:
To use this client in a web browser or mobile app, you'll need a build system capable of loading NPM packages on the client. Some common choices include Browserify, Webpack, and Meteor 1.3. [...]
Side note: I currently use Babel 6 to recursively transpile all my sources from a separate folder source.es6/ into the "official" source/ folder, which is then watched and processed by generate.py. Is it possible to use this somehow as a solution to my question?
OTOH, I would love to see at least some kind of integration with Webpack, Browserify or SystemJS.
I suggest you do the following. First, create a loadable package(s) from the Apollo Client and its dependencies, e.g. using Webpack. Then make sure these package(s) are loaded in your web page before you load your qooxdoo app. Then the Apollo API is available to your qooxdoo code.
If you choose to deploy the Apollo packages with <script> tags you can let generate.py do that by using the add-script config key.
I suggest you place the output of the Webpack run in your qooxdoo project's resource path and add #asset hints for those files in your main qooxdoo class. This will make sure they are copied into the build version of your app, and you can use the relative URI to these files, either in your index.html directly or in the add-script config settings.
I don't think your transpiling with Babel6 will help here. The Apollo code is already consumable and you woudn't want to disect it and make it part of your qooxdoo (es6) source tree, let alone its dependencies. I would rather treat it as a shrink-wrapped JS library as I described that is added like a resource.

How to symlink Javascript projects repos into a Meteor app

Meteor has a great file loading policy for general development. It automatically loads files from the app directory with some special treatment for public, private, client and server directories. (See http://docs.meteor.com/#structuringyourapp)
When loading third-party Javascript libraries into a Meteor app, I usually put them in a <head> script or directly in the client/compatibility directory, which works well for released files.
However, sometimes I need to link a developing version of a project directly from a GitHub repository from a certain branch, when testing patches or pull requests. I already do this all the time for Meteor smart packages which are picked up transparently. However, I'm not sure how to do this for general (client-side) Javascript libraries. Moreover, it's the linking in of a repo rather than a listed version that is tricky. Can someone who has had to do this give suggestions?
One approach to this was briefly described in https://github.com/meteor/meteor/issues/1229.
I found that this can be cleanly implemented as a resident smart package in your app. This approach works well in Meteor 0.6.5 and any future versions until this API changes. First create the following in package.js:
Package.on_use(function (api) {
api.use(['routepolicy', 'webapp'], 'server');
api.add_files('client.html', 'client');
api.add_files('server.js', 'server');
});
and in server.js, you declare that you want Meteor to serve up an entire directory (the appropriate part of the repo) as part of the app (in my case, OpenLayers):
connect = Npm.require('connect');
RoutePolicy.declare('/lib', 'network');
WebApp.connectHandlers
.use(connect.bodyParser())
.use('/lib', connect.static("/home/mao/projects/openlayers/lib"));
finally, client.html tells your app to load up the code in the right path:
<head>
<script src="/lib/OpenLayers.js"></script>
</head>
Assuming the above package was in a directory named openlayers, commenting or uncommenting openlayers in the package file of my app allows me to switch really easily between compiled releases and running from repo for this package.

Categories