How to bundle a plugin based javascript platform - javascript

I'am trying to achieve a plugin system for my typescript SPA, but I'am not sure how to approach this in terms of code bundling/splitting.
Current setup
The frontend and backend is implemented with Typescript / ESM.
Frontend:
Vuejs 3
Vitejs/Rollup
Pinia
Backend/API:
NestJs with Express/RestAPI
The app currently consist of 3 npm packages:
Server: NestJs Server Application
Web: Vuejs Frontend / UI Components
Common: Shared models and interfaces and utilities between frontend and backend
Requirements
I want to enable third parties to implement additional modules for this platform installable e.g. via NPM or ideally by just copying the module into a modules directory or some kind of marketplace in the future.
A third party module should be able to import and use classes from the core frontend/backend and common packages similar to the core modules itself. For example, a third party module should be able to access services/models/ui-components and stores of a core calendar module.
Ideally this should work without the need to rebuild the whole frontend/backend. I'am coming from languages like Java and PHP in which you have namespaces, packages and classloaders. I'am not sure how to bundle something like this in a javascript
application especially for the frontend.
So my questions are:
How to bundle/split the core packages
How to import core packages within third party modules
How to build, bundle/split the third party packages
I assume I have to somehow add those core dependencies as externals within the third party module and provide the core bundles as UMD or ESM.
Would be great if someone could point me in the right direction and can provide me with some hints, resources, best practices or even existing/similar solutions.

Related

Alternatives to structure and use javascript libraries without npm

We have been working on our projects all tied to a single lerna repository, as:
lerna
|----- app1 - application 1
|----- app2 - application 2
|----- appN - application N
|----- commondb (common database libraries for app1, app2 to appN)
|----- commonux (common ux libraries for app1, app2 to appN)
|----- commonauth (common authentication libraries for app1, app2 to appN)
As the code grew a lot, lerna is really full of packages (40+ packages) and too much code.
We're now trying to split lerna into smaller pieces and we're looking for alternatives. Doing that, applications would need a way to import common libraries as we do today.
Certainly NPM seens to be a solution (making each common package independent and publishing it on NPM), but we want to keep our code in our environment without third party services or clouds (we have our own git server instance).
What are the current options to manage javascript libraries that we can make use of? What would be the recommended one in such a scenario?
Your decision can be greatly affected by answering the following - do you want your apps to be running the same version of the shared libraries? Or do want autonomy within the libraries, and to be able to publish and manage different versions of the libraries, where it is the responsibility on the consumer app to manage which version of the library it is using?
If it is the former, my suggestion would be to stick with a Mono-repo approach, maybe consider something like NX, where it has some nice tooling for only linting, testing, building and deploying only the affected modules, whilst sharing a common single package.json and therefore common libraries shared across multiple apps and libs
Otherwise you are looking at potentially managing multiple repos, multiple versions of each library, multiple pipelines, multiple workspace configs.
You could simply use pnpm workspaces: protocol which allows you to create a monorepo structure but without even requiring you to publish when used this way: "foo": "workspace:*". So if you wish to use this approach, then you could have a monorepo structure and just keep local packages, it will use the local code without downloading anything. I used this concept in a Vue pnpm Workspace demo. With this approach you could stay with local packages and/or go with Lerna to publish if you also want to (I prefer Lerna-Lite which is what I use) since they now both support pnpm/yarn workspace: protocol. This approach is flexible and allows you to switch to publishing (to npm or a private registry) at any point in the time in the future if you wish to, in other words there's no major code refactoring to add Lerna after the fact when using a pnpm workspace structure.

How to package a service worker file inside a frontend Javascript library

Let's say I am building a frontend library that needs to use service worker.
Normally you would store the file in sw.js and load it from the frontend, and I already know how to do that.
But if I'm trying to write a library that does something in the frontend, and this library relies on service workers, which means I need two files:
The main library JS
The sw.js file
How do I package them in a single NPM package so people can use them easily in their apps (both through npm install and through CDN loads)? I looked up all over the internet but couldn't find anything.

extend existing API with custom endpoints

I'm creating an API for multiple customers. The core endpoints like /users are used by every customer but some endpoints rely on individual customization. So it might be that User A wants a special endpoint /groups and no other customer will have that feature. Just as a sidenote, each customer would also use his own database schema because of those extra features.
I personally use NestJs (Express under the hood). So the app.module currently registers all my core modules (with their own endpoints etc.)
import { Module } from '#nestjs/common';
import { UsersModule } from './users/users.module'; // core module
#Module({
imports: [UsersModule]
})
export class AppModule {}
I think this problem is not related to NestJs so how would you handle that in theory?
I basically need an infrastructure that is able to provide a basic system. There are no core endpoints anymore because each extension is unique and multiple /users implementations could be possible. When developing a new feature the core application should not be touched. Extensions should integrate themselves or should get integrated on startup. The core system ships with no endpoints but will be extended from those external files.
Some ideas come to my mind
First approach:
Each extension represents a new repository. Define a path to a custom external folder holding all that extension projects. This custom directory would contain a folder groups with a groups.module
import { Module } from '#nestjs/common';
import { GroupsController } from './groups.controller';
#Module({
controllers: [GroupsController],
})
export class GroupsModule {}
My API could loop through that directory and try to import each module file.
pros:
The custom code is kept away from the core repository
cons:
NestJs uses Typescript so I have to compile the code first. How would I manage the API build and the builds from the custom apps? (Plug and play system)
The custom extensions are very loose because they just contain some typescript files. Due to the fact they don't have access to the node_modules directory of the API, my editor will show me errors because it can't resolve external package dependencies.
Some extensions might fetch data from another extension. Maybe the groups service needs to access the users service. Things might get tricky here.
Second approach:
Keep each extension inside a subfolder of the src folder of the API. But add this subfolder to the .gitignore file. Now you can keep your extensions inside the API.
pros:
Your editor is able to resolve the dependencies
Before deploying your code you can run the build command and will have a single distribution
You can access other services easily (/groups needs to find a user by id)
cons:
When developing you have to copy your repository files inside that subfolder. After changing something you have to copy these files back and override your repository files with the updated ones.
Third approach:
Inside an external custom folder, all extensions are fully fledged standalone APIs. Your main API would just provide the authentication stuff and could act as a proxy to redirect the incoming requests to the target API.
pros:
New extensions can be developed and tested easily
cons:
Deployment will be tricky. You will have a main API and n extension APIs starting their own process and listening to a port.
The proxy system could be tricky. If the client requests /users the proxy needs to know which extension API listens for that endpoint, calls that API and forwards that response back to the client.
To protect the extension APIs (authentication is handled by the main API) the proxy needs to share a secret with those APIs. So the extension API will only pass incoming requests if that matching secret is provided from the proxy.
Fourth approach:
Microservices might help. I took a guide from here https://docs.nestjs.com/microservices/basics
I could have a microservice for the user management, group management etc. and consume those services by creating a small api / gateway / proxy that calls those microservices.
pros:
New extensions can be developed and tested easily
Separated concerns
cons:
Deployment will be tricky. You will have a main API and n microservices starting their own process and listening to a port.
It seems that I would have to create a new gateway api for each customer if I want to have it customizable. So instead of extending an application I would have to create a customized comsuming API each time. That wouldn't solve the problem.
To protect the extension APIs (authentication is handled by the main API) the proxy needs to share a secret with those APIs. So the extension API will only pass incoming requests if that matching secret is provided from the proxy.
There are several approaches to this. What you need to do is figure out what workflow is best suited for your team, organization, and clients.
If this was up to me, I would consider using one repository per module, and use a package manager like NPM with private or organization scoped packages to handle the configuration. Then set up build release pipelines that push to the package repo on new builds.
This way all you need is the main file, and a package manifest file per custom installation. You can independently develop and deploy new versions, and you can load new versions when you need to on the client-side.
For added smoothness, you could use a configuration file to map modules to routes and write a generic route generator script to do most of the bootstrapping.
Since a package can be anything, cross dependencies within the packages will work without much hassle. You just need to be disciplined when it comes to change and version management.
Read more about private packages here:
Private Packages NPM
Now Private NPM registries cost money, but if that is an issue there are several other options as well. Please review this article for some alternatives - both free and paid.
Ways to have your private npm registry
Now if you want to roll your own manager, you could write a simple service locator, that takes in a configuration file containing the necessary information to pull the code from the repo, load it up, and then provide some sort of method to retrieve an instance to it.
I have written a simple reference implementation for such a system:
The framework: locomotion service locator
An example plugin checking for palindromes: locomotion plugin example
An application using the framework to locate plugins: locomotion app example
You can play around with this by getting it from npm using npm install -s locomotion you will need to specify a plugins.json file with the following schema:
{
"path": "relative path where plugins should be stored",
"plugins": [
{
"module":"name of service",
"dir":"location within plugin folder",
"source":"link to git repository"
}
]
}
example:
{
"path": "./plugins",
"plugins": [
{
"module": "palindrome",
"dir": "locomotion-plugin-example",
"source": "https://github.com/drcircuit/locomotion-plugin-example.git"
}
]
}
load it like this:
const loco = require("locomotion");
It then returns a promise that will resolve the service locator object, which has the locator method to get a hold of your services:
loco.then((svc) => {
let pal = svc.locate("palindrome"); //get the palindrome service
if (pal) {
console.log("Is: no X in Nixon! a palindrome? ", (pal.isPalindrome("no X in Nixon!")) ? "Yes" : "no"); // test if it works :)
}
}).catch((err) => {
console.error(err);
});
Please note that this is just a reference implementation, and is not robust enough for serious application. However, the pattern is still valid and shows the gist of writing this kind of framework.
Now, this would need to be extended with support for plugin configuration, initializations, error checking, maybe add support for dependency injection and so on.
I would go for external packages option.
You can structure your app to have a packages folder. I would have UMD compiled builds of external packages in that folder so that your compiled typescript won't have any issues with the packages. All packages should have an index.js file on each package's root folder.
And your app can run a loop through the packages folder using fs and require all the packages index.js into your app.
Then again dependency installation is something you have to take care of. I think a configuration file on each package could solve that too. You can have a custom npm script on main app to install all the package dependencies before starting the application.
This way, you can just add new packages to your app by copy pasting the package into the packages folder and rebooting the app. Your compiled typescript files won't be touched and you don't have to use private npm for your own packages.

How to organize Javascript project to allow sharing of code between client SPA and Node server

I am developing an application which comprises a SPA front end and a Rest back-end.
To implement the Rest back-end I am using Node and Express.
Considering that both front-end and back-end are written in JavaScript (and TypeScript), I would like to share some code between these 2 parts (namely Interfaces and simple utils).
So basically my project is composed of three parts: Client, Server, Shared. Therefore I am inclined to have a project directory structure similar to this:
ProjecFolder
ClientFolder
.....
ServerFolder
.....
SharedFolder
.....
I am looking for suggestions on how best organize my project. I have done some research and I have found this interesting article which suggests to use a mechanism based on Gulp tasks that copy all files from SharedFolder into both ClientFolder and ServerFolder and then runs transpling.
I am wondering whether there can be an alternative approach or tools that perform what otherwise has to be configures as Gulp workflow.
My recommendation is to use a package manager tool. When you have dependencies, and the requirements of the server changed, you have to change the module. You don't want the SPA (frontend), to break, when you need to make changes to the server.
This is why package managers give you versions. Each module that depends on your shared code, can use a different version of it. You can use NPM for that. Build a module, publish it, and install it on your frontend and backend.
Many times, in production you split the frontend and backend. The frontend may exist in a file storage system (S3, Google Cloud Storage and similar), and the backend executed on your servers. Then it will be harder to use the same files on both systems.

Loading modules in qooxdoo desktop (browser environment)

I'm struggling how to integrate client-side modules like - just as an example - Apollo Client
into the qooxdoo-specific generate.py workflow so that they become available in the browser.
According to the installation notes:
To use this client in a web browser or mobile app, you'll need a build system capable of loading NPM packages on the client. Some common choices include Browserify, Webpack, and Meteor 1.3. [...]
Side note: I currently use Babel 6 to recursively transpile all my sources from a separate folder source.es6/ into the "official" source/ folder, which is then watched and processed by generate.py. Is it possible to use this somehow as a solution to my question?
OTOH, I would love to see at least some kind of integration with Webpack, Browserify or SystemJS.
I suggest you do the following. First, create a loadable package(s) from the Apollo Client and its dependencies, e.g. using Webpack. Then make sure these package(s) are loaded in your web page before you load your qooxdoo app. Then the Apollo API is available to your qooxdoo code.
If you choose to deploy the Apollo packages with <script> tags you can let generate.py do that by using the add-script config key.
I suggest you place the output of the Webpack run in your qooxdoo project's resource path and add #asset hints for those files in your main qooxdoo class. This will make sure they are copied into the build version of your app, and you can use the relative URI to these files, either in your index.html directly or in the add-script config settings.
I don't think your transpiling with Babel6 will help here. The Apollo code is already consumable and you woudn't want to disect it and make it part of your qooxdoo (es6) source tree, let alone its dependencies. I would rather treat it as a shrink-wrapped JS library as I described that is added like a resource.

Categories