My team has project and 4-5 micro services on Node.js. We have important javascript file which we use in application, but now we need to start using this file in services.
I want to create private npm package which will represents this file.
I'll add this new package in package.json of application and services using link to the repo.
I have doubt about local development. Lets imagine that developer needs to make change into this file in separated repository and make change into application. I see this workflow:
Commit changes to file and push to branch "my-new-feature-001".
Change package.json of my application for using this branch of common file
Develop feature in application.
Merge "my-new-feature-001" into main branch of file
input "git checkout package.json" in application
Commit and merge changes in application.
It's so difficult. Is there any more simple solution? I mean using remote repository for deploy servers and local repository of common file for local development?
Related
I'm creating an API for multiple customers. The core endpoints like /users are used by every customer but some endpoints rely on individual customization. So it might be that User A wants a special endpoint /groups and no other customer will have that feature. Just as a sidenote, each customer would also use his own database schema because of those extra features.
I personally use NestJs (Express under the hood). So the app.module currently registers all my core modules (with their own endpoints etc.)
import { Module } from '#nestjs/common';
import { UsersModule } from './users/users.module'; // core module
#Module({
imports: [UsersModule]
})
export class AppModule {}
I think this problem is not related to NestJs so how would you handle that in theory?
I basically need an infrastructure that is able to provide a basic system. There are no core endpoints anymore because each extension is unique and multiple /users implementations could be possible. When developing a new feature the core application should not be touched. Extensions should integrate themselves or should get integrated on startup. The core system ships with no endpoints but will be extended from those external files.
Some ideas come to my mind
First approach:
Each extension represents a new repository. Define a path to a custom external folder holding all that extension projects. This custom directory would contain a folder groups with a groups.module
import { Module } from '#nestjs/common';
import { GroupsController } from './groups.controller';
#Module({
controllers: [GroupsController],
})
export class GroupsModule {}
My API could loop through that directory and try to import each module file.
pros:
The custom code is kept away from the core repository
cons:
NestJs uses Typescript so I have to compile the code first. How would I manage the API build and the builds from the custom apps? (Plug and play system)
The custom extensions are very loose because they just contain some typescript files. Due to the fact they don't have access to the node_modules directory of the API, my editor will show me errors because it can't resolve external package dependencies.
Some extensions might fetch data from another extension. Maybe the groups service needs to access the users service. Things might get tricky here.
Second approach:
Keep each extension inside a subfolder of the src folder of the API. But add this subfolder to the .gitignore file. Now you can keep your extensions inside the API.
pros:
Your editor is able to resolve the dependencies
Before deploying your code you can run the build command and will have a single distribution
You can access other services easily (/groups needs to find a user by id)
cons:
When developing you have to copy your repository files inside that subfolder. After changing something you have to copy these files back and override your repository files with the updated ones.
Third approach:
Inside an external custom folder, all extensions are fully fledged standalone APIs. Your main API would just provide the authentication stuff and could act as a proxy to redirect the incoming requests to the target API.
pros:
New extensions can be developed and tested easily
cons:
Deployment will be tricky. You will have a main API and n extension APIs starting their own process and listening to a port.
The proxy system could be tricky. If the client requests /users the proxy needs to know which extension API listens for that endpoint, calls that API and forwards that response back to the client.
To protect the extension APIs (authentication is handled by the main API) the proxy needs to share a secret with those APIs. So the extension API will only pass incoming requests if that matching secret is provided from the proxy.
Fourth approach:
Microservices might help. I took a guide from here https://docs.nestjs.com/microservices/basics
I could have a microservice for the user management, group management etc. and consume those services by creating a small api / gateway / proxy that calls those microservices.
pros:
New extensions can be developed and tested easily
Separated concerns
cons:
Deployment will be tricky. You will have a main API and n microservices starting their own process and listening to a port.
It seems that I would have to create a new gateway api for each customer if I want to have it customizable. So instead of extending an application I would have to create a customized comsuming API each time. That wouldn't solve the problem.
To protect the extension APIs (authentication is handled by the main API) the proxy needs to share a secret with those APIs. So the extension API will only pass incoming requests if that matching secret is provided from the proxy.
There are several approaches to this. What you need to do is figure out what workflow is best suited for your team, organization, and clients.
If this was up to me, I would consider using one repository per module, and use a package manager like NPM with private or organization scoped packages to handle the configuration. Then set up build release pipelines that push to the package repo on new builds.
This way all you need is the main file, and a package manifest file per custom installation. You can independently develop and deploy new versions, and you can load new versions when you need to on the client-side.
For added smoothness, you could use a configuration file to map modules to routes and write a generic route generator script to do most of the bootstrapping.
Since a package can be anything, cross dependencies within the packages will work without much hassle. You just need to be disciplined when it comes to change and version management.
Read more about private packages here:
Private Packages NPM
Now Private NPM registries cost money, but if that is an issue there are several other options as well. Please review this article for some alternatives - both free and paid.
Ways to have your private npm registry
Now if you want to roll your own manager, you could write a simple service locator, that takes in a configuration file containing the necessary information to pull the code from the repo, load it up, and then provide some sort of method to retrieve an instance to it.
I have written a simple reference implementation for such a system:
The framework: locomotion service locator
An example plugin checking for palindromes: locomotion plugin example
An application using the framework to locate plugins: locomotion app example
You can play around with this by getting it from npm using npm install -s locomotion you will need to specify a plugins.json file with the following schema:
{
"path": "relative path where plugins should be stored",
"plugins": [
{
"module":"name of service",
"dir":"location within plugin folder",
"source":"link to git repository"
}
]
}
example:
{
"path": "./plugins",
"plugins": [
{
"module": "palindrome",
"dir": "locomotion-plugin-example",
"source": "https://github.com/drcircuit/locomotion-plugin-example.git"
}
]
}
load it like this:
const loco = require("locomotion");
It then returns a promise that will resolve the service locator object, which has the locator method to get a hold of your services:
loco.then((svc) => {
let pal = svc.locate("palindrome"); //get the palindrome service
if (pal) {
console.log("Is: no X in Nixon! a palindrome? ", (pal.isPalindrome("no X in Nixon!")) ? "Yes" : "no"); // test if it works :)
}
}).catch((err) => {
console.error(err);
});
Please note that this is just a reference implementation, and is not robust enough for serious application. However, the pattern is still valid and shows the gist of writing this kind of framework.
Now, this would need to be extended with support for plugin configuration, initializations, error checking, maybe add support for dependency injection and so on.
I would go for external packages option.
You can structure your app to have a packages folder. I would have UMD compiled builds of external packages in that folder so that your compiled typescript won't have any issues with the packages. All packages should have an index.js file on each package's root folder.
And your app can run a loop through the packages folder using fs and require all the packages index.js into your app.
Then again dependency installation is something you have to take care of. I think a configuration file on each package could solve that too. You can have a custom npm script on main app to install all the package dependencies before starting the application.
This way, you can just add new packages to your app by copy pasting the package into the packages folder and rebooting the app. Your compiled typescript files won't be touched and you don't have to use private npm for your own packages.
I have a Javascript library that I wrote for a client. This is written in typescript using webpack and "compiles" into javascript. I want to give the client access to the distribution files but not the whole source. Ideally they can install from the command line to make installing updates easy.
The library provides some javascript functions. The client would install the script in one location on their server. They could then include the javascripts in their web surveys as they need it.
+project
+dist
-main.js
-vendor.js
-index.html
-README.md
-LICENSE.md
+src
-index.js
-index.html
...
My initial thoughts are to give them access to a private git repository that contains only the distribution files. So my project would be a git repository, only I would have access to this repo. I would then copy the contents of the dist directory to a release directory. The release directory would be another git repo I could supply to the client.
I'm not sure this is the best approach.
It was suggested that GitHub releases may be an option - but I don't use GitHub, I use GitLab and would like to continue to do so.
npm also doesn't seem like a good choice. It installs files into the node_modules directory and creates a package.json file. That's going to be confusing to my client and isn't "clean".
It sounds like a second git repository as submodule could work for you. On your side it would receive the built files, and on the clients side they could consume them.
I'd suggest making use of tags to indicate significant versions in the submodule
By using a separate repository there is no risk of leaking the original files.
Alternatively you could package the files as a zip, and upload somewhere like S3 as part of your ci process, and write a script to give the client that can automatically download the distribution files - but this seems more complex than just using a package manager like npm.
Hello stakOverFlowers :D
I have a simple NodeJS WebApp that use Lerna to manage the project. So i have a package directory that contains n different projects each ones using different tasks runner tools.
I always use Maven Build Profile in java environment but for this NodeJS project maven will not be used.
So the question is...
Is there a way to reproduce the Maven Build Profile concept without using MVN?
In a nutshell i need to use a build profile in nodejs, without using MVN, to customize build for different environments such as Production v/s Development environments.
There's a way to do that?
thanks to all
you can do it by storing your configurations in a JSON file as key value pairs in the same way as you do in properties file in Java.
Then by someway or other invoke properties from the environment specific configuration file such as production.json or stage.json or qa.json.
One of the easy ways to do this is using this module named config
Using this you can pass NODE_ENV=production(or dev or qa whatever) and access relevant configurations. This will help you achieve environment profiling.
You can also store configurations in a JS file but I personally prefer a JSON file for storing configurations.
But if you're wondering for dependencies management that is done by package.json file which is somewhat similar to your pom.xml file.
For more details about it you might want to read this official documentation for it.
My solution, following the TGW advice, works!!
Just install the config module, and create a directory that containing .json files.
$ npm install config
$ mkdir config
$ vi config/default.json
Than if u are on a windows machine, choose your NODE_ENV using NODE_ENV=production and than execute your web app.
In your .js file put informations like yours dbConnection host and password.... and to retrieve it use:
var config = require('config');
var dbConfig = config.get('JsonItem.dbConfig');
..more details on https://github.com/lorenwest/node-config
I am developing ionic application and using gulp utility to minify js files.
To avoid gulp getting configured in all the systems of team members, I am planning to have remote system which will communicate with TFS, take the project and put it in some folder and then I will install gulp on that and run task accordingly.
But my question is how to make connection from remote system to TFS to get project ? I know through Eclipse i can pull and get the project but is there any independent approach like running batch file and getting things done.
Overall aim is to avoid installing run in all the team member system instead they can connect to remote system and take the minified versions.
Help is appreciated !
According to your description, you just want to get a version from TFS to the workspace on your remote machine.
There are several ways to achieve this:
Install VS/Team Explorer on the remote machine, then perform get latest or get specific version from history.
Go to TFS web portal, navigate to CODE tab, select the project, and choose Download as Zip.
Use TFS Get command to get (download) either the latest version or a specified version of one or more files or folders from Team Foundation Server to the workspace.
Syntax:
tf get [itemspec] [/version:versionspec] [/all] [/overwrite] [/force] [/remap]
[/recursive] [/preview] [/noautoresolve] [/noprompt]
[/login:username,[password]]
Write a batch script to invoke command line. Check http://n3wjack.net/2014/02/02/how-to-pull-from-tfs-from-the-command-line/
I need to separate my Frontend and Backend into two different repos. Because one developer can't install rails and doesn't need it (we can make stub for API).
How can I do it in case of deployment? Do I need git submodule? How to use it (with GitHub and Ninefold)?
I found information about how to develop standalone frontend app (thanks I can use grunt) and how to use submodules, but I can't combine it. Please help! Does anyone have such experience?
Having your rails app provide a RESTful API is a good idea here. Your standalone front-end app can then interact with the API over HTTP(S).
If you want the front-end app within the rails app but need repository separation (i.e. don't want the front-end developer to access the code of the rails app), using a git submodule may work but probably needs some organisational thought.
This is what I'd do:
First clone your rails app from GitHub or Bitbucket (or git init one locally) and then configure a git submodule.
git clone git#github.com:pathto/myawesomerailsapp.git
cd myawesomerailsapp
git submodule add git#github.com:pathto/mystandalonejsapp.git app/assets/standalone
Now when you cat .gitmodules you'll notice there's a new submodule configured in your repo.
Commit and push your changes. Ninefold will detect the submodules and use them, but if you have any problems just get in touch.
Good luck!