I am using react-scripts v2 (beta) and I have read the documentation here.
We need to create many environments.
We want to store env file under folder config/env.
We might use javascript file in config/env/staging.js because .env seems to be only for root directory.
We have real environment :
default
development
staging
preproduction
production
We expect the default environment to be a default config under version control in config/env/default.js, it must be the default configuration used when doing npm start
We expect the user to be able to override with a file with no version control. (something like config/env/default.local.js
Basically that can be reduced to two issues :
Is it possible to relocate env location folder?
How can we create and select a new environment on npm start/build?
without ejecting.
Just copy the environment file to .env before starting / building. You can even put .env in .gitignore that way
"start": "cp $ENV .env && react-scripts start"
Then run it:
ENV=config/staging/.env npm start
There are lots of ways of doing what you want without ejecting, since it's all preprocessing (before your app starts / builds).
Related
I am currently deploying a react application (front-end only) using Cloud Run, I have created a trigger for a cloud build to run which deploys the app using Cloud Run.
However, when I'm trying to create some environment variables to access in my components using cloud run UI I cannot access them due to the fact (from my understanding) that the environments are defined within the instance of the app and not the user's browser.
So my question is - How should I properly approach this issue?
I've tried perhaps building sort of "config.json", but I wasn't able to find a proper way to mount the files in different environments.
You'll want to do the following. First, install https://www.npmjs.com/package/env-cmd
Next, you can specify the .env to be used depending upon your environment as so. The below code allows you to specify configuration declaratively rather than imperatively. Now, when you say process.env.REACT_APP_ENV1 process.env.REACT_APP_ENV2 process.env.REACT_APP_ENV3, then depending on env the correct values will be transparently picked up.
"scripts": {
"start": "env-cmd -f env_anotherenv.env react-scripts start",
"start_vscode": "env-cmd -f env_vscode.env react-scripts start",
"build_staging": "env-cmd -f env_staging.env react-scripts build --profile",
"build_production": "env-cmd -f env_production.env react-scripts build --profile"
}
Example of env_production.env will be
REACT_APP_ENV1 = someenv1
REACT_APP_ENV2 = someenv2
REACT_APP_ENV3 = someenv3
Example of env_staging.env will be
REACT_APP_ENV1 = someotherenv1
REACT_APP_ENV2 = someotherenv2
REACT_APP_ENV3 = someotherenv3
I would suggest you bake in your environment variables in your build artifacts which you then access using process.env.VARIABLE_1 as described here.
Use a multi-stage docker build. In your build stage populate a .env file from build-args (e.g using envsubst) before running npm run build. Here is a sample using a template that you can push to your git repo.
ARG VARIABLE_1
ARG VARIABLE_2
COPY .env.template .
RUN /bin/sh -c "envsubst '\$VARIABLE_1 \$VARIABLE_2' < .env.template > .env"
.env.template
VARIABLE_1=${VARIABLE_1}
VARIABLE_2=${VARIABLE_2}
Depending on how your cloud build triggers are set up, the values for the build args can be sourced from substitution variables.
The final stage can be based on the nginx image, here you copy the built static files bundle and serve using nginx. See a full docker example here
This way you do not expose a .env file and there is no environment variables configuration to be done from the cloud run end.
Is there a way to have a tsconfig sitting in my prod folder to only create sourcemaps if process.env is development? I don't want sourcemaps created during my CI/CD build or pushed to production.
tsconfig was not meant to have stuff like env in the file itself its just a simple json file. To do what you want you should be using your own config for each env aka tsconfig.production.json, tsconfig.development.json
You can then in your build pipeline use the --project/--p command when doing tsc to specify your location. So if you want only prod builds being created in your CI/CD then in your step on executing tsc you can just do tsc -p ./tsconfig.production.json where ./tsconfig.production.json is your path to that file. If you want it to use your env itself most of the CI/CD lets you execute a different script depending on the env so again you just execute this one for prod builds.
Another quick note most people structure their tsconfig files to have a base file which say in this example tsconfig.development.json extends and so does tsconfig.production.json. In the tsconfig.base.json you have all the same main settings so you don't repeat yourself twice, with source maps turned off. Then in the tsconfig.development.json you can extend this file and turn source maps on, to still handle nice debugging when on development.
Anyway i hope this hopes.
I want to do something like this, where, I want to keep all my packages globally just like node package itself. So for example in my package.json I have a package name called "Highcharts" I want to install it globally I don't want to create a local node_modules folder and use it but I want to access it from outside so next time whenever I want to create a copy of my project folder I should be able to use highcharts directly without using npm install. Is it possible?
globally installed node_modules - > Users/user/AppData/Roaming/node_modules/highcharts
app
src
node_modules (I don't want to keep it)
package.json
tsconfig.json
angular.json
How to link these globally installed node_modules with the current app or any app which we want to create?
Any help will be appreciated. Thank you so much :)
local packages are installed in the project directory
global packages are installed in a single place in your system
Usually it is a good idea to have all npm packages required for your project installed locally (project folder). This makes sure, that you can have dozens of applications which are running a different versions of each package if needed.
export NODE_PATH='yourdir'/node_modules
Hello, if am getting right, you want to keep all dependencies global.
You can just run install with -g command. Those libraries will be available in node installation folder.
From the Node docs
If the NODE_PATH environment variable is set to a colon-delimited list of absolute paths, then node will search those paths for modules if they are not found elsewhere. (Note: On Windows, NODE_PATH is delimited by semicolons instead of colons.)
Additionally, node will search in the following locations:
1: $HOME/.node_modules
2: $HOME/.node_libraries
3: $PREFIX/lib/node
Where $HOME is the user's home directory, and $PREFIX is node's configured node_prefix.
These are mostly for historic reasons. You are highly encouraged to place your dependencies locally in node_modules folders. They will be loaded faster, and more reliably.
I hope I answered, you just need to manage the paths to node_modules wherever you have kept it.
My Nuxt project uses system environment variables to set client ids, secrets, urls, etc...
An example is in my nuxt.config.js where I set several properties with the following formula:
{
something: process.env.SOMETHING || 'something_for_dev'
}
Nuxt dev version is working fine because looks after the process.env.SOMETHING and correctly use something_for_dev.
Nuxt on staging has its own configuration on Azure and the SOMETHING env var is correctly set but suddenly it still continue using something_for_dev...
What should I do to let Nuxt use the sys env vars I set on my Server rather than the default used for dev? Thanks
Env variables are set build time, not runtime. So it will be the env variables that set during your build, which seems you do on your dev machine.
So you can either build with proper env variables or use nuxt-env module, which allows runtime variables, but keep in mind that it wont allow webpack to optimize dead code and environment variables used in nuxt-env are exposed client side, so if you store secrets use the secret config option
In addition to Aldarund comment above you could build a proper env variables in nuxt by following:
Use cross-env: npm install cross-env
In your project add a folder named environment
and under the folder you could have different environment (e.g. development, staging, production)
Your environment configs will have your baseUrl and other configs, let's say for development you could have your localhost then for the staging or production you could have your API_URL
defaults.json \\development
defaults.prod.json \\production
In nuxt.config.ts build > extend configure your different environment
This will replace defaults.json depending on which environment we will run our script in package.json.
In package.json configure your script to what environment will be run (e.g npm run start will use NODE_ENV=development which will use the defaults.json with baseUrl: http://localhost:3000 and npm run build will use defaults.prod.json with baseUrl: http://www.API_URL.com and other configs
For more detail you could see cross-env
Right now I got a NodeJS backend for a system I built. The problem is that I need to also maintain another instance of the backend for some client specific requirements, both of the instances share like 70-80% of the code. At the moment I'm using git branches to keep them apart, I know git is not meant to do this, so I would like to know if there is something that allows me to have two separate projects sharing some codebase, similar to flavors in Android.
There are few options to do this:
1. Install your own module as separate dependency with npm via package.json dependency.
create own reusable code as separate project on your git space
specify it as dependency in package.json
use it by installing it with npm install, in same fashion you do with regular npm modules
2. Use docker.
Docker is container virtualisation engine, that allows you to create images of vritual environments with pre-installed infrastructure/file system
You just crete image with some linux os inside, node and your module preinstalled, and all you need is to mount your unique code as "volume" to the container and thats it.
use nodejs offcial image - it have everything basic node.js env would need - to create own image. In the folder where you have /reusable_code folder and package.json create Dockerile file
Dockerfile:
FROM node:6.9.2
RUN mkdir app
COPY ./reusable_code /app/reusable_code
COPY ./package.json /app/package.json
WORKDIR /app
RUN npm install -g <your global modules> && npm install
now run docker build -t base-image-with-reusable-code .
It will create and tag the image as base-image-with-reusable-code
Now once you want to use it with any unique code you should do from the folder where the code is (this assuming all unique code use same package.json dependencies used in previous step - if not this will need extra step)
docker run -ti -v ./app.js:/app/app.js -v ./unique_code:/app/unique_code base-image-with-reusable-code node app.js
Of course names should be corrected, and if you have different project structure then the changes should reflect that.
3. Link the reusable code module folder via OS
Simply put, just ln -s /path/to/reusable/code ./resuable_code from your unique code project root folder, and then use it assuming it residing at root of every unique project you have linked it to.
4. Link the reusable code module folder via npm link
as suggested by #Paul :
node native way to do #3 is via npm link, which both sets up the symlink and makes reference to it in your package.json so that your code can treat it as an external module
Assuming you have reusable code folder in same folder where unique code folders are located:
Modified example of the npm link docs:
cd ~/projects/unique_project1 # go into the dir of your main project
npm link ../reusable_code # link the dir of your dependency
Note : all solutions assume you have separate git project for your reusable code. Generally speaking , it's a good practice.