Why don't I get any value from process.env? - javascript

I want to use environmental variables in my angular project to hide sensitive information. I have followed this tutorial using dotenv https://javascript.plainenglish.io/setup-dotenv-to-access-environment-variables-in-angular-9-f06c6ffb86c0 but I can't seem to get any value from my process.env. I can't figure out why or how.
setenv.ts file
const { writeFile } = require('fs');
const { argv } = require('yargs');
// read environment variables from .env file
require('dotenv').config();
// read the command line arguments passed with yargs
const environment = argv.environment;
const isProduction = environment === 'prod';
const targetPath = isProduction
? `./src/environments/environment.prod.ts`
: `./src/environments/environment.ts`;
if (!process.env.API_KEY || !process.env.ANOTHER_API_KEY) {
console.error('All the required environment variables were not provided!');
process.exit(-1);
}
// we have access to our environment variables
// in the process.env object thanks to dotenv
const environmentFileContent = `
export const environment = {
production: ${isProduction},
APP_ID: "${process.env.APP_ID}",
API_KEY: "${process.env.API_KEY}"
};
`;
// write the content to the respective file
writeFile(targetPath, environmentFileContent, function (err) {
if (err) {
console.log(err);
}
console.log(`Wrote variables to ${targetPath}`);
});
package.json
"ng": "ng",
"config": "npx ts-node ./scripts/setenv.ts",
"start": "npm run config -- --environment=dev && ng serve",
"build": "npm run config -- --environment=prod && ng build",
.env
APP_ID=myAppId
API_KEY=myApi_Key

I believe your path is wrong. The way I do it is import environment at the top of the document, for example, with a relative path from src/app/services/api.service.ts to src/enviroments/environment.ts:
import { environment } from '../../environments/environment';
And then just extract values you need:
BACKEND_URL = environment.apiUrl;

I suspect you don't have the .env file in the right place. Can you try this
require('dotenv').config({ path: '/custom/path/to/.env' })
See here https://www.npmjs.com/package/dotenv

Related

How can I dynamically import a directory of files from within a loop

I am attempting to import files from a directory from within a loop to dynamically generate routes. As you can see in my (Working from index file (my initial code), I attempted to use (import.meta.url) because I wanted this to be a sync operation. The error I got was as follow:
First Error:
[ERROR] 11:51:22 тип Unable to compile TypeScript:
src/routes/index.ts(26,33): error TS1343: The 'import.meta'
meta-property is only allowed when the '--module' option is 'es2020',
'es2022', 'esnext', 'system', 'node12', or 'nodenext'.
This is what I tried.
After Google searches I tried the code in the (One of the things I tried) section below. I ran into this error which is strange to me since I am actually using the dynamic import.
[ERROR] 11:52:20 Error: require() of ES Module
/app/src/routes/module-streams.js from /app/src/routes/index.ts not
supported. Instead change the require of module-streams.js in
/app/src/routes/index.ts to a dynamic import() which is available in
all CommonJS modules.
Honestly even if I got the second approach I tried working, I am not thrilled about having to deal with promises. I want to import the function of each of the many routes, invoke it and create the routes during startup. This is a mock server and I am attempting to reduce the boilerplate code for adding new JSON. The funny thing is I was able to get my first approach working in another place in the app importing JSON. The code is almost identical and I get no complaint.
Really all I want to do is dynamically get the files and invoke the functions synchronously if possible. I never imaged this would be so problematic. It certainly was not with require.
package.json
{
"name": "mock-server",
"version": "1.0.0",
"description": "",
"main": "server.js",
"type": "module",
"scripts": {
"start": "ts-node-dev src/server.ts",
"test": "jest --watchAll --no-cache"
},
"author": "",
"license": "ISC",
"dependencies": {
"#types/express": "^4.17.13",
"#types/mongoose": "^5.11.97",
"express": "^4.17.3",
"mongoose": "^6.3.0",
"typescript": "^4.6.3"
},
"devDependencies": {
"ts-node-dev": "^1.1.8"
}
}
tsconfig
{
"compilerOptions": {
"target": "es5",
"module": "commonjs",
"strict": true,
"esModuleInterop": true,
"forceConsistentCasingInFileNames": true
}
}
Dockerfile
FROM node:alpine
WORKDIR /app
COPY package.json .
RUN npm install --only=prod
COPY . .
CMD ["npm", "start"]
One of the route files i am attepting to import
import express from 'express';
import mongoose from "mongoose";
const ModuleStreams = mongoose.model('ModuleStreams', new mongoose.Schema({
moduleName: String,
streamName: String,
status: String,
softwareSourceId: String,
profiles: Array
}, { collection : 'module-streams' }));
export default (services) => {
let router = express.Router();
router.route('/')
.get((req, res, next) => {
ModuleStreams.find({}, function(err, data) {
console.log(data);
res.send(data);
});
});
return router;
};
Working from index file (my initial code)
import fs from 'fs';
import { createRequire } from "module";
export const routes = () => {
const require = createRequire(import.meta.url);
const dir = process.cwd() + '/src/routes';
const paths = fs.readdirSync(dir, { withFileTypes: true })
.filter(item => !item.isDirectory())
.map(item => item.name);
paths.forEach(filePath => {
const fileNameSegments = filePath.split('.');
const routeName = fileNameSegments[0];
if (routeName !== 'index') {
const content = require(`./${filePath}`);
const test = import(`./${filePath}`);
}
});
};
One of the things I tried
import fs from 'fs';
export const routes = () => {
const dir = process.cwd() + '/src/routes';
const paths = fs.readdirSync(dir, { withFileTypes: true })
.filter(item => !item.isDirectory())
.map(item => item.name);
paths.forEach(filePath => {
const fileNameSegments = filePath.split('.');
const routeName = fileNameSegments[0];
if (routeName !== 'index') {
const test = import(`./${filePath}`);
}
});
};

Load separate .env from package.json

"start:dev": "set NODE_ENV=development&&nodemon ./bin/www",
"start:test": "set NODE_ENV=testing&&nodemon ./bin/www",
I have two separate .env files dev.env and test.env
I want to load dev.env on npm run start:dev and load test.env on npm run start:test
I have searched every where on the Internet, but no help.
Any help is appreciated.
You can only set node env in npm script. to import file you need to write code on your server file.
import dotenv in your server file
import dotenv from "dotenv";
or
const dotenv = require("dotenv");
use below code to import particular env file.
let envConfig={}
if (process.env.NODE_ENV === "development") {
if (fs.existsSync(".env.development")) {
envConfig = dotenv.parse(fs.readFileSync(".env.development"));
}
} else if(process.env.NODE_ENV === "testing"){
if (fs.existsSync(".env.test")) {
envConfig = dotenv.parse(fs.readFileSync(".env.test"));
}
}
for (const k in envConfig) {
process.env[k] = envConfig[k];
}
The dotenv NPM package loads a file called .env by default, but this behaviour can be overriden. So you can do something like:
const { config } = require('dotenv')
if (process.env.NODE_ENV === 'development') {
config({ path: '/full/path/to/your/dev.env' })
} else if (process.env.NODE_ENV === 'testing') {
config({ path: '/full/path/to/your/test.env' })
}
I believe this answer has a solution for you:
scripts: {
"set-env:production": "export $(cat .production.env | grep \"^[^#;]\" |xargs)",
"set-env:development": "export $(cat .env | grep \"^[^#;]\" |xargs)",
}

Proxy to backend with default Next.js dev server

Before, when I made apps with create-react-app, I would have a setupProxy.js file that would route API requests similar to this
const proxy = require('http-proxy-middleware');
module.exports = function(app) {
app.use('/api',
proxy({
target: 'http://localhost:8000',
changeOrigin: true,
})
);
};
But that doesn't seem to work with next.js. When I do the same thing, I get various errors.
Googling a solution, a lot say to use a custom server of some kind. But how would I accomplish a proxy like above using the default nextjs dev server? (Equivalent of npm run dev when dev in my package.json is next dev.
There is now an official feature for this in the config: Rewrites
Besides normal path rewrites, they can also proxy requests to another webserver
next.config.js:
module.exports = {
async rewrites() {
return [
{
source: '/api/:path*',
destination: 'http://localhost:8000/:path*' // Proxy to Backend
}
]
}
}
See Next.js Docs Rewrites
My server.js set up, hope it helps:
import express from 'express';
import next from 'next';
import proxy from 'http-proxy-middleware';
const port = parseInt(process.env.PORT, 10) || 3000;
const dev = process.env.NODE_ENV !== 'production';
const app = next({ dev });
const handle = app.getRequestHandler();
app.prepare().then(() => {
const server = express();
server.use(
'/api',
proxy({
target: process.env.API_HOST,
changeOrigin: true,
}),
);
server.all('*', (req, res) => handle(req, res));
server.listen(port, err => {
if (err) throw err;
console.log(`> Ready on http://localhost:${port}`);
});
});
package.json:
"scripts": {
"dev": "NODE_ENV=development node -r esm server.js",
"build": "NODE_ENV=production next build",
"start": "NODE_ENV=production node -r esm server.js",
},
Another solution with catch-all routes + http-proxy-middleware:
// pages/api/proxy/[...slug].js
import { createProxyMiddleware } from "http-proxy-middleware"; // #2.0.6
const proxy = createProxyMiddleware({
target: process.env.BACKEND_URL,
secure: false,
pathRewrite: { "^/api/proxy": "" }, // remove `/api/proxy` prefix
});
export default function handler(req, res) {
proxy(req, res, (err) => {
if (err) {
throw err;
}
throw new Error(
`Request '${req.url}' is not proxied! We should never reach here!`
);
});
}
see: https://stackoverflow.com/a/72310680
Rewrites didn't work for me, and neither did using axios config.proxy.
I've resorted to a good old constant:
const SERVER_URL =
process.env.NODE_ENV === 'production' ? 'https://produrl.com : 'http://localhost:8000';
export async function getStaticProps() {
const data = axios.get(`${SERVER_URL}/api/my-route`)
// ...
}
I would much rather proxy requests and keep my requests cleaner, but I don't have a day to spend wrestling with this.
Maybe this very simple setup will help others.

env-vars in React using Dotenv and Webpack

I want to access some of my environment variables defined in the frontend (React).
I have the following setup:
React + NodeJS (I do NOT use create-react-app)
Webpack 4
Dotenv
I have tried to follow https://medium.com/#trekinbami/using-environment-variables-in-react-6b0a99d83cf5#0618 but it does not work either any error is thrown.
webpack.config.js
const dotenv = require("dotenv");
module.exports = () => {
// call dotenv and it will return an Object with a parsed key
const env = dotenv.config().parsed;
// reduce it to a nice object, the same as before
const envKeys = Object.keys(env).reduce((prev, next) => {
console.log(prev)
prev[`process.env.${next}`] = JSON.stringify(env[next]);
return prev;
}, {});
return {
...,
plugins: [
new webpack.DefinePlugin(envKeys)
],
...
}
With above Webpack config I think I should be able to do <h4>My var: {process.env.REACT_APP_MY_VAR}</h4> in file.js, of course I have defined REACT_APP_MY_VAR in my .env-file located in the project root.
With above I expect file.js to render the value of REACT_APP_MY_VAR, but i does render nothing, either the value or an error.
I would recommend using dotenv-webpack instead of dotenv package for easy configuration.
4 simple steps:-
1) install dotenv-wepack using
npm install dotenv-webpack --save
2) Create .env file at root of application
API_URL=http://localhost:8000
3) Add this to your webpack config file.
const Dotenv = require('dotenv-webpack');
module.exports = {
...
plugins: [
new Dotenv()
]
...
};
4) Use env variable inside your application anywhere.
import React from 'react';
const App = () => {
return (
<h1>{process.env.API_URL}</h1>
);
}
export default App;
Hope that helps!!!

How to view files created within a Zeit Docker/Node container

According to the Zeit docs
There are no limitations inside Docker deployments when it comes to the file system. It's always writable and readable.
And indeed my little test seems to write files successfully:
app.get('/write', (req, res) => {
console.log({
__dirname,
cwd: process.cwd()
})
const text = `some bit of text`
const dirpath = path.resolve(process.cwd(), 'uploads')
const fullpath = path.resolve(dirpath, `file-${+new Date()}.txt`)
mkdirp(dirpath, function(error) {
if (error) {
console.error(error)
} else {
fs.writeFile(fullpath, text, error => {
if (error) {
console.error('error writing', error)
} else {
console.log(`file written at ${fullpath}`)
fs.readdir(dirpath, function(err, items) {
for (var i = 0; i < items.length; i++) {
console.log(items[i])
}
})
res.send('File written')
}
})
}
})
})
After several refreshes of the /write route, this will print the list of files. However within the Zeit "source" panel, I only see the files copied by my Dockerfile:
For reference, my Dockerfile:
FROM node:carbon
WORKDIR /usr/src/app
COPY package*.json ./
RUN npm install
COPY . .
EXPOSE 3000
CMD [ "npm", "start" ]
# ^^^^^^^^^^^^ "start": "node ./build/server"
Within the Zeit/Now environment, is there any way to view/intereact with these files, via ssh or some other method?
Nope. And that is because you can't access the state of the deployment, but only its source and logging!
It makes sense, after all, you should be running a stateless application...

Categories