I am a newbie to node js. I have different configs for different environments viz dev, prod etc. Currently while creating my app package I copy .json to config.json and then export config.json as config variable (global) and use it throughout the app.
config = require(__dirname + '/config/config');
(config.httpProxy && config.httpProxy.enabled);
I want to load specific env.json as part of environment variable (for dev dev.json's all keys are exported as global variable in app) instead of copying it into app's config.json, so that same app can be used in different env. How to do that.
PS: for application packaging support and dependency management I use gulp and npm.
Please help.
you can name your files like this:
config.development.json
config.production.json
config.test.json
Then load files as:
config = require(__dirname + '/config/config.' + process.env.NODE_ENV);
where process.env.NODE_ENV value can be development/production/test
you have to start your application as
NODE_ENV=development node app.js
for this to work.
I suggest you use this module called config it handles all your env config files.
https://www.npmjs.com/package/config
Just make a folder named config and makes files in it as :
1. development.json
2. qa.json
3. production.json
While starting server provide relevant environment as others mentioned.
Then you can use any property mentioned in your config files.
If you are running your project from your script then set NODE_ENV into your package.json file.
{
...
"scripts": {
"nodemon:server": "NODE_ENV=dev NODE_PATH=. nodemon --exec \"babel-node --stage 1\" server.js",
"prod:server": "NODE_ENV=prod NODE_PATH=. nodemon --exec \"babel-node --stage 1\" server.js"
},
"author": "'Shubham Batra'",
"license": "MIT",
....
}
"prod:server": "**NODE_ENV=prod** NODE_PATH=. nodemon --exec \"babel-node --stage 1\" server.js"
than use in config,js file e.g.
if (process.env.NODE_ENV === 'test' ) {
url = 'http://abc:3000';
dbUrl= 'xyz';
password = '***';
} else if (process.env.NODE_ENV === 'prod') {
url = 'http://def:3000';
...
}
export default {
conf: {
url,
dbUrl,
...
}
}
after that, you can import this config file anywhere in your project and use conf
Related
I am currently developing my own npm package and I created a separate project to download this package from npm for an independent test. The package is being developed in typescript and I have a main file with several additional module files. In my main file, I am importing all of the classes from the other modules, then exporting all of them under the main file. I don't know if this is good practice but when I run the main file on the test project, it says it can't find the module when the path it specifies exists in the working directory.
Code Snippets:
Main file:
import { EventBus } from "./modules/eventbus/eventbus";
import { EventHandler } from "./modules/eventbus/eventhandler";
import { EventType } from "./modules/eventbus/eventtype";
import { Event } from "./modules/eventbus/event";
import { SemVer } from "./modules/semver";
export { SemVer, Event, EventBus, EventHandler, EventType };
Error:
Error [ERR_MODULE_NOT_FOUND]: Cannot find module '/workspaces/epic-engine-testing/node_modules/epic-engine/lib/modules/eventbus/eventbus' imported from /workspaces/epic-engine-testing/node_modules/epic-engine/lib/index.js
Working directory:
Testing file:
import { EventBus, EventHandler, EventType, Event } from "epic-engine";
class SomeType extends EventType {
constructor() {
super();
}
}
const eventbus = new EventBus();
const handler = new EventHandler<SomeType>(eventbus, "type", () => {});
eventbus.createHandler(handler);
const event = new Event<SomeType>(eventbus, new SomeType(), "type");
package.json:
{
"devDependencies": {
"#tsconfig/esm": "^1.0.2",
"#types/jest": "^29.2.3",
"jest": "^29.3.1",
"ts-jest": "^29.0.3",
"tslint": "^6.1.3",
"typescript": "^4.9.3"
},
"name": "epic-engine",
"description": "Pure TS engine developed by EpicPuppy613",
"version": "0.1.0-dev.5",
"main": "lib/index.js",
"types": "lib/index.d.ts",
"type": "module",
"scripts": {
"test": "jest --config jestconfig.json",
"build": "tsc",
"prepare": "npm run build",
"lint": "tslint -p tsconfig.json",
"prepublishOnly": "npm test && npm run lint",
"preversion": "npm run lint",
"version": "git add -A src",
"postversion": "git push && git push --tags"
},
"repository": {
"type": "git",
"url": "git+https://github.com/EpicPuppy613/epic-engine.git"
},
"author": "EpicPuppy613",
"license": "MIT",
"bugs": {
"url": "https://github.com/EpicPuppy613/epic-engine/issues"
},
"homepage": "https://github.com/EpicPuppy613/epic-engine#readme",
"files": [
"lib/**/*"
]
}
I tried a bunch of things including changing the references to use .js, using absolute paths instead, and changing some settings in tsconfig.json.
Why is Node.js not finding the submodules or would it be better to export the modules in a different way?
Save time with npm link command
I created a separate project to download this package from npm for an independent test.
First, you can use the handy npm link command to save yourself the trouble of uploading your package just so you can test. As per the docs the npm link command:
...is handy for installing your own stuff, so that you can work on it and test iteratively without having to continually rebuild.
Install your package as a dependency
With that out of the way, I think the hint is in the error message. Note it says:
Cannot find module /workspaces/epic-engine-testing/node_modules/...
Here it seems Node is looking for the file in the epic-engine-testing project, so you must have the package.json file for the test project reference your package (i.e. the one you want to test). So go into your epic-engine-testing project folder and at the terminal type npm install epic-engine#0.1.0-dev.5. That should install your package so it can be found. If that doesn't resolve it, you'll need to share the package.json file for the epic-engine-testing to help us see what's going on.
Using the export { ... } from '...' syntax
Your main file can use the re-exports synax and be simplified to this when exporting:
export { EventBus } from "./modules/eventbus/eventbus";
export { EventHandler } from "./modules/eventbus/eventhandler";
export { EventType } from "./modules/eventbus/eventtype";
export { Event } from "./modules/eventbus/event";
export { SemVer } from "./modules/semver";
// the line below is not necessary when using above syntax.
// export { SemVer, Event, EventBus, EventHandler, EventType };
After compiling the ts files into js files, wherever you have import syntax, Node.js looks for .js files to resolve them. So it needs to be explicitly given a module name with .js extension in import.
You may need to read this doc on how the import mechanism works in Node.js.
To fix this issue, you have multiple choices(since the target you've defined is ES6):
change moduleResolution to nodeNext and add .js extension whenever you would importing modules in typescript:
import { EventBus } from "./modules/eventbus/eventbus.js";
import { EventHandler } from "./modules/eventbus/eventhandler.js";
import { EventType } from "./modules/eventbus/eventtype.js";
...
You don't need to be worried about it, typescript is well smart. Based on this comment from one of typescript contributor:
.js file extensions are now allowed
Using rollup package. The rollup package won't manipulate your files. Instead, it bundles your output files.
Here is my excel.js:
let test = async () => {
console.log(process.env.DATABASE_HOST);
.......
}
test();
Here is my package.json fragment:
"scripts": {
.............
"excel": "cross-env NODE_ENV=development node ./server/excel.js",
"test": "react-scripts test"
}
My .env.development is stored in the application root folder.
Here is my .env.development:
DATABASE_HOST=dbServer
When I execute the following command line in the application root folder:
npm run excel
It should return "dbServer", unfortunately, it returns undefined.
How can I fix it?
Install dotenv package, and require it require('dotenv').config()
I am setting up my React app project using create-react-app.
I was wondering if there is a way to turn-off the chunking mechanism that is built-in into the react scripts. The thing is that I need to fix the name of the bundle created on the build.
It can be done by extending your CRA with react-app-rewired package which allows you to modify webpack config.
Changes needed to remove hash in build file names.
Install react-app-rewired
npm install react-app-rewired --save-dev
create config-overrides.js file in your root folder (where package.json is)
place the following code to the config-overrides.js file. It keeps all CRA settings, only remove the hash part from filenames.
module.exports = function override(config, env) {
config.output = {
...config.output, // copy all settings
filename: "static/js/[name].js",
chunkFilename: "static/js/[name].chunk.js",
};
return config;
};
use the new config. In the package.json file in scripts section replace "build": "react-scripts build", with "build": "react-app-rewired build",
Unless you are going to change more configuration, it is enough to only use react-app-rewired in build. Otherwise replace react-scripts with react-app-rewired in others scripts except eject
I've found that you can disable chunking by setting splitChunks webpack configuration. For more details check https://github.com/facebook/create-react-app/issues/5306#issuecomment-431431877
However, this does not remove the contenthash part from the bundle name and you will still have that random string in the name.
To remove this, go to your webpack.config and edit the bundle name
'static/js/[name].[contenthash:8].js' => 'static/js/[name].js'
This is extended and improved version of Darko's answer. I created it mostly to save time for others who is not fully satisfied with solution mentioned in this comment and didn't have a patience to dig to this comment that solved the issue in much nicer way.
Main idea of this "hacky" approach is to re-write standard react-scripts's webpack configuration on the fly and inject it back to original scripts.
For that you would need to install rewire package from npmjs.org, like so:
npm install rewire --save-dev
Then you create separate build script that will will "wrap" original react build script and make sure that it will relieve corrected webpack configuration. Conventional way is to save this file inside ./scripts folder. So let's call it ./scripts/build.js. It's content:
const rewire = require('rewire');
const path = require('path');
// Pointing to file which we want to re-wire — this is original build script
const defaults = rewire('react-scripts/scripts/build.js');
// Getting configuration from original build script
let config = defaults.__get__('config');
// If we want to move build result into a different folder, we can do that!
// Please note: that should be an absolute path!
config.output.path = path.join(path.dirname(__dirname), 'custom/target/folder');
// If we want to rename resulting bundle file to not have hashes, we can do that!
config.output.filename = 'custom-bundle-name.js';
// And the last thing: disabling splitting
config.optimization.splitChunks = {
cacheGroups: {
default: false,
},
};
config.optimization.runtimeChunk = false;
Then, we should use this build script instead of standard one in our packages.json, something like so:
...
"scripts": {
"start": "react-scripts start",
"build": "node ./scripts/build.js",
"test": "react-scripts test",
"eject": "react-scripts eject"
},
...
As others have pointed out you can try this with react-app-rewired instead of ejecting. Here is a version that also handles css and media files:
After installing npm install react-app-rewired --save-dev I created a config-overrides.js with the following content:
module.exports = function override(config, env) {
if (env !== "production") {
return config;
}
// Get rid of hash for js files
config.output.filename = "static/js/[name].js"
config.output.chunkFilename = "static/js/[name].chunk.js"
// Get rid of hash for css files
const miniCssExtractPlugin = config.plugins.find(element => element.constructor.name === "MiniCssExtractPlugin");
miniCssExtractPlugin.options.filename = "static/css/[name].css"
miniCssExtractPlugin.options.chunkFilename = "static/css/[name].css"
// Get rid of hash for media files
config.module.rules[1].oneOf.forEach(oneOf => {
if (!oneOf.options || oneOf.options.name !== "static/media/[name].[hash:8].[ext]") {
return;
}
oneOf.options.name = "static/media/[name].[ext]"
});
return config;
};
I don't know how to turn off chunking but what you could do try achieve you goal
Update to latest react and react-dom , run 'yarn react#next react-dom#next' (or npm command to do same)
You should now have the latest react versions - so you can code split using React.lazy/React.Suspense, use hooks and so on.
So now you can name your chunks using (component or dependency examples below)
const MyComp = lazy(() => import(/* webpackChunkName: 'MyChunkNmame'
*/ './MyComp'), );
const myLib= await import(/* webpackChunkName: "myLib" */ 'myLib');
If you have an issue with errors when using the import syntax you need to use the babel-plugin-syntax-dynamic-import plugin. Put the "babel" field in your package json.
Now you can name your chunks and implement the latest way to code split - hope that helps. Here is a link to React.lazy React.Suspense - https://reactjs.org/blog/2018/10/23/react-v-16-6.html
There is a hack without needing eject:
yarn add --dev rewire
create file in root and name it build-non-split.js
fill inside it by below codes:
const rewire = require('rewire');
const defaults = rewire('react-scripts/scripts/build.js');
let config = defaults.__get__('config');
config.optimization.splitChunks = {
cacheGroups: {
default: false,
},
};
config.optimization.runtimeChunk = false;
change the build script inside your package.json to:
"build": "node ./scripts/build-non-split.js",
yarn build
I am trying to build a different bundle based on an argument passed to webpack.
I have a create-react-app that I have ejected from and currently currently if I do npm run build it builds a bundle using webpack. As I have both an english and spanish version of the site I was hoping that I could pass an argument here. i.e. to build a Spanish version something like npm run build:es.
My webpack file currently just builds the English bundle. There is a separate process during the application to pull in translations, but during the building of the bundle it would be great if I could stipulate which language to build the bundle for.
Anyone have any ideas.
The relevant webpack code is below:
//default messages for translations
var defaultMessages = require('/translations/en.json');
//more webpack stuff......
{
test: /\.(js|jsx)$/,
loader: require.resolve('string-replace-loader'),
query: {
multiple: Object.keys(defaultMessages).map(key => ({
search: `__${key}__`,
replace: defaultMessages[key]
}))
}
},
Webpack can receive a --env argument, which is then passed to the webpack.config file. The key is to export a function returning the configuration from your webpack.config.js, not the raw configuration.
$ webpack --env=lang_es
And in webpack.config.js:
module.exports = function(env) {
if (env === 'lang_es') {
// ...
}
return {
module: {
// ...
},
entry: {
// ...
}
}
}
And in package.json:
"scripts": {
"build_es": "webpack --env=lang_es",
}
This was originally really meant to distinguish between build types, e.g. development or production, but it's just a string passed into the config file - you can give it any meaning you want.
As hinted in the comments, using environment variables is the second, webpack-independent, approach. You can set the environment variable directly in package.json's scripts section:
"scripts": {
"build_es": "BUILD_LANG=es webpack",
}
(Use cross-env to set the environment when developing on Windows).
And in webpack.config.js:
if (process.env.BUILD_LANG === 'es') {
// ...
}
This environment-based approach has been used in a few places already (for example Babel's BABEL_ENV variable), so I'd say that it has gathered enough mileage to consider it proven and tested.
Edit: fixed the cross-env part to mention that it's only necessary on Windows.
Is there a way to automatically zip certain files at the build time with Node.js and npm?
For example, I have a project, that file structure looks like this:
Project/
--lib/
--node_modules/
--test/
--index.js
--package.json
I want to be able to zip lib folder, certain modules from node_modules and index.js into some zip archive to upload it on the AWS Lambda, for example. I do not need test folder or test Node.js modules (mocha and chai) to be zipped. I have even created a bash script for generating zip file, but is there a way to automatically execute this script, when 'npm install' is called?
This should be a standard problem and it should have a standard solution, but I was unable to discover such.
UPDATE
thanks to michael, decided to use gulp. This is my script, in case some one else will need it for AWS Lambda:
var gulp = require('gulp');
var clean = require('gulp-clean');
var zip = require('gulp-zip');
var merge = require('merge-stream');
gulp.task('clean', function () {
var build = gulp.src('build', {read: false})
.pipe(clean());
var dist = gulp.src('dist', {read: false})
.pipe(clean());
return merge(build, dist);
});
gulp.task('build', function() {
var index = gulp.src('index.js')
.pipe(gulp.dest('build'));
var lib = gulp.src('lib/**')
.pipe(gulp.dest('build/lib'));
var async = gulp.src('node_modules/async/**')
.pipe(gulp.dest('build/node_modules/async'));
var collections = gulp.src('node_modules/collections/**')
.pipe(gulp.dest('build/node_modules/collections'));
var underscore = gulp.src('node_modules/underscore/**')
.pipe(gulp.dest('build/node_modules/underscore'));
var util = gulp.src('node_modules/util/**')
.pipe(gulp.dest('build/node_modules/util'));
var xml2js = gulp.src('node_modules/xml2js/**')
.pipe(gulp.dest('build/node_modules/xml2js'));
return merge(index, lib, async, collections, underscore, util, xml2js);
});
gulp.task('zip', ['build'], function() {
return gulp.src('build/*')
.pipe(zip('archive.zip'))
.pipe(gulp.dest('dist'));
});
gulp.task('default', ['zip']);
I realize this answer comes years too late for the original poster. But I had virtually the same question about packaging up a Lambda function, so for posterity, here's a solution that doesn't require any additional devDependencies (like gulp or grunt) and just uses npm pack along with the following package.json (but does assume you have sed and zip available to you):
{
"name": "my-lambda",
"version": "1.0.0",
"scripts": {
"postpack": "tarball=$(npm list --depth 0 | sed 's/#/-/g; s/ .*/.tgz/g; 1q;'); tar -tf $tarball | sed 's/^package\\///' | zip -#r package; rm $tarball"
},
"files": [
"/index.js",
"/lib"
],
"dependencies": {
"async": "*",
"collections": "*",
"underscore": "*",
"util": "*",
"xml2js": "*"
},
"bundledDependencies": [
"async",
"collections",
"underscore",
"util",
"xml2js"
],
"devDependencies": {
"chai": "*",
"mocha": "*"
}
}
Given the above package.json, calling npm pack will produce a package.zip file that contains:
index.js
lib/
node_modules/
├── async/
├── collections/
├── underscore/
├── util/
└── xml2js/
The files array is a whitelist of what to include. Here, it's just index.js and the lib directory.
However, npm will also automatically include package.json, README (and variants like README.md, CHANGELOG (and its variants), and LICENSE (and the alternative spelling LICENCE) unless you explicitly exclude them (e.g. with .npmignore).
The bundledDependencies array specifies what packages to bundle. In this case, it's all the dependencies but none of the devDependencies.
Finally, the postpack script is run after npm pack because npm pack generates a tarball, but we need to generate a zip for AWS Lambda.
A more detailed explanation of what the postpack script is doing is available at https://hackernoon.com/package-lambda-functions-the-easy-way-with-npm-e38fc14613ba (and is also the source of the general approach).
If you're UNIX-based you could also just use the zip command in one of your scripts:
"scripts": {
"zip": "zip -r build.zip build/"
"build": "build",
"build-n-zip": "build && zip
}
The above creates a build.zip at the root, which is a zipped up version of the /build folder.
If you wanted to zip multiple folders/files, just add them to the end:
"scripts": {
"zip": "zip -r build.zip build/ some-file.js some-other-folder/"
}
Note
If a build.zip already exists in the folder, the default behaviour is for zip to add files to that existing archive. So many people who are continuously building will probably want to delete the build.zip first:
"scripts": {
"zip": "rm -f build.zip && zip -r build.zip build",
"build": "build",
"build-n-zip": "yarn build && yarn zip"
}
I would go with gulp using gulp-sftp, gulp-tar and gulp-gzip and an alias as command. Create a file called .bash_aliases in your users home folder containing
alias installAndUpload='npm install && gulp runUploader'
After a reboot you can call both actions at once with this alias.
A gulp file could look something like this
var gulp = require('gulp');
var watch = require('gulp-watch');
var sftp = require('gulp-sftp');
var gzip = require('gulp-gzip');
gulp.task('runUploader', function () {
gulp.src('.path/to/folder/to/compress/**')
.pipe(tar('archive.tar'))
.pipe(gzip())
.pipe(gulp.dest('path/to/folder/to/store')) // if you want a local copy
.pipe(sftp({
host: 'website.com',
user: 'johndoe',
pass: '1234'
}))
});
Of course, you can also add gulp-watch to automatically create the tar/zip and upload it whenever there is a change in the directory.
You should take a look to npm scripts.
You'll still need a bash script laying around in your repository, but it will be automatically triggered by some npm tasks when they are executed.
npm-pack-zip worked for me.
npm install --save-dev npm-pack-zip
To publish the whole lambda using aws I used this node script in package.json:
"publish": "npm-pack-zip && aws lambda update-function-code --function-name %npm_package_name% --zip-file fileb://%npm_package_name%.zip && rm %npm_package_name%.zip"
You can use Zip-Build, this little package will use the data in your package.json file and create a compressed file named project-name_version.zip.
Disclaimer: I am a developer of this library.
How to use zip-build
Just install in your project as dev dependency with:
$ npm install --save-dev zip-build
Then modify the build script in your package.json, adding && zip-build at the end, like this:
"scripts": {
"build": your-build-script && zip-build
}
If your build directory is named different than build and your desired directory for compressed files is named different than dist, you can provide the directory names as arguments for zip-build:
"scripts": {
"build": your-build-script && zip-build build-dirname zip-dirname
}
If you need automate tasks take a look to Grunt or Gulp.
In the case of Grunt needed plugins:
https://www.npmjs.com/package/grunt-zip
https://www.npmjs.com/package/grunt-aws-lambda
Check out my gist at https://gist.github.com/ctulek/6f16352ebdfc166ce905
This uses gulp for all the tasks you mentioned except creating the lambda function initially (it only updates the code)
It assumes every lambda function is implemented in its own folder, and you need to define your AWS credential profile.