I currently have a package.json file that includes this script:
"build": "webpack --inline --colors --progress --display-error-details --display-cached",
I also have a webpack.config.js at the root of my repository.
I'm trying to make my npm scripts less verbose by moving them to separate .js files. I've done some simple ones (clean with rimraf and a simple copy), but I'm struggling with calling webpack from a (node) javascript file. This is what I've tried:
// contents of ./build/compile.js
var webpack = require('webpack');
var webpackConfig = require('../webpack.config.js');
webpack(webpackConfig);
This does nothing. I've also tried:
// contents of ./build/compile.js
var webpack = require('webpack');
webpack({
inline: true,
colors: true,
// and so on
});
This also does nothing. Just calling webpack() also does nothing...
And by nothing, I mean it also doesn't throw an error.
So how can I call webpack, make it use my config file, but also pass along the flags like --inline --colors --progress ...?
Regarding progress, this answer worked for me..
https://stackoverflow.com/a/31069781
var ProgressPlugin = require('webpack/lib/ProgressPlugin')
var compiler = webpack(cfg)
compiler.apply(new ProgressPlugin((percentage, msg) => {
console.log((percentage * 100) + '%', msg);
}))
compiler.run((err, stats) => {
if (err) return reject(err);
resolve(stats)
})
Related
Here is my excel.js:
let test = async () => {
console.log(process.env.DATABASE_HOST);
.......
}
test();
Here is my package.json fragment:
"scripts": {
.............
"excel": "cross-env NODE_ENV=development node ./server/excel.js",
"test": "react-scripts test"
}
My .env.development is stored in the application root folder.
Here is my .env.development:
DATABASE_HOST=dbServer
When I execute the following command line in the application root folder:
npm run excel
It should return "dbServer", unfortunately, it returns undefined.
How can I fix it?
Install dotenv package, and require it require('dotenv').config()
I am setting up my React app project using create-react-app.
I was wondering if there is a way to turn-off the chunking mechanism that is built-in into the react scripts. The thing is that I need to fix the name of the bundle created on the build.
It can be done by extending your CRA with react-app-rewired package which allows you to modify webpack config.
Changes needed to remove hash in build file names.
Install react-app-rewired
npm install react-app-rewired --save-dev
create config-overrides.js file in your root folder (where package.json is)
place the following code to the config-overrides.js file. It keeps all CRA settings, only remove the hash part from filenames.
module.exports = function override(config, env) {
config.output = {
...config.output, // copy all settings
filename: "static/js/[name].js",
chunkFilename: "static/js/[name].chunk.js",
};
return config;
};
use the new config. In the package.json file in scripts section replace "build": "react-scripts build", with "build": "react-app-rewired build",
Unless you are going to change more configuration, it is enough to only use react-app-rewired in build. Otherwise replace react-scripts with react-app-rewired in others scripts except eject
I've found that you can disable chunking by setting splitChunks webpack configuration. For more details check https://github.com/facebook/create-react-app/issues/5306#issuecomment-431431877
However, this does not remove the contenthash part from the bundle name and you will still have that random string in the name.
To remove this, go to your webpack.config and edit the bundle name
'static/js/[name].[contenthash:8].js' => 'static/js/[name].js'
This is extended and improved version of Darko's answer. I created it mostly to save time for others who is not fully satisfied with solution mentioned in this comment and didn't have a patience to dig to this comment that solved the issue in much nicer way.
Main idea of this "hacky" approach is to re-write standard react-scripts's webpack configuration on the fly and inject it back to original scripts.
For that you would need to install rewire package from npmjs.org, like so:
npm install rewire --save-dev
Then you create separate build script that will will "wrap" original react build script and make sure that it will relieve corrected webpack configuration. Conventional way is to save this file inside ./scripts folder. So let's call it ./scripts/build.js. It's content:
const rewire = require('rewire');
const path = require('path');
// Pointing to file which we want to re-wire — this is original build script
const defaults = rewire('react-scripts/scripts/build.js');
// Getting configuration from original build script
let config = defaults.__get__('config');
// If we want to move build result into a different folder, we can do that!
// Please note: that should be an absolute path!
config.output.path = path.join(path.dirname(__dirname), 'custom/target/folder');
// If we want to rename resulting bundle file to not have hashes, we can do that!
config.output.filename = 'custom-bundle-name.js';
// And the last thing: disabling splitting
config.optimization.splitChunks = {
cacheGroups: {
default: false,
},
};
config.optimization.runtimeChunk = false;
Then, we should use this build script instead of standard one in our packages.json, something like so:
...
"scripts": {
"start": "react-scripts start",
"build": "node ./scripts/build.js",
"test": "react-scripts test",
"eject": "react-scripts eject"
},
...
As others have pointed out you can try this with react-app-rewired instead of ejecting. Here is a version that also handles css and media files:
After installing npm install react-app-rewired --save-dev I created a config-overrides.js with the following content:
module.exports = function override(config, env) {
if (env !== "production") {
return config;
}
// Get rid of hash for js files
config.output.filename = "static/js/[name].js"
config.output.chunkFilename = "static/js/[name].chunk.js"
// Get rid of hash for css files
const miniCssExtractPlugin = config.plugins.find(element => element.constructor.name === "MiniCssExtractPlugin");
miniCssExtractPlugin.options.filename = "static/css/[name].css"
miniCssExtractPlugin.options.chunkFilename = "static/css/[name].css"
// Get rid of hash for media files
config.module.rules[1].oneOf.forEach(oneOf => {
if (!oneOf.options || oneOf.options.name !== "static/media/[name].[hash:8].[ext]") {
return;
}
oneOf.options.name = "static/media/[name].[ext]"
});
return config;
};
I don't know how to turn off chunking but what you could do try achieve you goal
Update to latest react and react-dom , run 'yarn react#next react-dom#next' (or npm command to do same)
You should now have the latest react versions - so you can code split using React.lazy/React.Suspense, use hooks and so on.
So now you can name your chunks using (component or dependency examples below)
const MyComp = lazy(() => import(/* webpackChunkName: 'MyChunkNmame'
*/ './MyComp'), );
const myLib= await import(/* webpackChunkName: "myLib" */ 'myLib');
If you have an issue with errors when using the import syntax you need to use the babel-plugin-syntax-dynamic-import plugin. Put the "babel" field in your package json.
Now you can name your chunks and implement the latest way to code split - hope that helps. Here is a link to React.lazy React.Suspense - https://reactjs.org/blog/2018/10/23/react-v-16-6.html
There is a hack without needing eject:
yarn add --dev rewire
create file in root and name it build-non-split.js
fill inside it by below codes:
const rewire = require('rewire');
const defaults = rewire('react-scripts/scripts/build.js');
let config = defaults.__get__('config');
config.optimization.splitChunks = {
cacheGroups: {
default: false,
},
};
config.optimization.runtimeChunk = false;
change the build script inside your package.json to:
"build": "node ./scripts/build-non-split.js",
yarn build
I am a newbie to node js. I have different configs for different environments viz dev, prod etc. Currently while creating my app package I copy .json to config.json and then export config.json as config variable (global) and use it throughout the app.
config = require(__dirname + '/config/config');
(config.httpProxy && config.httpProxy.enabled);
I want to load specific env.json as part of environment variable (for dev dev.json's all keys are exported as global variable in app) instead of copying it into app's config.json, so that same app can be used in different env. How to do that.
PS: for application packaging support and dependency management I use gulp and npm.
Please help.
you can name your files like this:
config.development.json
config.production.json
config.test.json
Then load files as:
config = require(__dirname + '/config/config.' + process.env.NODE_ENV);
where process.env.NODE_ENV value can be development/production/test
you have to start your application as
NODE_ENV=development node app.js
for this to work.
I suggest you use this module called config it handles all your env config files.
https://www.npmjs.com/package/config
Just make a folder named config and makes files in it as :
1. development.json
2. qa.json
3. production.json
While starting server provide relevant environment as others mentioned.
Then you can use any property mentioned in your config files.
If you are running your project from your script then set NODE_ENV into your package.json file.
{
...
"scripts": {
"nodemon:server": "NODE_ENV=dev NODE_PATH=. nodemon --exec \"babel-node --stage 1\" server.js",
"prod:server": "NODE_ENV=prod NODE_PATH=. nodemon --exec \"babel-node --stage 1\" server.js"
},
"author": "'Shubham Batra'",
"license": "MIT",
....
}
"prod:server": "**NODE_ENV=prod** NODE_PATH=. nodemon --exec \"babel-node --stage 1\" server.js"
than use in config,js file e.g.
if (process.env.NODE_ENV === 'test' ) {
url = 'http://abc:3000';
dbUrl= 'xyz';
password = '***';
} else if (process.env.NODE_ENV === 'prod') {
url = 'http://def:3000';
...
}
export default {
conf: {
url,
dbUrl,
...
}
}
after that, you can import this config file anywhere in your project and use conf
How to run a npm script command from inside a gulp task?
package.json
"scripts":
{
"tsc": "tsc -w"
}
gulpfile.js
gulp.task('compile:app', function(){
return gulp.src('src/**/*.ts')
.pipe(/*npm run tsc*/)
.pipe(gulp.dest('./dist'))
.pipe(connect.reload());
});
I want to do this because running npm run tsc does not give me any error but if I use gulp-typescript to compile .ts then I get bunch of errors.
You can get the equivalent using gulp-typescript
var gulp = require('gulp');
var ts = require('gulp-typescript');
gulp.task('default', function () {
var tsProject = ts.createProject('tsconfig.json');
var result = tsProject.src().pipe(ts(tsProject));
return result.js.pipe(gulp.dest('release'));
});
gulp.task('watch', ['default'], function() {
gulp.watch('src/*.ts', ['default']);
});
Then on your package.json
"scripts": {
"gulp": "gulp",
"gulp-watch": "gulp watch"
}
Then run
npm run gulp-watch
Alternatively using shell
var gulp = require('gulp');
var shell = require('gulp-shell');
gulp.task('default', function () {
return gulp.src('src/**/*.ts')
.pipe(shell('npm run tsc'))
.pipe(gulp.dest('./dist'))
.pipe(connect.reload());
});
gulp-shell has been blacklisted you can see why here
Another alternative would be setting up webpack.
Wasted about 1 hour on this simple thing, looking for a ~complete answer, so adding another here:
If you question is only on typescript (tsc), see https://stackoverflow.com/a/36633318/984471
Else, see below for a generic answer.
The question title is generic, so a generic example is given below first, then the answer.
Generic example:
Install nodejs, if you haven't, preferably LTS version, from here: https://nodejs.org/
Install below:
npm install --save-dev gulp gulp-run
File package.json has below contents (other contents can be there):
{
"name": "myproject",
"scripts": {
"cmd1": "echo \"yay! cmd1 command is run.\" && exit 1",
}
}
Create a file gulpfile.js with below contents:
var gulp = require('gulp');
var run = require('gulp-run');
gulp.task('mywatchtask1', function () {
// watch for javascript file (*.js) changes, in current directory (./)
gulp.watch('./*.js', function () {
// run an npm command called `test`, when above js file changes
return run('npm run cmd1').exec();
// uncomment below, and comment above, if you have problems
// return run('echo Hello World').exec();
});
});
Run the task mywatchtask1 using gulp?
gulp mywatchtask1
Now, gulp is its watching for js file changes in the current directory
if any changes happen then the npm command cmd1 is run, it will print yay! cmd1 command is run. everytime the one of the js file changes.
For this question: as another example:
a) package.json will have
"tsc": "tsc -w",
instead of the below:
"cmd1": "echo \"yay! cmd1 command is run.\" && exit 1",
b) and, gulpfile.js will have:
return run('npm run tsc').exec();
instead of below:
return run('npm run cmd1').exec();
Hope that helps.
You can try to implement it using childprecess node package or
use https://www.npmjs.com/package/gulp-run
var run = require('gulp-run');
gulp.task('compile:app', function(){
return gulp.src(['src/**/*.js','src/**/*.map'])
.pipe(run('npm run tsc'))
.pipe(gulp.dest('./dist'))
.pipe(connect.reload());
});
Is there a way to automatically zip certain files at the build time with Node.js and npm?
For example, I have a project, that file structure looks like this:
Project/
--lib/
--node_modules/
--test/
--index.js
--package.json
I want to be able to zip lib folder, certain modules from node_modules and index.js into some zip archive to upload it on the AWS Lambda, for example. I do not need test folder or test Node.js modules (mocha and chai) to be zipped. I have even created a bash script for generating zip file, but is there a way to automatically execute this script, when 'npm install' is called?
This should be a standard problem and it should have a standard solution, but I was unable to discover such.
UPDATE
thanks to michael, decided to use gulp. This is my script, in case some one else will need it for AWS Lambda:
var gulp = require('gulp');
var clean = require('gulp-clean');
var zip = require('gulp-zip');
var merge = require('merge-stream');
gulp.task('clean', function () {
var build = gulp.src('build', {read: false})
.pipe(clean());
var dist = gulp.src('dist', {read: false})
.pipe(clean());
return merge(build, dist);
});
gulp.task('build', function() {
var index = gulp.src('index.js')
.pipe(gulp.dest('build'));
var lib = gulp.src('lib/**')
.pipe(gulp.dest('build/lib'));
var async = gulp.src('node_modules/async/**')
.pipe(gulp.dest('build/node_modules/async'));
var collections = gulp.src('node_modules/collections/**')
.pipe(gulp.dest('build/node_modules/collections'));
var underscore = gulp.src('node_modules/underscore/**')
.pipe(gulp.dest('build/node_modules/underscore'));
var util = gulp.src('node_modules/util/**')
.pipe(gulp.dest('build/node_modules/util'));
var xml2js = gulp.src('node_modules/xml2js/**')
.pipe(gulp.dest('build/node_modules/xml2js'));
return merge(index, lib, async, collections, underscore, util, xml2js);
});
gulp.task('zip', ['build'], function() {
return gulp.src('build/*')
.pipe(zip('archive.zip'))
.pipe(gulp.dest('dist'));
});
gulp.task('default', ['zip']);
I realize this answer comes years too late for the original poster. But I had virtually the same question about packaging up a Lambda function, so for posterity, here's a solution that doesn't require any additional devDependencies (like gulp or grunt) and just uses npm pack along with the following package.json (but does assume you have sed and zip available to you):
{
"name": "my-lambda",
"version": "1.0.0",
"scripts": {
"postpack": "tarball=$(npm list --depth 0 | sed 's/#/-/g; s/ .*/.tgz/g; 1q;'); tar -tf $tarball | sed 's/^package\\///' | zip -#r package; rm $tarball"
},
"files": [
"/index.js",
"/lib"
],
"dependencies": {
"async": "*",
"collections": "*",
"underscore": "*",
"util": "*",
"xml2js": "*"
},
"bundledDependencies": [
"async",
"collections",
"underscore",
"util",
"xml2js"
],
"devDependencies": {
"chai": "*",
"mocha": "*"
}
}
Given the above package.json, calling npm pack will produce a package.zip file that contains:
index.js
lib/
node_modules/
├── async/
├── collections/
├── underscore/
├── util/
└── xml2js/
The files array is a whitelist of what to include. Here, it's just index.js and the lib directory.
However, npm will also automatically include package.json, README (and variants like README.md, CHANGELOG (and its variants), and LICENSE (and the alternative spelling LICENCE) unless you explicitly exclude them (e.g. with .npmignore).
The bundledDependencies array specifies what packages to bundle. In this case, it's all the dependencies but none of the devDependencies.
Finally, the postpack script is run after npm pack because npm pack generates a tarball, but we need to generate a zip for AWS Lambda.
A more detailed explanation of what the postpack script is doing is available at https://hackernoon.com/package-lambda-functions-the-easy-way-with-npm-e38fc14613ba (and is also the source of the general approach).
If you're UNIX-based you could also just use the zip command in one of your scripts:
"scripts": {
"zip": "zip -r build.zip build/"
"build": "build",
"build-n-zip": "build && zip
}
The above creates a build.zip at the root, which is a zipped up version of the /build folder.
If you wanted to zip multiple folders/files, just add them to the end:
"scripts": {
"zip": "zip -r build.zip build/ some-file.js some-other-folder/"
}
Note
If a build.zip already exists in the folder, the default behaviour is for zip to add files to that existing archive. So many people who are continuously building will probably want to delete the build.zip first:
"scripts": {
"zip": "rm -f build.zip && zip -r build.zip build",
"build": "build",
"build-n-zip": "yarn build && yarn zip"
}
I would go with gulp using gulp-sftp, gulp-tar and gulp-gzip and an alias as command. Create a file called .bash_aliases in your users home folder containing
alias installAndUpload='npm install && gulp runUploader'
After a reboot you can call both actions at once with this alias.
A gulp file could look something like this
var gulp = require('gulp');
var watch = require('gulp-watch');
var sftp = require('gulp-sftp');
var gzip = require('gulp-gzip');
gulp.task('runUploader', function () {
gulp.src('.path/to/folder/to/compress/**')
.pipe(tar('archive.tar'))
.pipe(gzip())
.pipe(gulp.dest('path/to/folder/to/store')) // if you want a local copy
.pipe(sftp({
host: 'website.com',
user: 'johndoe',
pass: '1234'
}))
});
Of course, you can also add gulp-watch to automatically create the tar/zip and upload it whenever there is a change in the directory.
You should take a look to npm scripts.
You'll still need a bash script laying around in your repository, but it will be automatically triggered by some npm tasks when they are executed.
npm-pack-zip worked for me.
npm install --save-dev npm-pack-zip
To publish the whole lambda using aws I used this node script in package.json:
"publish": "npm-pack-zip && aws lambda update-function-code --function-name %npm_package_name% --zip-file fileb://%npm_package_name%.zip && rm %npm_package_name%.zip"
You can use Zip-Build, this little package will use the data in your package.json file and create a compressed file named project-name_version.zip.
Disclaimer: I am a developer of this library.
How to use zip-build
Just install in your project as dev dependency with:
$ npm install --save-dev zip-build
Then modify the build script in your package.json, adding && zip-build at the end, like this:
"scripts": {
"build": your-build-script && zip-build
}
If your build directory is named different than build and your desired directory for compressed files is named different than dist, you can provide the directory names as arguments for zip-build:
"scripts": {
"build": your-build-script && zip-build build-dirname zip-dirname
}
If you need automate tasks take a look to Grunt or Gulp.
In the case of Grunt needed plugins:
https://www.npmjs.com/package/grunt-zip
https://www.npmjs.com/package/grunt-aws-lambda
Check out my gist at https://gist.github.com/ctulek/6f16352ebdfc166ce905
This uses gulp for all the tasks you mentioned except creating the lambda function initially (it only updates the code)
It assumes every lambda function is implemented in its own folder, and you need to define your AWS credential profile.