What is the reason for running "rimraf dist" command in the build script in package.json file?
"scripts": {
"build": "rimraf dist ..."
},
The rimraf package which you will probably find in the devDependencies section of the package.json you have, is used to safely remove files and folders on all platforms (Unix and Windows).
When you have a build system, you want to also make sure you remove the output files before re-emitting the build output content (especially in case you have removed some files which are not needed anymore). You could go ahead and do:
"scripts": {
"clean": "rm ./dist/* -Recurse -Force"
}
But that will work on Windows only, and sometimes will also give you problems due to issues around the use of *. On Unix the command would be different. In order to make things better, rimraf does the removal for you so you can simply invoke it in your scripts/clean section:
"scripts": {
"clean": "rimraf dist"
}
This will ensure your package.json (your build system) to be cross-platform.
rimraf
A rm -rf util for nodejs
$ rimraf dist removes the dist file or folder.
I guess the build script puts stuff inside the dist directory and wants to remove the old stuff from the last time you build it.
To be sure that all old files are deleted from the folder before the script will compile the files.
What actually needs to be known is what the script rimraf dist is doing, and that's simple it's cleaning out the dist folder to give you a clean start on what is assumed to be a web app or TypeScript app that requires some level compilation.
But, it isn't limited to that alone. Rimraf is a package that allows you the power of rm -rf ./dist which is specific to Linux OS. As suggested in the above comment this command will give you problems on Windows, especially when there is a directory that isn't empty or the file path becomes too long (very common with nested npm dependencies). Rimraf will bypass those exceptions as if you're running on Linux and this becomes a very powerful tool.
Related
I am developing vue project and syncing dist folder with git. This worked well while using webpack. However, I have moved to #vue/cli --- using vue create myProject instead of vue init webpack myProj.
The issue is that every time i run npm run build, it deletes dist folder and recreates it -- all .git and other files gone.
How do I prevent new build from deleting required files in dist folder and only update the changes?
Assuming you have your own mechanism for cleaning up the old resources, vue-cli-service build comes with this option called --no-clean to instruct the compiler not to remove the "dist" directory before building the project.
So, add the switch/option to the build script on package.json:
{
"scripts": {
"build": "vue-cli-service build --no-clean"
}
}
Alternatively, if you use Yarn, you can pass additional argument(s) right after the script name. So there's no need to make any change to the script. To run it:
yarn build --no-clean
Thanks to answer by Yom S. the documentation here does provide way to keep older dist.
However, the you can't use --no-clean like npm build --no-clean. To use no clean mode from terminal you need to write following command instead
./node_modules/.bin/vue-cli-service --no-clean
Update
Instead you can also add --no-clean in package.json
First of all, I'm not after a technique to package my application into a single executable.
I'm curious to find out what is the approach to create a copy of the app within the current project that is ready to be moved to server-enabled location.
Right now my server app is a git repo and it has all the usual files in there, together with the source folder:
./src/server/index.js
Do we simply copy everything that is in the ./src/server folder to ./dist/ then also copy the package.json into ./dist?
Then we copy the contents of the ./dist folder to a location that will be able to serve the application, like /www/app2/ and inside that location, we make sure that we have NODE_ENV=production in the environment and run npm install to pull the production dependencies?
But then, our package.json file would still have the development related scripts and other things we don't need in production?
What is a best-practices way to deploy a NodeJs app?
--- UPDATE ---
This is what I have prototyped so far and it is working:
"scripts": {
"clean:dist": "./node_modules/.bin/rimraf dist",
"prep:dist": "./node_modules/.bin/mkdirp ./dist",
"copy:server": "./node_modules/.bin/ycopy ./src/server/ ./dist/ -r '^((?!tests$).)*$' -i",
"copy:package": "./node_modules/.bin/copyfiles package-production.json ./dist/",
"build": "npm run clean:dist && npm run prep:dist && npm run copy:server",
"start:dev": "nodemon src/server/index.js",
"start:server": "node dist/server/index.js",
"prompt": "echo 'No prompt functionality available'",
"greet": "echo 'Welcome to my project.'"
},
So the idea is to selectively move bits from the dev/src folder to a production ready dist folder. The idea behind having a simple package.json file is that we will not be needing the dev dependencies in there also we will not be needing most of the dev scripts as well. So probably something like the following will be enough:
"scripts": {
"setup:server": "NODE_ENV='production' && npm install"
"start:server": "pm2 start index.js"
}
... or maybe we would like to have some csh/bash scripts inside ./dist/bin that will streamline the start process.
"scripts": {
"start:server": "./bin/launcher"
}
I can definitely see a need for a custom project tree structure existing within the the ./dist folder and totally different to the ./src structure.
I am not sure why the "development" contents of your package.json are a problem, so perhaps I am not getting the crux of your problem. However, for our environment we deploy all of our node apps (primarily microservices) with ansible. The deployment package is just a tarball (could be a zip file). The ansible package includes templates for a config.js file and a pm2 startup.json file that are both customized based on the environment of the target (staging/test vs production).
Let me know if you want a few more details if you are interested in this approach.
Console output on npm run build failure:
'.' is not recognized as an internal or external command, operable
program or batch file.
And the relevant npm file:
"scripts": {
"postinstall": "jspm install && npm run build",
"start": "http-server",
"build": "./bin/build-code",
"build-home": "./bin/build-home -dw",
"build-common-deps": "./bin/build-common-deps -dw",
"build-navbar": "./bin/build-navbar -dw",
"build-root": "./bin/build-root -dw",
"build-angular1": "./bin/build-angular1 -dw",
"build-react": "./bin/build-react -w",
"build-preact": "./bin/build-preact -dw",
"build-vanilla": "./bin/build-vanillajs",
"build-angular2": "./bin/build-angular2 -dw"
}
Looks like it's not understanding the pathing to the ./bin/build-code script location. From what I understand, it looks for files from the package.json's location? So, if the app has a bin folder in the same dir as package.json, then this is the correct pathing to the build-code script, which is within the bin folder. What gives? Using PowerShell to run npm run build if it matters.
P.S. I tried with basic Command Prompt - no changes. Someone running the same build (both of us just pulled from repo) on Cygwin said they had to "Change dos endings to unix", which doesn't tell me much and doesn't seem to be the issue.
Looks to me like npm is invoking batch scripts. Batch files are run in CMD.exe (even when invoked from PowerShell), which doesn't recognize / as a path separator. That's where the error message comes from.
Replace the forward slashes with \ (or \\ if they require escaping).
I am trying to add a second part to my npm bundle script. The first part runs great, however I am trying to copy in 3 files along with the bundle.
So right now I have :
"bundle": "NODE_ENV=production webpack --output-file bundledFile.js && cp package.json dist/",
The NODE_ENV=production webpack --output-file bundledFile.js works great by itself. The part that is not working is the && cp package.json dist/, I would like the script to copy my package.json (along with 2 other files actually, but just starting with this one) to the dist folder. Brand new to these scripts, any idea how to fix? Appreciate any advice, thanks!
The syntax should work (and seems to, looking at your comments). I would suggest splitting your npm scripts across multiple points, though:
{
"bundle": "NODE_ENV=production webpack --output-file bundledFile.js",
"copy": "cp package.json dist/ && cp README.md dist/ && cp .npmrc dist/",
"build": "npm run bundle && npm run copy"
}
In order to be cross-platform compatible (cp is not typically available on windows), I would also suggest adding a build file somewhere such as ./tools/copy-distrubution-files.js which would make use of fs to copy the necessary files, then call it in the npm scripts with node ./tools/copy-distribution-files.js. That will be (mostly) platform independent (you still have to assume that node is available as the nodejs executable, but that seems fairly reasonable to me).
The quickest way for me was to reference powershell in a package.json script like this:
"copyFile": "#powershell copy './source/package.json' './deploy'",
If you're running on windows use the following command:
"copy": "copy \"package.json\" \"dist\" && copy \"README.md\" \"dist\" && copy \".npmrc\" \"dist\""
copy instead of cp.
Don't forget to use "" for each path (escape them with \ within the quoted command). and if you need to define a long path, don't use / (slashes) but \ (backslashes)
like:
copy "devices\\VS-88UT\\index.html" "devices\\VS-88UT\\dist"
Also, if you prefere ther is a nice plugin to run bash command before and after each build
To copy folders and files in windows just use
xcopy git\\* dist\\ /e /i /h
I think this might help someone.
I am trying to get away from using Grunt or Gulp in my projects. I thought that a good way to replace them is by using npm-scripts.
I know that npm-scripts leverages package.json, but I noticed that to run more advanced build processes, you need to include command line functions. However, this is not a cross-platform solution since Windows doesn't support the wide variety of commands that an OS like Linux supports.
So, I was wondering if you could just run npm-scripts on Node, and just reference any npm package you want to with a require statement.
Is this possible? If not, are there any good cross-platform solutions that exist for npm-scripts excluding Grunt and Gulp?
Yes you could run "npm-scripts on node". For instance I have this in my package.json (irrelevant parts are removed for clarity), and both rimraf and webpack are implemented in pure JS and interpreted by node.js. In fact rimraf is a good example of cross platform rm -Rf. This solution runs fine on windows, mac or linux boxes by just issuing npm run-script build.
{
"scripts": {
"build": "rimraf dist && webpack --config ./blah.js"
},
"devDependencies": {
"rimraf": "^2.5.0",
"webpack": "^1.12.10"
}
}
Or you could do something like:
"scripts": {
"hello": "node hello"
}
And implement everything you want in hello.js in the same dir as your package.json, and include whatever you need in that script like:
const hello = require("debug")("hello"); // require whatever module you need
console.log("hello world");
It would run just fine with npm run-script hello
> your-module#1.0.0 hello D:\dev\tmp
> node hello
hello world
I would stick with webpack and npm only. Using webpack is much simpler than writing custom grunt or gulp tasks, dev server will give a stable local enviornment to work, and it is just a config setup and you are golden. I would also move away from using global installs especially if you work team environment or are running continuous delivery. As you won't be able to control the global installation or installed version on any of these environments.