I am trying to get away from using Grunt or Gulp in my projects. I thought that a good way to replace them is by using npm-scripts.
I know that npm-scripts leverages package.json, but I noticed that to run more advanced build processes, you need to include command line functions. However, this is not a cross-platform solution since Windows doesn't support the wide variety of commands that an OS like Linux supports.
So, I was wondering if you could just run npm-scripts on Node, and just reference any npm package you want to with a require statement.
Is this possible? If not, are there any good cross-platform solutions that exist for npm-scripts excluding Grunt and Gulp?
Yes you could run "npm-scripts on node". For instance I have this in my package.json (irrelevant parts are removed for clarity), and both rimraf and webpack are implemented in pure JS and interpreted by node.js. In fact rimraf is a good example of cross platform rm -Rf. This solution runs fine on windows, mac or linux boxes by just issuing npm run-script build.
{
"scripts": {
"build": "rimraf dist && webpack --config ./blah.js"
},
"devDependencies": {
"rimraf": "^2.5.0",
"webpack": "^1.12.10"
}
}
Or you could do something like:
"scripts": {
"hello": "node hello"
}
And implement everything you want in hello.js in the same dir as your package.json, and include whatever you need in that script like:
const hello = require("debug")("hello"); // require whatever module you need
console.log("hello world");
It would run just fine with npm run-script hello
> your-module#1.0.0 hello D:\dev\tmp
> node hello
hello world
I would stick with webpack and npm only. Using webpack is much simpler than writing custom grunt or gulp tasks, dev server will give a stable local enviornment to work, and it is just a config setup and you are golden. I would also move away from using global installs especially if you work team environment or are running continuous delivery. As you won't be able to control the global installation or installed version on any of these environments.
Related
For example when I run gulp I don't have to do npx gulp.
I can omit the npx and just run gulp
How do I do this for my own package?
I've added mycommand to the package npm bin config, but I still always have to do npx mycommand for it to work.
It depends on your operating system. On a UNIX-based OS (ie. Linux or Mac) you can use the alias command:
$ alias gulp="npx gulp"
For the rest of your terminal session, you can then run:
$ gulp
to run npx gulp. However, whenever you restart your terminal program, you'll lose the alias.
To make the alias permanent, you need to add the alias command to the appropriate start-up file (eg. .bashrc, .profile, etc.) for your OS. Simply copy/paste the exact command you used before, at the end of that file, save, and restart your terminal. You'll have the alias permanently.
Aliases in Windows are also possible, but are a bit trickier; see Aliases in Windows command prompt.
You don't have to use npx gulp for gulp because it is installed globally
So, there's gulp.cmd, gulp.ps1 and gulp starting npx gulp somewhere in PATH, so you can run them from there
this answer is low effort, feel free to edit it
if you install gulp globally:
npm i -g gulp
you will be able to just run gulp
I would avoid answers suggesting global installs (npm install -g). Global installs guarantee that everyone working on a project has their own unique set of tools, and the reason companies end up with huge wiki pages on "how to get the project working locally".
You mention the "bin config", but perhaps you're misunderstanding that one. That's not for specifying commands that your app will use. If you were making a cli application, the bin section in package.json is where you would specify the commands exported by your project. For example, gulp exports the gulp command like:
"bin": {
"gulp": "./bin/gulp.js"
},
Instead, you would add dependencies needed by your application to devDependencies. This ensures that everyone using the project will get the same version of all of the tools, such as gulp, tsc
Using npx is a far better solution because you can add everything the project needs in devDependencies, and npx will use that version. For the common command line tools, an alias such as #machineghost suggests is a better way to go, but to expand on it a bit:
Sometimes, the name of the command is different from the name of the package. In those cases, the alias can use:
alias tsc='npx --package=typescript tsc'
Normally, when running npx, it will prompt to install if there is not a version found in the current project. This is often a good safeguard, because it reminds you to add it to the devDependencies of the project. However, if you really want it to "just work" like a global install, you can add the "yes" flag:
alias command=`npx -y gulp`
This will use the version specified in the current project if present, but install it and run it directly if not.
I got a big project that is a monorepo consisting of multiple scripts and libraries, its structure is the following:
package.json // "private":true
\packages
\comp1
\package.json // an actual component
\comp2
\package.json // an actual component
\comp3
\package.json // an actual component
I've made a monorepo.tgz using yarn pack.
Then I made a test app whose package.json look like this:
"scripts": {
// this is a script in one of the monorepo's components
"start": "ui-build --bundle --watch -p 3000"
}
"dependencies": {
"comp1": "../monorepo/monorepo.tgz",
"comp2": "../monorepo/monorepo.tgz",
"comp3": "../monorepo/monorepo.tgz",
...
but its not working, when I run start its complaining that ui-build: command not found.
How can I test this monorepo locally to simulate a published npm package as closely as possible?
Using npm link (or yarn link), you can 'install' the packages from your local development environment.
To do this, you first run npm link in the directory of the package you want to install, so in \packages\comp1. Then in your testapp, run npm link comp1. This will install your package. Repeat for any others you want to install.
More info:
https://docs.npmjs.com/cli/v6/commands/npm-link
https://classic.yarnpkg.com/en/docs/cli/link/
To import a file directly without using npm link or yarn link you have to prepend the path with file:. And I believe you would have to pack each file, but you can link directly to the path without packing it as well. Make sure to build it if you are linking directly to the local package folder.
For your example:
"comp1": "file:../monorepo/comp1.tgz",
"comp2": "file:../monorepo/comp2.tgz",
"comp3": "file:../monorepo/comp3.tgz",
or
"comp1": "file:../path/to/monorepo/packages/comp1",
"comp2": "file:../path/to/monorepo/packages/comp2",
"comp3": "file:../path/to/monorepo/packages/comp3",
After some research I've found that https://verdaccio.org/ is the best tool to test a library without deploying to an npm repository
What is the reason for running "rimraf dist" command in the build script in package.json file?
"scripts": {
"build": "rimraf dist ..."
},
The rimraf package which you will probably find in the devDependencies section of the package.json you have, is used to safely remove files and folders on all platforms (Unix and Windows).
When you have a build system, you want to also make sure you remove the output files before re-emitting the build output content (especially in case you have removed some files which are not needed anymore). You could go ahead and do:
"scripts": {
"clean": "rm ./dist/* -Recurse -Force"
}
But that will work on Windows only, and sometimes will also give you problems due to issues around the use of *. On Unix the command would be different. In order to make things better, rimraf does the removal for you so you can simply invoke it in your scripts/clean section:
"scripts": {
"clean": "rimraf dist"
}
This will ensure your package.json (your build system) to be cross-platform.
rimraf
A rm -rf util for nodejs
$ rimraf dist removes the dist file or folder.
I guess the build script puts stuff inside the dist directory and wants to remove the old stuff from the last time you build it.
To be sure that all old files are deleted from the folder before the script will compile the files.
What actually needs to be known is what the script rimraf dist is doing, and that's simple it's cleaning out the dist folder to give you a clean start on what is assumed to be a web app or TypeScript app that requires some level compilation.
But, it isn't limited to that alone. Rimraf is a package that allows you the power of rm -rf ./dist which is specific to Linux OS. As suggested in the above comment this command will give you problems on Windows, especially when there is a directory that isn't empty or the file path becomes too long (very common with nested npm dependencies). Rimraf will bypass those exceptions as if you're running on Linux and this becomes a very powerful tool.
On the face of it, mocha being in devDependencies like the tutorials say, is logical enough, it is after all a dev dependency.
But in practice you install it -g so you can run mocha as a command. And as far as I can tell, given that, it makes no difference at all whether it's mentioned in your package.json.
So is there any need to explicitly list it?
If you're working on an open-source project, one of your goals could be to allow other developers to be able to start contributing quickly.
One of the things that will help a lot is the possibility for a new developer to quickly be able to build and run your project, as well as run the tests. In order to do that, you can provide an easy way of installation of all the tools that a developer should have in order to contribute to your project.
This includes:
Build tools
Testing tools
Code quality tools (linter)
On the other hand, a user of your project is likely not going to need any of that, which is a good reason to split dependencies and devDependencies.
On top of that, it's useful to edit your package.json to provide useful scripts so that you can, for example, run npm test. It's common to specify something like:
{
...
"scripts": {
...
"test": "mocha -opts mocha.opts ...tests..."
}
}
Then npm test is going to run the specific mocha from your node_modules.
If you install it globally, that's a single version across all your projects.
If it's a dev dependency, each project can be using a version specific to that project, and the project can migrate to newer versions in a controlled way.
Pretty much the same argument as for having other modules loaded project-specific rather than globally.
Because you don't need to run mocha as a command. You can run it from node_modules like so: ./node_modules/.bin/mocha.
Npm has special support for this. If you have the following in package.json:
"scripts": {
"test": "mocha"
},
"devDependencies": {
"mocha": "*"
}
Then you can execute npm test even if you don't have mocha globally installed.
So, what's the use of this? First of all it's a nice thing to do if you collaborate with other developers - they don't need to do anything more than npm install to set up the development environment.
Second, and I think more useful, is this makes it easy to integrate your project with other tools like Travis etc.
I am trying to add a second part to my npm bundle script. The first part runs great, however I am trying to copy in 3 files along with the bundle.
So right now I have :
"bundle": "NODE_ENV=production webpack --output-file bundledFile.js && cp package.json dist/",
The NODE_ENV=production webpack --output-file bundledFile.js works great by itself. The part that is not working is the && cp package.json dist/, I would like the script to copy my package.json (along with 2 other files actually, but just starting with this one) to the dist folder. Brand new to these scripts, any idea how to fix? Appreciate any advice, thanks!
The syntax should work (and seems to, looking at your comments). I would suggest splitting your npm scripts across multiple points, though:
{
"bundle": "NODE_ENV=production webpack --output-file bundledFile.js",
"copy": "cp package.json dist/ && cp README.md dist/ && cp .npmrc dist/",
"build": "npm run bundle && npm run copy"
}
In order to be cross-platform compatible (cp is not typically available on windows), I would also suggest adding a build file somewhere such as ./tools/copy-distrubution-files.js which would make use of fs to copy the necessary files, then call it in the npm scripts with node ./tools/copy-distribution-files.js. That will be (mostly) platform independent (you still have to assume that node is available as the nodejs executable, but that seems fairly reasonable to me).
The quickest way for me was to reference powershell in a package.json script like this:
"copyFile": "#powershell copy './source/package.json' './deploy'",
If you're running on windows use the following command:
"copy": "copy \"package.json\" \"dist\" && copy \"README.md\" \"dist\" && copy \".npmrc\" \"dist\""
copy instead of cp.
Don't forget to use "" for each path (escape them with \ within the quoted command). and if you need to define a long path, don't use / (slashes) but \ (backslashes)
like:
copy "devices\\VS-88UT\\index.html" "devices\\VS-88UT\\dist"
Also, if you prefere ther is a nice plugin to run bash command before and after each build
To copy folders and files in windows just use
xcopy git\\* dist\\ /e /i /h
I think this might help someone.