My problem is that when I try to install a package of mine with the version set to a certain git branch my build is breaking because the repository doesn't house the dist folder which my npm package does.
Here is how I want to install the package.
#me/myPkg": "github:me/myPkg#myBranch
So should my repository include the dist folder or what's the best practice in this case? I don't want to publish my dist package onto github because then I would have weird merge conflicts because the dist folder is containg all of my type declaration and minified javascript files, but maybe this the preferred way?
Another thought I had was to have a postinstall script in my package.json which built the source files but then I would have to have the devDependencies installed as well and npm install doesn't install them for dependencies.
How are you publishing your package? For instance, I have a GitHub Actions setup on my repositories that, when commits are made to master, will automatically perform several steps
Run an npm install Setup Git Credential variables (for later
process)
Run npm test (and fail whole process if tests fail)
Run
npm run release (I use standard-version for auto version
incrementing and Changelog creation, and do this prior to build so
the version can be reflected in the built code)
Run npm run build
Run npm run publish to push to npm
Commit my version update as a
tag to Git
My GitHub Action does all of this in a temp container it spins up for the build process. Once it is done, the whole container is tossed. I use an .npmignore file to tell publish what I don't want in the final package (note files, build tools, etc).
Related
I download nodejs package by commit: npm install -g package_name
But I find some package downloaded has diffrent files with ther same package on Github.Why?
A developer creates a package when he thinks the code is stable, and continues developing on future updates.
So github code is Work in Progress code for the next update, npm package is the latest stable version of the package.
In above case, if you want to see the npm package code in github look for the tag with the package version.
Also files in npm package can be compiled code while github contains the source code. Using .gitignore a developer can exclude compiled files from github repo and using .npmignore a developer can exclude source files from npm package (not often the case, but can be done)
So the reason for difference in code can be one of these.
I have a package (package-a) that depends on another package (package-b) which is not published to npm but is on my file system. When I run npm install from package-a, package-b's dependencies are not installed. I have to navigate to package-b's directory and run npm install manually. Is there a way to install both packages' dependencies with a single npm command?
Here's my directory structure:
/
...
shared/
...
javascript/
...
package-b/
package.json
package-a/
package.json
Per the docs, I placed the following in package-a/package.json. (I'm using npm 5+)
dependencies: {
package-b: "file:../shared/javascript/package-b",
}
When I navigate to /package-a and run npm install, it installs all of package-a's dependencies like normal and also copies the package-b directory to package-a/node_modules. This is presumably what lets me type require('package-b') instead of require('../shared/javascript/package-b') .
However, as I stated before, package-bs dependencies are not installed, so if I try to use package-a, I get an error when package-b is required because it is trying to use dependencies that do not exist locally.
Once again, to solve this, I can navigate to package-b and run npm-install, but I'm hoping there's a better solution as I may have many such sub packages and I'd like to avoid having to write a shell script to go install all my dependencies if I can do this with an npm command. (perhaps I just did something wrong and npm install should be working?)
Follow up question: when I run npm install from package-b's directory, the packages are installed there, but not in the version of package-b that got copied to /package-a/node_modules during the first npm install, yet everything still works. So now it seems like when I require('package-b') it is actually resolving to /shared/javascript/package-b and NOT /package-a/node_modules/package-b. So what's the point of copying the file in the first place?
Update
It turns out this is a bug in npm 5. It only occurrs when installing from a package-lock.json file. (Github Issue)
The files are (probably) not being copied, they're being symbolically linked (symlink). This essentially creates an alias/shortcut that looks like a real directory, but points to another path.
This is how the older npm link feature worked. The reason is the code stays "live"; Changes in the linked module are reflected whenever you run the module that's referencing them, meaning you don't have to npm update all the time.
I am pretty new to Gulp and I've got a question: should I install gulp plugins (such as gulp-sass or gulp-imagemin) locally or globally? In examples around the web people mostly do it locally (with --save-dev option). As I understand, by doing so the modules are stored in the local node_modules folder, added in the local package.json as devDependencies and can be referenced to in the local gulpfile.js via require(). So, if I need to install the same modules for another project I work on, it may be accomplished by copying package.json to a new project's folder and typing in npm install in a command line tool (after getting to the project's folder). Fine.
But what if I, as a regular gulp user, don't have plans to upload and share my stuff on the npm space and not interested in maintaining devDependencies, can I in this case just install gulp plugins globally like npm install -g gulp-sass? Will they be found through a global path in my system? Is it an option on the whole, if I don't want to bother myself to copy package.json, run npm install every time I create a new project or have multiple copies of the same modules scattered around on my disk?
For plugins that you are using via gulp, what you have to do is install them locally to the project. Although you have to install gulp itself globally and locally, to run the file and then for the project to pick up the gulp based commands and functions.
The reason you should install the plugins via npm locally is so that it is specific to the project, for example if you then went to upload this project to your server or host on github then you would have to then go and have to globally install all of your packages again. If they are saved, they exist in your packages.json, this is so that when you go to run npm install to install all the packages for said project npm knows what to install.
If I can further clarify anything let me know.
A kind of "global install" for the typical case OP explained, could be to have a folder where you put your package.json with every dependencies you need for each of your projects, and run npm install from there.
And put here also your gulpfiles for each project (you could named them as you want), like gulpfile-project1.js, conf-project2.js... whatever.
Of course adjust your paths in those files accordingly.
And from this directory, run gulp using the -f flag:
gulp -f gulpfile-project1.js
So you end with this kind of scaffolding :
my_dev_directory
- project1
- dev
- main.less
- main.js
- www
- main.css
- main.js
- gulp
- node_modules
- package.json
- package-lock.json
- gulpfile-project1.js
In your gulpfiles, start to declare a const with the base path of your project to ease all path declarations, like:
const path_dev = '../project1/dev';
const path_www = '../project1/www';
This way you have only one node_modules directory installed on your disk and your projects directories contain only public files you want to upload, no dev files.
Of course one of the main drawback is that you share the same module version with all your projects, but it answered to OP's need and even if it's not recommended, not the "proper way to do", you are free to do as you want with your dev tools :)
I am doing an internship in a company.
I need to create a node server.
I installed node on the computer (Windows) and I should install some plugins like:
- nodejs-webpack
- colors
- uglify
Normally I need to enter a command like : npm install "theModule"
But the software can not access the internet (due to company restrictions) and support service can not authorize the software (or do not want).
Can I install modules in any other way ? (download from Google and slide archives in the correct folder for example).
If the answer is no, do you know how can i get around this security?
You need a private npm repository.
Check out this answer:
can you host a private repository for your organization to use with npm?
I found it !
Just for exemple, we will install 'nodejs-websocket' :
1) You just have to download it here.
2) Put files into your Node's directory (for me it's "C:\Program Files\nodejs\node_modules\npm\node_modules")
3) in your .js file just add this line : var ws = require("C:/Program Files/nodejs/node_modules/npm/node_modules/nodejs-websocket/")
Done ! Thanks for all :D
I added this as a comment on your own answer, but I figured I should add a real answer with a better explanation.
Normally when you run npm install package-name npm installs the package to a node_modules directory in the directory you are in at that moment. So if your app was located at C:\code\my-app then you would cd into that directory and run npm install package-name. This would create a node_modules directory at C:\code\my-app\node_modules if it didn't already exist. Then it would install package-name into that directory at C:\code\my-app\node_modules\package-name.
As long as the module is in the node_modules directory for your app, you can require the module in your code without entering a big long file path.
var ws = require('nodejs-websocket');
The place you manually installed your module is the global node_modules directory. It's where npm would install a module if you did npm install -g package-name. The global node_modules directory is added to your system path when you install npm. In other words, any modules you install in there would be accessible from the command line like any other command. For example:
npm install -g bower
That would install the "bower" package to the global npm module directory. Bower would then be accessible as a command line tool for you to use. For example:
bower install angularjs
The global directory is more for tools like that and not really for modules that you intend to use in your code. Technically you can require a module from anywhere by including the full path in the require call like you did, but the standard practice would be to place it in the node_modules directory in the root of your application, and then require it with just its name and not a full path.
Edit: Here's another tip that you might like to take advantage of as well.
When you normally install a module with npm install package-name, your application usually has a package.json file at the root of it. If it does, you can do npm install package-name --save and npm will add the package-name module to a list in your app's package.json file. Once a module is listed in your app's package.json file it's called a "dependency" of your app because it basically says your app depends on package-name.
Normally, when you have dependencies listed in package.json, you can completely delete your app's node_modules directory and then simply run npm install from within your app's root directory and npm will automatically install all dependencies it finds listed in your app's package.json file. Since your corporate firewall won't allow this automatic downloading of modules, you won't get that benefit. However it is still good practice to follow the same conventions.
A trick you can do to create your package.json file is to manually install your dependencies into your app's node_modules directory. Once you have the modules your app needs, you can instruct npm to create a package.json file for your app by simply running npm init. It will walk you through a few little prompts to fill out your package.json file with details about your app. It will then peek inside your node_modules directory to see if you've already installed any modules before having a package.json file. If it finds any, it will automatically add them to the dependencies field in the package.json file it creates :D
With a proper package.json in place you'll always have a nice tidy list of what dependencies your application needs, even if your node_modules directory gets deleted. Normally people add their app's node_modules directory to their .gitignore file, only checking in the package.json file. This way they don't store their dependencies in source control but they can still be easily installed on new machines that clone it by simply running npm install from inside the app's directory. In your case though you may want to just add node_modules to your source control since you can't let npm install dependencies automatically.
When I want to add a package (and check in the dependency into git), where does it belong - into package.json or into bower.json?
From what I gather,
running bower install will fetch the package and put it in /vendor directory,
running npm install it will fetch it and put it into /node_modules directory.
This SO answer says bower is for front-end and npm is for backend stuff.
Ember-app-kit seems to adhere to this distinction from the first glance... But instructions in gruntfile for enabling some functionality give two explicit commands, so I'm totally confused here.
Intuitively I would guess that
npm install --save-dev package-name would be equivalent to adding the package-name to my package.json
bower install --save package-name might be the same as adding the package to my bower.json and running bower install?
If that is the case, when should I ever install packages explicitly like that without adding them to the file that manages dependencies (apart from installing command line tools globally)?
Npm and Bower are both dependency management tools. But the main difference between both is npm is used for installing Node js modules but bower js is used for managing front end components like html, css, js etc.
A fact that makes this more confusing is that npm provides some packages which can be used in front-end development as well, like grunt and jshint.
These lines add more meaning
Bower, unlike npm, can have multiple files (e.g. .js, .css, .html, .png, .ttf) which are considered the main file(s). Bower semantically considers these main files, when packaged together, a component.
Edit: Grunt is quite different from Npm and Bower. Grunt is a javascript task runner tool. You can do a lot of things using grunt which you had to do manually otherwise. Highlighting some of the uses of Grunt:
Zipping some files (e.g. zipup plugin)
Linting on js files (jshint)
Compiling less files (grunt-contrib-less)
There are grunt plugins for sass compilation, uglifying your javascript, copy files/folders, minifying javascript etc.
Please Note that grunt plugin is also an npm package.
Question-1
When I want to add a package (and check in the dependency into git), where does it belong - into package.json or into bower.json
It really depends where does this package belong to. If it is a node module(like grunt,request) then it will go in package.json otherwise into bower json.
Question-2
When should I ever install packages explicitly like that without adding them to the file that manages dependencies
It does not matter whether you are installing packages explicitly or mentioning the dependency in .json file. Suppose you are in the middle of working on a node project and you need another project, say request, then you have two options:
Edit the package.json file and add a dependency on 'request'
npm install
OR
Use commandline: npm install --save request
--save options adds the dependency to package.json file as well. If you don't specify --save option, it will only download the package but the json file will be unaffected.
You can do this either way, there will not be a substantial difference.
Update for mid 2016:
The things are changing so fast that if it's late 2017 this answer might not be up to date anymore!
Beginners can quickly get lost in choice of build tools and workflows, but what's most up to date in 2016 is not using Bower, Grunt or Gulp at all! With help of Webpack you can do everything directly in NPM!
Google "npm as build tool" result:
https://medium.com/#dabit3/introduction-to-using-npm-as-a-build-tool-b41076f488b0#.c33e74tsa
Webpack: https://webpack.github.io/docs/installation.html
Don't get me wrong people use other workflows and I still use GULP in my legacy project(but slowly moving out of it), but this is how it's done in the best companies and developers working in this workflow make a LOT of money!
Look at this template it's a very up-to-date setup consisting of a mixture of the best and the latest technologies:
https://github.com/coryhouse/react-slingshot
Webpack
NPM as a build tool (no Gulp, Grunt or Bower)
React with Redux
ESLint
the list is long. Go and explore!
Your questions:
When I want to add a package (and check in the dependency into git),
where does it belong - into package.json or into bower.json
Everything belongs in package.json now
Dependencies required for build are in "devDependencies" i.e. npm install require-dir --save-dev (--save-dev updates your package.json by adding an entry to devDependencies)
Dependencies required for your application during runtime are in "dependencies" i.e. npm install lodash --save (--save updates your package.json by adding an entry to dependencies)
If that is the case, when should I ever install packages explicitly like that without adding them to the file that manages dependencies (apart from installing command line tools globally)?
Always. Just because of comfort. When you add a flag (--save-dev or --save) the file that manages deps (package.json) gets updated automatically. Don't waste time by editing dependencies in it manually. Shortcut for npm install --save-dev package-name is npm i -D package-name and shortcut for npm install --save package-name is npm i -S package-name