I am not used to work with Linux, so maybe this is a silly question. In the lab that I am working on, I am asked to run the command "npm install fs" to read/write files.
But here is the error that i have and I don't know how to solve it, i couldn't find similar problems too.
PS: I am working on Ubuntu 17.10.
fs is a module that doesn't have to be installed. It comes part of node.js core so you only need to require it in your code.
Instead of running npm install fs --save, just add the following line in the file you're wanting to use it in:
const fs = require('fs');
Here are the docs: https://nodejs.dev/the-nodejs-fs-module
However, to address your question, the error message your receiving is likely due to the fact that:
You don't have a package.json in your project
You are not in the correct directory when running npm install ... and therefore it cannot find your package.json.
I believe your mistake is that you skipped the part where you initialize your npm repository (which generates your package.json).
To initialize an npm repo, do the following:
Navigate to your project root directory
Run npm init and follow the instructions if when prompted.
Related
READ BEFORE ANSWER: I've already solved this issue. It was a caching issue on the npm servers. Everything works fine after switching to GitHub packages. I've already accepted my own answer.
I have a project, which I want to deploy to elastic beanstalk but sometimes the deploy fails on the npm install script with the following message:
npm ERR! code EINTEGRITY
npm ERR! Verification failed while extracting #my-package#^1.2.0:
npm ERR! Verification failed while extracting #my-package#^1.2.0:
npm ERR! sha512-lQ...HA== integrity checksum failed when using sha512: wanted sha512-lQ...HA== but got sha512-nH...ow==. (4835509 bytes)
It fails even on packages which are severel weeks old.
I’ve tried:
npm cache clean --force
npm cache verify
node_modules is in .npmignore
package-lock.json is in .npmignore
Writing a mail to support#npmjs.com, but they replying always with some helpless default replies without any solution or intention to help.
It fails even on new elastic beanstalk instances.
I have no idea how to solve this problem.
EDIT: I've also tried to delete the npm cache while preinstall script, but it doesn't work either.
EDIT2: My repo has no package-lock.json.
EDIT3: My .npmrc file has the following content
//registry.npmjs.org/:_authToken=${NPM_TOKEN}
unsafe-perm=true
package-lock=false
strict-ssl=false
EDIT4: I think it wasn't clear: It's a private package on the official npm registry. And it doesn't fail always. The current publish process includes several attempts to deploy on aws instance so long as it's succeed.
Have u try to delete package-lock.json?
OR
Try to delete npm and npm-cache folders
THEN
re-run npm install
Not exactly your case, but for those who run into the "integrity checksum failed" error the following might help. But first make sure you understand what's going on. npm tells you that the checksum from https://registry.npm.org doesn't match the one from package-lock.json. Either it changed in the registry, or...
Consider a line from the output:
npm ERR!
sha512-lQ...HA==
integrity checksum failed when using sha512: wanted
sha512-lQ...HA==
but got
sha512-nH...ow==
. (4835509 bytes)
Find the package in package-lock.json by the first two integrity checksums (sha512-lQ...HA==), and put the third one (sha512-nH...ow==) into its "integrity" field.
More on it here.
It seems to be a caching issue at the npm servers. We've switched from npm to GitHub packages, everything works fine there.
It could be that the version of NPM on these instances are out of date. Could you try either: npm install -g npm
Have you made sure that when this is deployed to beanstalk that the package-lock file is not on the instance? - If you have a bad lock file it needs to be deleted and re-generated.
Short of that, would need more information as you seem to have exhausted a lot of options.
This can happen if you request a version that is not available on the registry.
With #my-package#^1.2.0 you're requesting a version between >=1.2.0 and <2.0.0. Could it be that on this registry there is only a version that is older than 1.2.0 or newer than 2.0.0? Npm will install whatever it gets and not raise an error here.
You can check the version you get in an npm install by looking into node_modules/my-package/package.json.
If this is not happening when doing a local npm install, check wether the npm registry Amazon uses is containing your my-package package.
You could try to add the official npm registry to your Beanstalk project to check if it was the Amazon npm registry that did not contain your package. See How to use a private npm registry on Elastic Beanstalk? how to do this.
It seems to be a package-lock.json issue.
As in this answer
If you have not pushed package-lock.json in your repo, it will be generated while running npm install. So it is always better to add package-lock.json in the repo to avoid inconsistent package-lock.json files across local machine and deployment machine.
Could you please try pushing a fresh package-lock.json file to the repo and try?
In my case, as razki alludes to, the version of npm/node on the build server differed significantly from the version on the developer's local computer. Updating to a close enough version got rid of this problem.
For example:
The build server had: npm/6.13.4 node/v12.14.1
The developer has: npm/6.14.8 node/v14.15.1.
The build server now: npm/6.14.10 node/v14.15.4
It seems the different versions calculate the sha differently for the same package. This is why removing the package-lock.json file can work in this particular situation - at least for a while, until the computer with the different version tries to build the project again.
Basically its concern about npm registery, Some home npm registery has been updated to another url.
You can run below command to see npm registery
npm config get registry
It should be set it
https://registry.npmjs.org/
If its not then run below command
npm config set registry https://registry.npmjs.org/
It will set npm registery. Now you can try again for
npm i
and it will install package successfully.
I am new to node.js and Github. I was trying to save some work by using command git add -A and the then I saw these lines below and some many of the lines are just running non-stop. I typed ctrl+c to stop it, but anyone knows what are just happened or what did I do wrong??
Thanks
This is because of how git treats the space character.
Find more info here: https://stackoverflow.com/a/1967986/2874959
Thanks #bcorbella for the answer. Just a small precision to be sure you won't do this as a beginner but never add the node_modules into your git project. Create a .gitignore file with at least:
node_modules
Use npm init, npm install <module> --save to create a package.json... then do simply a npm install when you are checking your project.
More info in here https://docs.npmjs.com/getting-started/using-a-package.json
try setting the config core.eol to native and see if you will get the same error, i see no reason why you should be tracking the node_modules/ folder.
> git config --global core.eol native
I have a package (package-a) that depends on another package (package-b) which is not published to npm but is on my file system. When I run npm install from package-a, package-b's dependencies are not installed. I have to navigate to package-b's directory and run npm install manually. Is there a way to install both packages' dependencies with a single npm command?
Here's my directory structure:
/
...
shared/
...
javascript/
...
package-b/
package.json
package-a/
package.json
Per the docs, I placed the following in package-a/package.json. (I'm using npm 5+)
dependencies: {
package-b: "file:../shared/javascript/package-b",
}
When I navigate to /package-a and run npm install, it installs all of package-a's dependencies like normal and also copies the package-b directory to package-a/node_modules. This is presumably what lets me type require('package-b') instead of require('../shared/javascript/package-b') .
However, as I stated before, package-bs dependencies are not installed, so if I try to use package-a, I get an error when package-b is required because it is trying to use dependencies that do not exist locally.
Once again, to solve this, I can navigate to package-b and run npm-install, but I'm hoping there's a better solution as I may have many such sub packages and I'd like to avoid having to write a shell script to go install all my dependencies if I can do this with an npm command. (perhaps I just did something wrong and npm install should be working?)
Follow up question: when I run npm install from package-b's directory, the packages are installed there, but not in the version of package-b that got copied to /package-a/node_modules during the first npm install, yet everything still works. So now it seems like when I require('package-b') it is actually resolving to /shared/javascript/package-b and NOT /package-a/node_modules/package-b. So what's the point of copying the file in the first place?
Update
It turns out this is a bug in npm 5. It only occurrs when installing from a package-lock.json file. (Github Issue)
The files are (probably) not being copied, they're being symbolically linked (symlink). This essentially creates an alias/shortcut that looks like a real directory, but points to another path.
This is how the older npm link feature worked. The reason is the code stays "live"; Changes in the linked module are reflected whenever you run the module that's referencing them, meaning you don't have to npm update all the time.
After installing node-crawler in Node.js (not in the default directory) via the npm command, I tried to run the code in the "Usage" section but an error occurs when executing var Crawler = require("crawler"); and the VisualStudio Code debug console says Cannot find module 'crawler'.
Does it happen because I installed crawler in a custom location? How can I fix this?
npm install will install a package locally. (--save to have package appear in your dependencies.)
To have access to it from everywhere, you need to install it globally, using npm install -g
Maybe I found the solution. I replaced "crawler" in var Crawler = require("crawler"); with the path that points to the crawler.js file in the lib folder in node-modules, and now the code works. Maybe it happened because I installed crawler in a custom location and so VisualStudio couldn't find "crawler".
I just started using Laravel Mix which is using webpack. I'm having some issues resolving dependencies.
I tried to install l20n with npm install l20n, added it to my project by adding require('l20n'); and then I ran npm run dev only to be told the following:
ERROR Failed to compile with 1 errors
This dependency was not found:
* fs in ./~/l20n/dist/bundle/node/l20n.js
Alright, so I figured I had to install fs too, issuing npm install fs and then I ran npm run dev again, but I get the exact same message. What am I doing wrong?
Ok, I checked out the source and I think I know what the issue is. The lib you are using is supposed to run in a node environment.
So, in your webpack configuration add this:
target: 'node'
For more info on targets see this