Setting Up Flow in An Existing Project - javascript

Am running Ubuntu 16.04.
I am setting up Flow as per the instructions in the link, however, on the next page (usage), where we are directed to run the command:
flow init
I get the error:
No command found...
Which makes sense as flow was not installed globally, but as a dev-dependency within the existing project directory. They also recommend & I quote:
Flow works best when installed per-project with explicit versioning rather than globally.
So, my question is, am I missing a step in the installation of flow, which is causing the error? Or should I go ahead & yarn add flow globally.
Flow Installation Instructions:
https://flow.org/en/docs/install/

Yarn will only install globally if you run # yarn global <add/bin/ls/remove/upgrade> [--prefix]. Using $ yarn add --dev flow-bin as the documentation states will be sufficient. Then, you should run $ yarn run flow.
The full instructions are here, and you can follow it with no problems.
You can also install it using npm instead of yarn:
$ npm install --save flow-bin
Edit
To get the flow init command to work you have to install Flow CLI globally, as the local flow binary won't be in your $PATH environment variable. The command is almost the same:
# npm install --global flow-bin
Alternatively you can execute the binary from within your local path. Something like: $ ./node_modules/.bin/flow init

Related

How do I run npx commands without prefixing `npx`

For example when I run gulp I don't have to do npx gulp.
I can omit the npx and just run gulp
How do I do this for my own package?
I've added mycommand to the package npm bin config, but I still always have to do npx mycommand for it to work.
It depends on your operating system. On a UNIX-based OS (ie. Linux or Mac) you can use the alias command:
$ alias gulp="npx gulp"
For the rest of your terminal session, you can then run:
$ gulp
to run npx gulp. However, whenever you restart your terminal program, you'll lose the alias.
To make the alias permanent, you need to add the alias command to the appropriate start-up file (eg. .bashrc, .profile, etc.) for your OS. Simply copy/paste the exact command you used before, at the end of that file, save, and restart your terminal. You'll have the alias permanently.
Aliases in Windows are also possible, but are a bit trickier; see Aliases in Windows command prompt.
You don't have to use npx gulp for gulp because it is installed globally
So, there's gulp.cmd, gulp.ps1 and gulp starting npx gulp somewhere in PATH, so you can run them from there
this answer is low effort, feel free to edit it
if you install gulp globally:
npm i -g gulp
you will be able to just run gulp
I would avoid answers suggesting global installs (npm install -g). Global installs guarantee that everyone working on a project has their own unique set of tools, and the reason companies end up with huge wiki pages on "how to get the project working locally".
You mention the "bin config", but perhaps you're misunderstanding that one. That's not for specifying commands that your app will use. If you were making a cli application, the bin section in package.json is where you would specify the commands exported by your project. For example, gulp exports the gulp command like:
"bin": {
"gulp": "./bin/gulp.js"
},
Instead, you would add dependencies needed by your application to devDependencies. This ensures that everyone using the project will get the same version of all of the tools, such as gulp, tsc
Using npx is a far better solution because you can add everything the project needs in devDependencies, and npx will use that version. For the common command line tools, an alias such as #machineghost suggests is a better way to go, but to expand on it a bit:
Sometimes, the name of the command is different from the name of the package. In those cases, the alias can use:
alias tsc='npx --package=typescript tsc'
Normally, when running npx, it will prompt to install if there is not a version found in the current project. This is often a good safeguard, because it reminds you to add it to the devDependencies of the project. However, if you really want it to "just work" like a global install, you can add the "yes" flag:
alias command=`npx -y gulp`
This will use the version specified in the current project if present, but install it and run it directly if not.

Can I import from node_modules of a module?

I'm not that great at JS/TS but I'm helping run a build system. I have a bit of code that was submitted, and I'm not sure if I should alter the build system or kick it back to the developer.
The code looks like this...
import * as googleAuth from "googleapis/node_modules/google-auth-library"
When I run yarn install the various modules are installed properly in ./node_modules/ including googleapis and google-auth-library. When I run yarn build I get something like this:
yarn run v1.19.1
$ rm -rf lib; ./node_modules/.bin/tsc -d -p tsconfig.npm.json
Error: src/foo/bar/common/oauth_helper.ts(3,29): error TS[23]: Cannot find module 'googleapis/node_modules/google-auth-library' or its corresponding type declarations.
When I test this out locally, I can make the error go away by doing cd ./node_modules/googleapis && yarn install but that installs a second set of all the modules needed there.
Is there something that I'm missing or should I tell the developer that he needs to import directly from goggle-auth-library. Furthermore if the developer is referencing this module explicitly I feel like he should list it as a dependency in package.json, not get it implicitly.
I would suggest to directly install google-auth-library instead of going through the node_modules of another package.
This might be risky because lets says in future googleapis package updates and for some reason removes dependency on google-auth-library this will break your app.

Good way to switch between 2 versions of the same dependency in package.json?

Turns out you can't have comments in JSON files, and it's a bit awkward to have people refer to some documentation telling them what line to copy/paste in and where in order to achieve this.
I think I can make a python script to copy/paste in one of two package.json files depending on what flags they pass in, but that feels overcomplicated.
I think I can include both dependencies (under different names) but that would create a requirement for both to be available, which is not good either.
Looking for ideas/thoughts on a good way to accomplish this. I have a release and dev version of the same dependency and I often need to swap between the two. Would like to improve the workflow beyond just having a notepad on the side with the two lines pasted in it...
yarn and npm already do this job, why not use them?
Releases
Tag the dev versions when you release them
yarn publish --tag dev dep
npm publish --tag dev dep
Then reference the tag at install time yarn install dep#dev.
Local
For local dependencies, npm and yarn provide the "link" command.
In your dependency dir run yarn link
In you project dir run yarn link dep
You could document the commands as scripts so easy to run or view.
"scripts" : {
"dep:local": "yarn link dep",
"dep:dev": "yarn install dep#dev",
"dep:latest": "yarn install dep#latest"
}

`npm install` is not installing local package's sub-dependencies

I have a package (package-a) that depends on another package (package-b) which is not published to npm but is on my file system. When I run npm install from package-a, package-b's dependencies are not installed. I have to navigate to package-b's directory and run npm install manually. Is there a way to install both packages' dependencies with a single npm command?
Here's my directory structure:
/
...
shared/
...
javascript/
...
package-b/
package.json
package-a/
package.json
Per the docs, I placed the following in package-a/package.json. (I'm using npm 5+)
dependencies: {
package-b: "file:../shared/javascript/package-b",
}
When I navigate to /package-a and run npm install, it installs all of package-a's dependencies like normal and also copies the package-b directory to package-a/node_modules. This is presumably what lets me type require('package-b') instead of require('../shared/javascript/package-b') .
However, as I stated before, package-bs dependencies are not installed, so if I try to use package-a, I get an error when package-b is required because it is trying to use dependencies that do not exist locally.
Once again, to solve this, I can navigate to package-b and run npm-install, but I'm hoping there's a better solution as I may have many such sub packages and I'd like to avoid having to write a shell script to go install all my dependencies if I can do this with an npm command. (perhaps I just did something wrong and npm install should be working?)
Follow up question: when I run npm install from package-b's directory, the packages are installed there, but not in the version of package-b that got copied to /package-a/node_modules during the first npm install, yet everything still works. So now it seems like when I require('package-b') it is actually resolving to /shared/javascript/package-b and NOT /package-a/node_modules/package-b. So what's the point of copying the file in the first place?
Update
It turns out this is a bug in npm 5. It only occurrs when installing from a package-lock.json file. (Github Issue)
The files are (probably) not being copied, they're being symbolically linked (symlink). This essentially creates an alias/shortcut that looks like a real directory, but points to another path.
This is how the older npm link feature worked. The reason is the code stays "live"; Changes in the linked module are reflected whenever you run the module that's referencing them, meaning you don't have to npm update all the time.

Best way to test global npm modules

Lets say I develop a global npm module called mytool that registers a env variable through "bin" in package.json with the name mytool.
So after I install it globally by typing
npm install mytool -g
then I can type mytool --someOption in terminal and handle the CLI input in javascript. Now lets assume that mytool works a lot with the current working directory of the CLI, so to just
node index.js --someOption
is a bad idea.
However to test for bugs I don't want to push a new version of "mytool" to npm and then install it globally from npm. Rather I want to be able to test this all locally.
Question: What is the best way to do test global npm modules without publishing to npm?
npm link
Contrary to what it seems, npm link can also be used in your case, too.
Just run it on the root folder of your project. No additional arguments are necessary.
Using it in you package folder will create a symlink in the global folder {prefix}/lib/node_modules/<package> that links to the package where the npm link command was executed.
It will also link any bins in the package to {prefix}/bin/{name}.

Categories