Why the need for browserify `paths` definition? - javascript

The link https://github.com/jhades/angularjs-gulp-example/blob/master/gulpfile.js has the gulp build-js task definition using browserify paths. I don't understand the need for it... wasn't it possible just to specify the entries as entries: './js/**/*.js', and this would have caused it to search all the sub-directories as well... instead of explicitly specifying paths: ['./js/controllers', './js/services', './js/directives'], which are all sub-directories of the same parent?
Any hints appreciated.

The author is using the paths configuration to enable non-relative require calls like these:
require('todoCtrl');
require('todoStorage');
require('todoFocus');
require('todoEscape');
require('footer');
Browserify emulates Node's module resolution mechanism (which is explained here) and when Node resolves a non-relative require, it looks in node_modules. The paths option gives Browserify a list of paths that are not in node_modules that it should check (before checking node_modules) when attempting to resolve non-relative require calls.
If all of your require calls for modules in your own project use relative paths (e.g. require('./js/controllers/todoCtrl')), you won't need the paths configuration option.

Well, one simple answer seems to be the fact that **/* is not recognised by browserify! You would have to require("glob") to do that... but it's probably simpler just to use paths to specify the extra folders.

Related

Webpack can't resolve non-js `require`s from within node_modules

I've got a Next.js project configured to resolve imports that end in .web.js. This works outside of my node_modules directory. I did this by setting resolve.extensions = ['.web.js, '.js', '.jsx'] in my webpack config. I understand that this setting is responsible for resolving imports that don't have an extension, e.g. import _ from './component', when ./component.web.js exists.
I also have some node_modules that make use of this .web.js extension. They're private modules, but the idea stands. Let's say our node_modules looks like this. It may be worth noting that these modules have already been transpiled and as such use require rather than import.
- node_modules
- #foo
- bar.js
- baz.web.js
- baz.native.js
Now let's say that we have the following:
// bar.js
require("./baz");
If I try to import #foo/bar, the app will throw a module not found error on the line require("./baz"); saying that it can't be found. If I change it to require("./baz.web.js") or remove the line altogether then the app runs fine.
Why can webpack make these kind of resolutions outside of node_modules, but not within the directory? And how can I tell webpack to resolve those imports, too?
Depending on your module resolution strategy, you'll either find some files or not. Node.js resolves modules as outlined here. This means for you that require('./baz') is resolved to requesting /path/to/module/baz.js. Since your file is not actually named, it is not found. You can use require('./baz.web') instead.
As to whether Webpack can "automatically" handle which import to use, it probably comes down to using a plugin or having some sort of logic in bar.js to choose between baz.web and baz.native.

Jest: automock modules, but only those defined in __mocks__, rather than all

TL;DR
I would like to have some kind of automock feature enabled, but only (and only!) for the modules that I have explicitly defined in a corresponding __mocks__ folder. Is there a way for this in Jest?
General advices and suggestions are also welcome.
A bit of context: (optional)
Turns out I was totally misunderstanding Jests automock feature. Btw, looking back now, I don't understand why, 'cause docs are pretty clear on what it actually does:
This option tells Jest that all imported modules in your tests should be mocked automatically.
As if I just noticed ALL keyword. Maybe I was just thinking - but it doesn't makes sense to have an automock even for the imported function that I'm actually going to test here, does it? Like obviously I would like to automock third party stuff from node_modules, but not my own code. And it turns out that:
Note: Node modules are automatically mocked when you have a manual mock in place (e.g.: __mocks__/lodash.js).
Note: Core modules, like fs, are not mocked by default. They can be mocked explicitly, like jest.mock('fs')
So it's kind of doing the opposite of what I thought it was doing.
If I understand correctly, you have certain mock files in your __mocks__ folder which you would like to be globally replaced whenever they're needed as part of a test file imports.
Jest has a means of configuring that. If you go through the Jest Configuration Documentation, you'll find a property named moduleNameMapper which takes an object with the keys being regular expressions and the values being the paths for the mock for the corresponding modules which match the regex.
In your jest.config.js, you would need to add a separate parameter and specify the mock pattern and path.
moduleNameMapper: {
"/^bootstrap/": "<root>/tests/__mocks__/bootstrapMock.js",
"/^axios/": "<root>/tests/__mocks/axiosMock.js"
}
You can find more info about the usage in the Jest docs.
However, if you do not want to go through all this, this is also achievable by placing a __mocks__ folder next to your node_modules folder. However, as defined in the documentation for Manual Mocks:
If we want to mock Node's core modules (e.g.: fs or path), then explicitly calling e.g. jest.mock('path') is required, because core Node modules are not mocked by default.

How to find an ES6 import module without a relative path?

I have an ES6 import.
import MyAwesomeComponent from 'packageNameOnlyWithoutPath';
I want to inspect the file packageNameOnlyWithoutPath. But I can't find it. I looked in node_modules but I don't see it there. So it might be hiding out elsewhere in the app.
Is there a canonical way to find the path that leads to packageNameOnlyWithoutPath?
you might want to take a look at index.js file in the packageNameOnlyWithoutPath folder inside the node_modules.
Else use text editors which supports goToDefinition plugin
TL;DR: Check resolve aliases in Webpack (or similar bundler) config or .babelrc
There's two places you can check first.
If you are using a bundler like Webpack, resolve aliases can be declared in the Webpack config file (usually webpack.config.js).
But I have also recently started using pure babel and node. The reoslves can also be declared in the .babelrc file (cleaner approach IMHO).
You should find what you're looking for in one of the above.

Using Babel's `sourceRoot` Doesn't Affect Imports

Currently I can do:
require('./frontend/src/components/SomeComponent');
But if I set the following in my webpack.config.js:
resolve: {
root: path.resolve('frontend', 'src')
}
I can instead do:
require('components/SomeComponent');
The problem is, when I don't use Webpack (eg. in a test environment) all of my imports break. According to the Babel docs, the sourceRoot property sets the "root from which all sources are relative." This made me think I could add the following to my .babelrc to fix my imports:
"sourceRoot": "frontend/src"
... but no such luck. When I do require('components/SomeComponent'); in babel-node it fails. When I just use Babel to transpile the file, the require line is the same whether or not I set a sourceRoot.
So, my question is, is there any way (with or without sourceRoot) to simulate webpack's resolve.root in Babel?
P.S. I know there are several Babel plug-ins which address this problem, but all of the ones I've seen require you to add a ~ to the require path (which of course breaks imports in Webpack).
Many project have webpack + babel, and in many projects you sometimes bypass webpack (as in your case - for tests).
In such cases, all the resolve aliases should live in babel.
There are plugins out there to allow one reading the configuration of the other (and similar plugins for eslint etc.).

getting jquery load/get use the current require.js path

require.js allows to smartly redefine the "include path" so that you can decide to install your dependancies into arbitrary sub-directories, transparently for them.
Well, this is true for the recursive require() these might do, but unfortunately it doesn't seem to work for the jquery load/get they might do: these still refere to html path, so that datas cannot be moved together with their lib.js.
how to make jquery load/get refer to this current require path ?
or am I missusing require.js features ?
thanks !
You can use toUrl for this. (Documented: go here in the documentation and search for toUrl.)
You need to have require as a dependency of your module (define(["require", ...], function (require, ...) {...) and then you call require.toUrl(path) where path is a path relative to your module. This call will return a path which is the same as if the path argument were a module's path defined in your RequireJS configuration. In other words, it is a path which will be relative if your baseUrl in your configuration was relative, or absolute if baseUrl is absolute. At any rate, you should just be able to feed it to $().load or $.get.

Categories