I am trying to import a module from hyphen like this: import { hyphenateHTMLSync } from "hyphen/fr"; in the script tag of a Svelte module but I get Error: 'hyphenateHTMLSync' is not exported by node_modules/hyphen/fr/index.js from rollup.
The module file in quesiton looks like this:
node_modules/hyphen/fr/index.js
module.exports = require("../export-contract.js")(
require("../patterns/fr.js")
);
node_modules/hyphen/export-contract.js
var createHyphenator = require("./hyphen.js");
module.exports = function (patterns) {
return {
hyphenate: createHyphenator(patterns, { async: true }),
hyphenateHTML: createHyphenator(patterns, { async: true, html: true }),
hyphenateHTMLSync: createHyphenator(patterns, { html: true }),
hyphenateSync: createHyphenator(patterns),
patterns: patterns
};
};
And hyphen.js contains the function to create hyphenator.
I do not know enough of Rollup, Svelte or even Node to know how to fix this.
Rollup requires extra plugins (#rollup/plugin-node-resolve and #rollup/plugin-commonjs) to deal with CommonJS modules, as explained here.
A very basic rollup example config using both plugins is given here.
In your particular use case, if you still have issues using the basic config, you'll probably want to dig into the dynamicRequireTargets option of the commonjs plugin.
Related
I'm working on an extension system for my web app. Third-party developers should be able to extend the app by providing named AMD modules exporting constants and functions following a predefined spec and bundled into a single .js JavaScript file.
Example JavaScript bundle:
define('module1', ['exports', 'module3'], (function (exports, module3) {
exports.spec = 'http://example.com/spec/extension/v1'
exports.onRequest = function (request) { return module3.respond('Hello, World.') }
}));
define('module2', ['exports', 'module3'], (function (exports, module3) {
exports.spec = 'http://example.com/spec/extension/v1'
exports.onRequest = function (request) { return module3.respond('Foo. Bar.') }
}));
define('module3', ['exports'], (function (exports) {
exports.respond = function (message) { return { type: 'message', message: message } }
}));
In the above example module1 and module2 are extension modules (identified by the spec export) and module3 is a shared dependency (e.g. coming from an NPM package). Extension bundles will be loaded in a worker within a sandboxed iframe to seal of the untrusted code in the browser.
Example TypeScript source:
// module1.ts
import respond from 'module3'
export const spec = 'http://example.com/spec/extension/v1'
export const onRequest = (request: Request): Response => respond('Hello, World.')
// module2.ts
import respond from 'module3'
export const spec = 'http://example.com/spec/extension/v1'
export const onRequest = (request: Request): Response => respond('Foo. Bar.')
// module3.ts
import dep from 'some-npm-package'
export respond = (message: string) => dep.createMessageObject(message)
Here is my list of requirements to bundling:
All necessary dependencies (e.g. shared module, NPM package logic) must be included in the bundle
The source code needs to be transpiled to browser compatible code if necessary
The AMD format is required by the custom extension loader implementation
The AMD modules must not be anonymous as the module file names are lost while bundling
No relative paths must be used among dependencies (e.g. ./path/to/module3 instead of module3)
The result should be one JavaScript bundle, thus ONE JavaScript file and ONE sourcemaps file
What's the easiest way to do this?
This is the closest solution I found using rollup and the following rollup.config.js:
import { nodeResolve } from '#rollup/plugin-node-resolve'
import { terser } from 'rollup-plugin-terser'
import typescript from '#rollup/plugin-typescript'
export default {
input: [
'src/module1.ts',
'src/module2.ts'
],
output: {
dir: 'dist',
format: 'amd',
sourcemap: true,
amd: {
autoId: true
}
},
plugins: [
typescript(),
nodeResolve(),
terser()
]
}
From this I get the desired named AMD modules (one for each entry point and chunk) in separate .js files. Problems:
Some dependencies are referenced by ./module3 while being named module3.
The modules appear in separate JavaScript and Sourcemap files instead of being concatenated into a single bundle.
Questions:
Is there an easy fix to the above rollup.config.js config to solve the problem?
I tried to write a small rollup plugin but I failed to get the final AMD module code within it to concatenate it to a bundle. Only the transpiled code is available to me. In addition I don't know how to handle sourcemaps during concatenation.
Is there an alternative to rollup better suited to this bundling scenario?
The big picture: Am I completely on the wrong track when it comes to building an extension system? Is AMD the wrong choice?
I found a way to extend the rollup.config.js mentioned in the question with a custom concatChunks rollup plugin to bundle multiple AMD chunks within a single file and having the source maps rendered, too. The only issue I didn't find an answer to was the relative module names that kept popping up. However, this may be resolved in the AMD loader.
Here's the full rollup.config.js that worked for me:
import Concat from 'concat-with-sourcemaps'
import glob from 'glob'
import typescript from '#rollup/plugin-typescript'
import { nodeResolve } from '#rollup/plugin-node-resolve'
import { terser } from 'rollup-plugin-terser'
const concatChunks = (
fileName = 'bundle.js',
sourceMapFileName = 'bundle.js.map'
) => {
return {
name: 'rollup-plugin-concat-chunks',
generateBundle: function (options, bundle, isWrite) {
const concat = new Concat(true, fileName, '\n')
// Go through each chunk in the bundle
let hasSourceMaps = false
Object.keys(bundle).forEach(fileId => {
const fileInfo = bundle[fileId]
if (fileInfo.type === 'chunk') {
let hasSourceMap = fileInfo.map !== null
hasSourceMaps = hasSourceMaps || hasSourceMap
// Concat file content and source maps with bundle
concat.add(
fileInfo.fileName,
fileInfo.code,
hasSourceMap ? JSON.stringify(fileInfo.map) : null
)
// Prevent single chunks from being emitted
delete bundle[fileId]
}
})
// Emit concatenated chunks
this.emitFile({
type: 'asset',
name: fileName,
fileName: fileName,
source: concat.content
})
// Emit concatenated source maps, if any
if (hasSourceMaps) {
this.emitFile({
type: 'asset',
name: sourceMapFileName,
fileName: sourceMapFileName,
source: concat.sourceMap
})
}
}
}
}
export default {
input: glob.sync('./src/*.{ts,js}'),
output: {
dir: 'dist',
format: 'amd',
sourcemap: true,
amd: {
autoId: true
}
},
plugins: [
typescript(),
nodeResolve(),
terser(),
concatChunks()
]
}
Please make sure you npm install the dependencies referenced in the import statements to make this work.
Considering the big picture, i.e. the extension system itself, I am moving away from a "one AMD module equals one extension/contribution" approach, as current developer tools and JavaScript bundlers are not ready for that (as this question shows). I'll go with an approach similar to the Visual Studio Code Extension API and will use a single "default" module with an activate export to register contributions a bundle has to offer. I hope that this will make extension bundling an easy task no matter what tools or languages are being used.
A user tries to use my package for nuxt.js, but gets the error: document is not defined.
I found the first issue. When I build the bundle with "build-bundle": "vue-cli-service build --target lib --name index ./src/index.js",
vue-style-loader is being used. This, however, results in the error for using nuxt projects. This part is failing:
function addStyle (obj /* StyleObjectPart */) {
var update, remove
var styleElement = document.querySelector('style[' + ssrIdKey + '~="' + obj.id + '"]')
Document is not defined since we are using server rendering. But the question is how can I build up my package so that I can use it with nuxt?
I need:
index.common.js
index.umd.js
index.umd.min.js
This is due to the server-side rendering. If you need to specify that you want to import a resource only on the client-side, you need to use the process.client variable.
For example, in your .vue file:
if (process.client) {
require('external_library')
// do something
}
The above is the fundamental solution to document is not defined.
I checked some information and found that, this problem is not caused by your package. In fact, the problem lies on the cache-loader package in the user’s nuxt project.
For some reason cache-loader incorrectly determined the current environment as browser and not node so that vue-style-loader is confused and used client implementation instead.
So try to let users add the following configuration to the nuxt.config.js file to disable stylesheet caches on server-side:
build: {
...
cache: true,
extend(config, { isServer, isDev, isClient }) {
...
if (isServer) {
for (const rules of config.module.rules.filter(({ test }) =>
/\.((c|le|sa|sc)ss|styl.*)/.test(test.toString())
)) {
for (const rule of rules.oneOf || []) {
rule.use = rule.use.filter(
({ loader }) => loader !== 'cache-loader'
)
}
}
}
...
}
...
}
I found a solution but it is not using the vue-cli service. Instead, the files are compiled by rollup. I found using the cli service much easier. The only problem with the cli service is it will adjust the "flow" of your repo. However, you can modify the rollup.config.js to amend the folder structure.
The problem with rollup is that it isn't webpack. Therefore, all components using a webpack configuration need to be adjusted or rollup.config.js needs to be amended to include the additional functionality
So I'm trying to set up Typescript and Chutzpah for testing purposes. Typescript is set up to output in this format:
define(['require', 'exports', './someModule'], function(require, exports, someModule) {
//examplecode
});
Which works fine, the problem occurs when someModule is actually a directory with an index.js.
/app
app.js
/someModule
index.js
require.js is unable to resolve someModule in this way and the test fails.
Is there any way to tell require.js that this is a module?
RequireJS won't automatically check for the presence of index.js and load that as your module. You need to tell RequireJS that when you want to load someModule, it should load someModule/index. I'd set a map in my call to require.config:
require.config({
[ ... ]
map: {
'*': {
someModule: 'someModule/index',
}
},
});
You have to adjust the name you give there so that it is a path relative to your baseUrl. It's not clear from the information you give in your question what it should be.
(For the record, there's also a packages setting that you could probably tweak to do what you want but putting something packages says "this is a package", which is not what you appear to have here. So I would not use it for what you are trying to do.)
I didn't like the configuration in map either. The most simple way I accomplished this was writing a plugin for require.
Let's name the plugin mod, where it is to be used as mod!module/someModule, you can also call it index as in index!module/someModule, whatever suits you best.
define(function(require, exports, module) {
// loading module/someModule/index.js with `mod!`
var someModule = require('mod!module/someModule');
// whatever this is about ..
module.exports = { .. };
});
So lets assume you have paths set in require's configuration with some sort of project structure:
- app
- modules
- someModule/index.js // the index we want to load
- someModule/..
- someModule/..
- etc
- plugins
- mod.js // plugin to load a module with index.js
Requires config:
require.config({
paths: {
'module': 'app/modules',
// the plugin we're going to use so
// require knows what mod! stands for
'mod': 'app/plugins/mod.js'
}
});
To read all the aspects of how to write a plugin, read the docs at requirejs.org. The simplest version would be to just rewrite the name of the requested "module" you are attempting to access and pass it back to load.
app/plugins/mod.js
(function() {
define(function () {
function parse(name, req) {
return req.toUrl(name + '/index.js');
}
return {
normalize: function(name, normalize) {
return normalize(name);
},
load:function (name, req, load) {
req([parse(name, req)], function(o) {
load(o);
});
}
};
});
})();
This is not production code, it's just a simple way to demonstrate that requires config wasn't meant to solve problems like this.
Suppose I have the following module:
var modulesReq = require.context('.', false, /\.js$/);
modulesReq.keys().forEach(function(module) {
modulesReq(module);
});
Jest complains because it doesn't know about require.context:
FAIL /foo/bar.spec.js (0s)
● Runtime Error
- TypeError: require.context is not a function
How can I mock it? I tried using setupTestFrameworkScriptFile Jest configuration but the tests can't see any changes that I've made in require.
I had the same problem, then I've made a 'solution'.
I'm pretty sure that this is not the best choice. I ended up stopping using it, by the points answered here:
https://github.com/facebookincubator/create-react-app/issues/517
https://github.com/facebook/jest/issues/2298
But if you really need it, you should include the polyfill below in every file that you call it (not on the tests file itself, because the require will be no global overridden in a Node environment).
// This condition actually should detect if it's an Node environment
if (typeof require.context === 'undefined') {
const fs = require('fs');
const path = require('path');
require.context = (base = '.', scanSubDirectories = false, regularExpression = /\.js$/) => {
const files = {};
function readDirectory(directory) {
fs.readdirSync(directory).forEach((file) => {
const fullPath = path.resolve(directory, file);
if (fs.statSync(fullPath).isDirectory()) {
if (scanSubDirectories) readDirectory(fullPath);
return;
}
if (!regularExpression.test(fullPath)) return;
files[fullPath] = true;
});
}
readDirectory(path.resolve(__dirname, base));
function Module(file) {
return require(file);
}
Module.keys = () => Object.keys(files);
return Module;
};
}
With this function, you don't need to change any require.context call, it will execute with the same behavior as it would (if it's on webpack it will just use the original implementation, and if it's inside Jest execution, with the polyfill function).
After spending some hours trying each of the answers above. I would like to contribute.
Adding babel-plugin-transform-require-context plugin to .babelrc for test env fixed all the issues.
Install - babel-plugin-transform-require-context here https://www.npmjs.com/package/babel-plugin-transform-require-context (available with yarn too)
Now add plugin to .babelrc
{
"env": {
"test": {
"plugins": ["transform-require-context"]
}
}
}
It will simply transform require-context for test env into dummy fn calls so that code can run safely.
If you are using Babel, look at babel-plugin-require-context-hook. Configuration instructions for Storybook are available at Storyshots | Configure Jest to work with Webpack's require.context(), but they are not Storyshots/Storybook specific.
To summarise:
Install the plugin.
yarn add babel-plugin-require-context-hook --dev
Create a file .jest/register-context.js with the following contents:
import registerRequireContextHook from 'babel-plugin-require-context-hook/register';
registerRequireContextHook();
Configure Jest (the file depends on where you are storing your Jest configuration, e.g. package.json):
setupFiles: ['<rootDir>/.jest/register-context.js']
Add the plugin to .babelrc
{
"presets": ["..."],
"plugins": ["..."],
"env": {
"test": {
"plugins": ["require-context-hook"]
}
}
}
Alternatively, add it to babel.config.js:
module.exports = function(api) {
api.cache(true)
const presets = [...]
const plugins = [...]
if (process.env.NODE_ENV === "test") {
plugins.push("require-context-hook")
}
return {
presets,
plugins
}
}
It may be worth noting that using babel.config.js rather than .babelrc may cause issues. For example, I found that when I defined the require-context-hook plugin in babel.config.js:
Jest 22 didn't pick it up;
Jest 23 picked it up; but
jest --coverage didn't pick it up (perhaps Istanbul isn't up to speed with Babel 7?).
In all cases, a .babelrc configuration was fine.
Remarks on Edmundo Rodrigues's answer
This babel-plugin-require-context-hook plugin uses code that is similar to Edmundo Rodrigues's answer here. Props to Edmundo! Because the plugin is implemented as a Babel plugin, it avoids static analysis issues. e.g. With Edmundo's solution, Webpack warns:
Critical dependency: require function is used in a way in which dependencies cannot be statically extracted
Despite the warnings, Edmundo's solution is the most robust because it doesn't depend on Babel.
Extract the call to a separate module:
// src/js/lib/bundle-loader.js
/* istanbul ignore next */
module.exports = require.context('bundle-loader?lazy!../components/', false, /.*\.vue$/)
Use the new module in the module where you extracted it from:
// src/js/lib/loader.js
const loadModule = require('lib/bundle-loader')
Create a mock for the newly created bundle-loader module:
// test/unit/specs/__mocks__/lib/bundle-loader.js
export default () => () => 'foobar'
Use the mock in your test:
// test/unit/specs/lib/loader.spec.js
jest.mock('lib/bundle-loader')
import Loader from 'lib/loader'
describe('lib/loader', () => {
describe('Loader', () => {
it('should load', () => {
const loader = new Loader('[data-module]')
expect(loader).toBeInstanceOf(Loader)
})
})
})
Alrighty! I had major issues with this and managed to come to a solution that worked for me by using a combination of other answers and the Docs. (Took me a good day though)
For anyone else who is struggling:
Create a file called bundle-loader.js and add something like:
module.exports = {
importFiles: () => {
const r = require.context(<your_path_to_your_files>)
<your_processing>
return <your_processed_files>
}
}
In your code import like:
import bundleLoader from '<your_relative_Path>/bundle-loader'
Use like
let <your_var_name> = bundleLoader.importFiles()
In your test file right underneath other imports:
jest.mock('../../utils/bundle-loader', () => ({
importFiles: () => {
return <this_will_be_what_you_recieve_in_the_test_from_import_files>
}
}))
Installing
babel-plugin-transform-require-context
package and adding the plugin in the .babelrc resolved the issue for me.
Refer to the documentation here:
https://www.npmjs.com/package/babel-plugin-transform-require-context
The easiest and fastest way to solve this problem will be to install require-context.macro
npm install --save-dev require-context.macro
then just replace:
var modulesReq = require.context('.', false, /\.js$/);
with:
var modulesReq = requireContext('.', false, /\.js$/);
Thats it, you should be good to go!
Cheers and good luck!
Implementation problems not mentioned:
Jest prevents out-of-scope variables in mock, like __dirname.
Create React App limits Babel and Jest customization. You need to use src/setupTests.js which is run before every test.
fs is not supported in the browser. You will need something like browserFS. Now your app has file system support, just for dev.
Potential race condition. Export after this import. One of your require.context imports includes that export. I'm sure require takes care of this, but now we are adding a lot of fs work on top of it.
Type checking.
Either #4 or #5 created undefined errors. Type out the imports, no more errors. No more concerns about what can or can't be imported and where.
Motivation for all this? Extensibility. Keeping future modifications limited to one new file. Publishing separate modules is a better approach.
If there's an easier way to import, node would do it. Also this smacks of premature optimization. You end up scrapping everything anyways because you're now using an industry leading platform or utility.
If you're using Jest with test-utils in Vue.
Install these packages:
#vue/cli-plugin-babel
and
babel-plugin-transform-require-context
Then define babel.config.js at the root of the project with this configuration:
module.exports = function(api) {
api.cache(true);
const presets = [
'#vue/cli-plugin-babel/preset'
];
const plugins = [];
if (process.env.NODE_ENV === 'test') {
plugins.push('transform-require-context');
}
return {
presets,
plugins
};
};
This will check if the current process is initiated by Jest and if so, it mocks all the require.context calls.
I faced the same issue with an ejected create-react-app project
and no one from the answers above helped me...
My solution were to copy to config/babelTransform.js the follwoing:
module.exports = babelJest.createTransformer({
presets: [
[
require.resolve('babel-preset-react-app'),
{
runtime: hasJsxRuntime ? 'automatic' : 'classic',
},
],
],
plugins:["transform-require-context"],
babelrc: false,
configFile: false,
});
Simpleset Solution for this
Just Do
var modulesReq = require.context && require.context('.', false, /\.js$/);
if(modulesReq) {
modulesReq.keys().forEach(function(module) {
modulesReq(module);
});
}
So Here I have added extra check if require.context is defined then only execute By Doing this jest will no longer complain
I'm writing some frontend code with ECMAScript 6 (transpiled with BabelJS, and then browserified with Browserify) so that I can have a class in one file, export it and import it in another file.
The way I'm doing this is:
export class Game {
constructor(settings) {
...
}
}
And then on the file that imports the class I do:
import {Game} from "../../lib/pentagine_browserified.js";
var myGame = new Game(settings);
I then compile it with grunt, this is my Gruntfile:
module.exports = function(grunt) {
"use strict";
grunt.loadNpmTasks('grunt-babel');
grunt.loadNpmTasks('grunt-browserify');
grunt.initConfig({
"babel": {
options: {
sourceMap: false
},
dist: {
files: {
"lib/pentagine_babel.js": "lib/pentagine.js",
"demos/helicopter_game/PlayState_babel.js": "demos/helicopter_game/PlayState.js"
}
}
},
"browserify": {
dist: {
files: {
"lib/pentagine_browserified.js": "lib/pentagine_babel.js",
"demos/helicopter_game/PlayState_browserified.js": "demos/helicopter_game/PlayState_babel.js"
}
}
}
});
grunt.registerTask("default", ["babel", "browserify"]);
};
However, on the new Game( call, I get the following error:
Uncaught TypeError: undefined is not a function
As so, what I did was analyse the generated code by Babel and Browserify and I found this line on PlayState_browserified.js:
var Game = require("../../lib/pentagine_browserified.js").Game;
I decided to print the require output:
console.log(require("../../lib/pentagine_browserified.js"));
And it is nothing but an empty object. I decided to check out the pentagine_browserified.js file:
var Game = exports.Game = (function () {
It seems like it is correctly exporting the class, but for some other reason it is not being required on the other file.
Also, I'm sure the file is being required properly because changing the string "../../lib/pentagine_browserified.js" spits out a Not Found error, so it is going for the right file, that I'm sure about.
Browserify is meant to be fed a single "entry point" file, through which it recursively traverses all of your require statements, importing the code from other modules. So you should be require'ing the _babel.js versions of modules, not _browserified.js ones.
From the looks of it, you intend for your app's "entry point" to be demos/helicopter_game/PlayState_browserified.js, yeah? If that's the case:
In PlayState.js, change it to import {Game} from "../../lib/pentagine_babel.js";.
In Gruntfile.js, remove "lib/pentagine_browserified.js": "lib/pentagine_babel.js".
Works for me. Let me know if that suffices or I am misunderstanding your requirements here.
P.S. You can use babelify to avoid having separate Grunt tasks for Babel and Browserify. See my answer here for an example.
I had a slightly different file configuration, that gave me some difficulty to get the "require" syntax to work in Node, but this post gave me the hint on how to used the babel-ified version of the file name.
I am using WebStorm with the FileWatcher option set to Babel, and I have the FileWatcher configured to watch all files with suffix .jsx, and rename the compiled output file from {my_file}.jsx to {my_file}-compiled.js.
So in my test case, I have 2 files:
Person.jsx:
class Person { ... }
export { Person as default}
and another file that wants to import it:
Test.jsx:
var Person = require('./Person-compiled.js');
I couldn't get the "require" statement to find the module until I started the file path with './' and also add '-compiled.js' to properly specify the file name so that Node es5 could find the module.
I was also able to use the "import" syntax:
import Person from './Person-compiled.js';
Since I have set up my WebStorm project as a Node ES5 project, I have to run 'Test-compiled.js' (not 'Test.jsx').