Webpack config for publishing vue component into an npm package - javascript

here's my main js code for the component
import './sass/main.scss'
import Vlider from './Vlider.vue'
function install(Vue) {
if (install.installed) return;
install.installed = true;
Vue.component('vlider', Vlider);
}
const plugin = {
install,
};
let GlobalVue = null;
if (typeof window !== 'undefined') {
GlobalVue = window.Vue;
} else if (typeof global !== 'undefined') {
GlobalVue = global.Vue;
}
if (GlobalVue) {
GlobalVue.use(plugin);
}
Vlider.install = install;
export default Vlider
can anyone help me with the webpack config? I need to output 4 files from this index.js
dist/vlider.umd.js
dist/vlider.esm.js
dist/vlider.min.js
vlider.css
so that it can support multiple entry points in package.json
"main": "dist/vlider.umd.js",
"module": "dist/vlider.esm.js",
"unpkg": "dist/vlider.min.js",
"browser": "src/vlider.vue"
this is my first time dealing with webpack so your help will be greatly appreciated

Webpack does not support esmodule as the output target. You can create commonjs module or umd/iife module.
If you need ESModule as a target then consider using Rollup.js. In general, you should use Rollup when you need to bundle library and Webpack for application bundling. (Note: Rollup is great but TypeScript + .Vue files + Class-based vue component syntax does not work.)
Also, irrespective of Webpack or Rollup, you can use array-based/multiple-targets export. Refer following links for more details:
Webpack: https://webpack.js.org/concepts/targets/#multiple-targets
Rollup: https://rollupjs.org/guide/en#configuration-files

Related

Bundle multiple named AMD modules with dependencies into one JS file (building a web app extension system)

I'm working on an extension system for my web app. Third-party developers should be able to extend the app by providing named AMD modules exporting constants and functions following a predefined spec and bundled into a single .js JavaScript file.
Example JavaScript bundle:
define('module1', ['exports', 'module3'], (function (exports, module3) {
exports.spec = 'http://example.com/spec/extension/v1'
exports.onRequest = function (request) { return module3.respond('Hello, World.') }
}));
define('module2', ['exports', 'module3'], (function (exports, module3) {
exports.spec = 'http://example.com/spec/extension/v1'
exports.onRequest = function (request) { return module3.respond('Foo. Bar.') }
}));
define('module3', ['exports'], (function (exports) {
exports.respond = function (message) { return { type: 'message', message: message } }
}));
In the above example module1 and module2 are extension modules (identified by the spec export) and module3 is a shared dependency (e.g. coming from an NPM package). Extension bundles will be loaded in a worker within a sandboxed iframe to seal of the untrusted code in the browser.
Example TypeScript source:
// module1.ts
import respond from 'module3'
export const spec = 'http://example.com/spec/extension/v1'
export const onRequest = (request: Request): Response => respond('Hello, World.')
// module2.ts
import respond from 'module3'
export const spec = 'http://example.com/spec/extension/v1'
export const onRequest = (request: Request): Response => respond('Foo. Bar.')
// module3.ts
import dep from 'some-npm-package'
export respond = (message: string) => dep.createMessageObject(message)
Here is my list of requirements to bundling:
All necessary dependencies (e.g. shared module, NPM package logic) must be included in the bundle
The source code needs to be transpiled to browser compatible code if necessary
The AMD format is required by the custom extension loader implementation
The AMD modules must not be anonymous as the module file names are lost while bundling
No relative paths must be used among dependencies (e.g. ./path/to/module3 instead of module3)
The result should be one JavaScript bundle, thus ONE JavaScript file and ONE sourcemaps file
What's the easiest way to do this?
This is the closest solution I found using rollup and the following rollup.config.js:
import { nodeResolve } from '#rollup/plugin-node-resolve'
import { terser } from 'rollup-plugin-terser'
import typescript from '#rollup/plugin-typescript'
export default {
input: [
'src/module1.ts',
'src/module2.ts'
],
output: {
dir: 'dist',
format: 'amd',
sourcemap: true,
amd: {
autoId: true
}
},
plugins: [
typescript(),
nodeResolve(),
terser()
]
}
From this I get the desired named AMD modules (one for each entry point and chunk) in separate .js files. Problems:
Some dependencies are referenced by ./module3 while being named module3.
The modules appear in separate JavaScript and Sourcemap files instead of being concatenated into a single bundle.
Questions:
Is there an easy fix to the above rollup.config.js config to solve the problem?
I tried to write a small rollup plugin but I failed to get the final AMD module code within it to concatenate it to a bundle. Only the transpiled code is available to me. In addition I don't know how to handle sourcemaps during concatenation.
Is there an alternative to rollup better suited to this bundling scenario?
The big picture: Am I completely on the wrong track when it comes to building an extension system? Is AMD the wrong choice?
I found a way to extend the rollup.config.js mentioned in the question with a custom concatChunks rollup plugin to bundle multiple AMD chunks within a single file and having the source maps rendered, too. The only issue I didn't find an answer to was the relative module names that kept popping up. However, this may be resolved in the AMD loader.
Here's the full rollup.config.js that worked for me:
import Concat from 'concat-with-sourcemaps'
import glob from 'glob'
import typescript from '#rollup/plugin-typescript'
import { nodeResolve } from '#rollup/plugin-node-resolve'
import { terser } from 'rollup-plugin-terser'
const concatChunks = (
fileName = 'bundle.js',
sourceMapFileName = 'bundle.js.map'
) => {
return {
name: 'rollup-plugin-concat-chunks',
generateBundle: function (options, bundle, isWrite) {
const concat = new Concat(true, fileName, '\n')
// Go through each chunk in the bundle
let hasSourceMaps = false
Object.keys(bundle).forEach(fileId => {
const fileInfo = bundle[fileId]
if (fileInfo.type === 'chunk') {
let hasSourceMap = fileInfo.map !== null
hasSourceMaps = hasSourceMaps || hasSourceMap
// Concat file content and source maps with bundle
concat.add(
fileInfo.fileName,
fileInfo.code,
hasSourceMap ? JSON.stringify(fileInfo.map) : null
)
// Prevent single chunks from being emitted
delete bundle[fileId]
}
})
// Emit concatenated chunks
this.emitFile({
type: 'asset',
name: fileName,
fileName: fileName,
source: concat.content
})
// Emit concatenated source maps, if any
if (hasSourceMaps) {
this.emitFile({
type: 'asset',
name: sourceMapFileName,
fileName: sourceMapFileName,
source: concat.sourceMap
})
}
}
}
}
export default {
input: glob.sync('./src/*.{ts,js}'),
output: {
dir: 'dist',
format: 'amd',
sourcemap: true,
amd: {
autoId: true
}
},
plugins: [
typescript(),
nodeResolve(),
terser(),
concatChunks()
]
}
Please make sure you npm install the dependencies referenced in the import statements to make this work.
Considering the big picture, i.e. the extension system itself, I am moving away from a "one AMD module equals one extension/contribution" approach, as current developer tools and JavaScript bundlers are not ready for that (as this question shows). I'll go with an approach similar to the Visual Studio Code Extension API and will use a single "default" module with an activate export to register contributions a bundle has to offer. I hope that this will make extension bundling an easy task no matter what tools or languages are being used.

How to test React Native Module?

I developed a React Native module (wrapping an SDK) and I’m interested in creating some unit tests using mocha. I’m not very familiar with mocha, but I can’t exactly figure out how to proceed.
I have my react native module, call it react-native-mymodule which I can use in an app by doing:
npm install react-native-mymodule
react-native link react-native-mymodule
Then I can import my module with:
import MySDK from "react-native-mymodule”;
I’m trying to do a similar thing with unit tests. In my root directory I have a test/ directory which is where I want to hold all my unit tests.
My simple test file in test/sdk.tests.js
import MySDK from "react-native-mymodule”;
var assert = require('assert');
describe(‘MySDK’, function() {
describe('#indexOf()', function() {
it('should return -1 when the value is not present', function() {
assert.equal([1, 2, 3].indexOf(4), -1);
});
});
});
I’ve tried modifying a tutorial I found online on compiling modules, but haven’t had any luck. This is a file test/setup.js:
import fs from 'fs';
import path from 'path';
import register from 'babel-core/register';
const modulesToCompile = [
'react-native-mymodule’
].map((moduleName) => new RegExp(`${moduleName}`));
const rcPath = path.join(__dirname, '..', '.babelrc');
const source = fs.readFileSync(rcPath).toString();
const config = JSON.parse(source);
config.ignore = function(filename) {
if (!(/\/node_modules\//).test(filename)) {
return false;
} else {
return false;
}
}
register(config);
.babelrc in the root level of my module
{
"presets": ["flow", "react-native"],
"plugins": [
["module-resolver", {
"root": [ "./js/" ]
}]
]
}
I have a test/mocha.opts file:
--require babel-core/register
--require test/setup.js
I’m invoking mocha with: ./node_modules/mocha/bin/mocha and I get an error:
Error: Cannot find module 'react-native-mymodule'
Can anyone advise me on the best way to test react native modules?
If you want to test native modules, I suggest the following:
1. E2E Tests
Node.js standalone cannot interpret native modules. If you want to test native modules in the context of your app, you want to create e2e tests using appium/webdriverio instead of writing unit tests with mocha.
With this, you actually start an emulator with your app installed.
Resources:
http://appium.io/docs/en/about-appium/intro/?lang=de
https://medium.com/jetclosing-engineering/react-native-device-testing-w-appium-node-and-aws-device-farm-295081129790
https://medium.com/swlh/automation-testing-using-react-native-and-appium-on-ubuntu-ddfddc0c29fe
https://webdriver.io/docs/api/appium.html
2. Unit Tests
If you want to write unit tests for your native module, write them in the Language the Native Module is written in
Resources:
https://www.swiftbysundell.com/basics/unit-testing/
https://junit.org/junit5/
https://developer.apple.com/documentation/xctest
https://cmake.org/cmake/help/latest/manual/ctest.1.html
Other than that, you have to mock the modules.
https://jestjs.io/docs/en/mock-functions
https://sinonjs.org/releases/latest/mocks/
https://www.npmjs.com/package/mock-require

How can I get PurifyCSSPlugin to remove my unused css in Angular6?

I'm trying to remove a lot of unused css within my sass files in my Angular 6 project.
I'm learned that there is a webpack plugin called PurifyCss.
Currently now, I'm unable to eject the webpack config in my angular project so I'm using ngw to help add the necessary plugins (to angular's webpack config) needed to extract the unused css in my sass files.
ngw.config.ts
import * as webpack from 'webpack';
import { Path } from '#angular-devkit/core';
import { NormalizedBrowserBuilderSchema } from '#angular-devkit/build-angular';
import * as PurifyCSSPlugin from 'purifycss-webpack';
import * as path from 'path';
import * as glob from 'glob';
export type WebpackOptions<T = NormalizedBrowserBuilderSchema> = {
root: Path,
projectRoot: Path,
options: T;
};
const command = process.argv[2].toLowerCase();
export default function (config: webpack.Configuration, options: WebpackOptions) {
if (command === 'test') {
console.log('Test configuration is running');
}
console.log('To modify webpack build, you can use ngw.config.ts');
console.log('check path:', glob.sync(path.join(__dirname, 'src/**/*.html')));
config.plugins.push(
new PurifyCSSPlugin({
// This was suggested to help it actually remove the css from: https://github.com/webpack-contrib/purifycss-webpack/issues/54
// Although there is an error of: Error: data['resolveExtensions'] should NOT have additional properties
resolveExtensions: ['.html', '.js'],
// This causes the build to run but does not remove the unused css
paths: glob.sync(path.join(__dirname, 'src/**/*.html'))
}),
);
return config;
}
Using the paths property alone doesn't work and it was suggested to add resolveExtensions from here.
Although this leads to the error below when doing a ngw prod build:
Error: data['resolveExtensions'] should NOT have additional properties
How can I configure the PurifyCSSPlugin to remove unused css within sass files in an Angular-6-cli environement?
Note: I don't have much experience with webpack, so I'm not sure if this config only works with css files instead of scss files. (If so please correct me).
Thanks

How to trick Node.js to load .js files as ES6 modules?

Node.JS 10 added experimental support for loading ES6 modules, which already work in browsers. That would mean that we could finally use exactly the same files for Node.JS and browsers without any transpiling or polyfills.
Except we can't. Node.js requires .mjs extension for files to be loaded as modules. I tried tricking node by using a symlink, but node got around it:
D:\web\lines>node --experimental-modules ES6test.mjs
(node:7464) ExperimentalWarning: The ESM module loader is experimental.
D:\web\lines\ES6test.js:6
import myLibrary from "./MyFile.mjs";
^^^^^^^^^^^^^^^
I can't think of any other workaround to make this work - which really renders the whole ES6 module support useless.
Can anybody else think of some trick to make Node.js ignore the extension?
You can now import .js file in node v12.x, in 2 steps:
Add the following line in your package.json file:
// package.json
{
"type": "module"
}
Add --experimental-modules flag before the script:
node --experimental-modules index.js
Reference: https://nodejs.org/api/esm.html
Node.js requires all ES modules should have .mjs extension. Since Node.js support of ES modules is experimental, this is subject to change. A proposal and open pull request are expected to address this problem with package.json esm flag and --mode option.
Currently this can be solved with custom ES module loader that hooks into default module resolver and changes module type for some modules:
custom-loader.mjs
import path from 'path';
const ESM_WITH_JS_EXT = './MyFile.js'; // relative to loader path
const ESM_WITH_JS_EXT_URL = new URL(path.dirname(import.meta.url) + `/${ESM_WITH_JS_EXT}`).href;
export function resolve(specifier, parentModuleURL, defaultResolver) {
const resolvedModule = defaultResolver(specifier, parentModuleURL);
if (resolvedModule.url === ESM_WITH_JS_EXT_URL)
resolvedModule.format = 'esm';
return resolvedModule;
}
It is used as:
node --experimental-modules --loader ./custom-loader.mjs ./index.mjs
Since there are fundamental differences in how ES and CommonJS modules are evaluated, the changes should be limited to modules that need them.
I solved exactly this problem with the fabulous esm package. You can enable dynamic (smart) esm module loading package wide, or per run with a flag like this:
node -r esm your/es6/module.js
It also has options to treat every file as a es6 module, or only those ending in '.mjs'. There are other packages out there, but this one just worked.
Import and export modules using ES6 that work with Node.js
Name files with .mjs extension instead of .js
Create files
touch main.mjs lib.mjs
main.js
import { add } from './lib.mjs';
console.log(add(40, 2));
lib.mjs
export let add = (x,y) => {
return x + y
}
Run
node --experimental-modules main.js
Here is a module that does what you need esmjs.mjs
import { readFileSync } from 'fs'
import { fileURLToPath, pathToFileURL } from 'url'
import { dirname, join } from 'path'
export const jsmodule = (test_url_or_path, module_path) => {
const __filename = test_url_or_path.toLowerCase().startsWith('file:')
? fileURLToPath(test_url_or_path)
: test_url_or_path
const __dirname = dirname(__filename)
const abs_path = join(__dirname, module_path)
const file_url = pathToFileURL(abs_path)
const file_buf = readFileSync(file_url)
const b64 = file_buf.toString('base64')
const moduleData = "data:text/javascript;base64," + b64
return import(moduleData)
}
Usage from .mjs module:
const { hey } = await jsmodule(import.meta.url, '../../test-data/mjs.js')
Usage, from .js file:
const { hey } = await jsmodule(__filename, '../../test-data/mjs.js')
Reference & tests on Github
You can do it in this way:
Create a file module2.mjs (the name is up to you)
'use strict';
export function foo() {
return 'foo';
}
Create index.mjs file:
'use strict';
import { foo } from './module2.mjs';
console.log(foo());
Using node 8.11.1 (or 10) you can run it as (command and output provided):
node --experimental-modules index.mjs
(node:60428) ExperimentalWarning: The ESM module loader is experimental.
foo

How can I mock Webpack's require.context in Jest?

Suppose I have the following module:
var modulesReq = require.context('.', false, /\.js$/);
modulesReq.keys().forEach(function(module) {
modulesReq(module);
});
Jest complains because it doesn't know about require.context:
FAIL /foo/bar.spec.js (0s)
● Runtime Error
- TypeError: require.context is not a function
How can I mock it? I tried using setupTestFrameworkScriptFile Jest configuration but the tests can't see any changes that I've made in require.
I had the same problem, then I've made a 'solution'.
I'm pretty sure that this is not the best choice. I ended up stopping using it, by the points answered here:
https://github.com/facebookincubator/create-react-app/issues/517
https://github.com/facebook/jest/issues/2298
But if you really need it, you should include the polyfill below in every file that you call it (not on the tests file itself, because the require will be no global overridden in a Node environment).
// This condition actually should detect if it's an Node environment
if (typeof require.context === 'undefined') {
const fs = require('fs');
const path = require('path');
require.context = (base = '.', scanSubDirectories = false, regularExpression = /\.js$/) => {
const files = {};
function readDirectory(directory) {
fs.readdirSync(directory).forEach((file) => {
const fullPath = path.resolve(directory, file);
if (fs.statSync(fullPath).isDirectory()) {
if (scanSubDirectories) readDirectory(fullPath);
return;
}
if (!regularExpression.test(fullPath)) return;
files[fullPath] = true;
});
}
readDirectory(path.resolve(__dirname, base));
function Module(file) {
return require(file);
}
Module.keys = () => Object.keys(files);
return Module;
};
}
With this function, you don't need to change any require.context call, it will execute with the same behavior as it would (if it's on webpack it will just use the original implementation, and if it's inside Jest execution, with the polyfill function).
After spending some hours trying each of the answers above. I would like to contribute.
Adding babel-plugin-transform-require-context plugin to .babelrc for test env fixed all the issues.
Install - babel-plugin-transform-require-context here https://www.npmjs.com/package/babel-plugin-transform-require-context (available with yarn too)
Now add plugin to .babelrc
{
"env": {
"test": {
"plugins": ["transform-require-context"]
}
}
}
It will simply transform require-context for test env into dummy fn calls so that code can run safely.
If you are using Babel, look at babel-plugin-require-context-hook. Configuration instructions for Storybook are available at Storyshots | Configure Jest to work with Webpack's require.context(), but they are not Storyshots/Storybook specific.
To summarise:
Install the plugin.
yarn add babel-plugin-require-context-hook --dev
Create a file .jest/register-context.js with the following contents:
import registerRequireContextHook from 'babel-plugin-require-context-hook/register';
registerRequireContextHook();
Configure Jest (the file depends on where you are storing your Jest configuration, e.g. package.json):
setupFiles: ['<rootDir>/.jest/register-context.js']
Add the plugin to .babelrc
{
"presets": ["..."],
"plugins": ["..."],
"env": {
"test": {
"plugins": ["require-context-hook"]
}
}
}
Alternatively, add it to babel.config.js:
module.exports = function(api) {
api.cache(true)
const presets = [...]
const plugins = [...]
if (process.env.NODE_ENV === "test") {
plugins.push("require-context-hook")
}
return {
presets,
plugins
}
}
It may be worth noting that using babel.config.js rather than .babelrc may cause issues. For example, I found that when I defined the require-context-hook plugin in babel.config.js:
Jest 22 didn't pick it up;
Jest 23 picked it up; but
jest --coverage didn't pick it up (perhaps Istanbul isn't up to speed with Babel 7?).
In all cases, a .babelrc configuration was fine.
Remarks on Edmundo Rodrigues's answer
This babel-plugin-require-context-hook plugin uses code that is similar to Edmundo Rodrigues's answer here. Props to Edmundo! Because the plugin is implemented as a Babel plugin, it avoids static analysis issues. e.g. With Edmundo's solution, Webpack warns:
Critical dependency: require function is used in a way in which dependencies cannot be statically extracted
Despite the warnings, Edmundo's solution is the most robust because it doesn't depend on Babel.
Extract the call to a separate module:
// src/js/lib/bundle-loader.js
/* istanbul ignore next */
module.exports = require.context('bundle-loader?lazy!../components/', false, /.*\.vue$/)
Use the new module in the module where you extracted it from:
// src/js/lib/loader.js
const loadModule = require('lib/bundle-loader')
Create a mock for the newly created bundle-loader module:
// test/unit/specs/__mocks__/lib/bundle-loader.js
export default () => () => 'foobar'
Use the mock in your test:
// test/unit/specs/lib/loader.spec.js
jest.mock('lib/bundle-loader')
import Loader from 'lib/loader'
describe('lib/loader', () => {
describe('Loader', () => {
it('should load', () => {
const loader = new Loader('[data-module]')
expect(loader).toBeInstanceOf(Loader)
})
})
})
Alrighty! I had major issues with this and managed to come to a solution that worked for me by using a combination of other answers and the Docs. (Took me a good day though)
For anyone else who is struggling:
Create a file called bundle-loader.js and add something like:
module.exports = {
importFiles: () => {
const r = require.context(<your_path_to_your_files>)
<your_processing>
return <your_processed_files>
}
}
In your code import like:
import bundleLoader from '<your_relative_Path>/bundle-loader'
Use like
let <your_var_name> = bundleLoader.importFiles()
In your test file right underneath other imports:
jest.mock('../../utils/bundle-loader', () => ({
importFiles: () => {
return <this_will_be_what_you_recieve_in_the_test_from_import_files>
}
}))
Installing
babel-plugin-transform-require-context
package and adding the plugin in the .babelrc resolved the issue for me.
Refer to the documentation here:
https://www.npmjs.com/package/babel-plugin-transform-require-context
The easiest and fastest way to solve this problem will be to install require-context.macro
npm install --save-dev require-context.macro
then just replace:
var modulesReq = require.context('.', false, /\.js$/);
with:
var modulesReq = requireContext('.', false, /\.js$/);
Thats it, you should be good to go!
Cheers and good luck!
Implementation problems not mentioned:
Jest prevents out-of-scope variables in mock, like __dirname.
Create React App limits Babel and Jest customization. You need to use src/setupTests.js which is run before every test.
fs is not supported in the browser. You will need something like browserFS. Now your app has file system support, just for dev.
Potential race condition. Export after this import. One of your require.context imports includes that export. I'm sure require takes care of this, but now we are adding a lot of fs work on top of it.
Type checking.
Either #4 or #5 created undefined errors. Type out the imports, no more errors. No more concerns about what can or can't be imported and where.
Motivation for all this? Extensibility. Keeping future modifications limited to one new file. Publishing separate modules is a better approach.
If there's an easier way to import, node would do it. Also this smacks of premature optimization. You end up scrapping everything anyways because you're now using an industry leading platform or utility.
If you're using Jest with test-utils in Vue.
Install these packages:
#vue/cli-plugin-babel
and
babel-plugin-transform-require-context
Then define babel.config.js at the root of the project with this configuration:
module.exports = function(api) {
api.cache(true);
const presets = [
'#vue/cli-plugin-babel/preset'
];
const plugins = [];
if (process.env.NODE_ENV === 'test') {
plugins.push('transform-require-context');
}
return {
presets,
plugins
};
};
This will check if the current process is initiated by Jest and if so, it mocks all the require.context calls.
I faced the same issue with an ejected create-react-app project
and no one from the answers above helped me...
My solution were to copy to config/babelTransform.js the follwoing:
module.exports = babelJest.createTransformer({
presets: [
[
require.resolve('babel-preset-react-app'),
{
runtime: hasJsxRuntime ? 'automatic' : 'classic',
},
],
],
plugins:["transform-require-context"],
babelrc: false,
configFile: false,
});
Simpleset Solution for this
Just Do
var modulesReq = require.context && require.context('.', false, /\.js$/);
if(modulesReq) {
modulesReq.keys().forEach(function(module) {
modulesReq(module);
});
}
So Here I have added extra check if require.context is defined then only execute By Doing this jest will no longer complain

Categories