How can i require an entry point in Webpack? - javascript

Im trying to export these two pieces of code cli.js and program.js, where cli depends on program and program has a bunch of other dependencies...
Webpack is doing a great job in bundling all dependencies of program.js (./a,./b,./c...) and correctly ignoring the ones that are externals like 'jquery', 'bluebird' ...
however when it comes to bundle the cli.js .. its not referencing the program.dist.js entry point, but bundling a copy of the entire program once again...
how could i fix this issue? is it a limitation with webpack? or is there any way around it? im currently using webpack 2.1.0-beta.27
this is my webpack.config.js
const path = require('path');
module.exports = {
entry: {
cli: './bin/cli.js',
program: './program.js',
},
target: 'node',
output: {
libraryTarget: 'umd',
filename: '[name].dist.js',
umdNamedDefine: true,
path: path.resolve(__dirname, 'distribution'),
},
externals: [
/^[a-z\-0-9]+$/
]
}
program.js
let a = require('./a'),
b = require('./b'),
c = require('./c');
bin/cli.js
const program = require('../program');
program.doSomething();
just a side node...
I cant split it into chunks with CommonsChunkPlugin because it would make my cli.dist.js unable to be executed by node.js like node cli.dist.js

Related

resolving all files from set public directories with webpack

I have built a custom HTML framework that has a pretty simple project structure. I really need to grab from 3 separate directories views,js, and components. I am very new to webpack but I figured with its configurability there would be a way for me to have all of these imports importing something like /components/random_component_name.js I need webpack to resolve these files to be their private path.
I have tried many different things this is what I have most recently tested
const path = require('path');
module.exports = {
entry: path.resolve(__dirname + '/public/js/main.js'),
module: {
generator: {
js: {
// Generator options for asset modules
// Customize publicPath for asset modules, available since webpack 5.28.0
publicPath: '/js',
// Emit the asset in the specified folder relative to 'output.path', available since webpack 5.67.0
outputPath: path.resolve(__dirname + 'public/js'),
},
},
},
}
How can i get this functionality out of webpack. Surely it shouldn't be too hard. I am new to all bundlers like this so sorry if this is horribly wrong.
I finally figured it out. I had tried a method close to this before but neglected the '/' in the alias key names so now this work
const path = require('path');
module.exports = {
//...
entry: {
main: './src/js/main.js',
},
resolve: {
alias: {
'/components': path.resolve(__dirname, 'src/components/'),
'/js': path.resolve(__dirname, 'src/js/'),
'/views': path.resolve(__dirname, 'src/views/'),
},
}
};

Firebase cloud functions: choosing extra file to upload

For my firebase application I need some backend functions that would load i18n files and send them to client. I am able to use webpack to bundle the files for deploying. That works fine. But what I would also like to do is to change the content of the files (because phrases in the app may be added or deleted), which is not possible in this case. Is there a way to upload these files along with the scripts?
There is this section in the firebase documentation: https://firebase.google.com/docs/functions/handle-dependencies. However, I wouldn't really like to write "language_namespace": "file:locales/language/namespace.json" for each file I have.
I didn't find any easy workaround and I will probably use some other system for the i18n, because this one is overcomplicated and it will not be easy to work with. Anyway, here is the solution in case someone faces a similar problem:
I used webpack-cli instead of tsc to bundle my files. Here is what my folder structure looked like:
root
- functions
- webpack.config.js
- lib
- src
- index.ts // File with the function
- ...
- public
- locales // Locales folder I needed to be uploaded
- en
- ...
- ru
- ...
- ...
- ...
- ...
It seems like firebase will upload anything that it finds in the lib folder, so when bundling the functions, I just used copy-webpack-plugin to copy locales there. Here is what my webpack config looked like:
const path = require('path')
const nodeExternals = require('webpack-node-externals')
const CopyWebpackPlugin = require('copy-webpack-plugin')
module.exports = {
entry: './src/index.ts',
output: {
path: path.resolve(__dirname, 'lib'),
libraryTarget: 'this',
filename: 'index.js',
},
mode: 'development',
resolve: {
extensions: ['.ts', '.tsx', '.js'],
},
target: 'node',
externals: [nodeExternals()],
module: {
rules: [
{ test: /\.tsx?/, loader: 'ts-loader', options: { transpileOnly: true } },
]
},
plugins: [
new CopyWebpackPlugin([{
from: path.resolve(__dirname, '../public/locales'),
to: './locales'
}])
]
}
Then, to prevent webpack from changing normal require to __webpack_require__whatever, I had to use the __non_webpack_require__ function, that would then be transpiled to normal require and let me read copied files from lib/locales:
// In case you are using typescript and want to prevent the compiler
// from arguing that `__non_webpack_require__` is not defined
declare function __non_webpack_require__(module: string): any
export const getLocale = functions.https.onRequest((req, res) => {
const { language, namespace } = req.query
try {
const translation = __non_webpack_require__(`./locales/${language}/${namespace}.json`)
// do stuff
res.send(translation)
} catch(e) {
if(e.code === 'MODULE_NOT_FOUND') {
res.status(400).end("Couldn't find translation for ${language}/${namespace}")
} else { /* ... */ }
}
})
After doing this, both firebase emulators:start --only functions and firebase deploy --only functions:getLocale worked correctly

How to avoid webpack bundle all files in one? [duplicate]

So right now I'm working with a prototype where we're using a combination between webpack (for building .tsx files and copying .html files) and webpack-dev-server for development serving. As you can assume we are also using React and ReactDOM as a couple of library dependencies as well. Our current build output is the following structure:
dist
-favicon.ico
-index.html
-main.js
-main.js.map // for source-mapping between tsx / js files
This places ALL of the modules (including library dependencies into on big bundled file). I want the end result to look like this:
dist
-favicon.ico
-index.html
-appName.js
-appName.min.js
-react.js
-react.min.js
-reactDOM.js
-reactDOM.min.js
I have references to each of the libraries in index.html and in import statements in the .tsx files. So my question is this...
How do I go from webpack producing this gigantic bundled .js file to individual .js files (libraries included, without having to specify each individually)? **Bonus: I know how to do prod/dev environment flags, so how do I just minify those individual files (again without bundling them)?
current webpack.config:
var webpack = require("webpack"); // Assigning node package of webpack dependency to var for later utilization
var path = require("path"); // // Assigning node package of path dependency to var for later utilization
module.exports = {
entry: [
"./wwwroot/app/appName.tsx", // Starting point of linking/compiling Typescript and dependencies, will need to add separate entry points in case of not deving SPA
"./wwwroot/index.html", // Starting point of including HTML and dependencies, will need to add separate entry points in case of not deving SPA
"./wwwroot/favicon.ico" // Input location for favicon
],
output: {
path: "./dist/", // Where we want to host files in local file directory structure
publicPath: "/", // Where we want files to appear in hosting (eventual resolution to: https://localhost:4444/)
filename: "appName.js" // What we want end compiled app JS file to be called
},
// Enable sourcemaps for debugging webpack's output.
devtool: "source-map",
devServer: {
contentBase: './dist', // Copy and serve files from dist folder
port: 4444, // Host on localhost port 4444
// https: true, // Enable self-signed https/ssl cert debugging
colors: true // Enable color-coding for debugging (VS Code does not currently emit colors, so none will be present there)
},
resolve: {
// Add '.ts' and '.tsx' as resolvable extensions.
extensions: [
"",
".ico",
".js",
".ts",
".tsx",
".web.js",
".webpack.js"
]
},
module: {
loaders: [
// This loader copies the index.html file & favicon.ico to the output directory.
{
test: /\.(html|ico)$/,
loader: 'file?name=[name].[ext]'
},
// All files with a '.ts' or '.tsx' extension will be handled by 'ts-loader'.
{
test: /\.tsx?$/,
loaders: ["ts-loader"]
}
],
preLoaders: [
// All output '.js' files will have any sourcemaps re-processed by 'source-map-loader'.
{
test: /\.js$/,
loader: "source-map-loader"
}
]
},
// When importing a module whose path matches one of the following, just
// assume a corresponding global variable exists and use that instead.
// This is important because it allows us to avoid bundling all of our
// dependencies, which allows browsers to cache those libraries between builds.
// externals: {
// "react": "React",
// "react-dom": "ReactDOM",
// "redux": "Redux"
// }
};
Change the output setting to be name driven e.g.
entry: {
dash: 'app/dash.ts',
home: 'app/home.ts',
},
output: {
path: './public',
filename: 'build/[name].js',
sourceMapFilename: 'build/[name].js.map'
},
To expand upon #basarat's answer, you can use the glob package from node's standard library to build the "entry" config:
const glob = require("glob");
module.exports = [
{
target: "node",
entry: glob.sync("./src/**/*.test.{ts,tsx}").reduce((acc, file) => {
acc[file.replace(/^\.\/src\//, "")] = file;
return acc;
}, {}),
output: {
filename: "[name].js",
chunkFilename: "[name]-[id].js",
path: __dirname + "/dist"
},
//...
}
];
This builds files with the same name as their source, replacing .ts and .tsx with .js.
OPs answer copied out of the question
Ended up finding a solution that fit my needs, although, again, in that webpack-y way, requires some additional configuration. Still would like to make it a little more dynamic, but will perfect this at a later point in time. The resolution I was looking for was the ability to "chunk" common modules, but I stated it as filename given "entry"-points provided in webpack. I didn't mind some files being combined, where it made sense, but wanted overall files to be at a component-level given the project wasn't a SPA (single page application).
The additional code ended up being:
plugins: [
new webpack.optimize.CommonsChunkPlugin({ // This plugin is for extracting and created "chunks" (extracted parts of the code that are common and aren't page specific)
// One of these instances of plugins needs to be specified for EACH chunk file output desired
filename: "common.js", // Filename for this particular set of chunks to be stored
name: "common", // Entry point name given for where to pull all of the chunks
minChunks: 3 // Minimum number of chunks to be created
})
]
I also had to parameterize the entry points (see below for example), by variable name so that I could assign react, react-dom, and redux modules to common.js file.
entry: {
main: "./wwwroot/app/appName.tsx", // Starting point of linking/compiling Typescript and dependencies, will need to add separate entry points in case of not deving SPA
index: "./wwwroot/index.html", // Starting point of including HTML and dependencies, will need to add separate entry points in case of not deving SPA
favicon: "./wwwroot/favicon.ico", // Input location for favicon
common: [ "react", "react-dom", "redux" ] // All of the "chunks" to extract and place in common file for faster loading of common libraries between pages
},

Requiring array of modules in Webpack

I am currently trying to load a variable array of modules with Webpack.
I've found this link: webpack can not require variable ,the request of a dependency is an expression but it doesn't seem to work for me.
In my project I've got a module inside my node_modules folder. The entry point of the module is called index.js.
Here's the basic structure:
| app
+---app.js
| js
+---gen.js
| node_modules
+---sample_module_1
| +---index.js
+-sample-module_2
+---index.js
| webpack.config.js
In future I'd like to add new modules. Therefore I tried following approach:
//app.js
var modules = [
"sample_module_1",
"sample_module_2"
]
for(var i = 0; i < modules.length; i++) {
require(modules[i] + "/index.js");
}
But Webpack doesn't seem to find the module. I've tried adding a resolveLoader to the webpack.config.js file:
//webpack.config.js
var path = require('path');
module.exports = {
entry: './app/app.js',
output: {
filename: 'gen.js',
path: path.resolve(__dirname, 'js')
},
resolveLoader: {
modules: ["node_modules"],
extensions: ['*', '.js']
}
};
Still, Webpack is not able to find the module.
I've also tried the suggestions on https://webpack.github.io/docs/context.html but still no results.
Any suggestions?

How to get node package consumer directory from node_modules?

I am trying to create a simple node module that creates a set of folders in the app that consumes it. I exported a simple createLayout function that creates the folders. I pushed my changes to git and did an npm i from another folder. Lets call the modules creator and consumer for the sake of explanation. When I try to call createLayout in consumer I am running in to several issues. I am in E:\ drive.
Below is the index.js in creator:
import {sync} from 'mkdirp';
export function createLayout(config) {
sync('./folder1');
}
And index.js in consumer:
var createLayout = require('creator').createLayout;
createLayout();
// with config createLayout({path: __dirname})
This results in creating a folder in E:\ not relative to consumer. So I tried including __dirname:
sync(__dirname + '/folder1');
Once again, this also creates a folder in E:\ not relative to consumer. I searched for bit like in various modules to see how they are doing when they are reading the config file, for instance webpack uses process.cwd. So I tried that too.
sync(process.cwd() + '/folder1');
Same, results in creating a folder in E:\ not relative to consumer. Then I tried to pass the __dirname or cwd through a config object.
// get __dirname from the `consumer` in config.path
sync(config.path + '/folder1');
But it ends up in following error:
Error: EPERM: operation not permitted, mkdir 'E:\'
I tried logging all the values in both creator and consumer:
console.log(__dirname, process.cwd(), config.path)
// creator: / / E:\projects\consumer
// consumer: E:\projects\consumer E:\projects\consumer E:\projects\consumer
I am using webpack with babel to pack the creator, plain js in consumer. I do not know what am I doing wrong. I am pretty new to nodejs ways of working.
Update
I am noticing that this is occurring only when I use webpack to build the creator. A simple module.exports works normally as anyone would expect. So I am including my webpack config file:
module.exports = {
entry: [
'./index.js'
],
output: {
filename: 'creator.js',
path: __dirname + '/dist',
library: 'creator',
libraryTarget: 'umd'
},
module: {
loaders: [
{
test: /\.js$/,
exclude: /node_modules/,
loader: 'babel'
}
]
},
externals: {
fs: 'fs'
}
};
Correct solution is adding this line in config:
target: 'node'
this will make webpack to ignore modules like fs and mkdirp and some other.
Now no longer need to specify externals.
Incorrect solution given before:
Just add mkdirp to externals and it will resolve you problem:
externals: {
fs: 'fs',
mkdirp: 'mkdirp'
}

Categories