Webpack - exporting directory and importing it via path - javascript

So I was trying to export a directory with webpack and came up with this script that imports recursively all the files that I need:
// lib1
const req = require.context('./', true, /^(?!.*test.js)^(?!.*Spec.js)^(?!.*__mocks__.*\.js)((.*\.(js\.*))[^.]*$)/)
const modules = {}
req.keys().forEach(key => {
modules[key] = req(key)
})
export default modules
That works fine, I have all modules available to me on subsequent bundles of webpack but it's not exactly what I wanted.
What I really need is a way to export those modules in a way that I can import them using paths instead of grabbing this module object and accessing services via it.
Note:
Apps have ^lib as external module.
webpack#4.1.1
// ex1 what I need:
import theme from 'lib1/services/theme'
// ex2 what I have:
import lib1 from 'lib1'
const theme = lib1.services.theme
Is there a way to achieve this?

Related

How can I import & export modules in Node.js where I have the module names and directories as strings?

My folder structure looks like this:
modules
module-and
index.js
module-not
index.js
module-or
index.js
module-xor
index.js
moduleBundler.js
The file I'm working in, moduleBundler.js, imports the modules from each module folder / file, then exports them all as one bundle:
import ModuleAnd from "./module-and";
import ModuleNot from "./module-not";
import ModuleOr from "./module-or";
import ModuleXor from "./module-xor";
export { ModuleAnd, ModuleNot, ModuleOr, ModuleXor };
How can I make this code automatically import and export each of these modules, without needing to hardcode their names and directories within moduleBundler.js?
I'm able to get the names and directories of each of the modules with this code:
const moduleDirectories = getDirectories(__dirname);
const moduleNames = moduleDirectories.map(x => x.slice(0, 1).toUpperCase() + camelise(x).slice(1));
console.log(moduleDirectories);
>>> [ 'module-and', 'module-not', 'module-or', 'module-xor' ]
console.log(moduleNames);
>>> [ 'ModuleAnd', 'ModuleNot', 'ModuleOr', 'ModuleXor' ]
But there doesn't seem to be an obvious way of importing or exporting modules using these values.
I tried looping over each folder and importing them like this:
for (const i in moduleNames) {
import moduleNames[i] from ("./" + moduleDirectories[i]);
}
>>> ReferenceError: from is not defined
I also tried using eval(), knowing its security risks, just to see if it would work:
for (const [key, value] of Object.entries(moduleNames)) {
const directory = "./" + moduleDirectories[parseInt(key)];
eval(`import ${value} from "${directory}"`);
}
>>> SyntaxError: Cannot use import statement outside a module
I know for eval() I could maybe get it working by adding "type": "module" to my package.json, but I'd rather avoid doing that if possible. I'd rather avoid eval() entirely, too, if possible.
Also, once I have got them imported, how can I then export them as a bundle?
Solved it! Turns out I needed to use require instead of import, d'oh!
Using the arrays moduleDirectories and moduleNames which I created earlier:
for (const i in moduleDirectories) {
const directory = "./" + moduleDirectories[i];
module.exports[moduleNames[i]] = require(directory).default;
}

Webpack not allowing es2020 import from variable in next.js

I'm developing a plugin feature in my next.js react app in which I have to dynamically import components from a bunch of given modules.
component.tsx :
const MyComponent = (props) => {
useEffect(() => {
const pluginNames = ['test-component'];
pluginNames.forEach(async (name) => {
try {
const plugin = await import(name);
} catch(err) {
// I don't want an unvalid plugin to crash my app
console.warn(err);
}
});
}, []);
// returns any html template
}
but when I run my code, I get the following error:
it seem to clearly indicate that the plugin is not found despite it's installed.
From what I understood, it happens because webpack doesn't pack up the dynamically imported plugins in my webpack config. Is there a way to tell webpack to include the specified modules (considering they are fixed since the startup of the app).
Possible solutions:
create a js file importing all modules and tell webpack to inject it into the bundle page
configure webpack so it adds the required modules

Typescript import with sending variable to imported files

Didn't found it in documentation. The question is is there anything in typescript like.
file1.js
module.exports = (service) =>
{
service.doSomeCoolStuff();
}
file2.js
const file2 = require('./file2')(service)
I need to send service object to another file and work with it. So is there something like this in typescript? I need to uses exactly this construction. I know it is possible to it with class, but requirement is to use it this way.
TypeScript has modules which support ES imports and exports.
file1.ts
export default (service) => service.doSomeCoolStuff();
file2.ts
import doCoolStuffWithService from './file1';
const file2 = doCoolStuffWithService(service);

Import types from .graphql into a .js file

I've searched something about import types from .graphql files. I’ve found graphql-import to import using # import something from 'something-else'. This works fine between .graphql files.
But what I’m trying to do is to import some types from generated.graphql from Prisma into a .js file.
For example:
I have this generated.graphql file from Prisma
"""generated.graphql file"""
type ItemWhereInput { ... }
type ItemConnection { ... }
...
I would like to import those two types ItemWhereInput and ItemConnection from generated.graphql file into items-types.js file
// items-types.js file
import gql from 'graphql-tag';
// I would like to make some kind of import ItemWhereInput and ItemConnection here
// Something like `import { ItemWhereInput, ItemConnection } from 'generated.graphql'`
...
const ItemWhereUniqueInput = gql`
input ItemWhereUniqueInput {
id: String!
}
`;
...
// And export ItemWhereInput and ItemConnection here
export default [Item, ItemInput, ItemWhereUniqueInput, ItemUpdateInput];
That way I could call makeExecutableSchema from graphql-tools and use those types in some place else
// items-query.js file
import { forwardTo } from 'prisma-binding';
const schema = `
items: [Item]!
item (where: ItemWhereUniqueInput!): Item
# And use it here
itemsConnection (where: ItemWhereInput): ItemConnection!
`;
const resolvers = {
items: forwardTo(‘db’),
item: forwardTo(‘db’),
itemsConnection: forwardTo(‘db’),
};
export default {
schema,
resolvers,
};
If it is somewhere else or there are something that could help, please, point me out.
Thanks.
You should be able to do the following:
During your build step, first, transform your generated.graphql file into a js file by
adding export default ` to the beginning of the file,
`); to the end of the file, and
renaming it to generated.js.
This way, you can import the file just as a js file in your development code:
// some other js file
/*
* notice the lack of .js, this should make it easier for your
* IDE to understand you're referencing the 'generated.graphql' file.
* If this is not possible in your code, you actually have to say
* .js here, not .graphql, because the file will be called .js after
* build.
*/
import generated from './generated';
console.log(generated);
You will see that schema is a string of the contents of the file pre-build-step.
It can now be used as typeDefs for makeExecutableSchema:
import { makeExecutableSchema } from 'graphql-tools';
import typeDefs from './generated';
import resolvers from './resolvers';
const schema = makeExecutableSchema({
typeDefs,
resolvers,
});
If you use a bundler and/or transpiler some additional work has to be done, to make sure the file is also run through these tools.
The project I've used this approach on only uses babel, with which it is a matter of:
using npm-watch instead of babel's --watch option to run the build script
(can be done in parallel)
running babel on all source .js files
running a custom script on all .graphql files, which:
adds the relevant code to the file, to make it valid js (in-memory)
programmatically runs babel on the result
saves it to the build destination with .js extension
Careful with large files though, as they are loaded to memory with this method!
Beware though, as this approach does not work with a bundler, for which you will have to either transform the file before running the bundler (and somehow still retain the old version, probably by naming the transformed version differently and deleting it after running the bundler), or find/create a Plugin doing this work for you.
Here are some options I found (quick google search): for webpack and Parcel.

Global Import In ES6

I have a large third party library that I need to share between two projects. The project has multiple folders with multiple files that contain multiple exports. Instead of importing these modules like this
import {BaseContainer} from '#company/customproject/src/containers/BaseContainer.js'
I would like to do this
import { BaseContainer } from '#company/customproject'
I know I can manually import all the modules into a single index.js file in the base directory but i am wondering if there is an easier way to do not have import them all explicitly
I know I can manually import all the modules into a single index.js file in the base directory but i am wondering if there is an easier way to do not have import them all explicitly
You should really just create an index.js file and import into that whatever you want to export so that you can control what APIs get exported and to not export private APIs.
That said there is an automated tool that generates an index.js automatically for you:
> npm install -g create-index
> create-index ./src
Which will generate an index.js with all the exports.
As the other answer suggests, you should create an index.js within each directory and explicitly export contents
#company/customproject/index.js
import {BaseContainer, SomeOtherContainer} from './src/containers'
export {
BaseContainer,
SomeOtherContainer
}
#company/customproject/src/containers/index.js
import BaseContainer from './BaseContainer'
import SomeOtherContainer from './SomeOtherContainer'
export {
BaseContainer,
SomeOtherContainer
}
Another option to autoload an entire directory is using require and module.exports to export every scanned file, however. You would likely run into conflicts using both ES6 import/export along with module.exports and default export statements.
#company/customproject/index.js
const fs = require('fs')
const modules = {}
fs.readdirSync(__dirname+'/src/containers').forEach(file => {
file = file.replace('.js', '')
modules[file] = require('./src/containers/'+file)
// map default export statement
if (modules[file].default) {
modules[file] = modules[file].default
}
})
module.exports = modules
Then simply use it in any ES5 or ES6 module
const {BaseContainer} = require('#company/customproject')
or
import {BaseContainer} from '#company/customproject'

Categories