How do I add an new file extension to Nodejs dynamic import?
I want to add my own filetype, lets call it .jszip. (No, this is just an example and what I actually want has nothing to do with zip).
Say I have
package.json:
{
"name": "test",
"scripts": {
"zip": "node --experimental-modules test.js"
}
}
test.js:
const fs = require('fs');
const Module = require('module');
function loadJsZip(module, filename) {
console.log('In loadJsZip');
const content = fs.readFileSync(filename, 'utf8');
// Do something to content
module._compile(content, filename);
}
require.extensions['.jszip'] = loadJsZip;
Module._extensions['.jszip'] = loadJsZip;
function loadJs(relativePath) {
import(f).then((module) => {
console.log(`imported from ${filename}:${module}`);
}).catch((err) => {
console.log(`While importing:${err}`);
});
}
loadJs('./testfile.jszip');
I am getting:
(node:20412) ExperimentalWarning: The ESM module loader is experimental.
While importing:TypeError [ERR_UNKNOWN_FILE_EXTENSION]: Unknown file extension: c:\...\testfile.jszip
It seems other file types are not supported: https://nodejs.org/api/esm.html#esm_import_statements
What worked for my case is getting the normal require and using that. So I'm importing .graphql files using:
import {createRequire} from 'module';
const require = createRequire(import.meta.url);
require('graphql-import-node/register');
const myQuery = require('./myquery.graphql');
The graphql-import-node package does a require.extension[] = behind the scenes:
require.extensions['.graphql'] = loadPlainFile
But this is starting to become madness. You are probably better off using Webpack or something.
Related
I am noob with JS and I can't figure how to instantiate one of my objects in this jest unit test (backend / nodejs project)
project structure:
appi/
src/
configFactory.js
...
test/
configFactory.test.js
...
package.json
Using require
configFactory.js
class ConfigFactory {
constructor(index_mapping){
this.mapping = index_mapping
}
}
configFactory.test.js
const ConfigFactory = require('../src/configFactory.js')
var fs = require('fs');
test('some test', () => {
fs.readFile(__dirname +'/__mock-data__/Mappings/mappings_ac.json', 'utf8', function(err, data) {
if (err) throw err;
const factory = new ConfigFactory(data);
});
});
This ends up with a TypeError: ConfigFactory is not a constructor
Using import
class ConfigFactory {
constructor(index_mapping){
this.mapping = index_mapping
}
}
export default ConfigFactory
import ConfigFactory from "../src/configFactory"
ends up with SyntaxError: Cannot use import statement outside a module. I tried to add "type": "module" to package.json but I feel that I am missing an important point
Looks like I was not properly exporting my class for CJS:
class ConfigFactory {
constructor(index_mapping){
this.mapping = index_mapping
}
}
module.exports = ConfigFactory
I already asked the question on Jest repository here. And also pushed a sample application here to reproduce the behavior. But for the sake of completeness here's the full story:
Essentially it's like this (./parsers.ts):
import yargs from "yargs";
export const parser = yargs
.strict(true)
.help()
.commandDir("cmds")
.demandCommand(1)
.recommendCommands();
And in cmds folder, there's a remote.ts:
import { Argv } from "yargs";
export const command = "remote <command>";
export const describe = "Manage set of tracked repos";
export const handler = (yargs: Argv<any>) => {};
export const builder = (yargs: Argv<any>) => {
return yargs
.commandDir("remote_cmds")
.demandCommand(1, 1)
.recommendCommands();
};
And then there's add.ts:
import { Argv } from "yargs";
export const command = "add <name> <url>";
export const handler = (yargs: Argv<any>): void => {};
export const describe = "Add remote named <name> for repo at url <url>";
export const builder = (yargs: Argv<any>): Argv => {
return yargs.demandCommand(0, 0);
};
Now I've got two more files:
// index.ts
import { parser } from "./parsers";
import { Arguments } from "yargs";
parser.parse("remote add foo", (err, argv, output) => {
console.log("parsed argv: %s", JSON.stringify(argv));
if (err) console.log("ERROR\n" + err);
if (output) console.log("OUTPUT\n" + output);
});
When I run this, it fails, rightly so. Because remote add command expects two arguments. And if I pass correct input, it gives correct output. Meaning everything works just fine.
// parsers.test.ts
import { Arguments } from "yargs";
import { parser } from "./parsers";
describe("remote", () => {
test("add", async () => {
const argv = parser.parse("remote add foo", (err, argv, output) => {
console.log(JSON.stringify(argv));
if (err) console.log("ERROR\n" + err);
if (output) console.log("OUTPUT\n" + output);
});
expect(argv.name).toEqual("foo");
});
});
Also the Jest configuration is:
module.exports = {
transform: {
"^.+\\.ts?$": "ts-jest",
},
testEnvironment: "node",
testRegex: "./src/.*\\.(test|spec)?\\.(ts|ts)$",
moduleFileExtensions: ["ts", "tsx", "js", "jsx", "json", "node"],
roots: ["<rootDir>/src"],
};
But when I run the above test, it doesn't fail at all, as if the parser has no configuration. (The assertion interestingly fails because foo is not extracted as a property into argv which shows, again, the parser didn't pick up the configuration inside cmds folder.)
Not sure if it's a bug or feature; while testing yargs parsers, something is messing with the parser configuration so that, nothing from commands directories gets loaded into the parser.
How can I test my parser using Jest? Thanks.
I want to import all pictures from a folder to a component with dynamic import like this:
index.js
const testFolder = './2020/';
const fs = require('fs');
const path = require('path');
const files = fs.readdirSync(testFolder);
const allowedExts = ['.png', '.jpg', 'svg'] // add any extensions you need
const modules = {};
if (files.length) {
let filterThruFiles = files.filter(file => allowedExts.indexOf(path.extname(file)) > -1)
filterThruFiles.forEach(file => modules[file] = `./${file}`);
}
module.exports = modules;
let modulesStringify = JSON.stringify(modules);
fs.writeFileSync('pics-url-list.json', modulesStringify);
I get an object with the paths and write it to a file, which works fine:
//pics-url-list.json
{
"1-1.jpg":"./1-1.jpg",
"1-2.jpg":"./1-2.jpg",
"1-3.jpg":"./1-3.jpg"
//and so on
}
but when I run npm start, I get this error. The main question is how to access this .json file in an React component? Now no console.log works in the App.js, it only show this error about fs.readdirSync
//App.js component
import picList from '../images/pics-url-list.json';
console.log(picList);
It is not possible to use fs like modules for client-side app.
There is a way to dynamically import all images by using require.context
For example,
function importAll(r) {
return r.keys().map(r);
}
const images = importAll(require.context('./', false, /\.(png|jpe?g|svg)$/));
I've tried a few implementations which none have been successful.
First Attempt
Using eval in package.json script "fetch:data": "eval $(cat .env) ts-node -O '{\"module\":\"commonjs\"}' ./bin/build-api-data.ts".
This results in a JSON parsing error because eval is removing my quotes for some reason.
undefined:1
{module:commonjs}
^
SyntaxError: Unexpected token m in JSON at position 1
Second Attempt
Using dotenv, the problem I encountered here was it was a race condition resulting in errors like this:
$ CANDID_ENV=local ts-node -O '{"module":"commonjs"}' ./bin/build-api-data.ts
/Users/lassiter.gregg/code/candidco-web/node_modules/contentful/dist/webpack:/contentful/contentful.js:49
throw new TypeError('Expected parameter accessToken')
^
TypeError: Expected parameter accessToken
Code Sample
import fs from 'fs';
import path from 'path';
import fetchApiData from '../lib/apiData';
import dotEnv from 'dotenv-safe';
const { CANDID_ENV } = process.env;
const isLocalBuild = CANDID_ENV === 'local';
console.log(dotEnv);
const API_DATA_FILENAME = 'api_data.json';
const ensureDirectoryExistence = filePath => {
var dirname = path.dirname(filePath);
if (fs.existsSync(dirname)) {
return true;
}
ensureDirectoryExistence(dirname);
fs.mkdirSync(dirname);
};
const writeData = (filename, data) => {
const filePath = path.join(__dirname, '..', '.data', filename);
ensureDirectoryExistence(filePath);
fs.writeFileSync(filePath, JSON.stringify(data));
console.log('API data stored', filePath);
};
const fetchAndStoreApiData = async () => {
console.log('Fetching all API data');
await dotEnv.config({
path: isLocalBuild ? './.env' : `./.env.${CANDID_ENV}`,
});
const newData = await fetchApiData();
writeData(API_DATA_FILENAME, newData);
};
const init = async () => {
fetchAndStoreApiData();
};
if (require.main === module) {
init();
}
In the case above, I've tried doing dotenv.config at the top of the file, in the init, in the function as you see. It always throws the same error about contentful not getting the env variable it needs. That said, if I log process.env and comment out the code relevant to fetchApiData then I see all my environment variables. That's why I think it's a race-time condition but haven't been able to find anything similar to my own issue.
Additionally, what makes this even more thorny is that this is a custom script that has to work in a node and esnext environment. So, I've had my fair share of thorny import/export issues using syntax I don't really prefer but haven't found away around it (e.g. export = someFunction).
Do I see it correctly, that you are trying to configure dotenv with a variable that you initialize with an env variable? I don't think that's going to work out.
Dotenv's work is to load the env variables to process.env. You have to config it as early as possible in your app.
More about it here: https://www.npmjs.com/package/dotenv
I want to test scripts in an environment where we can not export modules. I have installed Jest version 23.1.0 and there aren't other packages in my package.json file.
Using jsdom 'old' api I have come up with a solution that works as expected:
script.js
var exVar = "test";
script.test.js
const jsdom = require('jsdom/lib/old-api.js');
test('old jsdom api config', function(done) {
jsdom.env({
html: "<html><body></body></html>",
scripts: [__dirname + "/script.js"],
done: function (err, window) {
expect(window.exVar).toBe("test");
done();
}
});
});
However with this implementation I have to re-write the config for every test, because it looks like the jsdom config gets re-written every time.
What I have tried
So far I have tried running this configuration:
const jsdom = require('jsdom/lib/old-api.js');
jsdom.env({
html: "<html><body></body></html>",
scripts: [__dirname + "/script.js"],
done: function (err, window) {
console.log('end');
}
});
with this test:
test('old jsdom api config', function(done) {
expect(window.exVar).toBe("test");
done();
});
in different ways: inside beforeAll, inside a script linked through setupFiles or through setupTestFrameworkScriptFile in the Jest configuration object, but still nothing works.
Maybe I could extend jest-environment as suggested in the docs, but I have no idea of the syntax I should be using, nor of how to link this file to the tests.
Thanks to my co-worker Andrea Talon I have found a way of using the same setup for different tests (at least inside the same file) using the 'Standard API' (not the 'old API').
Here is the complete test file.
const {JSDOM} = require("jsdom")
const fs = require("fs")
// file to test
const srcFile = fs.readFileSync("script.js", { encoding: "utf-8" })
// the window
let window
describe('script.js test', () => {
beforeAll((done) => {
window = new JSDOM(``, {
runScripts: "dangerously"
}).window
const scriptEl = window.document.createElement("script")
scriptEl.textContent = srcFile
window.document.body.appendChild(scriptEl)
done()
})
test('variable is correctly working', (done) => {
expect(window.exVar).toBe("test");
done()
})
})
Additional setup
In order to load multiple scripts I have created this function which accepts an array of scripts:
function loadExternalScripts (window, srcArray) {
srcArray.forEach(src => {
const scriptEl = window.document.createElement("script")
scriptEl.textContent = src
window.document.body.appendChild(scriptEl)
});
}
So instead of appending every single script to the window variable I can load them by declaring them at the top of the file like this:
// files to test
const jQueryFile = fs.readFileSync("jquery.js", { encoding: "utf-8" })
const srcFile = fs.readFileSync("lib.js", { encoding: "utf-8" })
and then inside the beforeAll function I can load them altogether like this:
loadExternalScripts(window, [jQueryFile, srcFile])