Mocking node modules with Jest and #std/esm - javascript

I'm currently having an issue writing tests for a node application that uses #std/esm. I've setup a manual mock of a node module inside a __mocks__ directory and the following code shows a test for the file uses this mocked node module. (It's used in db.mjs)
const loader = require('#std/esm')(module, { cjs: true, esm: 'js' })
const Db = loader('../src/db').default
const db = new Db()
describe('getNotes', () => {
it('gets mocked note', () => {
db.getNote()
})
})
However, when I run Jest, my manual mock is not being used, it is using the real node module.
Does anyone have any thoughts on why this could be happening?

Jest is particular about the location of your mocks. From their documentation on mocking node modules:
For example, to mock a scoped module called #scope/project-name,
create a file at mocks/#scope/project-name.js, creating the
#scope/ directory accordingly.
In your case it would be __mocks__/#std/esm.js

Related

How to set/define environment variables and api_Server in cypress?

Currently, we are using cypress to test our application. We have 2 environments with 2 different api_Servers. I want to define this inside the environment files. I am not sure how to define both the url in same file.
For example,
Environment-1:
baseUrl - https://environment-1.me/
Api_Serever - https://api-environment-1.me/v1
Environment-2:
baseUrl - https://environment-2.me/
Api_Serever - https://api-environment-2.me/v1
So few test cases depend on the baseUrl and 1 test case to check API depends on Api_Serever.
To resolve this I tried to set the baseUrl and Api_Serever inside the config file inside a plugin following this link https://docs.cypress.io/api/plugins/configuration-api.html#Usage.
I created two config files for 2 environments,
{
"baseUrl": "https://environment-2.me/",
"env": {
"envname": "environment-1",
"api_server": "https://api-environment-1.me/v1"
}
}
Another file similar to this changing the respective endpoints.
plugin file has been modified as,
// promisified fs module
const fs = require('fs-extra')
const path = require('path')
function getConfigurationByFile (file) {
const pathToConfigFile = path.resolve('..', 'cypress', 'config', `${file}.json`)
return fs.readJson(pathToConfigFile)
}
module.exports = (on, config) => {
// `on` is used to hook into various events Cypress emits
// `config` is the resolved Cypress config
// accept a configFile value or use development by default
const file = config.env.configFile || 'environment-2'
return getConfigurationByFile(file)
}
inside test cases, whichever refers to the baseUrl we used visit('/')
This works fine when we run a specific file from the command line using the command cypress run --env configFile=environment-2 all the test cases pass, as the visit('/') automatically replaces with the respective environments expect the API test case.
I am not sure how the API test should be modified to call the API endpoint instead of the base URL.
Can somebody help, please?
Thanks,
indhu.
If I understand your question correctly, you need to run tests with different urls. Urls being set in cypress.json or in env file.
Can you configure the urls in cypress.json file as below. I haven't tried though, can you give it a go.
{
"baseUrl": "https://environment-2.me/",
"api_server1": "https://api1_url_here",
"api_server2": "https://api2_url_here"
}
Inside the test call pass the urls as below;
describe('Test for various Urls', () => {
it('Should test the base url', () => {
cy.visit('/') // this point to baseUrl configured in cypress.json file
// some tests to continue based on baseUrl..
})
it('Should test the api 1 url', () => {
cy.visit(api_server1) // this point to api server 1 configured in cypress.json file
// some tests to continue based on api server1..
})
it('Should test the api 2 url', () => {
cy.visit(api_server2) // this point to api server 2 configured in cypress.json file
// some tests to continue based on api server2..
})
})
This issue has been resolved.
The best way is to do with a plugin as suggested by their docs (https://docs.cypress.io/api/plugins/configuration-api.html#Usage).
I kept the structure same as such in my question and in my test case I called it using, cy.request(Cypress.env('api_server'))
This solved my issue :)

"fetch is not found globally and no fetcher passed" when using spacejam in meteor

I'm writing unit tests to check my api. Before I merged my git test branch with my dev branch everything was fine, but then I started to get this error:
App running at: http://localhost:4096/
spacejam: meteor is ready
spacejam: spawning phantomjs
phantomjs: Running tests at http://localhost:4096/local using test-in-console
phantomjs: Error: fetch is not found globally and no fetcher passed, to fix pass a fetch for
your environment like https://www.npmjs.com/package/unfetch.
For example:
import fetch from 'unfetch';
import { createHttpLink } from 'apollo-link-http';
const link = createHttpLink({ uri: '/graphql', fetch: fetch });
Here's a part of my api.test.js file:
describe('GraphQL API for users', () => {
before(() => {
StubCollections.add([Meteor.users]);
StubCollections.stub();
});
after(() => {
StubCollections.restore();
});
it('should do the work', () => {
const x = 'hello';
expect(x).to.be.a('string');
});
});
The funniest thing is that I don't even have graphql in my tests (although, I use it in my meteor package)
Unfortunately, I didn't to find enough information (apart from apollo-link-http docs that has examples, but still puzzles me). I did try to use that example, but it didn't help and I still get the same error
I got the same error importing a npm module doing graphql queries into my React application. The app was compiling but tests were failing since window.fetch is not available in the Node.js runtime.
I solved the problem by installing node-fetch https://www.npmjs.com/package/node-fetch and adding the following declarations to jest.config.js:
const fetch = require('node-fetch')
global.fetch = fetch
global.window = global
global.Headers = fetch.Headers
global.Request = fetch.Request
global.Response = fetch.Response
global.location = { hostname: '' }
Doing so we instruct Jest on how to handle window.fetch when it executes frontend code in the Node.js runtime.
If you're using nodejs do the following:
Install node-fetch
npm install --save node-fetch
Add the line below to index.js:
global.fetch = require('node-fetch');
The problem is this: fetch is defined when you are in the browser, and is available as fetch, or even window.fetch
In the server it is not defined, and either needs to be imported explicity, or a polyfill like https://www.npmjs.com/package/unfetch (as suggested in the error message) needs to be imported by your test code to make the problem go away.

Running tests .mjs / ESM on Node using Jasmine or any other alternative

My Node-based project is implemented using native ES module support on Node thanks to the --experimental-modules CLI switch (i.e. node --experimental-modules).
Obviously, when I run a spec using Jasmine node --experimental-modules ./node_modules/jasmine/bin/jasmine I get the following error:
Error [ERR_REQUIRE_ESM]: Must use import to load ES Module
Is it ever possible to use Jasmine using ES modules in Node?
If not, is there any alternative to don't use a framework (e.g. running tests with npm scripts)?
It was easier than I thought.
It's just about calling a file which you might call run.mjs as follows:
node --experimental-modules ./run.mjs
The whole file would look like this:
jasmine.mjs:
import Jasmine from "jasmine"
import JasmineConsoleReporter from "jasmine-console-reporter"
const jasmine = new Jasmine()
jasmine.loadConfigFile( "./support/jasmine.json" )
jasmine.env.clearReporters()
jasmine.addReporter( new JasmineConsoleReporter( {
colors: true,
cleanStack: true,
verbosity: 4,
listStyle: 'indent',
activity: false
} ) )
export default jasmine
And you would add specs as follows in separate files:
import jasmine from './my-project/spec/jasmine.mjs'
jasmine.env.describe('Foo', () => {
jasmine.env.it('Bar', () => {
// Expects, assertions...
})
})
Finally, you would run jasmine importing both configured jasmine instance and specs:
import jasmine from './my-project/spec/jasmine.mjs'
import someSpec1 from './my-project/spec/someSpec1.mjs'
import someSpecN from './my-project/spec/someSpecN.mjs'
someSpec1()
someSpecN()
jasmine.execute()
Simplifying the solution of #Matias_Fidemraizer, keeping only the important bits in one file:
import glob from 'glob';
import Jasmine from 'jasmine';
const jasmine = new Jasmine();
jasmine.loadConfigFile('tests/jasmine.json');
// Load your mjs specs
glob('**/*-test.mjs', function (er, files) {
Promise.all(
files
// Use relative paths
.map(f => f.replace('tests/specs/', './'))
.map(f => import(f)
.catch(e => {
console.error('** Error loading ' + f + ': ');
console.error(e);
process.exit(1);
}))
)
.then(() => jasmine.execute());
});
And execute it with
node --experimental-modules jasmine-run.mjs
You will have some problems in the logs, receiving a message like
[...] .split('\n') [...]
That message mean that you have an exception in the mjs code.
You can follow there: https://github.com/jasmine/jasmine-npm/issues/150

require.context not a function when trying to run Mocha tests

I'm trying to run tests using the Mocha.js + JSDOM frameworks, but I'm having trouble getting Mocha to start up. This is in the process of testing a React app using the Vue.js library. I keep getting the following error:
var req = require.context('./', false, /\.vue$/);
TypeError: require.context is not a function
The code in question is:
let req = require.context('./', false, /\.vue$/);
components.forEach(function (component) {
try {
let filePath = './' + component + '.vue';
let injected = inject(req(filePath));
Vue.component(getComponentName(component), injected);
let appComponent = {
name: injected.name,
props: {
autocompletion: {
metadata: getComponentName('template'),
score: xTemplatesScore,
attributes: injected.props || []
}
}
};
appComponents.push(appComponent);
} catch (err) {
console.log(err);
console.error('Vue file was not found for component:' + component + '. Please rename your files accordingly ( component-name.vue )');
}
Is there a way to get around this and actually get Mocha to start up? Or is there a suitable replacement for require.context? I've tried to redo it with just plain string concatenations and a vanilla require, but that keeps telling me that none of the Vue modules can be found.
require.context is a method of webpack. Your tests must be bundled before they can be run.
Normally, you'd create a separate webpack config file for your tests. You'll then create a test bundle using webpack and then run Mocha on this bundle. Alternatively, you can use mocha-loader inside the webpack test config file and let the tests run as part of the bundling process.
Further information can be found in the webpack documentation on testing.

How to properly require modules from mocha.opts file

I'm using the expect.js library with my mocha unit tests. Currently, I'm requiring the library on the first line of each file, like this:
var expect = require('expect.js');
describe('something', function () {
it('should pass', function () {
expect(true).to.be(true); // works
});
});
If possible, I'd like to remove the boilerplate require code from the first line of each file, and have my unit tests magically know about expect. I thought I might be able to do this using the mocha.opts file:
--require ./node_modules/expect.js/index.js
But now I get the following error when running my test:
ReferenceError: expect is not defined
This seems to make sense - how can it know that the reference to expect in my tests refers to what is exported by the expect.js library?
The expect library is definitely getting loaded, as if I change the path to something non-existent then mocha says:
"Error: Cannot find module './does-not-exist.js'"
Is there any way to accomplish what I want? I'm running my tests from a gulp task if perhaps that could help.
You are requiring the module properly but as you figured out, the symbols that the module export won't automatically find themselves into the global space. You can remedy this with your own helper module.
Create test/helper.js:
var expect = require("expect.js")
global.expect = expect;
and set your test/mocha.opts to:
--require test/helper
While Louis's answer is spot on, in the end I solved this with a different approach by using karma and the karma-chai plugin:
Install:
npm install karma-chai --save-dev
Configure:
karma.set({
frameworks: ['mocha', 'chai']
// ...
});
Use:
describe('something', function () {
it('should pass', function () {
expect(true).to.be(true); // works
});
});
Thanks to Louis answer and a bit of fiddling around I sorted out my test environment references using mocha.opts. Here is the complete setup.
My project is a legacy JavaScript application with a lot of "plain" js files which I wish to reference both in an html file using script tags and using require for unit testing with mocha.
I am not certain that this is good practice but I am used to Mocha for unit testing in node project and was eager to use the same tool with minimal adaptation.
I found that exporting is easy:
class Foo{...}
class Bar{...}
if (typeof module !== 'undefined') module.exports = { Foo, Bar };
or
class Buzz{...}
if (typeof module !== 'undefined') module.exports = Buzz;
However, trying to use require in all the files was an issue as the browser would complain about variables being already declared even when enclosed in an if block such as:
if (typeof require !== 'undefined') {
var {Foo,Bar} = require('./foobar.js');
}
So I got rid of the require part in the files and set up a mocha.opts file in my test folder with this content. The paths are relative to the root folder:
--require test/mocha.opts.js
mocha.opts.js content. The paths are relative to the location of the file:
global.assert = require('assert');
global.Foo = require("../foobar.js").Foo;
global.Bar = require("../foobar.js").Bar;
global.Buzz = require("../buzz.js");

Categories