I am developing an application using Node.js. I have been using callbacks but I recently started migrating to promises. My problem is that on the latest electron (12.0.0) and Node 14.15 promises just sometimes don't work. No error or anything, it just doesn't work or takes several seconds to. Check the following snippet
const fs=require('fs/promises')
test()
function test() {
fs.readFile('views/database_connection_status.html', 'utf-8').then((data) => {
console.log(data)
}).catch((error) => {
console.log(error)
})
}
The file does exist. Sometimes it loads, and displays correctly and sometimes just nothing.
const co = require('co');
const fs = require('fs');
test().then(res => {
callback(null,res)
}).catch(error => callback(error));
const test = () =>{
return co(function* (){
return fs.readFile('views/database_connection_status.html','utf-8)
}
}
Hey, Joseph, you can check out the "co" package for functions with promises, your code will look like the above.
Can read more about it https://www.npmjs.com/package/co
Related
When running API tests in Node.js I often find myself repeating whole blocks of it statements with slightly different assertions. This seems a waste and I'd like it to respect DRY principles.
Let's take the following as an example:-
const { expect } = require('chai');
const got = require('got');
const core = require('../Libraries/CoreFunctions')
describe('Some description', function() {
beforeEach(function() {
})
it('should do something', function(done) {
got.get('https://someurl.com',
{headers: core.headers()})
.then(res => {
core.respDateCode(res);
console.log(core.prettyJSON(res));
expect(core.something).to.be.lessThan(2000);
done()
})
.catch(err => {
console.log('Error: ', err.message);
});
}).timeout(5000);
it('should do something else', function(done) {
got.get('https://someurl.com',
{headers: core.headers()})
.then(res => {
core.respDateCode(res);
console.log(core.prettyJSON(res));
expect(core.somethingElse).to.be.greaterThan(1000);
done()
})
.catch(err => {
console.log('Error: ', err.message);
});
}).timeout(5000);
});
I'm looking for suggestions as to how best to refactor the above to reduce repetition?
Move the logic for fetching into a seperate file. You can then encapsulate every request into a function (with no parameters, so if API URL changes, your tests won't have to change). Every test should call the function under test explicitly in the "it" block, so it is quickly apparent what is being tested. If you have a lot of repeated setup code, you can move that into a function.
A nice side effect of the isolation of the API calls is that you will end up with a client for the API, that is actually being tested at the same time as your API.
Don't be afraid of your test code being duplicated at the high level. Basically "given this setup, when function under test is called, then this happens". You can put test setup into other functions, but don't overdo it, as you might risk not being able to tell what actually happened when looking at the test. Also, never abstract away the function under test.
const { expect } = require('chai');
const gotClient = require('../somewhere/client/uses/got');
const core = require('../Libraries/CoreFunctions')
describe('Some description', function() {
it('should do something', async function(done) {
// given
const res = await gotClient.resource.getOne()
// when
core.functionThatIsBeingTested(res);
// then
expect(core.something).to.be.lessThan(2000);
done()
}).timeout(5000);
it('should do something else', async function(done) {
// given
const res = await gotClient.resource.getOne()
// when
core.functionThatIsBeingTested(res);
// then
console.log(core.prettyJSON(res));
expect(core.somethingElse).to.be.greaterThan(1000);
done()
}).timeout(5000);
});
Notice, the only real difference between this version and your version is that in my version you don't need to concern yourself with the url and the headers, which makes the code more readable and easier to understand. It would be even better if client was named after the API it was fetching and the resource was a name of the resource.
Just as an example:
const res = await twilio.phoneNumbers.availableForPurchase()
I'm trying to use the async/await functionality to build a node JS script. I currently have a file called repo.js as a helper file to get data from Github's API and return it to a variable for me to access elsewhere in different JS files of my node application, repo.js is as such:
const axios = require('axios')
const repo = async () => {
const data = await axios.get('https://api.github.com/repos/OWNER/REPO/releases', {
headers: {
'Authorization': 'token MYTOKEN'
}
})
return data
}
exports.repo = repo
And then in my main.js file I'm trying to do...
const repo = require('./src/utils/repo')
program
.option('-d, --debug', 'output extra debugging')
.option('-s, --small', 'small pizza size')
.option('-p, --pizza-type <type>', 'flavour of pizza')
const repoData = repo.repo
console.log(repoData)
Unfortunately, this just returns [AsyncFunction: repo] to the console which isn't the intended behaviour. Why can't I access the contents here?
UPDATE
Based on some responses I've been given, I'm aware of the fact I need my code inside of a async function or to use .then(). The issue is, I don't want to put all of my application's code inside of a .then() just to rely on one thing from an API.
Example:
var version = ''
repo.getRepoDetails().then((res) => {
version = res.data[0].body.tag_name
})
Now I have access to version everywhere.
Every async/await function is a promise, meaning that you need to wait for it to finish in order to read it's result.
repo.repo().then(res => console.log(res))
If you application is a simple nodejs script(or single file) then you can wrap your code inside an IIFE like this:
const repo = require('./src/utils/repo')
(async () => {
program
.option('-d, --debug', 'output extra debugging')
.option('-s, --small', 'small pizza size')
.option('-p, --pizza-type <type>', 'flavour of pizza')
const repoData = await repo.repo() <--- You can use await now instead of then()
console.log(repoData)
})()
Async function always return promise object so you can access the result using promise.then() like
repo.repo().then(result => result)
In the following code, I'm reading some files and getting their filename and text. After that, I'm storing data in an option variable to generate an epub file:
const Epub = require("epub-gen")
const folder = './files/'
const fs = require('fs')
let content = []
fs.readdir(folder, (err, files) => {
files.forEach(filename => {
const title = filename.split('.').slice(0, -1).join('.')
const data = fs.readFileSync(`${folder}${filename}`).toString('utf-8')
content.push({ title, data })
})
})
const option = {
title: "Alice's Adventures in Wonderland", // *Required, title of the book.
content
}
new Epub(option, "./text.epub")
The problem is, new Epub runs before the files are read, before content is ready. I think Promise.all is the right candidate here. I checked the Mozilla docs. But it shows various promises as example, but I have none. So, I'm not very sure how to use Promise.all here.
Any advice?
Your problem is with readdir, which is asynchronous so new Epub, like you already figured out, is called before it's callback parameter.
Switch to using readdirSync or move const option ... new Epub... inside the callback parameter of readdir, after files.forEach.
At the moment you can do everything synchronous since you use readFileSync.
So you can place the Epub creation after the forEach loop.
If you want to go async, my first question would be:
Does your node.js version support util.promisify ( node version 8.x or higher iirc )?
If so, that can be used to turn the callback functions like readFile and such into promises. If not, you can use the same logic, but then with nested callbacks like the other solutions show.
const FS = require( 'fs' );
const { promisify } = require( 'util' );
const readFile = promisify( FS.readFile );
const readFolder = promisify( FS.readFolder );
readFolder( folder )
// extract the file paths. Maybe using `/${filename}` suffices here.
.then( files => files.map( filename => `${folder}${filename}`))
// map the paths with readFile so we get an array with promises.
.then( file_paths => file_paths.map( path => readFile( path )))
// fecth all the promises using Promise.all() .
.then( file_promises => Promise.all( file_promises ))
.then( data => {
// do something with the data array that is returned, like extracting the title.
// create the Epub objects by mapping the data values with their titles
})
// error handling.
.catch( err => console.error( err ));
Add promises to an array. Each promise should resolve with the value you were pushing into content
When all promises resolve, the returned value will be the array previously known as content.
Also, you can, and should, use all async fs calls. So readFileSync can be replaced with readFile (async). I did not replace your code with this async call however, so you can clearly see what was required to answer your original question.
Not sure if I got the nesting right in snippet.
const Epub = require("epub-gen")
const folder = './files/'
const fs = require('fs')
let promises = []
fs.readdir(folder, (err, files) => {
files.forEach(filename => {
promises.push(new Promise((resolve, reject) => {
const title = filename.split('.').slice(0, -1).join('.')
const data = fs.readFile(`${folder}${filename}`).toString('utf-8')
resolve({
title,
data
})
}))
})
})
const option = {
title: "Alice's Adventures in Wonderland", // *Required, title of the book.
content
}
new Epub(option, "./text.epub")
Promise.all(promises).then((content) => {
//done
})
I have a API script in a file
const ApiCall = {
fetchData: async (url) => {
const result = await fetch(url);
if (!result.ok) {
const body = await result.text(); // uncovered line
throw new Error(`Error fetching ${url}: ${result.status} ${result.statusText} - ${body}`); // uncovered line
}
return result.json();
},
};
export default ApiCall;
When I mock the call, I have two uncovered lines in code coverage.
Any idea how can I make them cover as well.
Here is what I have tried so far which is not working
it('test', async () => {
ApiCall.fetchData = jest.fn();
ApiCall.fetchData.result = { ok: false };
});
I am kind of new into Jest, so any help would be great.
You need to provide a stubb response in your test spec so that the if statement is triggered. https://www.npmjs.com/package/jest-fetch-mock will allow you to do just that. The example on their npm page should give you what you need https://www.npmjs.com/package/jest-fetch-mock#example-1---mocking-all-fetches
Basically the result is stored in state(redux) and is called from there. jest-fetch-mock overrides your api call/route and returns the stored result in redux all within the framework.
Assuming that what you want to test is the ApiCall then you would need to mock fetch. You are mocking the entire ApiCall so those lines will never execute.
Also, you have an issue, because if you find an error or promise rejection, the json() won't be available so that line will trigger an error.
Try this (haven't test it):
it('test error', (done) => {
let promise = Promise.reject(new Error("test"));
global.fetch = jest.fn(() => promise); //You might need to store the original fetch before swapping this
ApiCall.fetchData()
.catch(err => );
expect(err.message).toEqual("test");
done();
});
it('test OK', (done) => {
let promise = Promise.resolve({
json: jest.fn(() => {data: "data"})
});
global.fetch = jest.fn(() => promise);
ApiCall.fetchData()
.then(response => );
expect(response.data).toEqual("data");
done();
});
That probably won't work right away but hopefully you will get the idea. In this case, you already are working with a promise so see that I added the done() callback in the test, so you can tell jest you finished processing. There is another way to also make jest wait for the promise which is something like "return promise.then()".
Plese post back
I'm new to unit testing, and I'm aware my tests may not be valuable or following a specific best practice, but I'm focused on getting this working, which will allow me to test my frontend code using JSDOM.
const { JSDOM } = require('jsdom');
const { describe, it, beforeEach } = require('mocha');
const { expect } = require('chai');
let checkboxes;
const options = {
contentType: 'text/html',
};
describe('component.js', () => {
beforeEach(() => {
JSDOM.fromFile('/Users/johnsoct/Dropbox/Development/andybeverlyschool/dist/individual.html', options).then((dom) => {
checkboxes = dom.window.document.querySelectorAll('.checkbox');
});
});
describe('checkboxes', () => {
it('Checkboxes should be an array', () => {
expect(checkboxes).to.be.a('array');
});
});
});
I'm getting the error "AssertionError: expected undefined to be an array". I'm simply using the array test as a test to ensure I have JSDOM functioning correctly. There are no other errors occurring. Any help would be much appreciated!
fromFile is an async function, meaning that by the time your beforeEach() has finished and the tests start running, it is (probably) still loading the file.
Mocha handles async code in two ways: either return a promise or pass in a callback. So either return the promise from fromFile or do this:
beforeEach(function(done) {
JSDOM.fromFile(myFile)
.then((dom) => {
checkboxes = dom.window.document.querySelectorAll('.checkbox');
})
.then(done, done);
});
The promise version looks like this:
beforeEach(function() {
return JSDOM.fromFile(myFile)
.then((dom) => {
checkboxes = dom.window.document.querySelectorAll('.checkbox');
});
});