I have an entry point in my app that is executed via npm start. I'd like to run some tests on this script with Jest, but cannot figure out how I should do it. The script automatically runs, so if I import it into a Jest file, I can't call it individually in my test blocks like:
const entryPoint = require('./entry-point')
test('something', () => {
entryPoint()
})
The code will already execute before it reaches any of the test blocks.
The code for the entry point is here:
const fs = require("fs");
const summarizeData = require("./summarize-data");
try {
const fileName = process.argv[2];
if (!fileName) {
throw Error("Please enter a file name. ex: npm start <filename>");
}
if (!fs.existsSync(`${fileName}.json`)) {
throw Error(`The file ${fileName}.json could not be found.`);
}
const jsonParsed = JSON.parse(fs.readFileSync(`${fileName}.json`, "utf8"));
const data = summarizeData(jsonParsed);
console.log(data);
} catch (error) {
throw Error(error);
}
I think it will be enough to unit test summarizeData function.
Something like this (using shouldJS for assertions):
const should = require('should');
const fileName = 'testData';
test('it summarize data properly', () => {
const jsonParsed = JSON.parse(fs.readFileSync(`${fileName}.json`, "utf8"));
const dataSummarized = summarizeData(jsonParsed);
dataSummarazied.something.should.be.equal(1); // TODO - add other assertions to cover all result
})
Related
Im trying to use fs to make a Command-Handler for ESM(ECMAScript) because you can't use fs in ESM i have 2 js enviornments, one with ESM and one with Node.js the Node env is only there so it reads all file names of a folder and uses the names to import and export them inside a file that the other env uses. I allready have it so the names get stored inside a const in the node env but when i try to write them with fs it gives me an error, and when i try to log the string it says undefined
const fs = require("fs")
const commands = []
try {
fs.unlinkSync("./scripts/modules/commands.js")
console.log('\x1b[32m File Deleted Succesfully \x1b[0m')
} catch (e) {
console.error(e)
}
try {
fs.openSync("./scripts/modules/commands.js", 'w')
console.log('\x1b[32m Created File Successfully')
} catch (e) {
console.error(e)
}
try {
const cCommands = fs.readdirSync("./scripts/modules/commands/").filter(file => file.endsWith('.js'));
for (const file of cCommands) {
const name = file.replace(".js", "")
commands.push(name)
}
console.log('\x1b[32m Pushed all Files Successfully \x1b[0m\n')
} catch (e) {
console.error(e)
}
// This outputs => 'ping' as a string
console.log(`${commands}`)
// This outputs => undefinedexport const commands = {undefined}; but should output => import {ping} from './commands/ping'; export const commands = {ping:ping};
console.log(`${commands.forEach(command => `import {${command}} from './commands/${command}';`)}export const commands = {${commands.forEach(command => `${command}:${command},`)}};`)
try {
const cmdString = `${commands.forEach(command => `import {${command}} from './commands/${command}';`)}export const commands = {${commands.forEach(command => `${command}:${command},`)}};`
const jsonString = JSON.stringify(cmdString);
fs.writeFile("./scripts/modules/commands.js", jsonString)
console.log(jsonString)
console.log('\x1b[32m Send all Commands Successfully \x1b[0m')
} catch (e) {
console.error(e)
}
Edit: Changed all .forEach() functions to .map() now this error accures => TypeError [ERR_INVALID_ARG_TYPE]: The "cb" argument must be of type function. Received undefined
To fix this error just use fs.writeFileSync instead of fs.writeFile
I have managed to use fleek to update IPFS via straight javascript. I am now trying to add this functionality to a clean install of a svelteKit app. I think I am having trouble with the syntax around imports, but am not sure what I am doing wrong. When I click the button on the index.svelte I get the following error
Uncaught ReferenceError: require is not defined
uploadIPFS upload.js:3
listen index.mjs:412..........(I truncated the error here)
A few thoughts
I am wondering if it could be working in javascript because it is being called in node (running on the server) but running on the client in svelte?
More Details
The index.svelte file looks like this
<script>
import {uploadIPFS} from '../IPFS/upload'
</script>
<button on:click={uploadIPFS}>
upload to ipfs
</button>
the upload.js file looks like this
export const uploadIPFS = () => {
const fleek = require('#fleekhq/fleek-storage-js');
const apiKey = 'cZsQh9XV5+6Nd1+Bou4OuA==';
const apiSecret = '';
const data = 'pauls test load';
const testFunctionUpload = async (data) => {
const date = new Date();
const timestamp = date.getTime();
const input = {
apiKey,
apiSecret,
key: `file-${timestamp}`,
data
};
try {
const result = await fleek.upload(input);
console.log(result);
} catch (e) {
console.log('error', e);
}
};
testFunctionUpload(data);
};
I have also tried using the other import syntax and when I do I get the following error
500
global is not defined....
import with the other syntax is
import fleekStorage from '#fleekhq/fleek-storage-js';
function uploadIPFS() {
console.log('fleekStorage',fleekStorage)
};
export default uploadIPFS;
*I erased the api secret in the code above. In future I will store these in a .env file.
Even more details (if you need them)
The file below will update IPFS and runs via the command
npm run upload
That file is below. For my version that I used in svelte I simplified the file by removing all the file management and just loading a variable instead of a file (as in the example below)
const fs = require('fs');
const path = require('path');
const fleek = require('#fleekhq/fleek-storage-js');
require('dotenv').config()
const apiKey = process.env.FLEEK_API_KEY;
const apiSecret = process.env.FLEEK_API_SECRET;
const testFunctionUpload = async (data) => {
const date = new Date();
const timestamp = date.getTime();
const input = {
apiKey,
apiSecret,
key: `file-${timestamp}`,
data,
};
try {
const result = await fleek.upload(input);
console.log(result);
} catch(e) {
console.log('error', e);
}
}
// File management not used a my svelte version to keep it simple
const filePath = path.join(__dirname, 'README.md');
fs.readFile(filePath, (err, data) => {
if(!err) {
testFunctionUpload(data);
}
})
I'm trying to mock a function using Frisby and Jest.
Here are some details about my code:
dependencies
axios: "^0.26.0",
dotenv: "^16.0.0",
express: "^4.17.2"
devDependencies
frisby: "^2.1.3",
jest: "^27.5.1"
When I mock using Jest, the correct response from API is returned, but I don't want it. I want to return a fake result like this: { a: 'b' }.
How to solve it?
I have the following code:
// (API Fetch file) backend/api/fetchBtcCurrency.js
const axios = require('axios');
const URL = 'https://api.coindesk.com/v1/bpi/currentprice/BTC.json';
const getCurrency = async () => {
const response = await axios.get(URL);
return response.data;
};
module.exports = {
getCurrency,
};
// (Model using fetch file) backend/model/cryptoModel.js
const fetchBtcCurrency = require('../api/fetchBtcCurrency');
const getBtcCurrency = async () => {
const responseFromApi = await fetchBtcCurrency.getCurrency();
return responseFromApi;
};
module.exports = {
getBtcCurrency,
};
// (My test file) /backend/__tests__/cryptoBtc.test.js
require("dotenv").config();
const frisby = require("frisby");
const URL = "http://localhost:4000/";
describe("Testing GET /api/crypto/btc", () => {
beforeEach(() => {
jest.mock('../api/fetchBtcCurrency');
});
it('Verify if returns correct response with status code 200', async () => {
const fetchBtcCurrency = require('../api/fetchBtcCurrency').getCurrency;
fetchBtcCurrency.mockImplementation(() => (JSON.stringify({ a: 'b'})));
const defaultExport = await fetchBtcCurrency();
expect(defaultExport).toBe(JSON.stringify({ a: 'b'})); // This assert works
await frisby
.get(`${URL}api/crypto/btc`)
.expect('status', 200)
.expect('json', { a: 'b'}); // Integration test with Frisby does not work correctly.
});
});
Response[
{
I hid the lines to save screen space.
}
->>>>>>> does not contain provided JSON [ {"a":"b"} ]
];
This is a classic lost reference problem.
Since you're using Frisby, by looking at your test, it seems you're starting the server in parallel, correct? You first start your server with, say npm start, then you run your test with npm test.
The problem with that is: by the time your test starts, your server is already running. Since you started your server with the real fetchBtcCurrency.getCurrency, jest can't do anything from this point on. Your server will continue to point towards the real module, not the mocked one.
Check this illustration: https://gist.githubusercontent.com/heyset/a554f9fe4f34101430e1ec0d53f52fa3/raw/9556a9dbd767def0ac9dc2b54662b455cc4bd01d/illustration.svg
The reason the assertion on the import inside the test works is because that import is made after the mock replaces the real file.
You didn't share your app or server file, but if you are creating the server and listening on the same module, and those are "hanging on global" (i.e: being called from the body of the script, and not part of a function), you'll have to split them. You'll need a file that creates the server (appending any route/middleware/etc to it), and you'll need a separate file just to import that first one and start listening.
For example:
app.js
const express = require('express');
const { getCurrency } = require('./fetchBtcCurrency');
const app = express()
app.get('/api/crypto/btc', async (req, res) => {
const currency = await getCurrency();
res.status(200).json(currency);
});
module.exports = { app }
server.js
const { app } = require('./app');
app.listen(4000, () => {
console.log('server is up on port 4000');
});
Then, on your start script, you run the server file. But, on your test, you import the app file. You don't start the server in parallel. You'll start and stop it as part of the test setup/teardown.
This will give jest the chance of replacing the real module with the mocked one before the server starts listening (at which point it loses control over it)
With that, your test could be:
cryptoBtc.test.js
require("dotenv").config();
const frisby = require("frisby");
const URL = "http://localhost:4000/";
const fetchBtcCurrency = require('./fetchBtcCurrency');
const { app } = require('./app');
jest.mock('./fetchBtcCurrency')
describe("Testing GET /api/crypto/btc", () => {
let server;
beforeAll((done) => {
server = app.listen(4000, () => {
done();
});
});
afterAll(() => {
server.close();
});
it('Verify if returns correct response with status code 200', async () => {
fetchBtcCurrency.getCurrency.mockImplementation(() => ({ a: 'b' }));
await frisby
.get(`${URL}api/crypto/btc`)
.expect('status', 200)
.expect('json', { a: 'b'});
});
});
Note that the order of imports don't matter. You can do the "mock" below the real import. Jest is smart enough to know that mocks should come first.
When i try executing this code i get an error. "Error: EISDIR: illegal operation on a directory, read".
Line 18 : Column 19
const { Client, Intents, Collection } = require('discord.js')
const config = require('./config.json')
const fs = require('fs')
const bot = new Client({ intents: [ Intents.FLAGS.GUILDS, Intents.FLAGS.GUILD_MESSAGES ] })
bot.commands = new Collection()
var cmdFiles = fs.readFileSync('./cmd').filter(f => f.endsWith(".js"))
for(const f in cmdFiles) {
const cmd = require(`./commands/${f}`)
bot.commands.set(cmd.help.name, cmd)
}
bot.once("ready", () => {
console.log('Bot is ready!')
})
bot.on("messageCreate", async message => {
if(message.author.bot) return;
var prefix = config.prefix
if(!message.content.startsWith(prefix)) return;
var array = message.content.split(" ");
var command = array[0];
const data = bot.commands.get(command.slice(prefix.length))
if(!data) return;
var args = array.slice(1)
try {
await data.run(bot, message, args)
} catch(e) {
await message.channel.send(e)
await console.log(e)
}
})
bot.login(config.token)
Yes all config things are defined.
I've tried searching for this error but got nothing that i need.
What i want to do is load every file from the directory 'cmd' in a array list and run a command if it is called.
Change this:
var cmdFiles = fs.readFileSync('./cmd').filter(f => f.endsWith(".js"));
to this:
var cmdFiles = fs.readdirSync('./cmd').filter(f => f.endsWith(".js"));
As your question states, ./cmd is a directory and you can't list the files in a directory with fs.readFileSync(). You would use fs.readdirSync() to do that.
fs.readFileSync() tries to open the directory as a file and read its contents. Since it's not a file, you get the EISDIR error.
I am trying to output the details of an audio file with ffmpeg using the ffprobe option. But it is just returning 'null' at the moment? I have added the ffmpeg layer in Lambda. can anyone spot why this is not working?
const { spawnSync } = require("child_process");
const { readFileSync, writeFileSync, unlinkSync } = require("fs");
const util = require('util');
var fs = require('fs');
let path = require("path");
exports.handler = (event, context, callback) => {
spawnSync(
"/opt/bin/ffprobe",
[
`var/task/myaudio.flac`
],
{ stdio: "inherit" }
);
};
This is the official AWS Lambda layer I am using, it is a great prooject but a little lacking in documentation.
https://github.com/serverlesspub/ffmpeg-aws-lambda-layer
First of all, I would recommend using NodeJS 8.10 over NodeJs 6.10 (it will be soon EOL, although AWS is unclear on how long it will be supported)
Also, I would not use the old style handler with a callback.
A working example below - since it downloads a file from the internet (couldn't be bothered to create a package to deploy on lambda with the file uploaded) give it a bit more time to work.
const { spawnSync } = require('child_process');
const util = require('util');
var fs = require('fs');
let path = require('path');
const https = require('https');
exports.handler = async (event) => {
const source_url = 'https://upload.wikimedia.org/wikipedia/commons/b/b2/Bell-ring.flac'
const target_path = '/tmp/test.flac'
async function downloadFile() {
return new Promise((resolve, reject) => {
const file = fs.createWriteStream(target_path);
const request = https.get(source_url, function(response) {
const stream = response.pipe(file)
stream.on('finish', () => {resolve()})
});
});
}
await downloadFile()
const test = spawnSync('/opt/bin/ffprobe',[
target_path
]);
console.log(test.output.toString('utf8'))
const response = {
statusCode: 200,
body: JSON.stringify([test.output.toString('utf8')]),
};
return response;
}
NB! In production be sure to generate a unique temporary file as instances that the Lambda function run on are often shared from invocation to invocation, you don't want multiple invocations stepping on each others files! When done, delete the temporary file, otherwise you might run out of free space on the instance executing your functions. The /tmp folder can hold 512MB, so it can run out fast if you work with many large flac files
I'm not fully familiar with this layer, however from looking at the git repo of the thumbnail-builder it looks like child_process is a promise, so you should be waiting for it's result using .then(), otherwise it is returning null because it doesn't wait for the result.
So try something like:
return spawnSync(
"/opt/bin/ffprobe",
[
`var/task/myaudio.flac`
],
{ stdio: "inherit" }
).then(result => {
return result;
})
.catch(error => {
//handle error
});