I created a plugin in plugins/index.js
const os = require('os');
module.exports = (on, config) => {
config.env.testing_computer = os.hostname();
return config;
}
Basically I want to store the hostname in the environment variable testing_computer.
However when I try to access it later in a custom command by Cypress.env("testing_computer") it is an empty string. How can I get the actual value?
Fixed. I cleared the rest of the file and left only the new plugin. I think the line at the top <references... may have been causing trouble, though I'm not sure.
Related
I am trying to call an API, loop through an array of images, assign unique names to each image in the array and then write them to a local directory. If I simplify the code I can write them to the root folder .I already created the sub folder manually, so it existed prior to running the function.
Here is my basic function:
const imageFolder = './img';
function downloadImage(url, filepath) {
client.get(url, res => {
res.pipe(fs.createWriteStream(`${imageFolder}/${filepath}`));
});
}
...make api call
const imagesArray = generations.data.map(item => item.generation.image_path);
imagesArray.forEach(item => {
// const fileName = uuid.v4() + '.webp'; // trying to assign unique filename with uuid
const fileName = new Date().getTime().toString() + '.webp'; // trying to assign unique filename with date object
downloadImage(item, fileName);
});
If I change
res.pipe(fs.createWriteStream(`${imageFolder}/${filepath}`));
to
res.pipe(fs.createWriteStream(filepath));
then it will work but just dumps the images in the root. I was thinking perhaps I was trying to concatenate a variable name with a string (for fileName + '.webp', but it is working in the root as mentioned. See attached image.
I also tried adding the path into the actual function call inside the forEach loop like so
downloadImage(item, `${imageFolder}/${fileName}`);
I did wonder about needing the __dirname variable, or whether it could be a permissions issue, but I don't see any errors.
I am assuming this is pretty straightforward.
OK, was fairly simple and I guess I sort of knew it once I got it working, changing to
downloadImage(item, path.join('src', 'img', fileName));
path.join concatenates folder names and fixes issues when working across platforms (OSX, Windows etc) which applies in this case as I am testing from both Windows and Mac.
I have a local JSON file which I intent to read/write from a NodeJS electron app. I am not sure, but I believe that instead of using readFile() and writeFile(), I should get a FileHandle to avoid multiple open and close actions.
So I've tried to grab a FileHandle from fs.promises.open(), but the problem seems to be that I am unable to get a FileHandle from an existing file without truncate it and clear it to 0.
const { resolve } = require('path');
const fsPromises = require('fs').promises;
function init() {
// Save table name
this.path = resolve(__dirname, '..', 'data', `test.json`);
// Create/Open the json file
fsPromises
.open(this.path, 'wx+')
.then(fileHandle => {
// Grab file handle if the file don't exists
// because of the flag 'wx+'
this.fh = fileHandle;
})
.catch(err => {
if (err.code === 'EEXIST') {
// File exists
}
});
}
Am I doing something wrong? Are there better ways to do it?
Links:
https://nodejs.org/api/fs.html#fs_fspromises_open_path_flags_mode
https://nodejs.org/api/fs.html#fs_file_system_flags
Because JSON is a text format that has to be read or written all at once and can't be easily modified or added onto in place, you're going to have to read the whole file or write the whole file at once.
So, your simplest option will be to just use fs.promises.readFile() and fs.promises.writeFile() and let the library open the file, read/write it and close the file. Opening and closing a file in a modern OS takes advantage of disk caching so if you're reopening a file you just previously opened not long ago, it's not going to be a slow operation. Further, since nodejs performs these operations in secondary threads in libuv, it doesn't block the main thread of nodejs either so its generally not a performance issue for your server.
If you really wanted to open the file once and hold it open, you would open it for reading and writing using the r+ flag as in:
const fileHandle = await fsPromises.open(this.path, 'r+');
Reading the whole file would be simple as the new fileHandle object has a .readFile() method.
const text = await fileHandle.readFile({encoding 'utf8'});
For writing the whole file from an open filehandle, you would have to truncate the file, then write your bytes, then flush the write buffer to ensure the last bit of the data got to the disk and isn't sitting in a buffer.
await fileHandle.truncate(0); // clear previous contents
let {bytesWritten} = await fileHandle.write(mybuffer, 0, someLength, 0); // write new data
assert(bytesWritten === someLength);
await fileHandle.sync(); // flush buffering to disk
I have a (Twilio) API call which requires credentials accountSid and authToken.
twilio.js
const twilio = require('twilio');
const accountSid = require('./auth/twilio_credentials');
const authToken = require('./auth/twilio_credentials');
console.log('accountSid: ' + accountSid);
console.log('authToken: ' + authToken);
module.exports = new twilio.Twilio(accountSid, authToken);
For security, instead of pasting the values directly into code, I have them in separate file. The credentials are used in the file at the same level as the auth folder
auth/twilio_credentials.js
module.exports = accountSid = 'xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx';
module.exports = authToken = 'xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx';
From the console logs (and using typeof), I was able to verify that the strings are being imported properly to that point, but when I run the code, I get throw new Error('accountSid is required');. However, it works when I paste the values directly into the file.
I feel like this is a wonky JavaScript thing that I'm missing. What's the difference between importing the string value from a different file, versus directly using a hard-coded value?
Try exporting them like this:
exports.accountSid = 'xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx';
exports.authToken = 'xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx';
And then, import them like this:
const {accountSid, authToken} = require("./auth/twilio_credentials");
I will suggest you to check how module.exports and exports work in Node.js here:
What is the purpose of Node.js module.exports and how do you use it?
I am using lockfile.locksync to lock a file in node.js. But I'd like to know the complete mechanism of this utility. So far every website says that its a "very polite lock file utility" but none explains the internal mechanism of this utility.
Any suggestions?
NODE FS
To understand file options in Node.js, take a look here #fs_file_system_flags and Linux open() syscall which covers options like
O_TRUNC - truncate existing file
O_CREAT - create if not exists
O_WRONLY - access mode, write-only
O_EXCL - ensure that this call creates the file. In pair with O_CREAT triggers EEXIST error if file exists.
But it doesn't offer file lock alternatives similar to "flock"
NPM LOCKFILE INVESTIGATING
I tries to use npm library called "lockfile" as there are too much copy/pasted examples with it.
Verdict. "lockfile" npm library in current implementation "v1.0.4" is incomplete/useless!
From the source:
exports.lockSync = function (path, opts) {
opts = opts || {}
opts.req = opts.req || req++
debug('lockSync', path, opts)
if (opts.wait || opts.retryWait) {
throw new Error('opts.wait not supported sync for obvious reasons')
}
try {
var fd = fs.openSync(path, wx)
locks[path] = fd
try { fs.closeSync(fd) } catch (er) {}
debug('locked sync!', path, fd)
return
So that 5 lines of actual code in "try{}" block just "openSync" in "wx" mode, save "path" to "locks[path]" and "closeSync".
If file exists it fails! due to "wx"!
YOU CAN'T LOCK THE EXISTING FILE with "lockfile" library!
Finally it tries to call "unlockSync" with onExit signal with this code:
exports.unlockSync = function (path) {
debug('unlockSync', path)
// best-effort. unlocking an already-unlocked lock is a noop
try { fs.unlinkSync(path) } catch (er) {}
delete locks[path]
}
And yes, it will DELETE your file after process exited!
This is not the file locking mechanics!
ANY RELIABLE SOLUTION FOR NODEJS FILE LOCK?
I've found fs-ext works just perfect! Open 2 tabs with node process and check what locking file means:
tab1
const fs = require("fs");
const {flockSync} = require('fs-ext');
const fd = fs.openSync('1.txt', 'w');
flockSync(fd, 'ex');
tab2
const fs = require("fs");
const {flockSync} = require('fs-ext');
const fd = fs.openSync('1.txt', 'r');
flockSync(fd, 'sh'); // PENDING!
In tab2 flockSync(fd, 'sh'); will pending until in tab1 flockSync(fd, 'un'); is called!
It really works!
I need to parse json object when the node server.js(which is my entry point to the program) is started ,the parse of the json file is done in diffrent module in my project.
I've two questions
Is it recommended to invoke the parse function with event in the server.js file
I read about the event.emiter but not sure how to invoke function
from different module...example will be very helpful
I've multiple JSON files
UPDATE to make it more clear
if I read 3 json file object (50 lines each) when the server/app is loaded (server.js file) this will be fast I guess. my scenario is that the list of the valid path's for the express call is in this json files
app.get('/run1', function (req, res) {
res.send('Hello World!');
});
So run1 should be defined in the json file(like white list of path's) if user put run2 which I not defined I need to provide error so I think that when the server is up to do this call and keep this obj with all config valid path and when user make a call just get this object which alreay parsed (when the server loaded ) and verify if its OK, I think its better approach instead doing this on call
UPDATE 2
I'll try explain more simple.
Lets assume that you have white list of path which you should listen,
like run1
app.get('/run1', function
Those path list are defined in jsons files inside your project under specific folder,before every call to your application via express you should verify that this path that was requested is in the path list of json. this is given. now how to do it.
Currently I've develop module which seek the json files in this and find if specific path is exist there.
Now I think that right solution is that when the node application is started to invoke this functionality and keep the list of valid paths in some object which I can access very easy during the user call and check if path there.
my question is how to provide some event to the validator module when the node app(Server.js) is up to provide this object.
If it's a part of your application initialization, then you could read and parse this JSON file synchronously, using either fs.readFileSync and JSON.parse, or require:
var config = require('path/to/my/config.json');
Just make sure that the module handling this JSON loading is required in your application root before app.listen call.
In this case JSON data will be loaded and parsed by the time you server will start, and there will be no need to trouble yourself with callbacks or event emitters.
I can't see any benefits of loading your initial config asynchronously for two reasons:
The bottleneck of JSON parsing is the parser itself, but since it's synchronous, you won't gain anything here. So, the only part you'll be able to optimize is interactions with your file system (i.e. reading data from disk).
Your application won't be able to work properly until this data will be loaded.
Update
If for some reason you can't make your initialization synchronous, you could delay starting your application until initialization is done.
The easiest solution here is to move app.listen part inside of initialization callback:
// initialization.js
var glob = require('glob')
var path = require('path')
module.exports = function initialization (done) {
var data = {}
glob('./config/*.json', function (err, files) {
if (err) throw err
files.forEach(function (file) {
var filename = path.basename(file)
data[filename] = require(file)
})
done(data);
})
}
// server.js
var initialization = require('./initialization')
var app = require('express')()
initialization(function (data) {
app.use(require('./my-middleware')(data))
app.listen(8000)
})
An alternative solution is to use simple event emitter to signal that your data is ready:
// config.js
var glob = require('glob')
var path = require('path')
var events = require('events')
var obj = new events.EventEmitter()
obj.data = {}
glob('./config/*.json', function (err, files) {
if (err) throw err
files.forEach(function (file) {
var filename = path.basename(file)
obj.data[filename] = require(file)
})
obj.emit('ready')
})
module.exports = obj
// server.js
var config = require('./config')
var app = require('express')()
app.use(require('./my-middleware'))
config.on('ready', function () {
app.listen(8000)
})