I want to read a symlink, and get the details of the link itself, not the contents of the linked file. How do I do that in Node, in a cross-platform way?
I can detect symlinks easily using lstat, no problem. Once I know the path of the file, and that it is a symlink though, how can I read it? fs.readFile always reads the target file, or throws an error for reading a directory for links to directories.
There is a fs.constants.O_SYMLINK constant, which in theory solves this on OSX, but it seems to be undefined on both Ubuntu & Windows 10.
If you have determined that the file is a symlink try this:
fs.readlink("./mysimlink", function (err, linkString) {
// .. do some error handling here ..
console.log(linkString)
});
Confirmed as working on Linux.
You could then use fs.realpath() to turn it into a full path. Be aware though that linkString can be just a filename or relative path as well as a fully qualified path so you may have to get fs.realpath() for the symlink, determine its directory part and prefix it to linkString before using fs.realpath() on it.
I've just faced the same issue: sometimes fs.readlink returns a relative path, sometimes it returns an absolute path.
(proper error handling not implemented to keep things simple)
const fs = require('fs');
const pathPckg = require('path');
async function getTarLinkOfSymLink(path){
return new Promise((resolve, reject)=>{
fs.readlink(path, (err, tarPath)=>{
if(err){
console.log(err.message);
return resolve('');
}
const baseSrcPath = pathPckg.dirname(path);
return resolve( pathPckg.resolve(baseSrcPath, tarPath) );
});
});
}
// usage:
const path = '/example/symbolic/link/path';
const tarPath = await getTarLinkOfSymLink(path);
The code works if the symbolic link is either a file or a directory/folder - tested on Linux
Related
So lets say I have some code in js
const myApiKey = 'id_0001'
But instead of harcoding it I want to put it in some bash script with other env vars and read from it and then replace it in the JS
So lets say for prod I would read from prod-env.sh or for dev I would read them from dev-env.sh and then gulp or some other tool does the magic and replaces MY_API_KEY based on whatever is established inside of prod-env.sh or dev-env.sh.
const myApiKey = MY_API_KEY
Update: I want to add I only care about unix OS, not concerned about windows. In golang there is way to read for example envVars.get('MY_API_KEY'), I'm looking for something similar but for JS in the client side.
If you're using gulp, it sounds like you could use any gulp string replacer, like gulp-replace.
As for writing the gulp task(s). If you are willing to import the environment into your shell first, before running node, you can access the environment via process.env
gulp.task('build', function(){
gulp.src(['example.js'])
.pipe(replace('MY_API_KEY', process.env.MY_API_KEY))
.pipe(gulp.dest('build/'));
});
If you don't want to import the environment files before running node, you can use a library like env2 to read shell environment files.
Another option would be to use js/json to define those environment files, and load them with require.
prod-env.js
{
"MY_API_KEY": "api_key"
}
gulpfile.js
const myEnv = require('./prod-env')
gulp.task('build', function(){
gulp.src(['example.js'])
.pipe(replace('MY_API_KEY', myEnv.MY_API_KEY))
.pipe(gulp.dest('build/'));
});
Also, for a more generic, loopy version of the replace you can do:
gulp.task('build', function () {
stream = gulp.src(['example.js']);
for (const key in process.env) {
stream.pipe('${' + key + '}', process.env[key]);
}
stream.pipe(gulp.dest('build/'));
});
In that last example I added ${} around the environment variable name to make it less prone to accidents. So the source file becomes:
const myApiKey = ${MY_API_KEY}
This answer is an easy way to do this for someone who doesn't want to touch the code they are managing. For example you are on the ops team but not the dev team and need to do what you are describing.
The environment variable NODE_OPTIONS can control many things about the node.js runtime - see https://nodejs.org/api/cli.html#cli_node_options_options
One such option we can set is --require which allows us to run code before anything else is even loaded.
So using this you can create a overwrite.js file to perform this replacement on any non-node_modules script files:
const fs = require('fs');
const original = fs.readFileSync;
// set some custom env variables
// API_KEY_ENV_VAR - the value to set
// API_KEY_TEMPLATE_TOKEN - the token to replace with the value
if (!process.env.API_KEY_TEMPLATE_TOKEN) {
console.error('Please set API_KEY_TEMPLATE_TOKEN');
process.exit(1);
}
if (!process.env.API_KEY_ENV_VAR) {
console.error('Please set API_KEY_ENV_VAR');
process.exit(1);
}
fs.readFileSync = (file, ...args) => {
if (file.includes('node_modules')) {
return original(file, ...args);
}
const fileContents = original(file, ...args).toString(
/* set encoding here, or let it default to utf-8 */
);
return fileContents
.split(process.env.API_KEY_TEMPLATE_TOKEN)
.join(process.env.API_KEY_ENV_VAR);
};
Then use it with a command like this:
export API_KEY_ENV_VAR=123;
export API_KEY_TEMPLATE_TOKEN=TOKEN;
NODE_OPTIONS="--require ./overwrite.js" node target.js
Supposing you had a script target.js
console.log('TOKEN');
It would log 123. You can use this pretty much universally with node, so it should work fine with gulp, grunt, or any others.
I am aware this isn't the first post about fs.unlink not working, but I'm very new to both Visual Studio and Node Js.
I want to delete a file in the working folder, I got an error and the file is not deleted.
Here is what I tried:
var fs = require('fs');
fs.unlink('test1.txt');
PS: I installed the necessary Node Js components in VS.
As far as the code goes, you're not invoking fs.unlink properly. For starters, it's asynchronous. You will need to provide it a callback. See example here:
https://nodejs.org/api/fs.html#fs_fs_unlink_path_callback
Secondly, you need to provide it the full file path, not just the name of the file... ie:
var fs = require('fs');
fs.unlink('C:\path\to\my\file\test1.txt', (err) => {});
You can also supply it with the variable __dirname to utilize your current working directory from wherever you invoke node against the script. Thus, that would look something like:
let fs = require('fs');
let path = require('path');
fs.unlink(path.join(__dirname, 'test1.txt', (err) => {
if (err) throw err;
console.log('test1.txt was deleted');
});
Currently, you can also invoke it synchronously using it's single parameter signature... thus you'd provide only the dir path:
fs.unlinkSync('C:\\path\\to\\my\\file\\test1.txt');
But, this is ill-advised as it will be blocking. I'd only use the "sync" variant during some application bootstrapping process, where it'd be invoked only one time or so, at startup. Try to fight the urge of it being "easier" to use and understand, and instead get yourself to understand asynchronous logic.
I'm currently trying to search for a few files in a specific folder on Windows using node and grunt.
I have a grunt task that has a function to read a dir with JSON files, but the problem is that when I run the task, the code to read the file doesn't do anything, everything else on that grunt task runs perfect, but that. I'm not sure if the reference for the path is correct, but I'm also using path.normalize() and it does not throws any error.
This is snippet of the code:
..// Some other code
var fs = require('fs'),
path = require("path");
grunt.registerTask('separate', function() {
var filePath = path.normalize("C:\Users\jbernhardt\Desktop\testkeeper\jenkinsReports");
fs.readdir(filePath, function(err, filenames) {
//This log doesn't show as it the function is not running
grunt.log.writeln("Testing");
if (err) {
grunt.log.writeln("Error");
return;
}
filenames.forEach(function(filename){
grunt.log.writeln("Testing");
});
});
...//Some more code below for the same task
}
Does anyone has an idea why this snippet of the code is being skipped when I run the task? I could probably be missing some basic stuffs. Thanks!
Try readdirSync and check if your function still not working. I guess your process is finished before the callback.
You can simply use the __dirname object to get the path where the current script is running:
..// Some other code
var fs = require('fs'),
path = require("path");
grunt.registerTask('separate', function() {
fs.readdir(__dirname, function(err, filenames) {
//This log doesn't show as it the function is not running
grunt.log.writeln("Testing");
if (err) {
grunt.log.writeln("Error");
return;
}
filenames.forEach(function(filename){
grunt.log.writeln("Testing");
});
});
...//Some more code below for the same task
}
You can find more info here.
you need change your path
var filePath = path.normalize("C:\\Users\\jbernhardt\\Desktop\\testkeeper\\jenkinsReports");
Also To achieve consistent results when working with Windows file paths on any operating system, use path.win32:
path.win32.basename('C:\\Users\\jbernhardt\\Desktop\\testkeeper\\jenkinsReports"');
You can read about https://nodejs.org/api/path.html#path_windows_vs_posix
The slash in path are being escaped.
"C:\\Users\\jbernhardt\\Desktop\\testkeeper\\jenkinsReports"
should solve your issue.
I am using the following code to take screenshots (in after each) when a test fails in Protractor:
function failScreenshot() {
var fs = require('fs');
var spec = jasmine.getEnv().currentSpec;
var specName = spec.description.split(' ').join('_');
if (spec.results().passed()) {
return;
} else {
browser.takeScreenshot().then(
function(png) {
var stream = fs.createWriteStream('screenshots/' + specName + '.png');
stream.write(new Buffer(png, 'base64'));
stream.end();
});
}
}
When I am running the tests locally, the screenshot works just as expected. When running the tests via Jenkins, the tests will stop at the first fail and the screenshot is not created. Also, the folders and paths are correct, I have checked them over and over again. My Jenkins version is 1532.1
Any ideeas on how could I solve this issue?
After further documentation I have found the answer. It was a problem with the path. It seems like NODE JS does not read the path as I thought.
The ./ returns the current directory, except in the require() function. When using require(), it reads ./ to the directory of the file in which it was called (obviously, the mistake was here). __dirname is always the directory of the file in which is used.
The code to be used for my path is the following:
__dirname + '/screenshots/' + specName + '.png'
You can also take the screenshots in jenkins by using the mocha-proshot reporter.
It is a npm package which can be downloaded easily and is very easy to setup.
Whenever I run vm.runInThisContext(code, filename), the code I ran reports __filename and __dirname as undefined.
This also leads to the situation that any fs.readFile and such calls will not work with relative paths. Actually to be exact, file system functions do not work at all even if I feed them a hard-coded absolute path to an existing file.
For example, this will do nothing:
var fs = require('fs');
fs.readFile('/home/test/file.txt', function(e, data) {
if (e) {throw e;}
console.log('here i am');
});
What happens is that nothing happens. If I run the code in normal NodeJS code then it outputs "here i am", but if I run that code through the vm module, then nothing happens. The callback is simply never called, because for some reason it can't locate the file and there does not seem to be any timeouts either.
How can I make Node to understand that the executed code is some "file" and also make the fs module functions to work? I tried specifying the second parameter to vm.runInThisContext(code, filename), but I see no difference. It almost looks that Node doesn't care about the second parameter.
I'm not exactly sure how I even got my code examples to work before, because right now they do not work at all.
I found out that you can use vm.runInNewContext(code, sandbox, filename) and then specify require, __filename and whatever you need in the sandbox:
// Place here some path to test it. I'm using Cygwin...
var filename = '/cygdrive/z/www/project/src/bootstrap.js';
var code = "var fs = require('fs'); fs.readFile('./path/to/some/file.js', function(e, data) {if (e) {throw e;} console.log(data.toString());});";
var vm = require('vm');
vm.runInNewContext(code, {
require: require,
console: console,
__filename: filename
}, filename);
Then if I run node bootstrap.js --debug it works fine!