Is it possible to return the contents of a static path to a directory instead of using an .
I want to write a script that reads the contents of a directory on the file system to a given time daily. This is integrated in a webapp I can't edit.
Short answer: you can't.
You need to do this server-side. Here is an answer from a similar question, using node.js.
You can use the fs.readdir or fs.readdirSync methods.
fs.readdir
const testFolder = './tests/';
const fs = require('fs');
fs.readdir(testFolder, (err, files) => {
files.forEach(file => {
console.log(file);
});
});
fs.readdirSync
const testFolder = './tests/';
const fs = require('fs');
fs.readdirSync(testFolder).forEach(file => {
console.log(file);
});
The difference between the two methods, is that the first one is asynchronous, so you have to provide a callback function that will be executed when the read process ends.
The second is synchronous, it will return the file name array, but it will stop any further execution of your code until the read process ends.
Related
I am aware this isn't the first post about fs.unlink not working, but I'm very new to both Visual Studio and Node Js.
I want to delete a file in the working folder, I got an error and the file is not deleted.
Here is what I tried:
var fs = require('fs');
fs.unlink('test1.txt');
PS: I installed the necessary Node Js components in VS.
As far as the code goes, you're not invoking fs.unlink properly. For starters, it's asynchronous. You will need to provide it a callback. See example here:
https://nodejs.org/api/fs.html#fs_fs_unlink_path_callback
Secondly, you need to provide it the full file path, not just the name of the file... ie:
var fs = require('fs');
fs.unlink('C:\path\to\my\file\test1.txt', (err) => {});
You can also supply it with the variable __dirname to utilize your current working directory from wherever you invoke node against the script. Thus, that would look something like:
let fs = require('fs');
let path = require('path');
fs.unlink(path.join(__dirname, 'test1.txt', (err) => {
if (err) throw err;
console.log('test1.txt was deleted');
});
Currently, you can also invoke it synchronously using it's single parameter signature... thus you'd provide only the dir path:
fs.unlinkSync('C:\\path\\to\\my\\file\\test1.txt');
But, this is ill-advised as it will be blocking. I'd only use the "sync" variant during some application bootstrapping process, where it'd be invoked only one time or so, at startup. Try to fight the urge of it being "easier" to use and understand, and instead get yourself to understand asynchronous logic.
I have an express server.
server.js
const express = require('express');
const app = express();
var json = require("./sample.js")
app.use("/", (req, res)=>{
console.log("----------->", JSON.stringify(json));
res.status(200).send(JSON.stringify(json));
});
app.listen(2222,()=>{
console.log(`Listening on port localhost:2222/ !`);
});
sample.js
var offer = {
"sample" : "Text",
"ting" : "Toing"
}
module.exports = offer ;
Once i execute the server.js file, it fetches the json data from sample.js file. If I update the data of the sample.js while the server.js is still executing I dont get updated data. Is there any way
to do the same, without stopping the execution of server.js.
Yes, there is a way, you have to read the file every time a request occurs (or cache it for a time, for a bit better performance).
The reason why require does not work is that NodeJS automatically caches modules for you. So even if you would require it inside the request handler (in the use), it won't work.
Because you cannot use require, it won't be convenient (or performant) to use a module. So your file should be in JSON format instead:
{
"sample" : "Text",
"ting" : "Toing"
}
To read it, you have to use the fs (file system) module. This allows you to read the file from disk every time:
const fs = require('fs');
app.get("/", (req, res) => {
// To read as a text file, you have to specify the correct
// encoding.
fs.readFile('./sample.json', 'utf8', (err, data) => {
// You should always specify the content type header,
// when you don't use 'res.json' for sending JSON.
res.set('Content-Type', 'application/json');
res.send(data)
})
});
It is important to know, that now data is a string, not an object. You would need JSON.parse() to get an object.
Also use is not recommended in this case. It is for middlewares, you should consider using get (as in my example), or all if you want to handle any verb.
You need to read the file at runtime:
fs = require('fs');
function getJson() {
fs.readFile('sample.json', (err, jsonData) => {
if (err) {
console.log('error reading sample.js ', err)
}
return(jsonData)
}
}
}
make sure your sample.js is instead just a json object.
I have an app with some products and each product has a gallery with a different amount of images. Each of the images has a name that is completely random / no correlation with the other image names.
Each of the product images are in /src/assets/images/products/:id/.
I need to add the paths to a gallery component but I can't loop through them because the names are random. Is there any way to just loop through each file from a folder using only Angular? If not can I do it on the back-end without renaming the files? I'm also running the app on a Node.js back-end if that matters.
You can't do that with frontend.
What you need to is using your back-end and return file in it.
You are using NodeJs as back-end so can use the fs.readdir or fs.readdirSync methods.
fs.readdir
const testFolder = './images/';
const fs = require('fs');
fs.readdir(testFolder, (err, files) => {
files.forEach(file => {
console.log(file); // use those file and return it as a REST API
});
})
fs.readdirSync
const testFolder = './images/';
const fs = require('fs');
fs.readdirSync(testFolder).forEach(file => {
console.log(file);
})
Read the full documenation, it may help you to how you can proceed.
In the following path:
app/test/vehicle/
there are several .js file. Is there any way in Javascript where I can get the number of .js file contains in the above-mentioned path?
I am using webStorm environment
Yes, using node:
const fs = require('fs');
const dir = './directory';
fs.readdir(dir, (err, files) => {
console.log(files.length);
});
https://stackoverflow.com/a/43747896/3650835
Since javascript doesn't have local filesystem access when run in the browser, there is no way to access local files from javascript. If you are asking about accessing all files hosted on a cdn or something similar in that directory, there is still no way to tell which requests will return a 20x.
If you are using nodejs, use the fs.readdir function:
const fs = require('fs');
fs.readdir('app/test/vehicle/', (err, files) => {
files.forEach(file => { console.log(file);} );
})
There is no way to access the file system from a web page.
If you are using nodejs, then you can use fs.readdir or fs.readdirSync:
const fs = require('fs');
const dir = './directory';
fs.readdir(dir, (err, files) => {
console.log(files.filter(file => file.endsWith(".js")).length);
});
Where are you trying to get list of the files? On the client side it is impossible. JavaScript does not have access to the client file system. If you are trying to do it in the server side using for example node.js then you can do something like:
var filesys = require('fs');
var files = filesys.readdirSync('/app/test/vehicle/');
I'm trying to list the files in the S:/test folder, which is in my network (it's not a local directory). I was wondering how to do this? The code so far looks like this:
const testFolder = 's:/test';
const fs = require('fs');
fs.readdir(testFolder, (err, files) => {
files.forEach(file => {
console.log(file);
});
})
I've tried changing the path to S:test, s:\test to no avail, the error is always "Cannot read 'forEach' of undefined"
If this is windows (which I assume it is), then you need to do a couple things:
Use the full path (drive letters are OK, but you can also use UNC paths in windows) to the desired directory.
Escape any backslashes in a string definition with an extra backslash.
Always use error handling on your fs.readdir() callback so if there is an error, you can see exactly what the error is.
Working code:
const fs = require('fs');
const testFolder = 's:\\test';
fs.readdir(testFolder, (err, files) => {
if (err) return console.log(err);
files.forEach(file => {
console.log(file);
});
});
I just tried this code on my own hard drive and it works just fine.
And, FYI I pretty much always use ES6 for/of now in modern node.js rather than .forEach() because it's much more efficient for the interpreter and it gives you more loop control (for example, you can use break to exit the loop).
const testFolder = 's:\\test';
const fs = require('fs');
fs.readdir(testFolder, (err, files) => {
if (err) return console.log(err);
for (let file of files) {
console.log(file);
}
});