Express.js: Serving path provided by user - javascript

So everyone is familiar with serving static files from a hard-coded path such as /public. How do I do that but serve files from a path provided as input from the user?
Use case: This is a local server and so after running the server, the user selects which folder to serve so that it can be available across the local network for access. I'm looking to create a video streaming API which will play videos served on the path selected by the user.
Currently, I list the items in the path like so:
const Promise = require('bluebird');
const path = require('path')
const fs = Promise.promisifyAll(require('fs'));
async listDir(dir) {
const list = [];
const dirItems = await fs.readdirAsync(dir); // get all items in the directory
for (const item of dirItems) {
const absPath = path.join(dir, item);
const stat = await fs.lstatAsync(absPath);
const isFile = stat.isFile(); // check if item is a file
list.push({ name: item, isFile }); // return item name and isFile boolean
}
return list;
}
Do I then just do pattern-matching in another route to return whichever file is asked for inside the dir?
Am I going about this the right way? Any suggestion/answer is appreciated.

Related

How to loop through the folder with files?

How can I loop through the folder that has subfolder and retrieve all files with extension '.element.ts' ?
const fs = require('fs');
const filesDir = fs.readdirSync('packages/web-components/src');
// the json result that will be generated
let content = [];
files.forEach(file => {
if (fileName === '???')
content.push(file);
});
const fs=require('fs');
function getAllFiles (dir, allFilesList = []){
const files = fs.readdirSync(dir);
files.map(file => {
const name = dir + '/' + file;
if (fs.statSync(name).isDirectory()) { // check if subdirectory is present
getAllFiles(name, allFilesList); // do recursive execution for subdirectory
} else {
allFilesList.push(name); // push filename into the array
}
})
return allFilesList;
}
const allFiles = getAllFiles('./testfolder');
const fileEndsWith = allFiles.filter(file => file.endsWith('.element.ts'))
console.log(fileEndsWith);
Easy way: use glob package.
const glob = require("glob");
const pattern = "packages/web-components/src/*.element.ts"
const elementTsFilenames = glob.sync(pattern);
Manual way:
const dir = "packages/web-components/src";
const extension = ".element.ts";
const elementTsFilenames = fs.readdirSync(dir).filter(fn => fn.endsWith(extension));
This is easy enough that the manual way is as easy or easier; if you have more complex requirements (e.g. recursively searching subdirectories), a library approach is nice.
I'll show you how to recursively get all the files in a directory (even those located in a subdirectory).
To do this, we need to create a recursive function that can call itself when dealing with sub-directories. And we also need the function to go through each of the sub-directories and add any new files it encounters. Then we also need to check if the filename contains a specific string. When the function is finished , it should return an array with all the files it encountered.
Here's what the recursive function looks like:
const fs = require("fs")
const path = require("path")
const getAllFiles = function(dirPath, extension, arrayOfFiles) {
files = fs.readdirSync(dirPath);
arrayOfFiles = arrayOfFiles || [];
files.forEach(function(file) {
if (fs.statSync(dirPath + "/" + file).isDirectory()) {
arrayOfFiles = getAllFiles(dirPath + "/" + file, arrayOfFiles);
} else if (file.endsWith(extension)){
arrayOfFiles.push(path.join(__dirname, dirPath, "/", file));
}
});
return arrayOfFiles;
}
First, we require() the Node.js path module. Since this is
included with Node.js, you don't need to install anything for it to
work. This module will help us easily create full file paths for our
files.
The getAllFiles variable holds the recursive function that will go
through each subdirectory and return an array of filenames. It takes a
directory file path, a specified character and an optional arrayOfFiles as arguments.
Inside the getAllFiles function, we first use the readdirSync()
function to get all of the files and directories inside the given
dirPath supplied to the function.
Then, we create an arrayOfFiles that will hold all the filenames
that will be returned when the function is done running.
Next, we loop over each item (file or directory) found by the
readdirSync() function. If the item is a directory, we have the
function recursively call itself to get all of the files and
sub-directories inside the given directory.
And if the item is a file, we simply append the file path to the
arrayOfFiles array. (When the end of the file name is confirmed to
contain characters)
When the forEach loop has finished, we return the arrayOfFiles
array.
Here is how you use the function in your code:
const result = getAllFiles("packages/web-components/src", ".element.ts");
I don't think you need an npm package for this: It's not too hard to walk the file system using an async iterator and filter the results based on something like a regular expression.
Another bonus of an async technique is that it doesn't block your thread while it iterates the files (other work can be done in between each result while it's searching), especially if you have a lot of sub-directories/files to search through.
If you want to reduce your project's dependencies, you can do something like this:
example.mjs:
import * as path from 'node:path';
import {readdir} from 'node:fs/promises';
/** Search all subdirectories, yielding matching file entries */
export async function* findFiles (dir, regexpFilter) {
for (const entry of await readdir(dir, {withFileTypes: true})) {
const fPath = path.resolve(dir, entry.name);
if (entry.isDirectory()) {
yield* findFiles(fPath, regexpFilter);
continue;
}
if (regexpFilter && !regexpFilter.test(entry.name)) continue;
yield Object.assign(entry, {path: fPath});
}
}
async function main () {
const dir = 'packages/web-components/src';
// Regular expression which means: ends with '.element.ts'
const filter = /\.element\.ts$/;
for await (const entry of findFiles(dir, filter)) {
// ^^^^^^
// If you don't include a filter argument, then all files will be iterated
console.log(entry.name); // just the file name
console.log(entry.path); // the full file path
}
}
main();

Discord.JS - List all files within a directory as one message

I am having an issue where I cannot seem to find a solution.
I have written a Discord bot from Discord.JS that needs to send a list of file names from a directory as one message. So far I have tried using fs.readddir with path.join and fs.readfilesync(). Below is one example.
const server = message.guild.id;
const serverpath = `./sounds/${server}`;
const fs = require('fs');
const path = require('path');
const directoryPath = path.join('/home/pi/Pablo', serverpath);
fs.readdir(directoryPath, function(err, files) {
if (err) {
return console.log('Unable to scan directory: ' + err);
}
files.forEach(function(file) {
message.channel.send(file);
});
});
Whilst this does send a message of every file within the directory, it sends each file as a separate message. This causes it to take a while due to Discord API rate limits. I want them to all be within the same message, separated with a line break, with a max of 2000 characters (max limit for Discord messages).
Can someone assist with this?
Thanks in advance.
Jake
I recommend using fs.readdirSync(), it will return an array of the file names in the given directory. Use Array#filter() to filter the files down to the ones that are JavaScript files (extentions ending in ".js"). To remove ".js" from the file names use Array#map() to replace each ".js" to "" (effectively removing it entirely) and use Array#join() to join them into a string and send.
const server = message.guild.id;
const serverpath = `./sounds/${server}`;
const { readdirSync } = require('fs');
const path = require('path');
const directoryPath = path.join('/home/pi/Pablo', serverpath);
const files = readdirSync(directoryPath)
.filter(fileName => fileName.endsWith('.js'))
.map(fileName => fileName.replace('.js', ''));
.join('\n');
message.channel.send(files);
Regarding handling the sending of a message greater than 2000 characters:
You can use the Util.splitMessage() method from Discord.JS and provide a maxLength option of 2000. As long as the number of chunks needed to send is not more than a few you should be fine from API ratelimits
const { Util } = require('discord.js');
// Defining "files"
const textChunks = Util.splitMessage(files, {
maxLength: 2000
});
textChunks.forEach(async chunk => {
await message.channel.send(chunk);
});
Built an array of strings (names of files) then join with "\n".
let names = []
fs.readdir(directoryPath, function(err, files) {
if (err) {
return console.log('Unable to scan directory: ' + err);
}
files.forEach(function(file) {
names << file
});
});
message.channel.send(names.join("\n"));

How to find if Azure File exists on NodeJS

I'm using the azure file storage, and using express JS to write a backend to render the contents stored in the azure file storage.
I am writing the code based on https://learn.microsoft.com/en-us/javascript/api/#azure/storage-file-share/shareserviceclient?view=azure-node-latest
const { ShareServiceClient, StorageSharedKeyCredential } = require("#azure/storage-file-share");
const account = "<account>";
const accountKey = "<accountkey>";
const credential = new StorageSharedKeyCredential(account, accountKey);
const serviceClient = new ShareServiceClient(
`https://${account}.file.core.windows.net`,
credential
);
const shareName = "<share name>";
const fileName = "<file name>";
// [Node.js only] A helper method used to read a Node.js readable stream into a Buffer
async function streamToBuffer(readableStream) {
return new Promise((resolve, reject) => {
const chunks = [];
readableStream.on("data", (data) => {
chunks.push(data instanceof Buffer ? data : Buffer.from(data));
});
readableStream.on("end", () => {
resolve(Buffer.concat(chunks));
});
readableStream.on("error", reject);
});
}
And you can view the contents through
const downloadFileResponse = await fileClient.download();
const output = await streamToBuffer(downloadFileResponse.readableStreamBody)).toString()
Thing is, I only want to find if the file exists and not spend time downloading the entire file, how could I do this?
I looked at https://learn.microsoft.com/en-us/javascript/api/#azure/storage-file-share/shareserviceclient?view=azure-node-latest
to see if the file client class has what I want, but it doesn't seem to have methods useful for this.
If you are using #azure/storage-file-share (version 12.x) Node package, there's an exists method in ShareFileClient. You can use that to find if a file exists or not. Something like:
const fileExists = await fileClient.exists();//returns true or false.

folders and files are not visible after uploading file though multer

I am working on a small project. discussing Step by step
At first I am uploading zip files though multer
extracting those files (How can I call extract function after completing upload using multer?)
After extracting those I am trying to filter those files
after filtering those files I want to move some files to another directory
in my main index.js I have
A simple route to upload files which is working
// MAIN API ENDPOINT
app.post("/api/zip-upload", upload, async (req, res, next) => {
console.log("FIles - ", req.files);
});
Continuous checking for if there is any zip file that needs to unzip but the problem is after uploading it's not showing any files or dir
// UNZIP FILES
const dir = `${__dirname}/uploads`;
const files = fs.readdirSync("./uploads");
const filesUnzip = async () => {
try {
if (fs.existsSync(dir)) {
console.log("files - ", files);
for (const file of files) {
console.log("file - ", file);
try {
const extracted = await extract("./uploads/" + file, { dir: __dirname + "/uploads/" });
console.log("Extracted - ",extracted);
// const directories = await fs.statSync(dir + '/' + file).isDirectory();
} catch (bufErr) {
// console.log("------------");
console.log(bufErr.syscall);
}
};
// const directories = await files.filter(function (file) { return fs.statSync(dir + '/' + file).isDirectory(); });
// console.log(directories);
}
} catch (err) {
console.log(err);
}
return;
}
setInterval(() => {
filesUnzip();
}, 2000);
Moving files to static directory but here is the same problem no directory found
const getAllDirs = async () => {
// console.log(fs.existsSync(dir));
// FIND ALL DIRECTORIES
if (fs.existsSync(dir)) {
const directories = await files.filter(function (file) { return fs.statSync(dir + '/' + file).isDirectory(); });
console.log("Directories - ",directories);
if (directories.length > 0) {
for (let d of directories) {
const subdirFiles = fs.readdirSync("./uploads/" + d);
for (let s of subdirFiles) {
if (s.toString().match(/\.xml$/gm) || s.toString().match(/\.xml$/gm) !== null) {
console.log("-- ", d + "/" + s);
const move = await fs.rename("uploads/" + d + "/" + s, __dirname + "/static/" + s, (err) => { console.log(err) });
console.log("Move - ", move);
}
}
}
}
}
}
setInterval(getAllDirs, 3000);
There are so many issues with your code, I don't know where to begin:
Why are you using fs.xxxSync() methods if all your functions are async? Using xxxSync() methods is highly discouraged because it's blocking the server (ie parallel requests can't/won't be accepted while a sync reading is in progress). The fs module supports a promise api ...
Your "Continuous checking" for new files is always checking the same (probably empty) files array because it seems you are executing files = fs.readdirSync("./uploads"); only once (probably at server start, but I can't tell for sure because there isn't any context for that snippet)
You shouldn't be polling that "uploads" directory. Because as writing a file (if done properly) is an asynchronous process, you may end up reading incomplete files. Instead you should trigger the unzipping from your endpoint handler. Once it is hit, body.files contains the files that have been uploaded. So you can simply use this array to start any further processing instead of frequently polling a directory.
At some points you are using the callback version of the fs API (for instance fs.rename(). You cannot await a function that expects a callback. Again, use the promise api of fs.
EDIT
So I'm trying to address your issues. Maybe I can't solve all of them because of missing infomation, but you should get the general idea.
First of all, you shuld use the promise api of the fs module. And also for path manipulation, you should use the available path module, which will take care of some os specific issues.
const fs = require('fs').promises;
const path = require('path');
Your API endpoint isn't currently returning anything. I suppose you stripped away some code, but still. Furthermore, you should trigger your filehandling from here, so you don't have to do directory polling, which is
error prone,
wasting resources and
if you do it synchronously like you do blocks the server
app.post("/api/zip-upload", upload, async (req, res, next) => {
console.log("FIles - ", req.files);
//if you want to return the result only after the files have been
//processed use await
await handleFiles(req.files);
//if you want to return to the client immediately and process files
//skip the await
//handleFiles(req.files);
res.sendStatus(200);
});
Handling the files seems to consist of two different steps:
unzipping the uploaded zip files
copying some of the extracted files into another directory
const source = path.join(".", "uploads");
const target = path.join(__dirname, "uploads");
const statics = path.join(__dirname, "statics");
const handleFiles = async (files) => {
//a random folder, which will be unique for this upload request
const tmpfolder = path.join(target, `tmp_${new Date().getTime()}`);
//create this folder
await fs.mkdir(tmpfolder, {recursive: true});
//extract all uploaded files to the folder
//this will wait for a list of promises and resolve once all of them resolved,
await Promise.all(files.map(f => extract(path.join(source, f), { dir: tmpfolder })));
await copyFiles(tmpfolder);
//you probably should delete the uploaded zipfiles and the tempfolder
//after they have been handled
await Promise.all(files.map(f => fs.unlink(path.join(source, f))));
await fs.rmdir(tmpfolder, { recursive: true});
}
const copyFiles = async (tmpfolder) => {
//get all files/directory names in the tmpfolder
const allfiles = await fs.readdir(tmpfolder);
//get their stats
const stats = await Promise.all(allfiles.map(f => fs.stat(path.join(tmpfolder, f))));
//filter directories only
const dirs = allfiles.filter((_, i) => stats[i].isDirectory());
for (let d of dirs) {
//read all filenames in the subdirectory
const files = await fs.readdir(path.join(tmpfolder, d)));
//filter by extension .xml
const xml = files.filter(x => path.extname(x) === ".xml");
//move all xml files
await Promise.all(xml.map(f => fs.rename(path.join(tmpfolder, d, f), path.join(statics, f))));
}
}
That should do the trick. Of course you may notice there is no error handling with this code. You should add that.
And I'm not 100% sure about your paths. You should consider the following
./uploads refers to a directory uploads in the current working directory (whereever that may be)
${__dirname}/uploads refers to a directory uploads which is in the same directory as the script file currently executing Not sure if that is the directory you want ...
./uploads and ${__dirname}/uploads may point to the same folder or to completely different folders. No way knowing that without additional context.
Furthermore in your code you extract the ZIP files from ./uploads to ${__dirname}/uploads and then later try to copy XML files from ./uploads/xxx to ${__dirname}/statics, but there won't be any directory xxx in ./uploads because you extracted the ZIP file to a (probably) completely different folder.

The file I converted with FFMPEG in Firebase Cloud function is not accessable

The following is my cloud function code.
exports.increaseVolume = functions.storage.object().onFinalize(async (object) => {
const fileBucket = object.bucket; // The Storage bucket that contains the file.
const filePath = object.name; // File path in the bucket.
const contentType = object.contentType; // File content type.
// Exit if this is triggered on a file that is not an audio.
if (!contentType.startsWith('video/mp4')) {
console.log('This is not an audio.');
return null;
}
// Get the file name.
const fileName = path.basename(filePath);
// Exit if the audio is already converted.
if (fileName.endsWith('_output.mp4')) {
console.log('Already a converted audio.');
return null;
}
// Download file from bucket.
const bucket = gcs.bucket(fileBucket);
const tempFilePath = path.join(os.tmpdir(), fileName);
// We add a '_output.flac' suffix to target audio file name. That's where we'll upload the converted audio.
const targetTempFileName = fileName.replace(/\.[^/.]+$/, '') + '_output.mp4';
const targetTempFilePath = path.join(os.tmpdir(), targetTempFileName);
const targetStorageFilePath = path.join(path.dirname(filePath), targetTempFileName);
await bucket.file(filePath).download({destination: tempFilePath});
console.log('Audio downloaded locally to', tempFilePath);
// Convert the audio to mono channel using FFMPEG.
let command = ffmpeg(tempFilePath)
.audioFilters([
{
filter: 'volume',
options: '5dB'
},
{
filter: 'afftdn'
}
])
.format('mp4')
.output(targetTempFilePath);
await promisifyCommand(command);
console.log('Output audio created at', targetTempFilePath);
// Uploading the audio.
await bucket.upload(targetTempFilePath, {destination: targetStorageFilePath});
console.log('Output audio uploaded to', targetStorageFilePath);
// Once the audio has been uploaded delete the local file to free up disk space.
fs.unlinkSync(tempFilePath);
fs.unlinkSync(targetTempFilePath);
return console.log('Temporary files removed.', targetTempFilePath);
});
This is how the file in my storage bucket shows. Where do I get the download link or how can I access the file?. When I typed the link in the browser it returns a JSON saying 403 - unauthorized access.
You need to use the getSignedUrl() method as follows:
const uploadResp = await bucket.upload(targetTempFilePath, {destination: targetStorageFilePath});
const file = uploadResp[0];
const options = {
action: 'read',
expires: '03-17-2025'
};
const getSignedUrlResponse = await file.getSignedUrl(options);
const url = getSignedUrlResponse[0];
//Do wathever you want with this url: save it in Firestore for example
The message you are facing is about permissions. Who should have access to what?
First of all you have to think about your business logic. Who should have access to that file? Is it good if that file should be public? Is there a time-frame in which the file should be accessible? The data should remain in the Bucket or it can be deleted?
I think there are two options for you:
Create a Bucket with public data, in such case the data would accessible for everyone who has access to the specific filename in the Bucket.
If the above is not allowed, then you can create a SignedURL, as mentioned by #Renaud Tarnec link example. You should keep in mind that the SignedURL has time-limited access and everyone who has the URL of it, will be able to get the object. Once the time is expired, the object will not be longer accessible.
Once you have defined this, you can either delete the object in your Bucket programatically or you can set a Lifecycle Management. There, you can set configurations which contains a set of rules, for example, delete objects created by their age (in days).

Categories