Zip archive with nested folder inside does not unzip with yauzl - javascript

I am writing software which, among other things, downloads a zip archive using the Dropbox API and then unzips that archive using yauzl.
The way the files are stored and downloaded from DB often ends up with nested folders, and I need to keep it that way.
However my implementation of yauzl is not capable of unzipping while keeping that nested folder structure, if there is a nested folder in the archive it does not unzip at all.
Here is my unzip function, which is the default yauzl example with the addition of local file write at the end.
const unzip = () => {
let zipPath = "pathToFile.zip"
let extractPath = "pathToExtractLocation"
yauzl.open(zipPath, {lazyEntries: true}, function(err, zipfile) {
if (err) throw err;
zipfile.readEntry();
zipfile.on("entry", function(entry) {
if (/\/$/.test(entry.fileName)) {
// Directory file names end with '/'.
// Note that entries for directories themselves are optional.
// An entry's fileName implicitly requires its parent directories to exist.
zipfile.readEntry();
} else {
// file entry
zipfile.openReadStream(entry, function(err, readStream) {
if (err) throw err;
readStream.on("end", function() {
zipfile.readEntry();
});
const writer = fs.createWriteStream(path.join(extractPath, entry.fileName));
readStream.pipe(writer);
});
}
});
});
}
Removing the if (/\/$/.test(entry.fileName)) check treats the top level folder as a file, extracting it with no file extension and 0kb size. What I want it to do is extract the archive including subfolders (to at least a depth of 2, being aware of the risk of zip bombing).
Is that possible using yauzl?

The code needs to create the directory tree at the extract path. You may use fs.mkdir with the recursive option to ensure that a directory exists before extract to it.
if (/\/$/.test(entry.fileName)) {
// Directory file names end with '/'.
// Note that entries for directories themselves are optional.
// An entry's fileName implicitly requires its parent directories to exist.
zipfile.readEntry();
} else {
// file entry
fs.mkdir(
path.join(extractPath, path.dirname(entry.fileName)),
{ recursive: true },
(err) => {
if (err) throw err;
zipfile.openReadStream(entry, function (err, readStream) {
if (err) throw err;
readStream.on("end", function () {
zipfile.readEntry();
});
const writer = fs.createWriteStream(
path.join(extractPath, entry.fileName)
);
readStream.pipe(writer);
});
}
);
}

Related

How to get the contents of a file as a String

I am new to Typescript and node.
I have this function
sftp.connect(config) //CONNECT TO STFP
.then(() => {
sftp.list(remoteFilePath) //LIST THE FILES IN THE FILEPATH
.then((list) => {
list.forEach((index) => { //FOR EVERY FILE IN THE FOLDER, DOWNLOAD IT
const fileName = remoteFilePath + index.name;
console.log(fileName);
sftp.fastGet(fileName, "/Users/Bob/" + index.name)
.then((value) => {
console.log(value);
sftp.end();
})
})
})
})
// .then(() => {
// // sftp.end();
// })
.catch(err => {
console.error(err.message);
});
and using the ssh2-sftp-client library. My question is that is it possible for this library to get the contents of the file as opposed to downloading it? I plan on making this function into a lambda function.
At the moment, the variable value contains a text telling me that the file has been downloaded to my designated path.
If you want to get the contents of the file you can read it using the fs module after downloading it
// using the ES6 module syntax
import { readFileSync } from "fs"
const data = readFileSync("./file.txt")
If you want to get the contents of the file without downloading them to the disk you have to pass a different destination. Use the ssh2-sftp-client get method instead, it accepts a Stream or a Buffer as the destination. You can use a Stream but you have to pipe it somewhere. Here's an example using process.stdout which is a writable stream:
// ...
stfp.get(
fileName, "/Users/Bob/" + index.name,
process.stdout
)

Node.js Unable to send multiple csv in an archive folder

I'have currently a problem to create and to send an archive from a node.js server.
In fact, i succeeded create an archive with 5 csv files on the archive but when i send the archive to my React Application and when i unzip this archive i'have got just one file ....
I have checked (local) the archive created by the server and there are 5 files instead of one when i unzip the folder send to the React Application.
My http response :
import { Parser } from "json2csv";
import archiver from "archiver";
import fs from "fs";
import path from "path";
const output = fs.createWriteStream(`csv/${campaign.title}.zip`, 'utf-8');
const archive = archiver("zip", {
zlib: { level: 9 }, // Sets the compression level.
});
output.on("close", function () {
console.log(archive.pointer() + " total bytes");
console.log("archiver has been finalized and the output file descriptor has closed.");
});
output.on("end", function () {
console.log("Data has been drained");
});
archive.on("warning", function (err) {
if (err.code === "ENOENT") {
// log warning
} else {
// throw error
throw err;
}});
archive.on("error", function (err) {
throw err;
});
archive.pipe(output);
d.map(items => {
let file = items;
archive.append(file, { name: file });
});
archive.pipe(res)
archive.finalize();
res.setHeader('application/zip')
res.setHeader("Content-Disposition", `attachment; filename=${campaign.title}.zip`);
res.status(200).send(archive);
What is content of mystical d? You use archiver incorrectly I think:
your code:
d.map(items => {
let file = items;
archive.append(file, { name: file });
});
but documentation say:
https://www.archiverjs.com/docs/archiver#append
append(source, data) → {this}
Appends an input source (text string, buffer, or stream) to the instance.
When the instance has received, processed, and emitted the input, the entry event is fired.
Parameters
source - Buffer | Stream | String - The input source.
data - Object - The entry data.
Check content and use of d - you need data and filename for use in append
EDIT:
Don't use Array.map like Array.forEach, it's Anti-pattern: Is performing a mapping operation without using returned value an antipattern?

how to copy an image and save it in a new folder in electron

I am trying to make an image organizer app , which searches images using tag's ,
So I want the user to select the image they want, so far I have done this by the following code
// renderer process
$("#uploadImage).on("click", (e) => {
ipcRenderer.send('dialoguploadImage')
});
this is the main process
ipcMain.on('dialoguploadImage', (e) => {
dialog.showOpenDialog({
properties: ['openFile']
}).then(result => {
sendBackimagePathFromMain(result.filePaths[0])
}).
catch(err => {
console.log(err)
})
});
function sendBackimagePathFromMain(result) {
mainWindow.webContents.send('imagePathFromMain',result)
}
so I have the image path, and the only thing I want to know is
how can I duplicate this image, rename it, cerate a new folder and save the image in that folder
like for example to this folder
('./currentDirectory/imageBackup/dognothapppy.jpg')
You can use fs.mkdirSync() to make the folder and fs.copyFileSync() to 'duplicate and rename' the file (in a file system, you don't need to duplicate and rename a file in two different steps, you do both at once, which is copying a file), or their async functions.
const { mkdirSync, copyFileSync } = require('fs')
const { join } = require('path')
const folderToCreate = 'folder'
const fileToCopy = 'selectedFile.txt'
const newFileName = 'newFile.txt'
const dest = join(folderToCreate, newFileName)
mkdirSync(folderToCreate)
copyFileSync(fileToCopy, dest)

Save file in a sibling directory after creating a directoy

I'm having issues to write a file in a specific directory while calling fs.writeFile from nodejs.
It ends up creating the file in the same parent directory from where it is called.
function ensureDirectoryExists(filePath) {
const dirname = path.dirname(filePath);
if (fs.existsSync(dirname)) {
return true;
}
ensureDirectoryExists(dirname);
fs.mkdirSync(dirname);
}
function getFilePath(fileName, resultPath) {
return path.join(resultPath, fileName);
}
export function writeDataToFile(fileName, resultPath) {
fs.writeFile(getFilePath(fileName, resultPath), data, function (err) {
if(err) {
console.log('Error: ', err);
}
console.log('Saved successfully');
});
}
So, my method writeDataToFile is called in my file - 'test.js'. So, I'm sending the resultPath to writeDataToFile from 'test.js' as -'parent/childDir/data/resultsDir'.
However, the file is being created at - 'parent/childDir/tests/testDir1'.
How could I fix this and create directory at run time before calling 'getFilePath()' using 'ensureDirectoryExists()' method?
Folder Structure:

How to check if each file in a path is accessible for reading or not

I am trying to read the contents of a specific path. for that purpose, i used the following code:
code1:
const contentsOfPersonalFolder = fs.readdirSync(rootPathToPersonal);
but i know in advance that i do not have access permission to read some of the contents that will be returned from the previous line of code.
To check whether or not I have access permission to read some files, i would use the following code
code2:
try {
fs.accessSync(path, fs.constants.R_OK);
logger.info('The directory: ', path, 'can be read');
} catch (err) {
logger.error('The directory: ', path, 'can not be read due inaccessibility');
}
The problem now is, the code in code1 will return an array of all available files in the specified path. and if one of the these files is not
accessible due read right protection, then it will throw and the program will throw.
what i want to achieve is to iterate through all the available files in the specified path in code1, and then check each item using the code in code2 and
if the file is accessible for reading i would like to do some logic, and if it is not accessible for reading i would do something else.
please let me know how to achieve that.
you could use fs.access to check the users permissions
https://nodejs.org/api/fs.html#fs_fs_access_path_mode_callback
const testFolder = './tests/';
const fs = require('fs');
fs.readdir(testFolder, (err, files) => {
files.forEach(file => {
console.log(file);
fs.access(file, fs.constants.R_OK, (err) => {
if (err) {
console.error("file is not readable");
return;
}
// do your reading operations
});
});
})
const fs = require('fs');
const isAvailableToRead = file => {
try {
fs.accessSync(file, fs.constants.R_OK);
return true;
} catch (err) {
return false;
}
}
const readDirectory = path => {
const files = fs.readdirSync(path);
files.forEach(file => {
if(isAvailableToRead(file)) {
console.log(`Do some logic ${file}`);
}
});
}
readDirectory(__dirname);

Categories