Node.js Unable to send multiple csv in an archive folder - javascript

I'have currently a problem to create and to send an archive from a node.js server.
In fact, i succeeded create an archive with 5 csv files on the archive but when i send the archive to my React Application and when i unzip this archive i'have got just one file ....
I have checked (local) the archive created by the server and there are 5 files instead of one when i unzip the folder send to the React Application.
My http response :
import { Parser } from "json2csv";
import archiver from "archiver";
import fs from "fs";
import path from "path";
const output = fs.createWriteStream(`csv/${campaign.title}.zip`, 'utf-8');
const archive = archiver("zip", {
zlib: { level: 9 }, // Sets the compression level.
});
output.on("close", function () {
console.log(archive.pointer() + " total bytes");
console.log("archiver has been finalized and the output file descriptor has closed.");
});
output.on("end", function () {
console.log("Data has been drained");
});
archive.on("warning", function (err) {
if (err.code === "ENOENT") {
// log warning
} else {
// throw error
throw err;
}});
archive.on("error", function (err) {
throw err;
});
archive.pipe(output);
d.map(items => {
let file = items;
archive.append(file, { name: file });
});
archive.pipe(res)
archive.finalize();
res.setHeader('application/zip')
res.setHeader("Content-Disposition", `attachment; filename=${campaign.title}.zip`);
res.status(200).send(archive);

What is content of mystical d? You use archiver incorrectly I think:
your code:
d.map(items => {
let file = items;
archive.append(file, { name: file });
});
but documentation say:
https://www.archiverjs.com/docs/archiver#append
append(source, data) → {this}
Appends an input source (text string, buffer, or stream) to the instance.
When the instance has received, processed, and emitted the input, the entry event is fired.
Parameters
source - Buffer | Stream | String - The input source.
data - Object - The entry data.
Check content and use of d - you need data and filename for use in append
EDIT:
Don't use Array.map like Array.forEach, it's Anti-pattern: Is performing a mapping operation without using returned value an antipattern?

Related

Unable to read file from S3 url

I have excel file stored in S3 bucket fro where I am storing the file url inside mongodb.I
want to read data from this excel file and transform into JSON.So for that I am XLSX package but it is not reading the file as I am passing the file url inside the method,
Below is my code:
try {
const data = await productSchema.findOne({ report_id: reportId });
if (data) {
console.log(data.me_url); //I am getting the file url
const file = xlsx.readFile(data.me_url);
const sheetNames = file.SheetNames;
const totalSheets = sheetNames.length;
console.log(sheetNames);
console.log(totalSheets);
}
}
catch (err) {
return err;
}
Any idea why it is not reading the file or anyhthing I have missed.

How to get the contents of a file as a String

I am new to Typescript and node.
I have this function
sftp.connect(config) //CONNECT TO STFP
.then(() => {
sftp.list(remoteFilePath) //LIST THE FILES IN THE FILEPATH
.then((list) => {
list.forEach((index) => { //FOR EVERY FILE IN THE FOLDER, DOWNLOAD IT
const fileName = remoteFilePath + index.name;
console.log(fileName);
sftp.fastGet(fileName, "/Users/Bob/" + index.name)
.then((value) => {
console.log(value);
sftp.end();
})
})
})
})
// .then(() => {
// // sftp.end();
// })
.catch(err => {
console.error(err.message);
});
and using the ssh2-sftp-client library. My question is that is it possible for this library to get the contents of the file as opposed to downloading it? I plan on making this function into a lambda function.
At the moment, the variable value contains a text telling me that the file has been downloaded to my designated path.
If you want to get the contents of the file you can read it using the fs module after downloading it
// using the ES6 module syntax
import { readFileSync } from "fs"
const data = readFileSync("./file.txt")
If you want to get the contents of the file without downloading them to the disk you have to pass a different destination. Use the ssh2-sftp-client get method instead, it accepts a Stream or a Buffer as the destination. You can use a Stream but you have to pipe it somewhere. Here's an example using process.stdout which is a writable stream:
// ...
stfp.get(
fileName, "/Users/Bob/" + index.name,
process.stdout
)

amazon s3.upload is taking time

I am trying to upload file to s3, before that I am altering the name of the file. Now I am accepting 2 files from request form-data object, renaming the filename, and uploading the file to s3. And end of the task I need to return the renamed file list which is uploaded successfully.
I am using S3.upload() function. But the problem is, the variable which is assigned as empty array initially, that will contain the renamed file list. But the array is returning empty response. The s3.upload() is taking much time. is there any probable solution where I can store the file name if upload is successful and return those names in response.
Please help me to fix this. The code looks like this,
if (formObject.files.document && formObject.files.document.length > 0) {
const circleCode = formObject.fields.circleCode[0];
let collectedKeysFromAwsResponse = [];
formObject.files.document.forEach(e => {
const extractFileExtension = ".pdf";
if (_.has(FILE_EXTENSIONS_INCLUDED, _.lowerCase(extractFileExtension))) {
console.log(e);
//change the filename
const originalFileNameCleaned = "cleaning name logic";
const _id = mongoose.Types.ObjectId();
const s3FileName = "s3-filename-convension;
console.log(e.path, "", s3FileName);
const awsResponse = new File().uploadFileOnS3(e.path, s3FileName);
if(e.hasOwnProperty('ETag')) {
collectedKeysFromAwsResponse.push(awsResponse.key.split("/")[1])
}
}
});
};
use await s3.upload(params).promise(); is the solution.
Use the latest code - which is AWS SDK for JavaScript V3. Here is the code you should be using
// Import required AWS SDK clients and commands for Node.js.
import { PutObjectCommand } from "#aws-sdk/client-s3";
import { s3Client } from "./libs/s3Client.js"; // Helper function that creates Amazon S3 service client module.
import {path} from "path";
import {fs} from "fs";
const file = "OBJECT_PATH_AND_NAME"; // Path to and name of object. For example '../myFiles/index.js'.
const fileStream = fs.createReadStream(file);
// Set the parameters
export const uploadParams = {
Bucket: "BUCKET_NAME",
// Add the required 'Key' parameter using the 'path' module.
Key: path.basename(file),
// Add the required 'Body' parameter
Body: fileStream,
};
// Upload file to specified bucket.
export const run = async () => {
try {
const data = await s3Client.send(new PutObjectCommand(uploadParams));
console.log("Success", data);
return data; // For unit tests.
} catch (err) {
console.log("Error", err);
}
};
run();
More details can be found in the AWS JavaScript V3 DEV Guide.

Zip archive with nested folder inside does not unzip with yauzl

I am writing software which, among other things, downloads a zip archive using the Dropbox API and then unzips that archive using yauzl.
The way the files are stored and downloaded from DB often ends up with nested folders, and I need to keep it that way.
However my implementation of yauzl is not capable of unzipping while keeping that nested folder structure, if there is a nested folder in the archive it does not unzip at all.
Here is my unzip function, which is the default yauzl example with the addition of local file write at the end.
const unzip = () => {
let zipPath = "pathToFile.zip"
let extractPath = "pathToExtractLocation"
yauzl.open(zipPath, {lazyEntries: true}, function(err, zipfile) {
if (err) throw err;
zipfile.readEntry();
zipfile.on("entry", function(entry) {
if (/\/$/.test(entry.fileName)) {
// Directory file names end with '/'.
// Note that entries for directories themselves are optional.
// An entry's fileName implicitly requires its parent directories to exist.
zipfile.readEntry();
} else {
// file entry
zipfile.openReadStream(entry, function(err, readStream) {
if (err) throw err;
readStream.on("end", function() {
zipfile.readEntry();
});
const writer = fs.createWriteStream(path.join(extractPath, entry.fileName));
readStream.pipe(writer);
});
}
});
});
}
Removing the if (/\/$/.test(entry.fileName)) check treats the top level folder as a file, extracting it with no file extension and 0kb size. What I want it to do is extract the archive including subfolders (to at least a depth of 2, being aware of the risk of zip bombing).
Is that possible using yauzl?
The code needs to create the directory tree at the extract path. You may use fs.mkdir with the recursive option to ensure that a directory exists before extract to it.
if (/\/$/.test(entry.fileName)) {
// Directory file names end with '/'.
// Note that entries for directories themselves are optional.
// An entry's fileName implicitly requires its parent directories to exist.
zipfile.readEntry();
} else {
// file entry
fs.mkdir(
path.join(extractPath, path.dirname(entry.fileName)),
{ recursive: true },
(err) => {
if (err) throw err;
zipfile.openReadStream(entry, function (err, readStream) {
if (err) throw err;
readStream.on("end", function () {
zipfile.readEntry();
});
const writer = fs.createWriteStream(
path.join(extractPath, entry.fileName)
);
readStream.pipe(writer);
});
}
);
}

How to zip a directory with node.js [duplicate]

I need to zip an entire directory using Node.js. I'm currently using node-zip and each time the process runs it generates an invalid ZIP file (as you can see from this Github issue).
Is there another, better, Node.js option that will allow me to ZIP up a directory?
EDIT: I ended up using archiver
writeZip = function(dir,name) {
var zip = new JSZip(),
code = zip.folder(dir),
output = zip.generate(),
filename = ['jsd-',name,'.zip'].join('');
fs.writeFileSync(baseDir + filename, output);
console.log('creating ' + filename);
};
sample value for parameters:
dir = /tmp/jsd-<randomstring>/
name = <randomstring>
UPDATE: For those asking about the implementation I used, here's a link to my downloader:
I ended up using archiver lib. Works great.
Example
var file_system = require('fs');
var archiver = require('archiver');
var output = file_system.createWriteStream('target.zip');
var archive = archiver('zip');
output.on('close', function () {
console.log(archive.pointer() + ' total bytes');
console.log('archiver has been finalized and the output file descriptor has closed.');
});
archive.on('error', function(err){
throw err;
});
archive.pipe(output);
// append files from a sub-directory, putting its contents at the root of archive
archive.directory(source_dir, false);
// append files from a sub-directory and naming it `new-subdir` within the archive
archive.directory('subdir/', 'new-subdir');
archive.finalize();
I'm not going to show something new, just wanted to summarise the solutions above for those who like Promises as much as I do 😉.
const archiver = require('archiver');
/**
* #param {String} sourceDir: /some/folder/to/compress
* #param {String} outPath: /path/to/created.zip
* #returns {Promise}
*/
function zipDirectory(sourceDir, outPath) {
const archive = archiver('zip', { zlib: { level: 9 }});
const stream = fs.createWriteStream(outPath);
return new Promise((resolve, reject) => {
archive
.directory(sourceDir, false)
.on('error', err => reject(err))
.pipe(stream)
;
stream.on('close', () => resolve());
archive.finalize();
});
}
Hope it will help someone 🤞
Use Node's native child_process api to accomplish this.
No need for third party libs. Two lines of code.
const child_process = require("child_process");
child_process.execSync(`zip -r <DESIRED_NAME_OF_ZIP_FILE_HERE> *`, {
cwd: <PATH_TO_FOLDER_YOU_WANT_ZIPPED_HERE>
});
The example above showcases the synchronous API. You can also use child_process.exec(path, options, callback) if you want async behavior. There are a lot more options you can specify other than cwd to further fine-tune your request.
If you don't have the ZIP utility:
This question is specifically asks about the zip utility for archiving/compression purposes. Therefore, this example assumes you have the zip utility installed on your system. For completeness sakes, some operating systems may not have utility installed by default. In that case you have at least three options:
Work with the archiving/compression utility that is native to your platform
Replace the shell command in the above Node.js code with code from your system. For example, linux distros usually come with tar/gzip utilities:
tar -cfz <DESIRED_NAME_OF_ZIP_FILE_HERE> <PATH_TO_FOLDER_YOU_WANT_ZIPPED_HERE>.
This is a nice option as you don't need to install anything new onto your operating system or manage another dependency (kind of the whole point for this answer).
Obtain the zip binary for your OS/distribution.
For example on Ubuntu: apt install zip.
The ZIP utility is tried and tested for decades, it's fairly ubiquitous and it's a safe choice. Do a quick google search or go to the creator, Info-ZIP's, website for downloadable binaries.
Use a third party library/module (of which there are plenty on NPM).
I don't prefer this option. However, if you don't really care to understand the native methods and introducing a new dependency is a non-issue, this is also a valid option.
This is another library which zips the folder in one line :
zip-local
var zipper = require('zip-local');
zipper.sync.zip("./hello/world/").compress().save("pack.zip");
Archive.bulk is now deprecated, the new method to be used for this is glob:
var fileName = 'zipOutput.zip'
var fileOutput = fs.createWriteStream(fileName);
fileOutput.on('close', function () {
console.log(archive.pointer() + ' total bytes');
console.log('archiver has been finalized and the output file descriptor has closed.');
});
archive.pipe(fileOutput);
archive.glob("../dist/**/*"); //some glob pattern here
archive.glob("../dist/.htaccess"); //another glob pattern
// add as many as you like
archive.on('error', function(err){
throw err;
});
archive.finalize();
To include all files and directories:
archive.bulk([
{
expand: true,
cwd: "temp/freewheel-bvi-120",
src: ["**/*"],
dot: true
}
]);
It uses node-glob(https://github.com/isaacs/node-glob) underneath, so any matching expression compatible with that will work.
To pipe the result to the response object (scenarios where there is a need to download the zip rather than store locally)
archive.pipe(res);
Sam's hints for accessing the content of the directory worked for me.
src: ["**/*"]
I have found this small library that encapsulates what you need.
npm install zip-a-folder
const zip-a-folder = require('zip-a-folder');
await zip-a-folder.zip('/path/to/the/folder', '/path/to/archive.zip');
https://www.npmjs.com/package/zip-a-folder
Adm-zip has problems just compressing an existing archive https://github.com/cthackers/adm-zip/issues/64 as well as corruption with compressing binary files.
I've also ran into compression corruption issues with node-zip https://github.com/daraosn/node-zip/issues/4
node-archiver is the only one that seems to work well to compress but it doesn't have any uncompress functionality.
Since archiver is not compatible with the new version of webpack for a long time, I recommend using zip-lib.
var zl = require("zip-lib");
zl.archiveFolder("path/to/folder", "path/to/target.zip").then(function () {
console.log("done");
}, function (err) {
console.log(err);
});
As today, I'm using AdmZip and works great:
import AdmZip = require('adm-zip');
export async function archiveFile() {
try {
const zip = new AdmZip();
const outputDir = "/output_file_dir.zip";
zip.addLocalFolder("./yourFolder")
zip.writeZip(outputDir);
} catch (e) {
console.log(`Something went wrong ${e}`);
}
}
import ... from answer based on https://stackoverflow.com/a/51518100
To zip single directory
import archiver from 'archiver';
import fs from 'fs';
export default zipDirectory;
/**
* From: https://stackoverflow.com/a/51518100
* #param {String} sourceDir: /some/folder/to/compress
* #param {String} outPath: /path/to/created.zip
* #returns {Promise}
*/
function zipDirectory(sourceDir, outPath) {
const archive = archiver('zip', { zlib: { level: 9 }});
const stream = fs.createWriteStream(outPath);
return new Promise((resolve, reject) => {
archive
.directory(sourceDir, false)
.on('error', err => reject(err))
.pipe(stream)
;
stream.on('close', () => resolve());
archive.finalize();
});
}
To zip multiple directories:
import archiver from 'archiver';
import fs from 'fs';
export default zipDirectories;
/**
* Adapted from: https://stackoverflow.com/a/51518100
* #param {String} sourceDir: /some/folder/to/compress
* #param {String} outPath: /path/to/created.zip
* #returns {Promise}
*/
function zipDirectories(sourceDirs, outPath) {
const archive = archiver('zip', { zlib: { level: 9 }});
const stream = fs.createWriteStream(outPath);
return new Promise((resolve, reject) => {
var result = archive;
sourceDirs.forEach(sourceDir => {
result = result.directory(sourceDir, false);
});
result
.on('error', err => reject(err))
.pipe(stream)
;
stream.on('close', () => resolve());
archive.finalize();
});
}
You can try in a simple way:
Install zip-dir :
npm install zip-dir
and use it
var zipdir = require('zip-dir');
let foldername = src_path.split('/').pop()
zipdir(<<src_path>>, { saveTo: 'demo.zip' }, function (err, buffer) {
});
I ended up wrapping archiver to emulate JSZip, as refactoring through my project woult take too much effort. I understand Archiver might not be the best choice, but here you go.
// USAGE:
const zip=JSZipStream.to(myFileLocation)
.onDone(()=>{})
.onError(()=>{});
zip.file('something.txt','My content');
zip.folder('myfolder').file('something-inFolder.txt','My content');
zip.finalize();
// NodeJS file content:
var fs = require('fs');
var path = require('path');
var archiver = require('archiver');
function zipper(archive, settings) {
return {
output: null,
streamToFile(dir) {
const output = fs.createWriteStream(dir);
this.output = output;
archive.pipe(output);
return this;
},
file(location, content) {
if (settings.location) {
location = path.join(settings.location, location);
}
archive.append(content, { name: location });
return this;
},
folder(location) {
if (settings.location) {
location = path.join(settings.location, location);
}
return zipper(archive, { location: location });
},
finalize() {
archive.finalize();
return this;
},
onDone(method) {
this.output.on('close', method);
return this;
},
onError(method) {
this.output.on('error', method);
return this;
}
};
}
exports.JSzipStream = {
to(destination) {
console.log('stream to',destination)
const archive = archiver('zip', {
zlib: { level: 9 } // Sets the compression level.
});
return zipper(archive, {}).streamToFile(destination);
}
};

Categories