ssh2-sftp-client Get request node.js - javascript

I am currently attempting to download a file from an FTP server using ssh2-sftp-client.
I can see the list of the files as shown in the code below, my problem is when it comes to downloading the files.
As you can see from my code below I am attempting to use the sftp.get to get the contents of the file and node file system to create a file.
When the file saves it doesn't save the contents of the file on the server is only saves [object Object]
var Client = require('../../node_modules/ssh2-sftp-client');
var sftp = new Client();
var root = '/files';
var fs = require('fs');
sftp.connect({
host: '192.168.0.1',
port: '22',
username: 'user',
password: 'password'
}).then(() => {
return sftp.list(root);
}).then((data) => {
for( var i = 0, len = data.length; i < len; i++) {
console.log(data[i].name);
var name = data[i].name;
sftp.get(root+'/'+name).then((file) => {
console.log(file);
fs.writeFile('downloads/'+name, file, function(err) {
if(err) {
return console.log(err);
}
console.log("The file was saved!");
});
}) }
}).catch((err) => {
console.log(err, 'catch error');
});
How can I get this to save the contents of the file?
Any help or a push in the right direction would be very appreciated.

I resolved my problem by changing the sftp.get to the following:-
sftp.get(root+"/"+name).then((stream) => {
stream.pipe(fs.createWriteStream(local+name));
});
I hope this helps anybody else who might have this issue.

Related

Upload files through cloud functions Admin SDK - Broken Files

I have been trying to upload files (mostly images) to firebase storage through firebase cloud function (onRequest method). I had to upload files from its base64 form. With the below code, i was able to achieve it, yet the file seems to be broken after upload.
const functions = require('firebase-functions');
const admin = require('firebase-admin');
const bucket = admin.storage().bucket();
const database = admin.database();
const express = require('express');
const cors = require('cors');
const safetyLogsAPI = express();
safetyLogsAPI.use(cors({ origin: true }));
safetyLogsAPI.post('/', async (req, res) => {
try {
const {
attachments,
projectId
} = req.body;
const safetyKey = database.ref('safetyLogs')
.child(projectId)
.push().key;
if(attachments && Array.isArray(attachments)) {
if (attachments.length > 0) {
for (let index = 0; index < attachments.length; index++) {
const base64Obj = attachments[index];
const { content, fileName, contentType } = base64Obj;
const stream = require('stream');
const bufferStream = new stream.PassThrough();
bufferStream.end(Buffer.from(content, 'base64'));
const fullPath = `SafetyIncidentLog/Images/${projectId}/${safetyKey}/${fileName}`;
const file = bucket.file(fullPath);
const metadata = {
projectId,
safetyLogId: safetyKey,
createdTimestamp: Date.now(),
systemGenerated: 'false',
fileType: 'Image',
fileName,
path: fullPath
};
bufferStream.pipe(file.createWriteStream({
metadata: {
contentType,
metadata
},
public: true,
validation: "md5"
}))
.on('error', (err) => {
console.log('Error Occured');
console.log(err);
})
.on('finish', () => {
console.log('File Upload Successfull');
});
}
}
}
return res.status(200).send({
code: 200,
message: 'Success'
});
} catch (error) {
console.log(error);
return res.status(500).send({
code:500,
message: 'Internal Server Error',
error: error.message
});
}
});
module.exports = functions.https.onRequest(safetyLogsAPI);
I have tried this approach with both the prefix part data:image/png;base64 present and eliminated. In both ways i see broken image. So where have I gone wrong. Is there a better way to make it?
Thanks in advance.
Also, is the approach i try to do so is a recommended way?. For use cases like, profile picture upload, and conversation image attachments, is this way recommended, or the a direct upload from client is recommended?
With Cloud Functions HTTP triggers, the request is terminated and the function is shut down as soon as you send a respond to the client. Any asynchronous work that isn't finished might never finish.
What I'm seeing in your code is that you send a response before the upload is complete. I can see that your call to res.status(200).send() happens immediately after you start the upload. Instead, your code should wait to send the response until after it completes, perhaps using on('finish') and on('error').

Do you need to save a file locally before sending it to a mongo db?

I am learning how to upload images from my React website to my Mongo database through an express server. In every tutorial I have read, the author saves the file locally in the express server before sending it to the Mongo database. Is there a way to avoid having to store the file locally by keeping it in a local variable which is then uploaded to the database?
Here are the tutorials I am referring to:
https://www.positronx.io/react-file-upload-tutorial-with-node-express-and-multer/
https://medium.com/ecmastack/uploading-files-with-react-js-and-node-js-e7e6b707f4ef
Thank you for your help.
I guess The GridFS API would be helpful to you.It says :
you can .pipe() directly from file streams to MongoDB
Here is the sample example according to doc :
const assert = require('assert');
const fs = require('fs');
const mongodb = require('mongodb');
const uri = 'mongodb://localhost:27017';
const dbName = 'test';
mongodb.MongoClient.connect(uri, function(error, client) {
assert.ifError(error);
const db = client.db(dbName);
var bucket = new mongodb.GridFSBucket(db);
fs.createReadStream('./meistersinger.mp3').
pipe(bucket.openUploadStream('meistersinger.mp3')).
on('error', function(error) {
assert.ifError(error);
}).
on('finish', function() {
console.log('done!');
process.exit(0);
});
});
documentation link : https://mongodb.github.io/node-mongodb-native/3.0/tutorials/gridfs/streaming/
Hope this help !
yes you want to store you files locally. I used an NFS server (FreeNas) and mounted it to that local folder.
So when i saved a file to that location, it was stored on the other NFS server. Then i sent that image location as a response back to react, which then saved that location in Mongodb.
Example uploads.js
router.post('/', auth, async (req, res) => {
let CurrentDate = moment().unix();
if (req.files.file === null) {
return res.status(400).json({ msg: 'no file uploaded' });
}
let user = await User.findById(req.user.id).select('-password');
let file = req.files.file;
file.name = CurrentDate + user._id + '.jpg';
file.mv(`./client/public/uploads/${file.name}`, (err) => {
if (err) {
console.error(err);
return res.status(500).send(err);
}
res.json({ fileName: file.name, filePath: `/uploads/${file.name}`});
});
});
This is what the mongodb entry looks like
image:"/uploads/15951066675f1365239d46882312332d20.jpg"

How to parse an object sent from react frontend in express.js?

So in my react front-end, I am using the 'react-drop-to-upload' module to allow the user to drag a file and upload. I followed the example on the npm module page and created a handler called handleDrop. The code looks like:
handleDrop(files) {
var data = new FormData();
alert((files[0]) instanceof File);
files.forEach((file, index) => {
data.append('file' + index, file);
});
fetch('/file_upload', {
method: 'POST',
body: data
});
}
At my express backend, I have the following code:
app.post('/file_upload', function(req , res){
var body = '';
req.on('data', function (data) {
body += data;
});
var post = "";
req.on('end', function () {
//post = qs.parse(body);
console.log(body);
// this won't create a buffer for me
//var fileBuffer = new Buffer(body);
//console.log(fileBuffer.toString('ascii'));
//pdfText(body, function(err, chunks) {
//console.log(chunks);
//});
});
//console.log(typeof post);
});
If I drop a txt file and do a console log on the body, it would give me:
------WebKitFormBoundaryqlp9eomS0BxhFJkQ
Content-Disposition: form-data; name="file0"; filename="lec16.txt"
Content-Type: text/plain
The content of my data!
------WebKitFormBoundaryqlp9eomS0BxhFJkQ--
I am trying to use the pdfText module which takes in a buffer or a pathname to the pdf file, extract text from it into an array of text 'chunks' . I want to convert the body object into a buffer using var fileBuffer = new Buffer(body); but that won't work. Can someone help me with this? Thanks!
You need a parser for multi-part data. You can look into multer regarding that.
Example code for you,
app.post('/file_upload', function(req , res){
var storage = multer.diskStorage({
destination: tmpUploadsPath
});
var upload = multer({
storage: storage
}).any();
upload(req, res, function(err) {
if (err) {
console.log(err);
return res.end('Error');
} else {
console.log(req.body);
req.files.forEach(function(item) {
// console.log(item);
// do something with the item,
const data = fs.readFileSync(item.path);
console.log(data);
});
res.end('File uploaded');
}
});
});
To understand the example code in depth, head here. Remember, you will get the file data as a buffer and not as actual data.

Write file to directory then zip directory

I am trying to write a file to a directory templates then stream a zip with the content that was written. However, the when the zip file is returned it says Failed - Network Error which is due to the fs.writeFile in the controller. If i remove the WriteFile stream then the zipping works fine. My question is how do i first write the file then run the zip. There seems to be something synchronous happening with the archiving and file writing of typeArrayString.
Controller:
exports.download_one_feed = function(req, res, next) {
Feed.findOne({'name': req.params.id})
.exec(function(err, dbfeeds){
if(err){
res.send('get error has occured in routes/feeds.js');
} else {
const feedArray = dbfeeds.feed.data;
// write file
// get from db collection & write file to download
const typeArrayString = JSON.stringify(feedArray);
let type = 'b'; // test only
fs.writeFile(path.join(appDir, 'templates/' + type + '/template.json'), typeArrayString, (err) => {
if (err) throw err;
console.log('Saved!');
})
archiverService.FileArchiver(feedArray, res);
}
})
};
Archive Service
const archiver = require('archiver')
const zip = archiver('zip')
const path = require('path')
const fs = require('fs')
const appDir = path.dirname(require.main.filename)
exports.FileArchiver = function (feedArray, res) {
// const app = this.app;
const uploadsDir = path.join(appDir, '/uploads/');
const templatesDir = path.join(appDir, '/templates/');
const extensions = [".jpg", ".png", ".svg"];
let imageArray = [];
const feedArrayObject = JSON.parse(feedArrayString);
feedArrayObject.forEach(function(x){iterate(x)}); // grab image names from object
imageArray = uniq_fast(imageArray); // remove duplicates
// zip images
for (let i = 0; i < imageArray.length; i++) {
console.log(imageArray[i])
const filePath = path.join(uploadsDir, imageArray[i]);
zip.append(fs.createReadStream(filePath), { name: 'images/'+imageArray[i] });
}
res.attachment('download.zip');
zip.pipe(res);
// zip template directory
console.log(templatesDir)
zip.directory(templatesDir, false);
zip.on('error', (err) => { throw err; });
zip.finalize();
return this;
}
Instead of writing the file then zipping the directory, i used zip.append to override the old file in the directory.

How we can upload multiple blobs to azure using nodejs

I am trying to upload 6 images to azure blob from single endpoint that I get from a registration form. The code shows how to upload a single blob but I need to upload multiple blobs at the same time. How can I do it?
Here is my code:
app.post('/upload', function (req, res) {
//var dirname = require('path').dirname(__dirname);
//var dirname1 = require('path').dirname(dirname);
var filename = req.files[0].filename;
var path = req.files[0].path;
var type = req.files[0].mimetype;
var options = {
contentType: type,
metadata: { fileName: filename }
}
blobSvc.createBlockBlobFromLocalFile(containerName, filename, path, options, function (error, result, response) {
if (error != null) {
console.log('Azure Full Error: ', error)
} else {
console.log(result);
console.log(response);
var user = new User();
user.name = req.body.name;
user.picture = 'https://yourblob.blob.core.windows.net/profile/' + result.name;
user.save(function (err) {
if (err) {
return res.json(err.message);
}
else {
return res.json({ User: user });
}
});
}
});
});
As Azure Storage for node sdk is based on RESTful APIs, and we implement upload functionality via Put Blob.
There is no such RESTful API or function in SDK for us to directly upload multiple independent blobs to Azure at once time.
You can implement this functionality for yourself by uploading files in loop.

Categories