System architecture
System consist of 3 components
1: FTP server: Used to store files. Can only be accessed by Node.js app. No direct access.
2: Node.js: Provides an API to interact with FTP server. Only way to interact with FTP server. Node.js app have no collection of files stored on FTP. Only provides method to work with FTP. Node.js app should not store any file uploaded from it and download from it.
3: Client: A user how will upload or download a file to FTP server by using Node.js app.
What I have done:
I am able to download the file stored on FTP by using basic-ftp package. Here is the code for download file function.
async function downloadFile(folderPath, fileName, writeStream) {
console.log(folderPath);
console.log(fileName);
const client = new ftp.Client()
// client.ftp.verbose = true
try {
await client.access({
'host': process.env.FTP_HOST,
'user': process.env.FTP_USER,
'password': process.env.FTP_PASSWORD,
});
await client.ensureDir(folderPath);
await client.downloadTo(writeStream, fileName);
}
catch(err) {
console.log(err)
}
client.close()
}
The file is download to the directory named /downloads on Node.js server. What I want to do actually is download the file directly to client computer. To download the file directly to client, I have tried streaming the writeStream object from download method. Here is the code for that
app.post("/download/file", urlencodedParser, (req, res, next) => {
var writeStream = fs.createWriteStream('./downloads/'+req.body.fileName);
writeStream.on("data", (data) => {
res.write(data);
})
writeStream.on("close", () => {
res.end()
})
res.setHeader('Transfer-Encoding', 'chunked');
downloadFile(req.body.folderName, req.body.fileName, writeStream);
})
This does not work. It always ends in error with downloading the file completely.
Another approach I tried is to generate a URL for the file and client will click the URL to download the file. Problem with this approach is the file is not complete by the time I start downloading which results in incomplete file download. For example, If the size of file is 10MB and only 2 MB was download by the time client clicked the link, it will download only 2MB file not 10MB.
Goal:
Download the file to client(browser) from a FTP server through Node js.
Requirement:
Download the file stored on FTP server directly to client through Node js.
Constraints:
Client does not have access to FTP server.
The only way to access the server is through Node js app.
You can try to indicate res as an output stream directly. That way you will simply redirect a stream from an ftp to a client:
async function downloadFile(fileName, writeStream) {
console.log(fileName);
const client = new ftp.Client()
// client.ftp.verbose = true
try {
await client.access({
'host': process.env.FTP_HOST,
'user': process.env.FTP_USER,
'password': process.env.FTP_PASSWORD,
});
await client.downloadTo(writeStream, fileName);
}
catch(err) {
console.log(err)
}
client.close()
}
app.post("/download/file", urlencodedParser, (req, res, next) => {
downloadFile(req.body.fileName, res);
})
Related
I've read this article about google drive implementation in nodejs. I want to give to the users of the app the ability to upload the processed files from the app to their google drive account. The article show how to implement a nodejs solution, but since the server will run on localhost, how I can authorize the user on the client side using vuejs?
I've found this question but it's very old and I'm not sure if can really help me at all.
At the moment my nodejs script will save the processed files on the users machine using fs.writeFile.
// espress endpoint
this.app.post('/processFiles', async (req, res) => {
for(let file in req.files){
//console.log(req.files[file]);
await this.composeData(req.files[file]);
}
res.setHeader('Content-Type', 'application/json');
res.send({processStatus: '', outputPath: this.tmpDir});
});
// processing file method
async composetData(file){
//some file compression stuff
this.output = path.format({dir: this.tmpDir, base: file.name});
await fs.writeFile(this.output, this.processedData);
}
Since I want to implement a client side solution, I'm thinking to add an endpoint to my express server that will send processed files back so the client code can do the gdrive upload.
//vue app code
processFiles(){
const formData = new FormData();
this.selectedFiles.forEach( (file, i) => {
formData.append(`file${i}`, file);
});
axios({
method: 'POST',
url: 'http://localhost:9000/processFiles',
data: formData
}).then( (res) => {
//here i want to implement gdrive upload
console.log(res);
});
}
Can anyone provide me some help about?
I'm building an AWS Lambda API, using the serverless stack, that will receive an image as a binary string then store that image on GitHub and send it over FTP to a static app server. I am NOT storing the image in S3 at all as I am required not to.
I have already figured out how to save the image to GitHub (by converting binary to base64), the issue is sending the binary image data over FTP to another server to hold statically. The static image server is pre-existing and I cannot use S3. I am using the ftp npm package to send the image. The image server indeed receives the image BUT NOT in the correct format and the image is just non-displayable junk data. I saw an example on how to do this on the client side by encoding the image in an Uint16Array and passing it to a new Blob() object, but sadly, NODE.JS DOES NOT SUPPORT BLOBS!
Using ftp npm module:
async sendImage(image) {
try {
const credentials = await getFtpCredentials(this.platform);
console.log('Sending files via FTP client');
return new Promise((resolve, reject) => {
let ftpClient = new ftp();
ftpClient.on('ready', () => {
console.log(`Putting ${image.path} onto server`);
// Set transfer type to binary
ftpClient.binary(err => {
if(err)
reject(err);
});
// image.content is binary string data
ftpClient.put(image.content, image.path, (err) => {
if (err) {
reject(err);
}
console.log('Closing FTP connection');
ftpClient.end();
resolve(`FTP PUT for ${image.path} successful!`);
});
});
console.log('Connecting to FTP server');
ftpClient.connect(credentials);
});
} catch(err) {
console.log('Failed to load image onto FTP server');
throw err;
}
}
This procedure sends the binary data to the server, but the data is un-readable by any browser or image viewer. Do I need to use another FTP package? Or am I just not encoding this right?? I've spent days googleing the answer to this seemingly common task and it's driving me up the wall! Can anyone instruct me on how to send binary image data from a node.js lambda function over FTP to another server so that the image is encoded properly and works when viewed? ANY help is very much appreciated!
So, it turns out that the ftp node module that I was using was the issue all along. It will corrupt any binary image transferred over FTP. I submitted a ticket to the their GitHub repo, but they haven't made a commit in 4 years so I don't expect a timely fix.
To solve the problem, I used the Basic FTP package instead:
const ftp = require('basic-ftp');
async sendImage(image) {
try {
const { host, user, password } = await getFtpCredentials(this.platform);
console.log('Received FTP credentials');
console.log('Creating FTP client');
const client = new ftp.Client();
client.ftp.verbose = true;
console.log('Logging in to remote server via FTP');
await client.access({
host, user, password, secure: false
});
console.log('Creating readable stream from binary data');
const buffer = new Buffer(image.content, 'binary');
const imgStream = new Readable();
imgStream._read = () => {};
imgStream.push(buffer);
imgStream.push(null);
console.log('Uploading readable stream to server via FTP');
const result = await client.upload(imgStream, image.path);
client.close();
return result
} catch(err) {
console.log('Failed to load image onto FTP server');
throw err;
}
}
Note that this package requires data to be transferred via a Readable stream. Thanks to this SO answer, I learned how to convert a binary string to a Readable stream. All works fine.
Happy coding!
I'm using angular and multer-s3 to upload files from an angular app to a node server. Everything works well on the desktop but for some reason when trying to upload the photo via my iPhone 7 the uploaded file is corrupt. I'm using the same image and running through the same flow on both devices but getting different results so I'm assuming its because of mobile?
Here's the alert I get when trying to open the S3 file on the mobile
The file “1519398514215-test.png” could not be opened because it is empty.
Here's my code
var aws = require('aws-sdk');
var path = require('path');
var path3 = path.join(__dirname, "../config/config-aws.json");
var multer = require('multer');
var multerS3 = require('multer-s3');
var request = require('request');
aws.config.loadFromPath(path3);
var s3 = new aws.S3();
var fileName = '';
var uploadM = multer({
storage: multerS3({
s3: s3,
bucket: 'XXXX',
acl: 'public-read',
metadata: function (req, file, cb) {
cb(null, {fieldName: file.fieldname + '.png'});
},
key: function (req, file, cb) {
fileName = Date.now().toString() + "-" + file.originalname + '.png' ;
cb(null, fileName)
}
})
});
router.post('/', uploadM.array('photos', 3), function(req,res) {
if (res.error) {
console.log(error.stack);
return res.status(400).json({
message: "Error",
error: res.error
});
}
const url = 'https://s3-us-west-2.amazonaws.com/XXXX/' + fileName;
return res.status(200).json({
fileName: url
});
});
And here's my client-side
sendImage() {
const formData: FormData = new FormData();
this.removeObjectFromCanvas('polygon');
if (!fabric.Canvas.supports('toDataURL')) {
alert('This browser doesn\'t provide means to serialize canvas to an image');
} else {
// window.open(this.canvas.toDataURL('png'));
const image = new Image();
image.src = this.canvas.toDataURL('png');
const blob = this.dataURItoBlob(image.src);
const file = new File([blob], 'test.png');
formData.append('photos', file, 'test');
this.postFile(formData);
}
}
postFile(file) {
this.fileService.post(file)
.subscribe(data => {
}, error => {
console.log(error);
});
}
UPDATE **********
So found out you can debug on mobile. It looks like the buffer I am sending has data in it. My first thought was the buffer was not sending.
**** Update
Still can't figure this out. I've done some research and its possible it has something to do with formData and append? But as you can see by the image above both seem to be fine. Will continue to research ...
***** UPDATE
Definitely uploading empty files. But its only on mobile?
Also, I checked the formData prior to sending to the node server, seems to have the correct data in it.
*** UPDATE
Ok, even weirder experience. It seems multer-s3 is uploading empty files but when I take the file on the server-side and return it to the client-side, then read that file and display it, the image is displayed perfectly. So the formData is not the issue, it's something with multer-s3 I'm assuming?
****UPDATE
I forgot to mention I am using fabricjs and getting the image from the canvas. I read in some places there may be an issue there but like I said above when I send the file to the server and send it back to the client, after reading the file it displays the image perfectly.
****Update
I tried adding contentType to the multer method and now I'm receiving a 503 service unavailable error when running on mobile only. For desktop it is fine.
aws.config.loadFromPath(path3);
var file1;
var s3 = new aws.S3();
var fileName = '';
var uploadM = multer({
storage: multerS3({
s3: s3,
bucket: 'rent-z',
acl: 'public-read',
contentType: function (req, file, cb) {
cb(null, 'image/png');
},
metadata: function (req, file, cb) {
console.log(file);
cb(null, {fieldName: file.fieldname});
},
key: function (req, file, cb) {
fileName = Date.now().toString() + "-" + file.originalname;
file1 = file;
cb(null, fileName)
}
})
}).array('photos', 1);
router.post('/', function(req,res) {
uploadM(req, res, function (err) {
if (err) {
console.log(err);
return res.status(400).json({
message: "Error uploading to multer",
error: err
});
}
console.log('worked');
if (res.error) {
console.log(error.stack);
return res.status(400).json({
message: "Error",
error: res.error
});
}
// fs.readFile(req.body, function (err, data) {
const url = 'https://s3-us-west-2.amazonaws.com/rent-z/' + fileName;
return res.status(200).json({
fileName: url
});
// });
})
});
I even tried running multer-s3's automatic find mime-type function and that did give the same result
**** Day 4
It's been 96 hours since I started debugging this problem. No progress has been made. Still trying to figure out why its working on desktop and not mobile. For anyone looking for a quick summary of behavior:
User uploads image on the desktop
User places image on canvas
Use scales image
User presses sendImage
This converts the image to dataUri then blob
This blob is added to a file which is appended to formData
This formData is sent to a nodejs server where multer-s3 middleware
uploads the file to s3 successfully
User tries on mobile
Fails at step 7. The file is uploaded but is empty.
Let me know if anyone has any ideas on how to continue.
I'll make this an "official" answer since this may work for your needs. Anytime I have an intricate issue like this, my first thought is often "I wonder if there is an API/SaaS/service out there that can abstract this for me." As you've found, file uploads are tricky, particularly when you start throwing in the myriad devices we have to deal with these days.
I won't mention any particular services, but googling "file upload saas" will generally get you the top industry players. For $25 - $50/month you can abstract file uploads to a very simple api call. Not only do you get time savings now, but (assuming you choose a solid provider) you get no more headaches regarding file uploads in the future. It's the SaaS's job to make sure file uploads work on a million different devices; it's the SaaS's job to make sure S3 integration works, even when S3's api changes; it's the SaaS's job to make sure the user sees a nice friendly message if their upload fails for some reason, etc. I get to spend my time building features for our app instead of worrying about whether or not file uploads work on the iPhone 47.
"But then I'm tied to a SaaS, and live at the whim of their prices and feature set" Ah, but you can minimize that problem. For many services we use, I like to make a wrapper/interface/whatever you'd like to call it. In the case of file uploads, I made an ES6 module: fileUploads.js
In this module, I have a method upload. What does this method do? It simply implements and abstracts the API of [fileupload SaaS X]. If in the future we want, or need, to change from SaaS X to SaaS Y, I only have to change one thing in our entire app: my fileUpload.js module.
Sorry, I tend to be a bad writer when I have not fully woken up, let me revise.
I am using expressjs with passportjs (local strategy) to manage my server and using connect-busboy to manage file uploading. I do not think passport will play a role in this.
Here is the server code for managing file uploads:
app.post('/upload', isLoggedIn, (req, res) => {
if(req.busboy){
req.pipe(req.busboy);
req.busboy.on('file', (fieldname, file, filename, encoding, mimetype) => {
if(mimetype.match(/^image\//)){
var root = path.join(__dirname, "../public/images/");
if(fs.existsSync(path.join(root, filename))){
var name = getUnique(path.join(root, filename));
} else {
var name = filename;
}
var ws = fs.createWriteStream(path.join(root, name), { flags: "a" });
file.pipe(ws);
}
});
}
});
As for my client page, it is used to change a JSON object which will get re-uploaded to the server as a configuration tool. When I upload a new image asynchronously I need to get the filename to update this JSON object while working on it. For uploading from the clients end I am using dropzonejs, which did not require any configuration on my part to work.
So, in summary I upload a number of images via dropzone asynchronously, busboy and fs on my server save the file, and I would like to get the filename returned to my javascript to modify the existing JSON object.
Edit solution:
Thanks to Elliot Blackburn for pointing me in the right direction.
By calling:
ws.on('close', () => {
res.send({filename: name});
});
after file.pipe(ws); to send the response back to the client. On the client side modify dropzone to handle the response like so:
dropzone.on('success', (file, res) => {
console.log(res);
});
Just send it in the normal http response. It'll depend what library you're using but most will allow you to trigger a normal req, res, next express call. From that you can access the file object, and return anything you want.
Something like:
req.send({filename: name}); // name is the filename var set earlier in the code.
Once you've finished editing the file and such, you can get the name and put it into that returned object and your client will receive that as object as the response which you can act upon.
I have an application that uses axios to make requests to a node server which in turn makes requests to another java server.
Call to node server from client:
// here payload is FormData()
axios.post(url, payload).then((response) => {
return callback(null, response);
}).catch((err) => {
return callback(err, null);
});
In the node server, I listen to the request using busboy:
let rawData = '';
const busboy = new Busboy({headers: req.headers});
busboy.on('file', function (fieldname, file, filename, encoding, mimetype) {
file.on('data', function (chunk) {
rawData += chunk;
});
});
Now the java server too expects FormData (just like the way I sent it to node). How do I get the FormData from node now? I have been googling hard and trying a lot of stuff in vain. Any solution not involving busboy will help too.
I had finally used the middleware busboy-body-parser which adds support for getting files from the request object as req.files. And once the file is there, I send it as form-data to the java web server using the form-data npm package. The req.files support used to be there by default in Express.js. But from 4.x, it has been deprecated.
Multer is another really good middleware for handling multipart/form-data.