I'm using angular and multer-s3 to upload files from an angular app to a node server. Everything works well on the desktop but for some reason when trying to upload the photo via my iPhone 7 the uploaded file is corrupt. I'm using the same image and running through the same flow on both devices but getting different results so I'm assuming its because of mobile?
Here's the alert I get when trying to open the S3 file on the mobile
The file “1519398514215-test.png” could not be opened because it is empty.
Here's my code
var aws = require('aws-sdk');
var path = require('path');
var path3 = path.join(__dirname, "../config/config-aws.json");
var multer = require('multer');
var multerS3 = require('multer-s3');
var request = require('request');
aws.config.loadFromPath(path3);
var s3 = new aws.S3();
var fileName = '';
var uploadM = multer({
storage: multerS3({
s3: s3,
bucket: 'XXXX',
acl: 'public-read',
metadata: function (req, file, cb) {
cb(null, {fieldName: file.fieldname + '.png'});
},
key: function (req, file, cb) {
fileName = Date.now().toString() + "-" + file.originalname + '.png' ;
cb(null, fileName)
}
})
});
router.post('/', uploadM.array('photos', 3), function(req,res) {
if (res.error) {
console.log(error.stack);
return res.status(400).json({
message: "Error",
error: res.error
});
}
const url = 'https://s3-us-west-2.amazonaws.com/XXXX/' + fileName;
return res.status(200).json({
fileName: url
});
});
And here's my client-side
sendImage() {
const formData: FormData = new FormData();
this.removeObjectFromCanvas('polygon');
if (!fabric.Canvas.supports('toDataURL')) {
alert('This browser doesn\'t provide means to serialize canvas to an image');
} else {
// window.open(this.canvas.toDataURL('png'));
const image = new Image();
image.src = this.canvas.toDataURL('png');
const blob = this.dataURItoBlob(image.src);
const file = new File([blob], 'test.png');
formData.append('photos', file, 'test');
this.postFile(formData);
}
}
postFile(file) {
this.fileService.post(file)
.subscribe(data => {
}, error => {
console.log(error);
});
}
UPDATE **********
So found out you can debug on mobile. It looks like the buffer I am sending has data in it. My first thought was the buffer was not sending.
**** Update
Still can't figure this out. I've done some research and its possible it has something to do with formData and append? But as you can see by the image above both seem to be fine. Will continue to research ...
***** UPDATE
Definitely uploading empty files. But its only on mobile?
Also, I checked the formData prior to sending to the node server, seems to have the correct data in it.
*** UPDATE
Ok, even weirder experience. It seems multer-s3 is uploading empty files but when I take the file on the server-side and return it to the client-side, then read that file and display it, the image is displayed perfectly. So the formData is not the issue, it's something with multer-s3 I'm assuming?
****UPDATE
I forgot to mention I am using fabricjs and getting the image from the canvas. I read in some places there may be an issue there but like I said above when I send the file to the server and send it back to the client, after reading the file it displays the image perfectly.
****Update
I tried adding contentType to the multer method and now I'm receiving a 503 service unavailable error when running on mobile only. For desktop it is fine.
aws.config.loadFromPath(path3);
var file1;
var s3 = new aws.S3();
var fileName = '';
var uploadM = multer({
storage: multerS3({
s3: s3,
bucket: 'rent-z',
acl: 'public-read',
contentType: function (req, file, cb) {
cb(null, 'image/png');
},
metadata: function (req, file, cb) {
console.log(file);
cb(null, {fieldName: file.fieldname});
},
key: function (req, file, cb) {
fileName = Date.now().toString() + "-" + file.originalname;
file1 = file;
cb(null, fileName)
}
})
}).array('photos', 1);
router.post('/', function(req,res) {
uploadM(req, res, function (err) {
if (err) {
console.log(err);
return res.status(400).json({
message: "Error uploading to multer",
error: err
});
}
console.log('worked');
if (res.error) {
console.log(error.stack);
return res.status(400).json({
message: "Error",
error: res.error
});
}
// fs.readFile(req.body, function (err, data) {
const url = 'https://s3-us-west-2.amazonaws.com/rent-z/' + fileName;
return res.status(200).json({
fileName: url
});
// });
})
});
I even tried running multer-s3's automatic find mime-type function and that did give the same result
**** Day 4
It's been 96 hours since I started debugging this problem. No progress has been made. Still trying to figure out why its working on desktop and not mobile. For anyone looking for a quick summary of behavior:
User uploads image on the desktop
User places image on canvas
Use scales image
User presses sendImage
This converts the image to dataUri then blob
This blob is added to a file which is appended to formData
This formData is sent to a nodejs server where multer-s3 middleware
uploads the file to s3 successfully
User tries on mobile
Fails at step 7. The file is uploaded but is empty.
Let me know if anyone has any ideas on how to continue.
I'll make this an "official" answer since this may work for your needs. Anytime I have an intricate issue like this, my first thought is often "I wonder if there is an API/SaaS/service out there that can abstract this for me." As you've found, file uploads are tricky, particularly when you start throwing in the myriad devices we have to deal with these days.
I won't mention any particular services, but googling "file upload saas" will generally get you the top industry players. For $25 - $50/month you can abstract file uploads to a very simple api call. Not only do you get time savings now, but (assuming you choose a solid provider) you get no more headaches regarding file uploads in the future. It's the SaaS's job to make sure file uploads work on a million different devices; it's the SaaS's job to make sure S3 integration works, even when S3's api changes; it's the SaaS's job to make sure the user sees a nice friendly message if their upload fails for some reason, etc. I get to spend my time building features for our app instead of worrying about whether or not file uploads work on the iPhone 47.
"But then I'm tied to a SaaS, and live at the whim of their prices and feature set" Ah, but you can minimize that problem. For many services we use, I like to make a wrapper/interface/whatever you'd like to call it. In the case of file uploads, I made an ES6 module: fileUploads.js
In this module, I have a method upload. What does this method do? It simply implements and abstracts the API of [fileupload SaaS X]. If in the future we want, or need, to change from SaaS X to SaaS Y, I only have to change one thing in our entire app: my fileUpload.js module.
Related
I'm building an AWS Lambda API, using the serverless stack, that will receive an image as a binary string then store that image on GitHub and send it over FTP to a static app server. I am NOT storing the image in S3 at all as I am required not to.
I have already figured out how to save the image to GitHub (by converting binary to base64), the issue is sending the binary image data over FTP to another server to hold statically. The static image server is pre-existing and I cannot use S3. I am using the ftp npm package to send the image. The image server indeed receives the image BUT NOT in the correct format and the image is just non-displayable junk data. I saw an example on how to do this on the client side by encoding the image in an Uint16Array and passing it to a new Blob() object, but sadly, NODE.JS DOES NOT SUPPORT BLOBS!
Using ftp npm module:
async sendImage(image) {
try {
const credentials = await getFtpCredentials(this.platform);
console.log('Sending files via FTP client');
return new Promise((resolve, reject) => {
let ftpClient = new ftp();
ftpClient.on('ready', () => {
console.log(`Putting ${image.path} onto server`);
// Set transfer type to binary
ftpClient.binary(err => {
if(err)
reject(err);
});
// image.content is binary string data
ftpClient.put(image.content, image.path, (err) => {
if (err) {
reject(err);
}
console.log('Closing FTP connection');
ftpClient.end();
resolve(`FTP PUT for ${image.path} successful!`);
});
});
console.log('Connecting to FTP server');
ftpClient.connect(credentials);
});
} catch(err) {
console.log('Failed to load image onto FTP server');
throw err;
}
}
This procedure sends the binary data to the server, but the data is un-readable by any browser or image viewer. Do I need to use another FTP package? Or am I just not encoding this right?? I've spent days googleing the answer to this seemingly common task and it's driving me up the wall! Can anyone instruct me on how to send binary image data from a node.js lambda function over FTP to another server so that the image is encoded properly and works when viewed? ANY help is very much appreciated!
So, it turns out that the ftp node module that I was using was the issue all along. It will corrupt any binary image transferred over FTP. I submitted a ticket to the their GitHub repo, but they haven't made a commit in 4 years so I don't expect a timely fix.
To solve the problem, I used the Basic FTP package instead:
const ftp = require('basic-ftp');
async sendImage(image) {
try {
const { host, user, password } = await getFtpCredentials(this.platform);
console.log('Received FTP credentials');
console.log('Creating FTP client');
const client = new ftp.Client();
client.ftp.verbose = true;
console.log('Logging in to remote server via FTP');
await client.access({
host, user, password, secure: false
});
console.log('Creating readable stream from binary data');
const buffer = new Buffer(image.content, 'binary');
const imgStream = new Readable();
imgStream._read = () => {};
imgStream.push(buffer);
imgStream.push(null);
console.log('Uploading readable stream to server via FTP');
const result = await client.upload(imgStream, image.path);
client.close();
return result
} catch(err) {
console.log('Failed to load image onto FTP server');
throw err;
}
}
Note that this package requires data to be transferred via a Readable stream. Thanks to this SO answer, I learned how to convert a binary string to a Readable stream. All works fine.
Happy coding!
Sorry, I tend to be a bad writer when I have not fully woken up, let me revise.
I am using expressjs with passportjs (local strategy) to manage my server and using connect-busboy to manage file uploading. I do not think passport will play a role in this.
Here is the server code for managing file uploads:
app.post('/upload', isLoggedIn, (req, res) => {
if(req.busboy){
req.pipe(req.busboy);
req.busboy.on('file', (fieldname, file, filename, encoding, mimetype) => {
if(mimetype.match(/^image\//)){
var root = path.join(__dirname, "../public/images/");
if(fs.existsSync(path.join(root, filename))){
var name = getUnique(path.join(root, filename));
} else {
var name = filename;
}
var ws = fs.createWriteStream(path.join(root, name), { flags: "a" });
file.pipe(ws);
}
});
}
});
As for my client page, it is used to change a JSON object which will get re-uploaded to the server as a configuration tool. When I upload a new image asynchronously I need to get the filename to update this JSON object while working on it. For uploading from the clients end I am using dropzonejs, which did not require any configuration on my part to work.
So, in summary I upload a number of images via dropzone asynchronously, busboy and fs on my server save the file, and I would like to get the filename returned to my javascript to modify the existing JSON object.
Edit solution:
Thanks to Elliot Blackburn for pointing me in the right direction.
By calling:
ws.on('close', () => {
res.send({filename: name});
});
after file.pipe(ws); to send the response back to the client. On the client side modify dropzone to handle the response like so:
dropzone.on('success', (file, res) => {
console.log(res);
});
Just send it in the normal http response. It'll depend what library you're using but most will allow you to trigger a normal req, res, next express call. From that you can access the file object, and return anything you want.
Something like:
req.send({filename: name}); // name is the filename var set earlier in the code.
Once you've finished editing the file and such, you can get the name and put it into that returned object and your client will receive that as object as the response which you can act upon.
I'm having some trouble wrapping my head around a couple on concepts when it comes to file management and what not within NodeJS and wondered if anyone can point me in the right direction.
I have a queue list of uploads that will happen simultaneously as the user needs (allowing them to pause or delete a download at will) and I need the process to hit the Node server to resize the image or manipulate the video as needed. From here I need Node to upload the video or image via S3, using AWS-SDK (I set up a fakeS3 server for general sandbox testing) and as that is being uploaded I need to track the progress of that upload within the browser.
I am using React and server side rendering with Node so I figured there has to be a very straightforward way of handling this situation but I cannot find someone that has done this previously. Here are a couple concepts I was messing with:
Server.js (core input)
server.use(express.static(path.join(__dirname, 'public')));
server.use(multer({
dest: './public/temp',
limits: {
fieldNameSize: 50,
files: 1,
fields: 5,
fileSize: 1024 * 1024
},
rename: (fieldname, filename) => {
return filename;
},
onFileUploadStart: (file) => {
console.log('Starting file upload process.');
},
inMemory: true
}));
server.use(cookieParser());
server.use(bodyParser.urlencoded({ extended: true }));
server.use(bodyParser.json());
Intermediate route (/upload or something)
export function uploadContent(req, res) {
const config = {
s3ForcePathStyle: true,
accessKeyId: 'ACCESS_KEY_ID',
secretAccessKey: 'SECRET_ACCESS_KEY',
endpoint: new AWS.Endpoint('http://localhost:4567'), // TODO make live
};
const client = new AWS.S3(config);
const params = {
Key: 'Key',
Bucket: 'Bucket',
Body: fs.createReadStream(req.body.files[0])
};
client.upload(params, function uploadCallback (err, data) {
console.log(err, data);
});
}
This does not work, due to body parser conflicting with the route, multer is having a time as well (I'm open to other suggestions) any information on how this would be accomplished would be awesome. I'm not looking for full code just another idea to get me on the right path. I appreciate any help!
you can use a socket chanel between your browser and your Nodejs Route and emit event start, end and progress and using httpUploadProgress event :
var s3obj = new AWS.S3({params: {Bucket: 'myBucket', Key: 'myKey'}});
s3obj.upload({Body: body}).
on('httpUploadProgress', function(evt) {
console.log(evt);
//Emit Here your events
}).
send(function(err, data) { console.log(err, data) });
I'm using fileFilter option of multer plugin to decide whether the image should be saved onto the server or not.
In that fileFilter function, I want to check magic bytes of these images to ensure they are real images and in right format. Multer only expose file which is json array of uploaded image file as follow. But I need actual image file to check magic bytes.
{ fieldname: 'file',
originalname: 'arsenal-home-kit.jpg',
encoding: '7bit',
mimetype: 'image/jpeg' }
I comment my detail problem in the following code. My attempt so far as below;
var storage = multer.diskStorage({
destination: __dirname + '/../public/images/',
filename: function (req, file, cb) {
console.log(file.originalname);
crypto.pseudoRandomBytes(16, function (err, raw) {
if (err) return cb(err);
cb(null, raw.toString('hex') + path.extname(file.originalname))
})
}
});
var upload = multer({
storage: storage,
fileFilter: function (req, theFile, cb) {
// using image-type plugin to check magic bytes
// I need actual image file right at here.
// theFile is json array, not the image file.
// How to I get the actual image file to check magic bytes.
if (imageType(theFile).ext === "jpg") {
// To accept the file pass `true`, like so:
cb(null, true);
} else {
// To reject this file pass `false`, like so:
cb(null, false);
}
}
});
P.S. I decided to use image-type plugin to check these magic bytes.
fileFilter is called before start file upload, therefore it has no access to file data.
this is similar request https://github.com/expressjs/multer/issues/155
as you see it is in roadmap.
Currently you can download file to temporary directory, then validate it and move to destination directory or delete.
In my applications i use two types of validators: first before upload (very limited) and second type after upload (mime check can be after ).
I've got a simple node.js + Restify backend with standard CORS settings and this endpoint:
var file = '1,5,8,11,12,13,176,567,9483';
server.get('/download', function(req, res, next) {
res.set({"Content-Disposition": "attachment; filename='numbers.csv'"});
res.setHeader("Content-type", "text/csv");
res.send(file);
return next();
}, function(err) {
res.send(err);
});
What it's suppose to do is to is to create CSV file and return it.
It works great when I simply type in the endpoint address to web browser and hit enter. The file gets downloaded properly.
But when I try to do the same thing, but instead of using browser's address bar I use Restangular like that:
Restangular.one('download').get().then(function (res) {
console.log(res);
});
it just writes response to console, but no file is being downloaded.
Is there a way to do this using Restangular? Or maybe I need to use something else for this?
I am not sure if Restangular can do that, but I am using FileSaver script for stuff like that. Add Filesaver to your HTML head and then:
Restangular.one('download').get().then(function (res) {
var file = new Blob([res], { type: 'text/csv' });
saveAs(file, 'something.csv');
});