The file I get from the backend says it's corrupted (pdf) or shows nothing (image/png).
This is how I upload to AWS:
const s3 = new AWS.S3();
try {
await s3.upload({
Bucket: bucket,
Key: filename,
Body: file,
ContentType: mimetype,
ContentDisposition: contentDisposition,
}).promise();
return { success: true, data: null };
} catch(e) {
return { success: false, data: null, message: e.code };
}
This is how I get the object from AWS
try {
const data = await s3.getObject({Bucket: bucket, Key: idAWSFile}).promise()
return { success: true, data }
} catch(e) {
return { success: false, data: null, message: e.code };
}
This is the object I get in the client side.
{
AcceptRanges: "bytes",
Body: <Buffer>,
ContentDisposition: "attachment; filename = "blablabla"",
ContentLength: 23361,
ContentType: "application/pdf",
ETag: <randomnumbers>,
LastModified: <DATE>,
Metadata: {},
}
This is how I'm trying to download it in the client side:
const blob = new Blob([data.Body], { type: data.ContentType });
const url = URL.createObjectURL(blob)
window.open(url)
What have I tried doing:
I tried with file-saver and same result -> corrupted file
I tried using toString(utf-8) on the Body in the s3.getObject function as I saw in other SO answers. But when I do this I get gibberish in the Body on the client side and don't know and haven't found what to do with it.
This change in the way I was getting the blob, in the client side, solved it.
const blob = new Blob([Buffer.from(data.Body, 'binary')], {type: data.ContentType})
Related
I am running into an issue where when I want to upload an image to s3 bucket nothing goes through.
Basically the only message I get is
API resolved without sending a response for /api/upload/uploadPhoto, this may result in stalled requests.
In the front end, I have an input which can take multiple files ( mainly images ) and then those are stored in event.target.files.
I have a function that stores each file in a state array, and with the button submit it sends a post request to my next.js API.
Here's the logic on the front end:
This function handles the photos, so whenever I add a photo it will automatically add it to the listingPhotos state:
const handleListingPhotos = async (e: any) => {
setMessage(null);
let file = e.target.files;
console.log("hello", file);
for (let i = 0; i < file.length; i++) {
const fileType = file[i]["type"];
const validImageTypes = ["image/jpeg", "image/png"];
if (validImageTypes.includes(fileType)) {
setListingPhotos((prev: any) => {
return [...prev, file[i]];
});
} else {
setMessage("Only images are accepted");
}
}
};
Once the photos are stored in the state, I am able to see the data of the files in the browsers console.log. I run the onSubmit to call the POST API:
const handleSubmit = async (e: any) => {
e.preventDefault();
const formData = new FormData();
formData.append("files[]", listingPhotos);
await fetch(`/api/upload/uploadPhoto`, {
method: "POST",
headers: { "Content-Type": "multipart/form-data" },
body: formData,
}).then((res) => res.json());
};
console.log("listingphotos:", listingPhotos);
Which then uses this logic to upload to the S3 Bucket, but the issue is that when I log req.body I am getting this type of information:
req.body ------WebKitFormBoundarydsKofVokaJRIbco1
Content-Disposition: form-data; name="files[]"
[object File][object File][object File][object File]
------WebKitFormBoundarydsKofVokaJRIbco1--
api/upload/UploadPhoto logic:
import { NextApiRequest, NextApiResponse } from "next";
const AWS = require("aws-sdk");
const access = {
accessKeyId: process.env.AWS_ACCESS_KEY_ID as string,
secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY as string,
};
// creates an S3 Client
const s3 = new AWS.S3({ region: "region", credentials: access });
export default async function uploadPhoto(
req: NextApiRequest,
res: NextApiResponse
) {
// take info from parent page
// console.log("req.body: ", req.body);
if (req.method === "POST") {
console.log("req.body", req.body);
let body = req.body;
let headers = req.headers;
let contentType = headers["Content-Type"] || headers["content-type"];
// check for correct content-type
if (!contentType.startsWith("multipart/form-data")) {
return { statusCode: 400, body: "Invalid content type" };
}
let boundary = contentType.replace("multipart/form-data; boundary=", "");
let parts = body.split(boundary);
for (let part of parts) {
if (part.startsWith("Content-Disposition")) {
let [fileData] = part.split("\r\n\r\n");
fileData = fileData.slice(0, -2);
let [fileName] = part.split("filename=");
fileName = fileName.slice(1, -1);
let params = {
Bucket: "RANDOM BUCKET NAME",
Key: fileName,
Body: fileData,
ContentType: { "image/png": "image/jpg" },
};
// Need to set the PARAMS for the upload
await s3.putObject(params);
console.log(
"Successfully uploaded object: " + params.Bucket + "/" + params.Key
);
}
}
return {
statusCode: 200,
body: "File uploaded",
};
// Uploads the files to S3
}
}
I was able to find a way to read if the files were correctly displayed.
req.body {
fileName: 'b699417375e46286e5a30fc252b9b5eb.png',
fileType: 'image/png'
}
POST request code was changed to the followng:
const s3Promises = Array.from(listingPhotos).map(async (file) => {
const signedUrlRes = await fetch(`/api/upload/uploadPhoto`, {
method: "POST",
body: JSON.stringify({
fileName: file.name,
fileType: file.type,
}),
headers: { "Content-Type": "application/json" },
});
Obviously, this is not the solution but it's part of it. The only problem I am running into right now is handling CORS in order to see if the files are sent to the bucket.
I am using s3 for storing/accessing my files ".glb" files, when the file gets uploaded it is uploaded as a ".glb" file type, but when I am trying to access it through the signedURL it is coming as a "file" file type
exports.uploadGLB = async (req, res) => {
const file = req.file;
//console.log("skslsk", file);
let { _id } = req.params;
//console.log("ID", _id);
const imageName = generateFileName();
const result = await uploadFile(file, imageName, file.mimetype)
await unlinkFile(file.path);
console.log(">>>>>>>>>>>>>>>",result)
User.findOneAndUpdate(
{ _id: _id },
{ $push: { metaSpaces: result } },
function (error, success) {
if (error) {
console.log(error);
} else {
console.log(success);
}
});
res.status(201).json({
message: "Uploaded",
});
};
The above code snippet is to upload and to generate the signedURL from s3-config.js file.
The below mentioned code is the controller of that api responsible for uploading the file
exports.uploadFile = async (file, name, mimetype) => {
//console.log("Uploadingg,,,,,,")
//console.log("paramsss", fileBuffer, fileName, bucketName)
const uploadParams = {
Bucket: bucketName,
Body: fs.createReadStream(file.path),
Key: name,
ContentType: mimetype,
};
const signedUrlExpireSeconds = 60 * 5;
let url = s3Client.getSignedUrl("getObject", {
Bucket: bucketName,
Key: name,
Expires: signedUrlExpireSeconds,
});
// console.log("url", url);
return new Promise( (resolve, reject) => {
s3Client.upload(uploadParams,(err,data)=>{
if(data){
let res = {url:url}
resolve(Object.assign(res,data))
}else
{
reject(err)
}
})
});
};
This is the log result after the file is successfully uploaded in the S3, where you can see that it is clearly the mimetype.
{
fieldname: 'glb',
originalname: 'Tiger.glb',
encoding: '7bit',
mimetype: 'model/gltf-binary',
destination: 'files/',
filename: '534e864015afbad9724e6686e19c481b',
path: 'files\534e864015afbad9724e6686e19c481b',
size: 9201964
} f998b0a3efe7b9f0387580c9bc3dea0fc36e754839f029522cc46631c4fedc8f model/gltf-binary
but when I am trying to access the same file using the generated signedURL the type of the file is changing to only "file", it is showing when I am checking its properties.
Screenshot of the property of the file, which is showing the file type
but when I am following exact same steps for any image file like png, jpeg, etc it is working fine without any issue, I am able to access the exact file which one is uploaded.
any solution for the above problem will be appreciated, really hoping for a solution. Thank you in advance.
i need to upload video to bunny stream :bunny docs
i convert file to base64 as thy do here: upload file: BODY PARAMS
if you select js axios as LANGUAGE you'll see the data value set in base64
and this is my code:
function UploadVideo(e){
const data = new FormData();
let file = e.target.files[0];
let video;
const toBase64 = file => new Promise((resolve, reject) => {
const reader = new FileReader();
reader.readAsDataURL(file);
reader.onload = () => resolve(reader.result);
reader.onerror = error => reject(error);
});
async function Main() {
video = await toBase64(file);
}
Main();
const c_options = {
method: 'POST',
url: 'https://video.bunnycdn.com/library/49034/videos',
headers: {
Accept: 'application/json',
'Content-Type': 'application/*+json',
AccessKey: ''
},
data: '{"title":"test"}'
};
axios.request(c_options).then(function (c_response) {
//upload start
const u_options = {
method: 'PUT',
url: `https://video.bunnycdn.com/library/49034/videos/${c_response.data.guid}`,
headers: {
Accept: 'application/json',
AccessKey: ''
},
data: video,
};
axios.request(u_options).then(function (u_response) {
//post url to php
console.log(u_response.data);
}).catch(function (error) {
console.error(error);
});
//upload end
console.log(c_response.data);
}).catch(function (error) {
console.error(error);
});
}
but it return status code 400
The 400 error text: "Failed to read the request form. Form key length limit 2048 exceeded."
how can i do that?
The error is telling that a key is too long.
Your data property is meant to be an object, because axios default Content-Type (which you have omitted) is application/x-www-form-urlencoded.
If you want to send the file, then you need to set the Content-Type in you header to Content-Type: application/octet-stream, i.e. your header object should be
headers: {
Accept: 'application/json',
AccessKey: '',
Content-Type: 'application/octet-stream'
},
This shown in the javascript example on the bunny.net page you link to.
I'm using node.js and would like to put on aws of the images I receive in base64. Everything goes well and loads them but when I open them it gives me an error, tells me that I am in the wrong form. And how do I get the image link to store it in the database?
The base64 form:
data:image/jpeg;base64,/9j/4QAYRXhpZgAASUkqAAgAAAAAAAAAAAAAAP/......
The function that uploads the image:
const s3 = new aws.S3({ params: { Bucket: process.env.S3_BUCKET } });
let data = this.createData(req.body.image);
s3.putObject(data, (err, response) => {
if (err) {
console.log(err);
}
else {
console.log(response)
/*tmp = task
.update(req.body)
.then(() => res.status(200).send(JSON.stringify(task.id_creator)))
.catch(error => res.status(400).send(error));*/
}
})
createData(image) {
//TODO NOME CARTELLA
let data = {
Key: 'test1',
Body: image,
ContentEncoding: 'base64',
ContentType: 'image/jpeg'
};
return data;
}
Everything goes well apparently, the response is:
{ ETag: '"20eaa681c71825d8f57472eb378be651"',
VersionId: 'kjQCDdfoq5H0Clhbs79SU4JiIUq8BgOn' }
But when I go in s3 console in my bucket if i download the image gives me an error ('format is wrong')
I figure out a solution:
I just added
let buf = new Buffer(req.body.image.replace(/^data:image\/\w+;base64,/, ""),'base64');
And sent buf as data.
And I added an ACL parameter:
createData(image) {
//TODO NOME CARTELLA
let data = {
Key: 'test1',
Body: image,
ContentEncoding: 'base64',
ContentType: 'image/jpeg',
ACL: 'public-read'
};
return data;
}
I have been struggling with this one for a while now, and need your guys help! I am trying to streamline the process of downloading a report from my website. With node this is fairly straightforward with a readStream, like so:
router.post('/report', (req, res) => {
const filename = 'test.png';
const filePath = path.join(__dirname, filename);
fs.exists(filePath, exists => {
if (exists) {
const stat = fs.statSync(filePath);
res.writeHead(200, {
'Content-Type': 'image/png',
'Content-Length': stat.size,
'Content-Disposition': 'attachment; filename=' + filename,
});
fs.createReadStream(filePath).pipe(res);
} else {
res.writeHead(400, { 'Content-Type': 'text/plain' });
res.end('ERROR File does NOT Exists');
}
});
});
Now if I try this with Postman or some other API tester, it works perfectly, the file is downloaded and saved correctly. Now I am struggling to get this to work my front-end. I am currently running AngularJS and have tried to use FileSaver.js as a way to take this data and save it, however it never works. The file is saved, but the data is unreadable, aka the image previewer says the image is broken. I think I am creating the Blob incorrectly?
function exportReport(_id) {
this.$http
.post(
'/api/report',
{ _id },
{
headers: {
'Content-type': 'application/json',
Accept: 'image/png',
},
}
)
.then(data => {
console.log(data);
const blob = new Blob([data], {
type: 'image/png',
});
this.FileSaver.saveAs(blob, 'testing.png');
});
}
The console log result is as so:
Object {data: "�PNG
↵↵
IHDRRX��iCCPICC Profi…g�x #� #������Z��IEND�B`�", status: 200, config: Object, statusText: "OK", headers: function}
Am I suppose to decode the object.data?
Try adding responseType: 'blob' to the request and omit creating a new blob:
function exportReport(_id) {
this.$http
.post(
'/api/report',
{ _id },
{
headers: {
'Content-type': 'application/json',
Accept: 'image/png',
},
responseType: 'blob'
}
)
.then(data => {
console.log(data);
this.FileSaver.saveAs(data.data, 'testing.png');
});
}