How to save binary image from json response to S3 bucket - javascript

I am trying to save a jpeg that I get back from Microsoft's Graph API to an S3 bucket (Digital Ocean Space actually but they work the same). I am able to get the image binary from the graph api. And can upload it successfully. But the image is gibberish at the endpoint. Here is the code I am using for reference.
// Prep to fetch user photo from microsoft office 365
// using recently aquired access token
const photoEndpoint = `${MS_GRAPH_URL}/v1.0/me/photo/$value`;
const config = { headers: { Authorization: `Bearer ${access_token}` } };
// Fetch user photo from microsoft office 365
const { data: photo } = await axios.get(photoEndpoint, {
headers: { Authorization: `Bearer ${access_token}` }
});
await storeImage(photo, id);
where storeImage is defined as follows:
// Import AWS SDK
const AWS = require("aws-sdk");
// Instantiate s3 instance to iterface with digital ocean spaced
const spacesEndpoint = new AWS.Endpoint("sfo2.digitaloceanspaces.com");
const s3 = new AWS.S3({
endpoint: spacesEndpoint,
accessKeyId: process.env.DO_SPACE_KEY,
secretAccessKey: process.env.DO_SPACE_SECRET
});
const storeImage = (image, key) => {
const params = {
ACL: "public-read",
Bucket: process.env.DO_SPACE_BUCKET,
Key: `${key}`,
Body: image,
ContentType: "image/jpeg"
};
const promise = new Promise((resolve, reject) => {
s3.upload(params, (err, data) => {
if (err) reject(err);
resolve(data);
});
});
return promise;
};
Any thoughts on what I am doing wrong?

Where do you indicate the Content-Type of data you're requesting via your axios call?
The default appears to be application/json rather than binary/octet-stream so your image data will undergo unwanted transformation. ;)
Adding responseType: 'stream' to your axios request should fix your issue.

Related

vue3 send image to api gateway rest endpoint and use lambda to store image in s3 bucket

Here's the scenario
vue3 app sends an image as a multipart/form to an endpoint
endpoint is api gateway rest endpoint
endpoint triggers a nodejs lambda function which should take the passed image and store it in an s3 bucket
Expected Result
s3 bucket should house an image that is publicly accessible
Actual Result
I believe it's a binary file that gets stored in the bucket, name and extension are fine but it isn't an actual image. Here's a link to an example. bad image
Lambda Code
module.exports = async (event) => {
const { accountNum } = event.pathParameters
const keyName = "logo"
const BUCKET = "foo-bar"
const s3 = new AWS.S3();
const parser = require("lambda-multipart-parser");
const result = await parser.parse(event);
const { content, filename, contentType } = result.files[0];
const params = {
Bucket: BUCKET,
Key: `${accountNum}/${filename}.png`,
Body: content,
// ContentDisposition: `attachment; filename="${filename}";`,
ContentType: contentType,
ACL: "public-read"
};
let res
try{
res = await s3.upload(params).promise()
// res = await s3.putObject(params).promise()
return {status: 200, message: "nailed it"}
} catch(e) {
console.log("errrr", e)
return {status: 200, message: "you did not nail it"}
}
}
Closing Thoughts
My API Gateway has been updated to allow binary types
My Resource has been set to add the headers Accept and Content-Type on the Request Header
I cannot figure out why it won't store an actual image in the s3 bucket.
Thanks a lot.

Upload and pin image with Piniata api on client side, no nodejs

I am trying to use the Piniata api. Here it is:
https://docs.pinata.cloud/
The idea is to upload and pin and image using the api, into my account in Piniata.
I got this sample to upload a file in base64, using Node.js and server side.
The sample use this api call:
"https://api.pinata.cloud/pinning/pinFileToIPFS"
I am suppose to be able to do this in client side as well.
However, there is no sample of client side without using Node.js. And I can't seem to find exactly a documentation of what the api call expects.
Here is the sample I got from the Piniata support:
const { Readable } = require("stream");
const FormData = require("form-data");
const axios = require("axios");
(async () => {
try {
const base64 = "BASE64 FILE STRING";
const imgBuffer = Buffer.from(base64, "base64");
const stream = Readable.from(imgBuffer);
const data = new FormData();
data.append('file', stream, {
filepath: 'FILENAME.png'
})
const res = await axios.post("https://api.pinata.cloud/pinning/pinFileToIPFS", data, {
maxBodyLength: "Infinity",
headers: {
'Content-Type': `multipart/form-data; boundary=${data._boundary}`,
pinata_api_key: pinataApiKey,
pinata_secret_api_key: pinataSecretApiKey
}
});
console.log(res.data);
} catch (error) {
console.log(error);
}
})();
Here is my attempt to perform an upload from client side without Node.js
async function uploadFile(base64Data)
{
const url = `https://api.pinata.cloud/pinning/pinFileToIPFS`;
var status = 0;
try {
let data = new FormData();
var fileName = "FILENAME.png";
var file = new File([base64Data], fileName, {type: "image/png+base64"});
data.append(`data`, file, file.name);
data.append(`maxBodyLength`, "Infinity");
const response = await postData('POST', url, {
'Content-Type': `multipart/form-data; boundary=${data._boundary}`,
"Authorization": "Bearer Redacted"
},
data
);
} catch (error) {
console.log('error');
console.log(error);
}
}
What I get as a response from the server is 400 and the error being:
{"error":"Invalid request format."}
What am I doing wrong?
Also, it seems like when I try to use FormData .append with a stream as a sample, it doesn't work. As if it only expects a blob.

Download file from Google Drive and upload to S3 using NodeJS

I download my PDF file from Google drive according to the documentation:
const file = await this.driveClient.files.get(
{
fileId: id,
alt: 'media',
},
{
responseType: 'stream'
},
);
Then I construct a form data:
const formData = new FormData();
formData.append('file', file.data, 'file.pdf');
And send it to S3 via the presigned upload url:
const uploadedDocument = await axios({
method: 'put',
url: presignedS3Url,
data: formData,
headers: formData.getHeaders(),
});
The flow works, but the file uploaded to s3 appears corrupted:
I also tried different response types from Google API such as blob. Any idea what I am missing? Thanks in advance!
I managed to solve the issue by piping the file stream from google drive directly to s3 using nodejs built-in packages.
import * as https from 'https';
import { promisify } from 'util';
import { pipeline } from 'stream';
//...
const file = await this.driveClient.files.get(
{
fileId: id,
alt: 'media',
},
{
responseType: 'stream'
},
);
await promisify(pipeline)(
file.data,
https.request(s3SignedUrl, {
method: 'PUT',
headers: {
'content-type': fileMetadata.mimeType,
'content-length': fileMetadata.size,
},
timeout: 60000,
}),
);
You need to export the file as PDF from google drive, use the following function, passing the id:
/**
* Download a Document file in PDF format
* #param{string} fileId file ID
* #return{obj} file status
* */
async function exportPdf(fileId) {
const {GoogleAuth} = require('google-auth-library');
const {google} = require('googleapis');
// Get credentials and build service
// TODO (developer) - Use appropriate auth mechanism for your app
const auth = new GoogleAuth({scopes: 'https://www.googleapis.com/auth/drive'});
const service = google.drive({version: 'v3', auth});
try {
const result = await service.files.export({
fileId: fileId,
mimeType: 'application/pdf',
});
console.log(result.status);
return result;
} catch (err) {
console.log(err)
throw err;
}
}

Pinata IPFS's pinFileToIPFS method not accepting a user uploaded file

I am working on a project (using React.js Express.js and Node.js) to convert a user uploaded image into and NFT on Ethereum blockchain and for that, I need to upload the image on an IPFS (I am using Pinata) and then use the pinata URI in the metadata to mint a new NFT. (Do let me know if I am wrong here, I am still newbie to web3)
For uploading my image onto the Pinata IPFS, I am sending the base64 string of the image from the client side to the server side and then calling the pinFileToIPFS method. This is the code of my server side file
const axios = require('axios');
const fs = require('fs');
const FormData = require('form-data');
const router = require('express').Router();
const { Readable } = require('stream');
const pinFileToIPFS = (image) => {
const url = `https://api.pinata.cloud/pinning/pinJSONToIPFS`;
const buffer = Buffer.from(image);
const stream = Readable.from(buffer);
const filename = `an-awesome-nft_${Date.now()}.png`;
stream.path = filename;
const formData = new FormData();
formData.append("file", stream);
return axios.post(url,
formData,
{
headers: {
'Content-Type': `multipart/form-data; boundary= ${formData._boundary}`,
'pinata_api_key': "*******************",
'pinata_secret_api_key': "**********************************",
}
}
).then(function (response) {
console.log("Success: ", response);
}).catch(function (error) {
console.log("Fail! ", error.response.data);
});
};
router.route('/').post((req, res) => {
const image = req.body.image;
pinFileToIPFS(image);
});
module.exports = router;
Here req.body.image contains the base64 string of the user uploaded file.
I have tried to convert the base64 string into a buffer and then convert the buffer into a readable stream (as done in the official Pianata documentation but for a localy file) and then wrap it up in FormData(), but I keep getting the following error.
data: {
error: 'This API endpoint requires valid JSON, and a JSON content-type'
}
I know the problem is with the format my image/file is being sent to the API but I can't figure out. I am still a newbie to web3 and blockchains so please help!
The recommended way of interacting with Pinata, is by using their Node.JS SDK. This SDK has a pinFileToIPFS function, allows you to upload an image to their IPFS nodes in the form of a readableStream.
A sample of this would look like
const fs = require('fs');
const readableStreamForFile = fs.createReadStream('./yourfile.png');
const options = {
pinataMetadata: {
name: MyCustomName,
keyvalues: {
customKey: 'customValue',
customKey2: 'customValue2'
}
},
pinataOptions: {
cidVersion: 0
}
};
pinata.pinFileToIPFS(readableStreamForFile, options).then((result) => {
//handle results here
console.log(result);
}).catch((err) => {
//handle error here
console.log(err);
});
However, if you are deadset on using their API endpoints and simply posting to them via axios, there is a seperate API endpoint. /pinning/pinFileToIPFS. Examples of this method can be found in their API Docs.
You may want to consider changing the following two lines and using the https://api.pinata.cloud/pinning/pinFileToIPFS endpoint instead:
const buffer = Buffer.from(image); -> const buffer = Buffer.from(image, "base64");
and
formData.append("file", stream); -> formData.append("file", stream, "fileNameOfChoiche.png);
When you are uploading an image or file to Pinata IPFS with node js. These are the steps that even don't need Pinata Node.js SDK.
1- You can upload an image from the front end with React or Next.js. Code is given below.
const uploadAttachment = async (data, token) => {
try {
return await Api.post(`${ApiRoutes.upload_attachment}`, data, {
headers: {
Authorization: "Bearer " + token, //the token is a variable which holds the token
},
});
} catch (error) {
return {
status: 404,
};
}
};
export default uploadAttachment;
2- You need to install multer to upload an image.
const multer = require("multer");
global.uploadSingleFile = multer({ dest: "uploads/" });
3- Set up your route with multer middleware and action which you are going to call.
.post(
"/attachments/upload",
uploadSingleFile.single("file"),
actions.attachments.upload.pinFileToIPFSLocal
);
4- Last step where you will hit the Pinata endpoint with Pinata API & Secret key.
pinFileToIPFSLocal: async (req, res, next) => {
try {
const url = "https://api.pinata.cloud/pinning/pinFileToIPFS";
let formData = new FormData();
formData.append("file", JSON.stringify(req.file), req.file.originalname);
axios
.post(url, formData, {
maxContentLength: -1,
headers: {
"Content-Type": `multipart/form-data; boundary=${formData._boundary}`,
pinata_api_key: process.env.PINATA_API_KEY,
pinata_secret_api_key: process.env.PINATA_API_SECRET,
path: "somename",
},
})
.then((data) => {
console.log("Result...", data);
return utils.response.response(
res,
"Upload image to ipfs.",
true,
200,
data.data
);
})
.catch((err) => {
return utils.response.response(
res,
"Image not upload to ipfs",
false,
200,
err
);
});
} catch (error) {
next(error);
}
The error message is clear. You are using url that used for json file upload. this is the url you should use to upload image
const url = `https://api.pinata.cloud/pinning/pinFileToIPFS`;
you don't have to convert buffer to a readable stream.
I am not sure if this is correct ${formData._boundary}. should be
"Content-Type": `multipart/form-data: boundary=${formData.getBoundary()}`,
There must be an error on the image parameter. A simple buffer representation of the image should work. The readable stream is not necessary.
Instead of creating the buffer, you could use middleware like express-fileupload to access the buffer representation of the file uploaded on the client-side directly.
const file = req.files;
const url = "https://api.pinata.cloud/pinning/pinFileToIPFS";
const data = new FormData();
data.append("file", file.file.data, { filepath: "anyname" });
const result = await axios.post(url, data, {
maxContentLength: -1,
headers: {
"Content-Type": `multipart/form-data; boundary=${data._boundary}`,
pinata_api_key: process.env.PINATA_API_KEY,
pinata_secret_api_key: process.env.PINATA_API_SECRET,
path: "somename",
},
});

Using lambda to serve bucket image resources through API Gateway results in a broken image

I am using AWS Api Gateway, Lambda, and S3.
My goal is to hit the gateway and serve and display an image from my S3 bucket in the browser. Currently I am able to successfully fetch the image from the appropriate bucket using the AWS SDK in my lambda no problem. I then send it over the wire as and the response is 200 and I can see all the appropriate headers. However the image is broken.
Here is the code in my lambda:
const AWS = require('aws-sdk');
const s3 = new AWS.S3();
exports.handler = async (event) => {
const bucketParams = {
Bucket: 'my-bucket',
Key: 'build/cat.jpg',
};
const bucket = await s3.getObject(bucketParams).promise();
const response = {
statusCode: 200,
body: Buffer.from(bucket.Body).toString(),
headers: {
'Accept-Ranges': 'bytes',
'Content-Length': bucket.ContentLength,
'Content-Type': bucket.ContentType,
'Last-Modified': bucket.LastModified,
'ETag': bucket.Etag,
}
};
return response;
};
The response is a 200, everything works but my image appears to be broken in the browser. Here is a uri of the example https://3bn2t9npbd.execute-api.us-east-1.amazonaws.com/dev/cat.jpg
As a side note I serve more than just images through this code the example above though is just a single image.
I was able to figure out the solution to my problem. Thanks to jarmod for pointing me in the right direction. I first had to enable Binary Media Types for all content in my AWS Gateway Settings pictured below.
After that I changed the lambda code to response to be:
const bucket = await s3.getObject(bucketParams).promise();
const response = {
statusCode: 200,
body: Buffer.from(bucket.Body).toString("base64"),
headers: {
'Accept-Ranges': 'bytes',
'Content-Length': bucket.ContentLength,
'Content-Type': bucket.ContentType,
'Last-Modified': bucket.LastModified,
},
isBase64Encoded: true,
};
return response;

Categories