I download my PDF file from Google drive according to the documentation:
const file = await this.driveClient.files.get(
{
fileId: id,
alt: 'media',
},
{
responseType: 'stream'
},
);
Then I construct a form data:
const formData = new FormData();
formData.append('file', file.data, 'file.pdf');
And send it to S3 via the presigned upload url:
const uploadedDocument = await axios({
method: 'put',
url: presignedS3Url,
data: formData,
headers: formData.getHeaders(),
});
The flow works, but the file uploaded to s3 appears corrupted:
I also tried different response types from Google API such as blob. Any idea what I am missing? Thanks in advance!
I managed to solve the issue by piping the file stream from google drive directly to s3 using nodejs built-in packages.
import * as https from 'https';
import { promisify } from 'util';
import { pipeline } from 'stream';
//...
const file = await this.driveClient.files.get(
{
fileId: id,
alt: 'media',
},
{
responseType: 'stream'
},
);
await promisify(pipeline)(
file.data,
https.request(s3SignedUrl, {
method: 'PUT',
headers: {
'content-type': fileMetadata.mimeType,
'content-length': fileMetadata.size,
},
timeout: 60000,
}),
);
You need to export the file as PDF from google drive, use the following function, passing the id:
/**
* Download a Document file in PDF format
* #param{string} fileId file ID
* #return{obj} file status
* */
async function exportPdf(fileId) {
const {GoogleAuth} = require('google-auth-library');
const {google} = require('googleapis');
// Get credentials and build service
// TODO (developer) - Use appropriate auth mechanism for your app
const auth = new GoogleAuth({scopes: 'https://www.googleapis.com/auth/drive'});
const service = google.drive({version: 'v3', auth});
try {
const result = await service.files.export({
fileId: fileId,
mimeType: 'application/pdf',
});
console.log(result.status);
return result;
} catch (err) {
console.log(err)
throw err;
}
}
Related
I have this collection type in Strapi:
I need to set banner from Strapi endpoint in Next.js.
I tried using this API http://localhost:1337/api/upload.
Image upload successfuly but image not set in my user banner field. Image only goes to Strapi media library.
async function handalBannerUpload(e){
e.preventDefault();
const formdata = new FormData();
formdata.append("files",file[0]);
formdata.append("field","banner");
formdata.append("refid",29);
formdata.append("ref","plugin::users-permissions.user");
// for (const value of formdata.values()) {
// console.log({value});
// }
const res = await fetch("http://localhost:1337/api/upload",{
method:"POST",
headers: {
// 'Content-Type': 'multipart/form-data',
Authorization: apiKey,
},
body: {
files: JSON.stringify(formdata.get('files')),
field: "banner",
refid: data?.id,
ref: "plugin::users-permissions.user",
}
// body: formdata
})
const dataRes = await res.json();
console.log(dataRes);
}
I am working on a project (using React.js Express.js and Node.js) to convert a user uploaded image into and NFT on Ethereum blockchain and for that, I need to upload the image on an IPFS (I am using Pinata) and then use the pinata URI in the metadata to mint a new NFT. (Do let me know if I am wrong here, I am still newbie to web3)
For uploading my image onto the Pinata IPFS, I am sending the base64 string of the image from the client side to the server side and then calling the pinFileToIPFS method. This is the code of my server side file
const axios = require('axios');
const fs = require('fs');
const FormData = require('form-data');
const router = require('express').Router();
const { Readable } = require('stream');
const pinFileToIPFS = (image) => {
const url = `https://api.pinata.cloud/pinning/pinJSONToIPFS`;
const buffer = Buffer.from(image);
const stream = Readable.from(buffer);
const filename = `an-awesome-nft_${Date.now()}.png`;
stream.path = filename;
const formData = new FormData();
formData.append("file", stream);
return axios.post(url,
formData,
{
headers: {
'Content-Type': `multipart/form-data; boundary= ${formData._boundary}`,
'pinata_api_key': "*******************",
'pinata_secret_api_key': "**********************************",
}
}
).then(function (response) {
console.log("Success: ", response);
}).catch(function (error) {
console.log("Fail! ", error.response.data);
});
};
router.route('/').post((req, res) => {
const image = req.body.image;
pinFileToIPFS(image);
});
module.exports = router;
Here req.body.image contains the base64 string of the user uploaded file.
I have tried to convert the base64 string into a buffer and then convert the buffer into a readable stream (as done in the official Pianata documentation but for a localy file) and then wrap it up in FormData(), but I keep getting the following error.
data: {
error: 'This API endpoint requires valid JSON, and a JSON content-type'
}
I know the problem is with the format my image/file is being sent to the API but I can't figure out. I am still a newbie to web3 and blockchains so please help!
The recommended way of interacting with Pinata, is by using their Node.JS SDK. This SDK has a pinFileToIPFS function, allows you to upload an image to their IPFS nodes in the form of a readableStream.
A sample of this would look like
const fs = require('fs');
const readableStreamForFile = fs.createReadStream('./yourfile.png');
const options = {
pinataMetadata: {
name: MyCustomName,
keyvalues: {
customKey: 'customValue',
customKey2: 'customValue2'
}
},
pinataOptions: {
cidVersion: 0
}
};
pinata.pinFileToIPFS(readableStreamForFile, options).then((result) => {
//handle results here
console.log(result);
}).catch((err) => {
//handle error here
console.log(err);
});
However, if you are deadset on using their API endpoints and simply posting to them via axios, there is a seperate API endpoint. /pinning/pinFileToIPFS. Examples of this method can be found in their API Docs.
You may want to consider changing the following two lines and using the https://api.pinata.cloud/pinning/pinFileToIPFS endpoint instead:
const buffer = Buffer.from(image); -> const buffer = Buffer.from(image, "base64");
and
formData.append("file", stream); -> formData.append("file", stream, "fileNameOfChoiche.png);
When you are uploading an image or file to Pinata IPFS with node js. These are the steps that even don't need Pinata Node.js SDK.
1- You can upload an image from the front end with React or Next.js. Code is given below.
const uploadAttachment = async (data, token) => {
try {
return await Api.post(`${ApiRoutes.upload_attachment}`, data, {
headers: {
Authorization: "Bearer " + token, //the token is a variable which holds the token
},
});
} catch (error) {
return {
status: 404,
};
}
};
export default uploadAttachment;
2- You need to install multer to upload an image.
const multer = require("multer");
global.uploadSingleFile = multer({ dest: "uploads/" });
3- Set up your route with multer middleware and action which you are going to call.
.post(
"/attachments/upload",
uploadSingleFile.single("file"),
actions.attachments.upload.pinFileToIPFSLocal
);
4- Last step where you will hit the Pinata endpoint with Pinata API & Secret key.
pinFileToIPFSLocal: async (req, res, next) => {
try {
const url = "https://api.pinata.cloud/pinning/pinFileToIPFS";
let formData = new FormData();
formData.append("file", JSON.stringify(req.file), req.file.originalname);
axios
.post(url, formData, {
maxContentLength: -1,
headers: {
"Content-Type": `multipart/form-data; boundary=${formData._boundary}`,
pinata_api_key: process.env.PINATA_API_KEY,
pinata_secret_api_key: process.env.PINATA_API_SECRET,
path: "somename",
},
})
.then((data) => {
console.log("Result...", data);
return utils.response.response(
res,
"Upload image to ipfs.",
true,
200,
data.data
);
})
.catch((err) => {
return utils.response.response(
res,
"Image not upload to ipfs",
false,
200,
err
);
});
} catch (error) {
next(error);
}
The error message is clear. You are using url that used for json file upload. this is the url you should use to upload image
const url = `https://api.pinata.cloud/pinning/pinFileToIPFS`;
you don't have to convert buffer to a readable stream.
I am not sure if this is correct ${formData._boundary}. should be
"Content-Type": `multipart/form-data: boundary=${formData.getBoundary()}`,
There must be an error on the image parameter. A simple buffer representation of the image should work. The readable stream is not necessary.
Instead of creating the buffer, you could use middleware like express-fileupload to access the buffer representation of the file uploaded on the client-side directly.
const file = req.files;
const url = "https://api.pinata.cloud/pinning/pinFileToIPFS";
const data = new FormData();
data.append("file", file.file.data, { filepath: "anyname" });
const result = await axios.post(url, data, {
maxContentLength: -1,
headers: {
"Content-Type": `multipart/form-data; boundary=${data._boundary}`,
pinata_api_key: process.env.PINATA_API_KEY,
pinata_secret_api_key: process.env.PINATA_API_SECRET,
path: "somename",
},
});
I have some issues parsing a request made from the front-end using FormData. This is the example request generated from Postman for node.js Axios. If I use the postman app with the request, it works as expected.
Frontend example generated from Postman code feature.
var axios = require('axios');
var FormData = require('form-data');
var fs = require('fs');
var data = new FormData();
data.append('file', fs.createReadStream('/some_file.jpg')); **//I am using Electron and I have acces to the FileSystem from the client.**
data.append('resizeLargeImage[width]', '1920');
data.append('resizeLargeImage[height]', '1080');
data.append('resizeLargeImage[type]', 'cover');
var config = {
method: 'post',
url: 'localhost:3030/api/v1/optimize-single',
headers: {
'x-api-key': '123',
...data.getHeaders()
},
data : data
};
axios(config)
.then(function (response) {
console.log(JSON.stringify(response.data));
})
.catch(function (error) {
console.log(error);
});
Backend
#Post('/optimize-single')
#UseInterceptors(FileInterceptor('file'))
async uploadFile(
#UploadedFile() file: FileDto,
#Body() body: UploadFileParametersDto,
#Res() response: Response,
): Promise<any> {
console.log('file', file, 'body', body);
**//File is undefined, body is a null Object**
return await this.appService.uploadFile(file, body, response);
}
Any ideas as to why Nest doesn't recognize this type of request?
Thanks!
I managed to figure it out.
If you are in the Renderer process, it works by attaching the file as a blob.
const fileBuffer = fs.readFileSync(filePath);
const fileName = path.basename(filePath);
const blob = new Blob([fileBuffer], {
type: mime.lookup(filePath),
});
formData.append('file', blob, fileName);
If you move the same functionality in the Main process it will work as expected with data.append('file', fs.createReadStream('/some_file.jpg'));
I am trying to save a jpeg that I get back from Microsoft's Graph API to an S3 bucket (Digital Ocean Space actually but they work the same). I am able to get the image binary from the graph api. And can upload it successfully. But the image is gibberish at the endpoint. Here is the code I am using for reference.
// Prep to fetch user photo from microsoft office 365
// using recently aquired access token
const photoEndpoint = `${MS_GRAPH_URL}/v1.0/me/photo/$value`;
const config = { headers: { Authorization: `Bearer ${access_token}` } };
// Fetch user photo from microsoft office 365
const { data: photo } = await axios.get(photoEndpoint, {
headers: { Authorization: `Bearer ${access_token}` }
});
await storeImage(photo, id);
where storeImage is defined as follows:
// Import AWS SDK
const AWS = require("aws-sdk");
// Instantiate s3 instance to iterface with digital ocean spaced
const spacesEndpoint = new AWS.Endpoint("sfo2.digitaloceanspaces.com");
const s3 = new AWS.S3({
endpoint: spacesEndpoint,
accessKeyId: process.env.DO_SPACE_KEY,
secretAccessKey: process.env.DO_SPACE_SECRET
});
const storeImage = (image, key) => {
const params = {
ACL: "public-read",
Bucket: process.env.DO_SPACE_BUCKET,
Key: `${key}`,
Body: image,
ContentType: "image/jpeg"
};
const promise = new Promise((resolve, reject) => {
s3.upload(params, (err, data) => {
if (err) reject(err);
resolve(data);
});
});
return promise;
};
Any thoughts on what I am doing wrong?
Where do you indicate the Content-Type of data you're requesting via your axios call?
The default appears to be application/json rather than binary/octet-stream so your image data will undergo unwanted transformation. ;)
Adding responseType: 'stream' to your axios request should fix your issue.
I'm trying to upload a file with the Google Drive api, and I have the metadata correct, and I want to ensure that the actual file contents make it there. I have a simple page setup that looks like this:
<div id="upload">
<h6>File Upload Operations</h6>
<input type="file" placeholder='file' name='fileToUpload'>
<button id='uploadFile'>Upload File</button>
</div>
and I have a the javascript setup where the user is prompted to sign in first, and then they can upload a file. Here's the code: (currently only uploads the file metadata....)
let uploadButton = document.getElementById('uploadFile');
uploadButton.onclick = uploadFile;
const uploadFile = () => {
let ftu = document.getElementsByName('fileToUpload')[0].files[0];
console.dir(ftu);
gapi.client.drive.files.create({
'content-type': 'application/json;charset=utf-8',
uploadType: 'multipart',
name: ftu.name,
mimeType: ftu.type,
fields: 'id, name, kind'
}).then(response => {
console.dir(response);
console.log(`File: ${ftu.name} with MimeType of: ${ftu.type}`);
//Need code to upload the file contents......
});
};
First, I'm more familiar with the back end, so getting the file in bits from the <input type='file'> tag is a bit nebulous for me. On the bright side, the metadata is there. How can I get the file contents up to the api?
So According to some resources I've found in my three day search to get this going, the file simply cannot be uploaded via the gapi client. It must be uploaded through a true REST HTTP call. So let's use fetch!
const uploadFile = () => {
//initialize file data from the dom
let ftu = document.getElementsByName('fileToUpload')[0].files[0];
let file = new Blob([ftu]);
//this is to ensure the file is in a format that can be understood by the API
gapi.client.drive.files.create({
'content-type': 'application/json',
uploadType: 'multipart',
name: ftu.name,
mimeType: ftu.type,
fields: 'id, name, kind, size'
}).then(apiResponse => {
fetch(`https://www.googleapis.com/upload/drive/v3/files/${response.result.id}`, {
method: 'PATCH',
headers: new Headers({
'Authorization': `Bearer ${gapi.client.getToken().access_token}`,
'Content-Type': ftu.type
}),
body: file
}).then(res => console.log(res));
}
The Authorization Header is assigned from calling the gapi.client.getToken().access_token function, and basically this takes the empty object from the response on the gapi call and calls the fetch api to upload the actual bits of the file!
In your situation, when you upload a file using gapi.client.drive.files.create(), the empty file which has the uploaded metadata is created. If my understanding is correct, how about this workaround? I have experienced the same situation with you. At that time, I used this workaround.
Modification points:
Retrieve access token using gapi.
File is uploaded using XMLHttpRequest.
Modified script:
Please modify the script in uploadFile().
let ftu = document.getElementsByName('fileToUpload')[0].files[0];
var metadata = {
'name': ftu.name,
'mimeType': ftu.type,
};
var accessToken = gapi.auth.getToken().access_token; // Here gapi is used for retrieving the access token.
var form = new FormData();
form.append('metadata', new Blob([JSON.stringify(metadata)], {type: 'application/json'}));
form.append('file', ftu);
var xhr = new XMLHttpRequest();
xhr.open('post', 'https://www.googleapis.com/upload/drive/v3/files?uploadType=multipart&fields=id,name,kind');
xhr.setRequestHeader('Authorization', 'Bearer ' + accessToken);
xhr.responseType = 'json';
xhr.onload = () => {
console.log(xhr.response);
};
xhr.send(form);
Note:
In this modified script, it supposes that Drive API is enabled at API console and the access token can be used for uploading file.
About fields, you are using id,name,kind. So this sample also uses them.
Reference:
gapi
If I misunderstand your question or this workaround was not useful for your situation, I'm sorry.
Edit:
When you want to use fetch, how about this sample script?
let ftu = document.getElementsByName('fileToUpload')[0].files[0];
var metadata = {
'name': ftu.name,
'mimeType': ftu.type,
};
var accessToken = gapi.auth.getToken().access_token; // Here gapi is used for retrieving the access token.
var form = new FormData();
form.append('metadata', new Blob([JSON.stringify(metadata)], {type: 'application/json'}));
form.append('file', ftu);
fetch('https://www.googleapis.com/upload/drive/v3/files?uploadType=multipart&fields=id,name,kind', {
method: 'POST',
headers: new Headers({'Authorization': 'Bearer ' + accessToken}),
body: form
}).then((res) => {
return res.json();
}).then(function(val) {
console.log(val);
});
With https://www.npmjs.com/package/#types/gapi.client.drive
const makeUploadUrl = (fileId: string, params: Record<string, boolean>) => {
const uploadUrl = new URL(
`https://www.googleapis.com/upload/drive/v3/files/${fileId}`
)
Object.entries({
...params,
uploadType: 'media',
}).map(([key, value]) => uploadUrl.searchParams.append(key, `${value}`))
return uploadUrl
}
const uploadDriveFile = async ({ file }: { file: File }) => {
const params = {
enforceSingleParent: true,
supportsAllDrives: true,
}
// create file handle
const { result } = await gapi.client.drive.files.create(params, {
// CAN'T have the upload type here!
name: file.name,
mimeType: file.type,
// any resource params you need...
driveId: process.env.DRIVE_ID,
parents: [process.env.FOLDER_ID],
})
// post the file data
await fetch(makeUploadUrl(result.id!, params), {
method: 'PATCH',
headers: new Headers({
Authorization: `Bearer ${gapi.client.getToken().access_token}`,
'Content-Type': file.type,
}),
body: file,
})
return result
})
}