Hey guys I'm working with PouchDB and react. File uploading is working normally but when I try to convert blob into url I'm having this error in console.
"Failed to execute 'createObjectURL' on 'URL': Overload resolution
failed."
I checked the return and I'm able to retrieve all the data/images into the console and basically see the file types and so on. Anyway I heard that "createObjectURL" is deprecated or so but I followed up the tutorial provided in docs for the PouchDB. So I'm not sure now. Can someone give me any insights or help on this ? Thanks
Snippet below:
// uploading files
const uploadF = e => {
// saving chosen file
const file = e.target.files[0];
// generating random number and converting it to the string
const random_id = Math.floor(Math.random() * 10000);
const random_id_to_string = String(random_id);
console.log(file);
// insert data into local DB
db.post({
_id: random_id_to_string,
_attachments: {
fileName: {
content_type: file.type,
data: file
}
}
})
// insert data into remote db
redb.post({
_id: random_id_to_string,
_attachments: {
fileName: {
content_type: file.type,
data: file
}
}
})
// upload file to s3 bucket
S3FileUpload.uploadFile(e.target.files[0],config)
.then(data => {
console.log(data);
})
.catch(err => console.log(err))
}
// retrieve all data from db
const files = [];
db.allDocs({
include_docs: true,
attachments: true
}).then(function (result) {
return result;
})
.then(function(blob){
const url = URL.createObjectURL(blob);
console.log(url);
})
.catch(function (err) {
console.log(err);
});
return(
<section className="hero">
<nav>
<h2>Welcome</h2>
<button onClick={handleLogout}>Logout</button>
<input type="file" onChange={e => uploadF(e)} />
</nav>
</section>
);
Related
I am using node.js aws sdk for s3 related methods. I have a method to download the file from s3 bucket.
I am downloading the file using the below code.
const downloadFileBase64 = async (payload) => {
let params = { Bucket: s3BucketName, Key: `${payload.folderName}/${payload.fileName}` };
try {
const response = await s3
.getObject(params, (err) => {
if (err) {
return err;
}
})
.promise();
return {
data: response.Body.toString('base64'),
fileName: payload.fileName
};
} catch (error) {
return Boom.badRequest(error.message);
}
};
Once i get the base64 content i am sending it over an email using sendgrid.
Issue: When i download small files everything is working fine. But when i download large files, some part of the file is missing in multiple pages. I just copy pasted the base64 in few online websites and downloaded the file from there, it's the same issue in those websites also. With this i concluded that there is some issue while returning the response from s3 itself. When i go to s3 and check it in the folder, it's showing proper file.
If you see the above screenshot, its the pdf which is having some random grey background in few pages and some text is also missing from the pdf.
I tried to use another method which just download buffer excluding the base64 conversion as shown below.
const downloadFileBuffer = async (payload) => {
let params = { Bucket: s3BucketName, Key: `${payload.folderName}/${payload.fileName}` };
try {
const response = await s3
.getObject(params, (err) => {
if (err) {
return err;
}
})
.promise();
return {
data: response.Body,
fileName: payload.fileName
};
} catch (error) {
return Boom.badRequest(error.message);
}
};
And once i get the file content in this above response, i am storing temporarily in a folder on server and then reading again and sending over email. But i am still having the same issue.
const fileContent = await docs.downloadFileBuffer({ payload: req.payload.action.dire });
await fs.writeFileSync(`${temp}testinggg.pdf`, fileContent?.data);
const fileData = await fs.readFileSync(`${temp}testinggg.pdf`, { encoding: 'base64' });
Any help on this issue is really appreciated.
After days of research and trying different ways, I found the issue. The issue was with .promise() used in s3.getObject(params, (err) => {}).promise();. Instead of that, I used callback using Promise as shown below. Now the file is properly showing the full content without missing any data.
const downloadFileBuffer = async (payload) => {
let params = { Bucket: s3BucketName, Key: `${payload.folderName}/${payload.fileName}` };
try {
return new Promise((resolve, reject) => {
s3.getObject(params, (err, response) => {
if (err) {
reject(err);
}
resolve({
data: response.Body,
fileName: payload.fileName
});
});
});
} catch (error) {
return Boom.badRequest(error.message);
}
};
I have an input that the user can upload an image, I want to get this image and pass it to the server side and the server will store this image on a local folder, for example:
I use linux for the server so the server.js is running from the folder /home/user/project/server/server.js. When the server get the image I want it to store on the folder /home/user/project/images/img.jpg
This my code:
HTML:
<input type="file" id="imageFile" accept=".jpg, .jpeg, .png" />
Front-End:
const signup = async () => {
const name = document.getElementById("signup_name").value;
const passwd = document.getElementById("signup_passwd").value;
const image = document.getElementById("imageFile").files[0];
let formData = new FormData();
formData.append("fileToUpload", image);
const response = await fetch("http:/localhost:3000/signup", {
method: "post",
headers: { "Content-Type": "application/json" },
body: JSON.stringify({
nome: cadastro_nome,
senha: cadastro_senha,
imagem: formData
}),
});
const result = await response.json();
document.getElementById("cadastro_nome").value = "";
document.getElementById("cadastro_senha").value = "";
alert(result);
};
Back-End:
app.post("/signup", async (req, res) => {
const { name, passwd, image } = req.body;
if (!name || !passwd) {
return res.status(400).json("Dados incorretos!");
}
knex
.transaction((trx) => {
trx
.insert({
login: name,
senha: passwd,
divida: 0,
})
.into("usuarios")
.then(trx.commit)
.catch(trx.rollback)
.then(res.json("Cadastrado com sucesso!"));
})
.catch((err) => {
console.log(err);
return res.json("Login existente, tente novamente!");
});
//PUT SOMETHING HERE TO SAVE IMAGE LOCALLY, MAYBE??
});
Yes, you can first store the uploaded image as a Base64 string using the FileReader, data urls are already base64 so when you call reader.readAsDataURL the e.target.result sent to the reader.onload handler and it will be all you need, but also may need add in your HDD or do it asynchronous using res.json, check the WDN official documentation about FileReader.
(Get user's uploaded image for example)
const imgPath = document.querySelector('input[type=file]').files[0];
const reader = new FileReader();
reader.addEventListener("load", function () {
// Convert file to base64 string and save to localStorage
localStorage.setItem("image", reader.result);
}, false);
if (imgPath) {
reader.readAsDataURL(imgPath);
}
And to read the image back from the localStorage, just use querySelector or getElementById:
const img = document.getElementById('image');
img.src = localStorage.getItem('image');
About the "fd" argument must be of type number, in my case, sometimes I was using:
fs.readSync() when I should have been using fs.readFileSync()
fs.writeSync() usage but should be fs.writeFileSync()
fr.write() could be in your case fs.writeFile()
The comment of #Dimava in your question can work too, I flagged up.
For more help, consult this post related to your similar question! ;)
noob question, I'm just getting started with Google Drive API v3. How can I download dynamic file from google drive when I only have fileId. file can be, image, pdf, or docs.
I tried searching but I couldn't found any reference or example related to this.
This what I have so far but it only download specific file extension.
downloadFile(req, res) {
const auth = new google.auth.JWT(
client_email,
null,
private_key,
SCOPES,
);
const { fileId } = req.params;
const drive = google.drive({ version: 'v3', auth});
var dest = fs.createWriteStream('./tmp/downloads/dummy.pdf')
drive.files.get({
fileId,
alt: 'media',
}, {
responseType: 'stream'
}).then((driveResponse) => {
driveResponse.data.on('end', () => {
console.log(`downloading fileID ${fileId}`);
})
.on('error', (err) => {
console.log(err);
})
.on('data', (d) => {
console.log(d);
})
.pipe(dest)
})
.catch((err) => {
console.log(err);
})
}
Is there way to download dynamic files from google drive?
I believe your goal as follows.
You want to download the files from Google Drive using the service account and the file ID.
The files include both Google Docs files and the files except for Google Docs files.
You want to achieve this using googleapis for Node.js.
Modification points:
Unfortunately, from it only download specific file extension., I cannot understand about the detail of your situation. But I guess that the reason of your issue might be due to downloading both Google Docs files and the files except for Google Docs files.
When Google Docs files are downloaded, the files are required to be downloaded using the method of "Files: export" in Drive API.
When the files except for Google Docs files are downloaded, the files are required to be downloaded using the method of "Files: get" in Drive API.
I thought that above situation might be the reason of your issue.
In order to download both Google Docs files and the files except for Google Docs files, I propose the following flow.
Check the mimeType of the file ID.
Download the file using each method by the mimeType.
When above points are reflected to your script, it becomes as follows.
Modified script:
From:
var dest = fs.createWriteStream('./tmp/downloads/dummy.pdf')
drive.files.get({
fileId,
alt: 'media',
}, {
responseType: 'stream'
}).then((driveResponse) => {
driveResponse.data.on('end', () => {
console.log(`downloading fileID ${fileId}`);
})
.on('error', (err) => {
console.log(err);
})
.on('data', (d) => {
console.log(d);
})
.pipe(dest)
})
.catch((err) => {
console.log(err);
})
To:
drive.files.get({ fileId, fields: "*" }, async (err, { data }) => {
if (err) {
console.log(err);
return;
}
let filename = data.name;
const mimeType = data.mimeType;
let res;
if (mimeType.includes("application/vnd.google-apps")) {
const convertMimeTypes = {
"application/vnd.google-apps.document": {
type:
"application/vnd.openxmlformats-officedocument.wordprocessingml.document",
ext: ".docx",
},
"application/vnd.google-apps.spreadsheet": {
type:
"application/vnd.openxmlformats-officedocument.spreadsheetml.sheet",
ext: ".xlsx",
},
"application/vnd.google-apps.presentation": {
type:
"application/vnd.openxmlformats-officedocument.presentationml.presentation",
ext: ".pptx",
},
};
filename += convertMimeTypes[mimeType].ext;
res = await drive.files.export(
{
fileId,
mimeType: convertMimeTypes[mimeType].type,
},
{ responseType: "stream" }
);
} else {
res = await drive.files.get(
{
fileId,
alt: "media",
},
{ responseType: "stream" }
);
}
const dest = fs.createWriteStream(filename);
res.data
.on("end", () => console.log("Done."))
.on("error", (err) => {
console.log(err);
return process.exit();
})
.pipe(dest);
});
Note:
In this modification, I prepared 3 types of Google Docs files at convertMimeTypes. When you want to download other mimeTypes, please modify convertMimeTypes. In this case, for example, Google Docs files are downloaded as Microsoft Docs files.
References:
Download files
Files: get
Files: export
I am trying to get my node.js backend to upload a file to AWS S3, which it got in a post request from my front-end. This is what my function looks like:
async function uploadFile(file){
var uploadParams = {Bucket: '<bucket-name>', Key: file.name, Body: file};
s3.upload (uploadParams, function (err, data) {
if (err) {
console.log("Error", err);
} if (data) {
console.log("Upload Success", data.Location);
}
});
}
When I try uploading the file this way, I get an Unsupported Body Payload Error...
I used fileStream.createReadStream() in the past to upload files saves in a directory on the server, but creating a fileStream did not work for me, since there is no path parameter to pass here.
EDIT:
The file object is created in the angular frontend of my web application. This it the relevant html code where the file is uploaded by a user:
<div class="form-group">
<label for="file">Choose File</label>
<input type="file" id="file"(change)="handleFileInput($event.target.files)">
</div>
If the event occurs, the handleFileInput(files: FileList) method in the corresponding component is called:
handleFileInput(files: FileList) {
// should result in array in case multiple files are uploaded
this.fileToUpload = files.item(0);
// actually upload the file
this.uploadFileToActivity();
// used to check whether we really received the file
console.log(this.fileToUpload);
console.log(typeof this.fileToUpload)
}
uploadFileToActivity() {
this.fileUploadService.postFile(this.fileToUpload).subscribe(data => {
// do something, if upload success
}, error => {
console.log(error);
});
}
the postFile(fileToUpload: File) method of the file-upload service is used to make the post request:
postFile(fileToUpload: File): Observable<Boolean> {
console.log(fileToUpload.name);
const endpoint = '/api/fileupload/single';
const formData: FormData = new FormData();
formData.append('fileKey', fileToUpload, fileToUpload.name);
return this.httpClient
.post(endpoint, formData/*, { headers: yourHeadersConfig }*/)
.pipe(
map(() => { return true; }),
catchError((e) => this.handleError(e)),
);
}
Here is the the server-side code that receives the file and then calls the uploadFile(file) function:
app.post('/api/fileupload/single', async (req, res) => {
try {
if(!req.files) {
res.send({
status: false,
message: 'No file uploaded'
});
} else {
let file = req.files.fileKey;
uploadFile(file);
//send response
res.send({
status: true,
message: 'File is uploaded',
data: {
name: file.name,
mimetype: file.mimetype,
size: file.size
}
});
}
} catch (err) {
res.status(500).send(err);
}
});
Thank you very much for your help in solving this!
Best regards, Samuel
Best way is stream the file. Assuming you are. reading it from disk. You could do this
const fs = require("fs");
const aws = require("aws-sdk");
const s3Client = new aws.S3();
const Bucket = 'somebucket';
const stream = fs.createReadStream("file.pdf");
const Key = stream.path;
const response = await s3Client.upload({Bucket, Key, Body: stream}).promise();
console.log(response);
I am trying to upload a file from mobile to google bucket using ionic 4. Although a file can upload into the could. I am struggling to get the file properties out of file object.
Here is my method,
async selectAFile() {
const uploadFileDetails = {
name: '',
contentLength: '',
size: '',
type: '',
path: '',
};
this.fileChooser.open().then(uri => {
this.file.resolveLocalFilesystemUrl(uri).then(newUrl => {
let dirPath = newUrl.nativeURL;
const dirPathSegments = dirPath.split('/');
dirPathSegments.pop();
dirPath = dirPathSegments.join('/');
(<any>window).resolveLocalFileSystemURL(
newUrl.nativeURL,
function(fileEntry) {
uploadFileDetails.path = newUrl.nativeURL;
const file: any = getFileFromFileEntry(fileEntry);
//log 01
console.log({ file });
uploadFileDetails.size = file.size;
uploadFileDetails.name = `${newUrl.name
.split(':')
.pop()}.${file.type.split('/').pop()}`;
uploadFileDetails.type = file.type;
async function getFileFromFileEntry(fileEntry) {
try {
return await new Promise((resolve, reject) =>
fileEntry.file(resolve, reject)
);
} catch (err) {
console.log(err);
}
}
},
function(e) {
console.error(e);
}
);
});
});
// here uploadFileDetails is simller to what I declared at the top ;)
// I wan't this to be populated with file properties
// console.log(uploadFileDetails.name) --> //''
const uploadUrl = await this.getUploadUrl(uploadFileDetails);
const response: any = this.uploadFile(
uploadFileDetails,
uploadUrl
);
response
.then(function(success) {
console.log({ success });
this.presentToast('File uploaded successfully.');
this.loadFiles();
})
.catch(function(error) {
console.log({ error });
});
}
even though I can console.log the file in log 01. I am unable to get file properties like, size, name, type out of the resolveLocalFileSystemURL function. basically, I am unable to populate uploadFileDetails object. What am I doing wrong? Thank you in advance.
you actually need 4 Ionic Cordova plugins to upload a file after getting all the metadata of a file.
FileChooser
Opens the file picker on Android for the user to select a file, returns a file URI.
FilePath
This plugin allows you to resolve the native filesystem path for Android content URIs and is based on code in the aFileChooser library.
File
This plugin implements a File API allowing read/write access to files residing on the device.
File Trnafer
This plugin allows you to upload and download files.
getting the file's metadata.
file.resolveLocalFilesystemUrl with fileEntry.file give you all the metadata you need, except the file name. There is a property called name in the metadata but it always contains value content.
To get the human readable file name you need filePath. But remember you can't use returning file path to retrieve metadata. For that, you need the original url from fileChooser.
filePathUrl.substring(filePathUrl.lastIndexOf('/') + 1) is used to get only file name from filePath.
You need nativeURL of the file in order to upload it. Using file path returning from filePath is not going to work.
getFileInfo(): Promise<any> {
return this.fileChooser.open().then(fileURI => {
return this.filePath.resolveNativePath(fileURI).then(filePathUrl => {
return this.file
.resolveLocalFilesystemUrl(fileURI)
.then((fileEntry: any) => {
return new Promise((resolve, reject) => {
fileEntry.file(
meta =>
resolve({
nativeURL: fileEntry.nativeURL,
fileNameFromPath: filePathUrl.substring(filePathUrl.lastIndexOf('/') + 1),
...meta,
}),
error => reject(error)
);
});
});
});
});
}
select a file from the file system of the mobile.
async selectAFile() {
this.getFileInfo()
.then(async fileMeta => {
//get the upload
const uploadUrl = await this.getUploadUrl(fileMeta);
const response: Promise < any > = this.uploadFile(
fileMeta,
uploadUrl
);
response
.then(function(success) {
//upload success message
})
.catch(function(error) {
//upload error message
});
})
.catch(error => {
//something wrong with getting file infomation
});
}
uploading selected file.
This depends on your backend implementation. This is how to use File Transfer to upload a file.
uploadFile(fileMeta, uploadUrl) {
const options: FileUploadOptions = {
fileKey: 'file',
fileName: fileMeta.fileNameFromPath,
headers: {
'Content-Length': fileMeta.size,
'Content-Type': fileMeta.type,
},
httpMethod: 'PUT',
mimeType: fileMeta.type,
};
const fileTransfer: FileTransferObject = this.transfer.create();
return fileTransfer.upload(file.path, uploadUrl, options);
}
hope it helps. :)