How upload image to firebase storage from Ionic? - javascript

I need help to send an image to firebase storage from ionic. So, i take a picture with this function:
async takePicture() {
const options: CameraOptions = {
quality: 100,
destinationType: this.camera.DestinationType.FILE_URI,
encodingType: this.camera.EncodingType.JPEG,
mediaType: this.camera.MediaType.PICTURE
}
try {
let imageURI = await this.camera.getPicture(options);
let newImageURI = await this.cropImage(imageURI);
let imageSanitized = await this.encodeFile(newImageURI);
this.imgSrc = imageSanitized;
} catch (e) {
console.error(JSON.stringify(e));
}
}
And I crop with this function:
cropImage(imgURI): Promise<string> {
return new Promise((resolve,reject) => {
this.cropService.crop(imgURI, { quality: 100 }).then((newImageURI: string) => {
resolve(newImageURI);
}, (err) => {
reject(err);
})
})
}
finishing I encode with this function:
encodeFile(ImageURI: string): Promise<any>{
return new Promise((resolve, reject) => {
this.base64.encodeFile(ImageURI).then((base64File: string) => {
this.imgUri = base64File;
resolve(this.domSanitizer.bypassSecurityTrustResourceUrl(base64File));
}, (err) => {
reject(err);
})
})
}
this.imgSrc is my sanitized image and this show very well in my file.html. However, I need send this image to firebase storage. For that, I created this function:
uploadToStorage(imgString) {
console.log(imgString);
this.storage.child('exemplo.JPEG').putString(imgString, 'data_url').then((res) => {
console.log(res);
}, (error) => {
console.error(error);
});
}
imgString is who gets the value of the this.domSanitizer.bypassSecurityTrustResourceUrl(base64File) or base64File from function encodeFile.
I don't get an error in my upload function, however, I don't get success, nothing shows up for me.
How I can send correctly the image to the server?

I think you can capture the image as base64 by using:
destinationType: this.camera.DestinationType.DATA_URL
instead of FILE_URI.
Anyways, to record an image I used the following code with AngularFire2 that works. this.data.image is where I saved base64 of image. Yet, it may get tricky if your encoder is adding ""data:image/jpeg;base64" to the beginning of your base64 string. You may try with or without the added string if this code doesn't work as expected.
import { AngularFireStorage, AngularFireStorageReference } from 'angularfire2/storage/public_api';
//...
const storageRef: AngularFireStorageReference = this.afStorage.ref(`images/${this.userId}/profilePic/`);
storageRef.putString(this.data.image, 'data_url', {
contentType: 'image/jpeg'
}).then(() => {
storageRef.getDownloadURL().subscribe(url => {
console.log('download url: ' + url);
//function to save download URL of saved image in database
});
})
})

I did this:
encodeFile(ImageURI: string): Promise<any>{
return new Promise((resolve, reject) => {
this.base64.encodeFile(ImageURI).then((base64File: string) => {
this.imgUri = base64File;
...
}, (err) => {
...
})
})
}
this.imgUri store my image base64 encoding. But is necessary to modify the beginning of it. So split:
let messageSplit = this.imgUri.split('data:image/*;charset=utf-8;base64,')[1];
add in her place
let message64 = 'data:image/jpg;base64,' + messageSplit;
So upload to the firebase storage.
const filePath = `my-pet-crocodile_${ new Date().getTime() }.jpg`;
let task = this.afStorage.ref(filePath).putString(message64, 'data_url');

Related

Google vision API is not working after upload image to Firebase

I built a image detection mobile app (e.g. Plastic Bottle, Aluminum Can, Milk Jug, etc.) with React-Native by using google vision API.
It worked well before and got response successfully.
But after I add Firebase image uploading function for store image, it (google vision api) didn't work.
In my guess, Firebase image upload and google vision API seems conflict and not compatible with each other.
Or in my image upload function, there seems error, but I am still not sure what is issue. Following is my code.
const takePicture = async () => {
if (this.camera) {
const options = { quality: 0.5, base64: true };
const data = await this.camera.takePictureAsync(options);
setScannedURI(data.uri)
imageUploadToFirebase(data)
// callGoogleVisionApi(data.base64) //============> After comment image upload function(above line) and if I call vision api here, it works well.
setIsLoading(true)
}
};
const imageUploadToFirebase = (imageData) => {
const Blob = RNFetchBlob.polyfill.Blob; //firebase image upload
const fs = RNFetchBlob.fs;
window.XMLHttpRequest = RNFetchBlob.polyfill.XMLHttpRequest;
window.Blob = Blob;
const Fetch = RNFetchBlob.polyfill.Fetch
window.fetch = new Fetch({
auto: true,
binaryContentTypes: [
'image/',
'video/',
'audio/',
'foo/',
]
}).build()
let uploadBlob = null;
var path = Platform.OS === "ios" ? imageData.uri.replace("file://", "") : imageData.uri
var newItemKey = Firebase.database().ref().child('usersummary').push().key;
var _name = newItemKey + 'img.jpg';
setIsLoading(true)
fs.readFile(path, "base64")
.then(data => {
let mime = "image/jpg";
return Blob.build(data, { type: `${mime};BASE64` });
})
.then(blob => {
uploadBlob = blob;
Firebase.storage()
.ref("scannedItems/" + _name)
.put(blob)
.then(() => {
uploadBlob.close();
return Firebase.storage()
.ref("scannedItems/" + _name)
.getDownloadURL();
})
.then(async uploadedFile => {
setFirebaseImageURL(uploadedFile)
// callGoogleVisionApi(imageData.base64) //============> If I call here, it didn't work.
})
.catch(error => {
console.log({ error });
});
});
}
This is my callGoogleVisionApi function.
const callGoogleVIsionApi = async (base64) => {
let googleVisionRes = await fetch(config.googleCloud.api + config.googleCloud.apiKey, {
method: 'POST',
body: JSON.stringify({
"requests": [{
"image": { "content": base64 },
features: [
{ type: "LABEL_DETECTION", maxResults: 30 },
{ type: "WEB_DETECTION", maxResults: 30 }
],
}]
})
})
.catch(err => { console.log('Network error=>: ', err) })
await googleVisionRes.json()
.then(googleResp => {
if (googleResp) {
let responseArray = googleResp.responses[0].labelAnnotations
responseArray.map((item, index) => {
if (item.description != "" && item.description != undefined && item.description != null) {
newArr.push(item.description)
}
})
}
}).catch((error) => {console.log(error)})
}
Note: If I upload an image to firebase after getting the result from google vision api, the second call to vision api does not work.
I added my callGoogleVIsionApi function. (It is working well without Firebase image upload function.)
What will be the solution of this issue?
I found the reason, but I am still curious why.
Fetch blob and google vision seems conflict each other.
I changed Firebase image upload function, and it worked well.
Following is my modified Firebase image upload function.
const imageUploadToFirebase = () => {
var path = Platform.OS === 'ios' ? scannedURI.replace('file://', '') : scannedURI;
const response = await fetch(path)
const blob = await response.blob();
var newItemKey = Firebase.database()
.ref()
.child('usersummary')
.push().key;
var _name = newItemKey + 'img.jpg';
Firebase.storage()
.ref(_name)
.put(blob)
.then(() => {
return Firebase.storage()
.ref(_name)
.getDownloadURL();
})
.then(async uploadedFile => {
let image = selectImage(sendItem.name?.toLowerCase());
sendItem.image = image;
sendItem.scannedURI = uploadedFile;
AsyncStorage.getItem('#scanedItemList')
.then(res => {
if (res != null && res != undefined && res != '') {
let result = `${res}#${JSON.stringify(sendItem)}`;
AsyncStorage.setItem('#scanedItemList', result);
} else {
AsyncStorage.setItem(
'#scanedItemList',
JSON.stringify(sendItem),
);
}
})
.catch(err => console.log(err));
})
.catch(error => {
console.log({error});
});
}
I'm not sure if you are using #google-cloud/vision package (in the callGoogleVisionApi() function) but as far as I know that is meant to be used in server side and authenticate with a service account. As an alternative to this method, you can use Cloud Storage Triggers for Cloud functions which will trigger a function whenever a new file is uploaded and then use Cloud Vision API.
The Google Vision API can use a base64-encoded image, a publicly accessible HTTP URI, or a blob in google cloud storage.
In order to use an HTTP URI you should change the JSON payload from your callGoogleVisionAPI function from this:
{
"requests": [{
"image": { "content": base64 },
features: [
{ type: "LABEL_DETECTION", maxResults: 30 },
{ type: "WEB_DETECTION", maxResults: 30 }
],
}]
}
to this:
{
"requests": [{
"image": { "source": {"imageUri": 'https://PUBLIC_URI_FOR_THE_IMAGE' } },
features: [
{ type: "LABEL_DETECTION", maxResults: 30 },
{ type: "WEB_DETECTION", maxResults: 30 }
],
}]
}
You've got a better explanation here: Make a Vision API request.

Moodle corrupted uploaded files

I am using moodle api core_files_upload with node js using this script
const { CallService } = require("../MoodleWS")
var Upload = async function(userid, file) {
var base64 = await toBase64(file)
.then(r => {
return r;
})
.catch((e) => {
console.log(e)
});
console.log(base64);
var param = {
itemid: 0,
instanceid: userid,
filearea: 'draft',
filecontent: base64,
component: 'user',
filepath: '/',
filename: file.name,
contextlevel: 'user'
}
// return promise calling web service, basically returned axios
return CallService('POST', 'core_files_upload', false, null, param);
}
const toBase64 = file => new Promise((resolve, reject) => {
const reader = new FileReader();
reader.readAsDataURL(file);
reader.onload = () => resolve(reader.result);
reader.onerror = () => reject(reader.error);
})
module.exports = { Upload }
the function returned success and files uploaded, I have checked the uploaded file size is the same as the original files, unfortunately I failed to open the files and keeps saying like image below
and images also can't be displayed
the uploaded base64 also have the mime/type header like data:application/pdf;base64,JVBERi0xLjQNJeL... I don't really know what went wrong. When I tried to upload files using the native web version of moodle, the files uploaded correctly. So anyone can help? Thanks
So it turned out that I don't need to include the mime/type from the base64. So I just remove the data:application/pdf;base64 and modify my param a little so it became like
var base64 = await toBase64(file)
.then(r => {
return r;
})
.catch((e) => {
console.log(e)
});
var param = {
itemid: 0,
instanceid: userid,
filearea: 'draft',
filecontent: base64.split(',')[1],
component: 'user',
filepath: '/',
filename: file.name,
contextlevel: 'user'
}

Upload Image from form-data to S3 using a Lambda

So I am writing a Lambda that will take in some form data via a straight POST through API Gateway (testing using Postman for now) and then send that image to S3 for storage. Every time I run it, the image uploaded to S3 is corrupted and won't open properly. I have seen people having to decode/encode the incoming data but I feel like I have tried everything using Buffer.from. I am only looking to store either .png or .jpg. The below code does not reflect my attempts using Base64 encoding/decoding seeing they all failed. Here is what I have so far -
Sample Request in postman
{
image: (uploaded .jpg/.png),
metadata: {tag: 'iPhone'}
}
Lambda
const AWS = require('aws-sdk')
const multipart = require('aws-lambda-multipart-parser')
const s3 = new AWS.S3();
exports.handler = async (event) => {
const form = multipart.parse(event, false)
const s3_response = await upload_s3(form)
return {
statusCode: '200',
body: JSON.stringify({ data: data })
}
};
const upload_s3 = async (form) => {
const uniqueId = Math.random().toString(36).substr(2, 9);
const key = `${uniqueId}_${form.image.filename}`
const request = {
Bucket: 'bucket-name',
Key: key,
Body: form.image.content,
ContentType: form.image.contentType,
}
try {
const data = await s3.putObject(request).promise()
return data
} catch (e) {
console.log('Error uploading to S3: ', e)
return e
}
}
EDIT:
I am now atempting to save the image into the /tmp directory then use a read stream to upload to s3. Here is some code for that
s3 upload function
const AWS = require('aws-sdk')
const fs = require('fs')
const s3 = new AWS.S3()
module.exports = {
upload: (file) => {
return new Promise((resolve, reject) => {
const key = `${Date.now()}.${file.extension}`
const bodyStream = fs.createReadStream(file.path)
const params = {
Bucket: process.env.S3_BucketName,
Key: key,
Body: bodyStream,
ContentType: file.type
}
s3.upload(params, (err, data) => {
if (err) {
return reject(err)
}
return resolve(data)
}
)
})
}
}
form parser function
const busboy = require('busboy')
module.exports = {
parse: (req, temp) => {
const ctype = req.headers['Content-Type'] || req.headers['content-type']
let parsed_file = {}
return new Promise((resolve) => {
try {
const bb = new busboy({
headers: { 'content-type': ctype },
limits: {
fileSize: 31457280,
files: 1,
}
})
bb.on('file', function (fieldname, file, filename, encoding, mimetype) {
const stream = temp.createWriteStream()
const ext = filename.split('.')[1]
console.log('parser -- ext ', ext)
parsed_file = { name: filename, path: stream.path, f: file, type: mimetype, extension: ext }
file.pipe(stream)
}).on('finish', () => {
resolve(parsed_file)
}).on('error', err => {
console.err(err)
resolve({ err: 'Form data is invalid: parsing error' })
})
if (req.end) {
req.pipe(bb)
} else {
bb.write(req.body, req.isBase64Encoded ? 'base64' : 'binary')
}
return bb.end()
} catch (e) {
console.error(e)
return resolve({ err: 'Form data is invalid: parsing error' })
}
})
}
}
handler
const form_parser = require('./form-parser').parse
const s3_upload = require('./s3-upload').upload
const temp = require('temp')
exports.handler = async (event, context) => {
temp.track()
const parsed_file = await form_parser(event, temp)
console.log('index -- parsed form', parsed_file)
const result = await s3_upload(parsed_file)
console.log('index -- s3 result', result)
temp.cleanup()
return {
statusCode: '200',
body: JSON.stringify(result)
}
}
The above edited code is a combination of other code and a github repo I found that is trying to achieve the same results. Even with this solution the file is still corrupted
Figured out this issue. Code works perfectly fine - it was an issue with API Gateway. Need to go into the API Gateway settings and set thee Binary Media Type to multipart/form-data then re-deploy the API. Hope this helps someone else who is banging their head against the wall on figuring out sending images via form data to a lambda.

Does anyone know why res.download is giving my download file a random name each time?

I'm using express 4.17.1. When I try to use res.download to send a csv file to the browser, the file is downloaded but the file name is something like this: 3d6a8bc1-696c-40f2-bae8-29ca69658534.csv
Then, when I attempt to download the same file again, it will send the file under this name: c1cd40ff-ea9d-4327-9389-9768fb53384a.csv
Each time it is a different random string of characters.
My code is simply this:
res.download(filePath, 'list.csv');
The filePath is this: ./downloadables/mail-list-14da.csv
I've tried using sendFile but got the same result. I recently updated from a previous version of express to see if it would automagically resolve this issue but it is still doing this.
EDIT: More Code Below as Requested
Here is the entirety of the request endpoint:
/*
* Download the list specified by the list_id with the appropriate fields as specified by the
* list_type parameter.
*/
router.get('/download/:list_type/:list_id', authCheck('list'), function(
req,
res,
next
) {
let listData = {};
Voter.aggregate(aggrPipelineList(req.params.list_type, req.params.list_id))
.allowDiskUse(true)
.exec()
.then(voterDocs => {
if (voterDocs && voterDocs.length === 0) {
res.status(404).json({
message: `list_id ${req.params.list_id} not found`
});
} else {
listData.voter_docs = voterDocs;
return req.params.list_type;
}
})
.then(listType => {
if (listType === 'mail') {
return generateMailExportFile(req.params.list_id, listData);
} else if (listType == 'phone') {
return generateCallExportFile(req.params.list_id, listData);
} else {
return generateFacebookExportFile(req.params.list_id, listData);
}
})
.then(filePath => {
console.log('FP: ' + filePath);
res.download(filePath, 'list.csv');
})
.catch(err => {
res.status(500).json({ message: err.message }); // #note: added message
});
});
Also including the generateMailExportFile function for completeness. Yes, I know I can refactor the three generate export file functions. It's on my list... I originally wrote this before I knew what the hell I was doing.
generateMailExportFile = function(listID, listData) {
let fields = [
'First Name',
'Last Name',
'Suffix',
'Street Address 1',
'Street Address 2',
'City',
'State',
'Zip'
];
let fileName = 'mail-list-' + listID.substr(listID.length - 4) + '.csv';
let voterRows = buildVoterRowsForMailList(listData.voter_docs);
let csv = json2csv({ data: voterRows, fields: fields });
let tempServerFilePath = './downloadables/' + fileName;
return new Promise((resolve, reject) => {
fs.writeFile(tempServerFilePath, csv, function(err) {
if (err) {
reject(err);
} else {
resolve(tempServerFilePath);
}
});
});
};
Here is the redux/thunk function that requests the file download:
export const downloadList = (listId, type) => {
return (dispatch, getState) => {
const rshttp = new RSHttp(getState);
rshttp
.get('/list/download/' + type + '/' + listId)
.then(response => {
let file = new Blob([response.data], { type: 'text/csv' }),
url = window.URL.createObjectURL(file);
window.open(url);
})
.catch(error => {
console.log('Error Downloading File: ' + JSON.stringify(error));
});
};
};
I hadn't before thought about the problem being on the react side. If I find an answer, I'll update this question. Any thoughts are still greatly appreciated!
The problem is you are recreating the file on your front-end. You can simply change your React code to:
export const downloadList = (listId, type) => {
return (dispatch, getState) => {
window.open('http://localhost:8080/list/download/' + type + '/' + listId', '_blank')
// Replace with your backend URL :)
};
};
and it will download the file with the correct filename.

Angular - how to get S3 bucket contents as Observable

How do I structure a service method in Angular4 to make an s3.listObjects call to return the contents of an S3 bucket as an Observable?
Here's what I'm trying at present, failing miserably:
public loadFilesFromS3(): Observable<any[]> {
const s3 = new AWS.S3();
const params = {
Bucket: 'bucket-name',
Prefix: 'prefix-name'
};
return (
s3.listObjects(params, (err, data) => {
if (err) {
throw err;
} else {
return(data);
}
})
)
}
Just completely stuck on this for the moment! :-|
Well, after some homework here's what I came up with. Seems to work well:
private tracksListSubject = new BehaviorSubject([]);
public tracksList$: Observable<Track[]> = this.tracksListSubject.asObservable();
public loadTracksFromS3() {
console.log('loading tracks from S3...');
this.authenticate();
const s3 = new AWS.S3();
const params = {
Bucket: config.AWS_BUCKET_NAME
};
s3.listObjects(params, (err, data) => {
if (err) {
console.log('error!', err);
}
const raw = data.Contents;
const tracks: Track[] = [];
raw.forEach((item) => {
tracks.push({
id: item.ETag,
title: item.Key,
url: config.AWS_BASE_URL + item.Key,
size: Math.round(item.Size / 1024 / 1024 * 10) / 10
})
});
console.log(tracks.length, 'tracks loaded');
this.tracksListSubject.next(tracks);
});
}
Then in other components I can just inject this service and subscribe to the tracksList$ property on it. Every time I modify the list of tracks inside the service, I issue this.tracksListSubject.next(newTracksList).
You should be able to use the Observable.bindNodeCallback function to convert the s3.listObjects function from a node style callback to a function that returns an Observable.
const listObjectsAsObservable = Observable.bindNodeCallback(s3.listObjects.bind(s3));
Note that as shown above you need to bind the s3 object to the function, to tell it that s3 is this. Otherwise, you will get an error.
Then you can use it as follows:
const params = {
Bucket: config.AWS_BUCKET_NAME
};
listObjectsAsObservable(params)
.subscribe({
next: (response) => console.log(response),
error: (err) => console.log(err)
});

Categories