Firebase React native Expo file upload and download issue resolving promise - javascript

I'm having a problem where the Array is not filled out, I think its something to do with the promoses resolving.
const UploadFile = async ({
imageName = `${Date.now()}`,
imageUris,
imageFolder = '',
metadata,
}: IFile) => {
if (imageUris) {
const promises: any[] = [];
const imageURLs: string[] = [];
imageUris.forEach(async (uri) => {
const randomNumber = Randomize('0', 10);
const finalImageName = `${Slugify(imageName)}`.toLowerCase();
const imagePath = `${imageFolder}/${finalImageName}-${randomNumber}`;
const imageRef = storageRef.child(imagePath);
const blob = (await fetch(uri)).blob();
const uploadTask = imageRef.put(await blob, metadata);
uploadTask.on(
firebase.storage.TaskEvent.STATE_CHANGED,
(snapshot) => {
const progress =
(snapshot.bytesTransferred / snapshot.totalBytes) * 100;
console.log('Upload is ' + progress + '% done');
},
(error) => console.log('Error:', error),
() => {
uploadTask.snapshot.ref.getDownloadURL().then((downloadURL) => {
console.log('File available at', downloadURL);
imageURLs.push(downloadURL);
});
},
);
promises.push(uploadTask);
});
// Not sure promise is resolving
Promise.all(promises).then((i) => {
console.log('All files uploaded', i);
});
Promise.all(imageURLs).then((i) => {
console.log('All imageURLs', i);
});
}
}
Output:
Retrieved listings
All files uploaded Array []
All imageURLs Array []
imageURLs Contents undefined
Upload is 0% done
Upload is 0% done
Upload is 100% done
File available at https://firebasestorage.googleapis.com/v0/b/wrecknet-ab69d.appspot.com/o/listings%2Fcar-5701393331?alt=media&token=ccfda911-36fb-4305-b6d7-0ee06fc824e1
Listing was successfully created
Upload is 100% done
File available at https://firebasestorage.googleapis.com/v0/b/wrecknet-ab69d.appspot.com/o/listings%2Fcar-4491812919?alt=media&token=03f72706-4201-4652-9172-8bcefaeb3e1f
As you can see the "All files uploaded Array []" and "All imageURLs Array []" arrays are empty, I suspect the Promise is not resolving.

As far as I know you can either listen to the on() of the UploadTask or to its then(), but not to both. Luckily you don't do anything meaningful in the on handling, so the entire code can be simplified down to:
const UploadFile = async ({
imageName = `${Date.now()}`,
imageUris,
imageFolder = '',
metadata,
}: IFile) => {
if (imageUris) {
const promises: any[] = [];
imageUris.forEach(async (uri) => {
const randomNumber = Randomize('0', 10);
const finalImageName = `${Slugify(imageName)}`.toLowerCase();
const imagePath = `${imageFolder}/${finalImageName}-${randomNumber}`;
const imageRef = storageRef.child(imagePath);
const blob = (await fetch(uri)).blob();
promises.push(imageRef.put(await blob, metadata));
});
Promise.all(promises).then((imageURLs) => {
console.log('All imageURLs', imageURLs);
});
}
}

Related

Firebase Storage Image upload - Function to return the Image URL after uploading it

I need to implement this async function,
const uploadImage = async () => {
const filename = new Date().getTime() + photo!.name
const storage = getStorage(app)
const storageRef = ref(storage, filename)
const uploadTask = uploadBytesResumable(storageRef, photo!);
uploadTask.on('state_changed',
(snapshot) => {},
(error) => {
console.log("error while uploading photo", error)
},
async () => {
photoUrl = await getDownloadURL(uploadTask.snapshot.ref);
console.log("getDownloadURL", photoUrl)
return photoUrl
}
);
}
It is the function to upload images to Firebase-Storage. Here I need to return the "photoUrl ". I need to call the function like,
const res = await uploadImage(photo)
how do I do this? The uploaded image's URL should return from the function.
The object returned by uploadBytesResumable is also a promise, so you can just await that and then call getDownloadURL:
const uploadImage = async () => {
const filename = new Date().getTime() + photo!.name
const storage = getStorage(app)
const storageRef = ref(storage, filename)
const uploadTask = uploadBytesResumable(storageRef, photo!);
await uploadTask;
photoUrl = await getDownloadURL(uploadTask.snapshot.ref);
return photoUrl
}
You actually don't even need a reference to the task, as you already have the storageRef, the above can be shorted to:
const uploadImage = async () => {
const filename = new Date().getTime() + photo!.name
const storage = getStorage(app)
const storageRef = ref(storage, filename)
await uploadBytesResumable(storageRef, photo!);
return await getDownloadURL(storageRef);
}
Here is same thing to upload multiple files to firebase and return their URLs
async function uploadMultipleFilesToFirebase(imagesArray) {
try {
const requests = imagesArray.map(async (imageFile) => {
const storageRef = ref(storage, filename)
await uploadBytesResumable(storageRef, imageFile);
return await getDownloadURL(storageRef);
})
return Promise.all(requests)
} catch (error) {
throw({ error })
}
}
And then use it with:
urlsOfUploadedImages.value = await uploadProductToFirebase(productData)

Javascript promises + useState + firebase onSnapshot

I have a database listener in my code and I am trying to get every user's new posts and then (when I have all of them in an array) update the posts state.
My code looks like this but it is not working good, because setPosts is async and sometimes it might be called again before ending the state update. I think that I need to wrap the listener in a Promise but I have no idea how to do it detaching the listener when the component unmounts.
useEffect(() => {
const { firebase } = props;
// Realtime database listener
const unsuscribe = firebase
.getDatabase()
.collection("posts")
.doc(firebase.getCurrentUser().uid)
.collection("userPosts")
.onSnapshot((snapshot) => {
let changes = snapshot.docChanges();
changes.forEach(async (change) => {
if (change.type === "added") {
// Get the new post
const newPost = change.doc.data();
// TODO - Move to flatlist On end reached
const uri = await firebase
.getStorage()
.ref(`photos/${newPost.id}`)
.getDownloadURL();
// TODO - Add the new post *(sorted by time)* to the posts list
setPosts([{ ...newPost, uri }, ...posts]);
}
});
});
/* Pd: At the first time, this function will get all the user's posts */
return () => {
// Detach the listening agent
unsuscribe();
};
}, []);
Any ideas?
Also, I have think to do:
useEffect(() => {
const { firebase } = props;
let postsArray = [];
// Realtime database listener
const unsuscribe = firebase
.getDatabase()
.collection("posts")
.doc(firebase.getCurrentUser().uid)
.collection("userPosts")
.orderBy("time") // Sorted by date
.onSnapshot((snapshot) => {
let changes = snapshot.docChanges();
changes.forEach(async (change) => {
if (change.type === "added") {
// Get the new post
const newPost = change.doc.data();
// Add the new post to the posts list
postsArray.push(newPost);
}
});
setPosts(postsArray.reverse());
});
But in this case, the post uri is saved too in the firestore document (something I can do because I write on the firestore with a cloud function that gets the post from storage), and I don't know if it is a good practice.
Thanks.
Update
Cloud Function code:
exports.validateImageDimensions = functions
.region("us-central1")
.runWith({ memory: "2GB", timeoutSeconds: 120 })
.https.onCall(async (data, context) => {
// Libraries
const admin = require("firebase-admin");
const sizeOf = require("image-size");
const url = require("url");
const https = require("https");
const sharp = require("sharp");
const path = require("path");
const os = require("os");
const fs = require("fs");
// Lazy initialization of the Admin SDK
if (!is_validateImageDimensions_initialized) {
const serviceAccount = require("./serviceAccountKey.json");
admin.initializeApp({
// ...
});
is_validateImageDimensions_initialized = true;
}
// Create Storage
const storage = admin.storage();
// Create Firestore
const firestore = admin.firestore();
// Get the image's owner
const owner = context.auth.token.uid;
// Get the image's info
const { id, description, location, tags } = data;
// Photos's bucket
const bucket = storage.bucket("bucket-name");
// File Path
const filePath = `photos/${id}`;
// Get the file
const file = getFile(filePath);
// Check if the file is a jpeg image
const metadata = await file.getMetadata();
const isJpgImage = metadata[0].contentType === "image/jpeg";
// Get the file's url
const fileUrl = await getUrl(file);
// Get the photo dimensions using the `image-size` library
getImageFromUrl(fileUrl)
.then(async (image) => {
// Check if the image has valid dimensions
let dimensions = sizeOf(image);
// Create the associated Firestore's document to the valid images
if (isJpgImage && hasValidDimensions(dimensions)) {
// Create a thumbnail for the uploaded image
const thumbnailPath = await generateThumbnail(filePath);
// Get the thumbnail
const thumbnail = getFile(thumbnailPath);
// Get the thumbnail's url
const thumbnailUrl = await getUrl(thumbnail);
try {
await firestore
.collection("posts")
.doc(owner)
.collection("userPosts")
.add({
id,
uri: fileUrl,
thumbnailUri: thumbnailUrl, // Useful for progress images
description,
location,
tags,
date: admin.firestore.FieldValue.serverTimestamp(),
likes: [], // At the first time, when a post is created, zero users has liked it
comments: [], // Also, there aren't any comments
width: dimensions.width,
height: dimensions.height,
});
// TODO: Analytics posts counter
} catch (err) {
console.error(
`Error creating the document in 'posts/{owner}/userPosts/' where 'id === ${id}': ${err}`
);
}
} else {
// Remove the files that are not jpeg images, or whose dimensions are not valid
try {
await file.delete();
console.log(
`The image '${id}' has been deleted because it has invalid dimensions.
This may be an attempt to break the security of the app made by the user '${owner}'`
);
} catch (err) {
console.error(`Error deleting invalid file '${id}': ${err}`);
}
}
})
.catch((e) => {
console.log(e);
});
/* ---------------- AUXILIAR FUNCTIONS ---------------- */
function getFile(filePath) {
/* Get a file from the storage bucket */
return bucket.file(filePath);
}
async function getUrl(file) {
/* Get the public url of a file */
const signedUrls = await file.getSignedUrl({
action: "read",
expires: "01-01-2100",
});
// signedUrls[0] contains the file's public URL
return signedUrls[0];
}
function getImageFromUrl(uri) {
return new Promise((resolve, reject) => {
const options = url.parse(uri); // Automatically converted to an ordinary options object.
const request = https.request(options, (response) => {
if (response.statusCode < 200 || response.statusCode >= 300) {
return reject(new Error("statusCode=" + response.statusCode));
}
let chunks = [];
response.on("data", (chunk) => {
chunks.push(chunk);
});
response.on("end", () => {
try {
chunks = Buffer.concat(chunks);
} catch (e) {
reject(e);
}
resolve(chunks);
});
});
request.on("error", (e) => {
reject(e.message);
});
// Send the request
request.end();
});
}
function hasValidDimensions(dimensions) {
// Posts' valid dimensions
const validDimensions = [
{
width: 1080,
height: 1080,
},
{
width: 1080,
height: 1350,
},
{
width: 1080,
height: 750,
},
];
return (
validDimensions.find(
({ width, height }) =>
width === dimensions.width && height === dimensions.height
) !== undefined
);
}
async function generateThumbnail(filePath) {
/* Generate thumbnail for the progressive images */
// Download file from bucket
const fileName = filePath.split("/").pop();
const tempFilePath = path.join(os.tmpdir(), fileName);
const thumbnailPath = await bucket
.file(filePath)
.download({
destination: tempFilePath,
})
.then(() => {
// Generate a thumbnail using Sharp
const size = 50;
const newFileName = `${fileName}_${size}_thumb.jpg`;
const newFilePath = `thumbnails/${newFileName}`;
const newFileTemp = path.join(os.tmpdir(), newFileName);
sharp(tempFilePath)
.resize(size, null)
.toFile(newFileTemp, async (_err, info) => {
// Uploading the thumbnail.
await bucket.upload(newFileTemp, {
destination: newFilePath,
});
// Once the thumbnail has been uploaded delete the temporal file to free up disk space.
fs.unlinkSync(tempFilePath);
});
// Return the thumbnail's path
return newFilePath;
});
return thumbnailPath;
}
});

use of promise.all to download the images using rn-fetch-blob?

downloadImagesInParallel = async (url) => {
const dirs = RNFetchBlob.fs.dirs
reactotron.log('downloadImagesInParallel', url)
await RNFetchBlob.config({
appendExt : 'png',
path : dirs.DocumentDir + `/${url}`
}).fetch('GET', `${url}`, {
//some headers ..
})
}
let newsImageUrl = []
newsData.forEach(element => {
newsImageUrl.push(this.downloadImagesInParallel(element.urlToImage).then((data) => {
reactotron.log('data', data)
}))
});
// const newsImagesURL = newsData.map((item) => this.downloadImagesInParallel(item.urlToImage))
reactotron.log('setHomeNewsList ***************** ', newsImageUrl)
const allData = await Promise.all(newsImageUrl)
This is how i tried to download all the images together but i am unable to do it. Please help me out in this.

Firebase cloud function storage trigger first thumbnail urls are fine then the next ones are all the same thumbnails urls as the first

I am trying to upload an image to firebase and then produce 2 thumbnails. I am able to do this with no problems. My current road block is when I write the urls to the realtime database, I am always getting the same url as the initial upload.
For example:
1st upload I get my uploaded image with the two proper thumbnails for the image
2nd upload I get my uploaded image with the two previous thumbnails (first image)
3rd upload I get my uploaded image with the first images thumbnails...
...this continues to reproduce the urls for the first upload
In my storage the correct thumbnails are being generated, but the urls are always for the first upload?
I don't know if this is a problem with the getSignedUrl() or not, really not sure whats going on here.
Here is my cloud function:
export const generateThumbs = functions.storage
.object()
.onFinalize(async object => {
const bucket = gcs.bucket(object.bucket); // The Storage object.
// console.log(object);
console.log(object.name);
const filePath = object.name; // File path in the bucket.
const fileName = filePath.split('/').pop();
const bucketDir = dirname(filePath);
const workingDir = join(tmpdir(), 'thumbs');
const tmpFilePath = join(workingDir, 'source.png');
if (fileName.includes('thumb#') || !object.contentType.includes('image')) {
console.log('exiting function');
return false;
}
// 1. ensure thumbnail dir exists
await fs.ensureDir(workingDir);
// 2. Download Sounrce fileName
await bucket.file(filePath).download({
destination: tmpFilePath
});
//3. resize the images and define an array of upload promises
const sizes = [64, 256];
const uploadPromises = sizes.map(async size => {
const thumbName = `thumb#${size}_${fileName}`;
const thumbPath = join(workingDir, thumbName);
//Resize source image
await sharp(tmpFilePath)
.resize(size, size)
.toFile(thumbPath);
//upload to gcs
return bucket.upload(thumbPath, {
destination: join(bucketDir, thumbName),
metadata: {
contentType: 'image/jpeg'
}
}).then((data) => {
const file = data[0]
// console.log(data)
file.getSignedUrl({
action: 'read',
expires: '03-17-2100'
}).then((response) => {
const url = response[0];
if (size === 64) {
// console.log('generated 64');
return admin.database().ref('profileThumbs').child(fileName).set({ thumb: url });
} else {
// console.log('generated 128');
return admin.database().ref('categories').child(fileName).child('thumb').set(url);
}
})
.catch(function (error) {
console.error(err);
return;
});
})
});
//4. Run the upload operations
await Promise.all(uploadPromises);
//5. Cleanup remove the tmp/thumbs from the filesystem
return fs.remove(workingDir);
})
Cleaned up my code and solved my problem, here is how I generated the urls and passed them to the proper URLs by accessing the users UID and postId in the file path:
export const generateThumbs = functions.storage
.object()
.onFinalize(async object => {
const fileBucket = object.bucket; // The Storage bucket that contains the file.
const filePath = object.name; // File path in the bucket.
const fileName = filePath.split('/').pop();
const userUid = filePath.split('/')[2];
const sizes = [64, 256];
const bucketDir = dirname(filePath);
console.log(userUid);
if (fileName.includes('thumb#') || !object.contentType.includes('image')) {
console.log('exiting function');
return false;
}
const bucket = gcs.bucket(fileBucket);
const tempFilePath = path.join(tmpdir(), fileName);
return bucket.file(filePath).download({
destination: tempFilePath
}).then(() => {
sizes.map(size => {
const newFileName = `thumb#${size}_${fileName}.png`
const newFileTemp = path.join(tmpdir(), newFileName);
const newFilePath = `thumbs/${newFileName}`
return sharp(tempFilePath)
.resize(size, size)
.toFile(newFileTemp, () => {
return bucket.upload(newFileTemp, {
destination: join(bucketDir, newFilePath),
metadata: {
contentType: 'image/jpeg'
}
}).then((data) => {
const file = data[0]
console.log(data)
file.getSignedUrl({
action: 'read',
expires: '03-17-2100'
}, function(err, url) {
console.log(url);
if (err) {
console.error(err);
return;
}
if (size === 64) {
return admin.database().ref('profileThumbs').child(userUid).child(fileName).set({ thumb: url });
} else {
return admin.database().ref('categories').child(fileName).child('thumb').set(url);
}
})
})
})
})
}).catch(error =>{
console.log(error);
});
})

javascript FileReader - how to parsing long file in chunks?

Initially, I have made loading here so like this
export function сonvertFilesToByteArray(e) {
const MAX_FILE_SIZE = 1024 * 1024 * 50; // 50MB
const files = Object.keys(e.target.files);
const asyncReadFile = eachFile =>
new Promise((resolve, reject) => {
if (e.target.files[eachFile].size > MAX_FILE_SIZE) {
return reject([{ message: `File ${e.target.files[eachFile].name} too large` }]);
}
const reader = new FileReader();
const targetFileInfo = {
contentType: e.target.files[eachFile].type,
filename: e.target.files[eachFile].name,
};
reader.readAsArrayBuffer(e.target.files[eachFile]);
reader.onload = () => {
resolve({ ...targetFileInfo, body: Array.from(new Uint8Array(reader.result)) });
};
reader.onerror = error => reject(error);
});
return Promise.all(files.map(asyncReadFile));
}
Here in the constant files, I define how many at my files and I apply a function to each of them.
And then I get my file(s) in the component
handleFileUpload = (e) => {
сonvertFilesToByteArray(e)
.then((result) => {
runInAction(() => {
this.files = [
...this.files,
...result,
];
});
})
.catch(err => runInAction(() => {
this.errors = [...this.errors, err[0].message];
}));
}
And put in this.files and finally my this.files looks like [{contentType: 'plain/text', filename: 'blabla', body: [123, 456, 23, ...] }]
Where [123, 456, 23...] there is my ArrayBuffer
But at such approach in spite of the fact that I use Promise.all, when loading files/files which have weight more ~ 2MB, the page is frozen, it is impossible to interact with her in any way (but I can scroll). Except as realization when each file are divided into chunks nothing has come to mind to correct a situation.
Ok, I try to rewrite the code: With chunks
export function сonvertFilesToByteArray(e) {
const MAX_FILE_SIZE = 1024 * 1024 * 50; // 50MB
const files = Object.keys(e.target.files);
const asyncReadFile = eachFile =>
new Promise((resolve, reject) => {
if (e.target.files[eachFile].size > MAX_FILE_SIZE) {
return reject([{ message: `File ${e.target.files[eachFile].name} too large` }]);
}
const file = e.target.files[eachFile];
let offset = 0;
console.log(offset, 'offset', file.size, 'size');
const defaultChunkSize = 64 * 1024; // bytes
const fileReader = new FileReader();
const blob = file.slice(offset, offset + defaultChunkSize);
const isEndOfFile = () => offset >= file.size;
const testEndOfFile = () => {
if (isEndOfFile()) {
console.log('Done reading file');
}
};
fileReader.readAsArrayBuffer(blob);
fileReader.onloadend = (event) => {
const target = (event.target);
if (target.error == null) {
const result = target.result;
offset += result.length;
testEndOfFile();
console.log(result, 'result');
resolve(result);
} else {
reject(target.error);
}
};
});
return Promise.all(files.map(asyncReadFile));
}
Here I receive the file and I divide it. But the problem is that if the file is more than a chunk, then I should bring together him from them again and again. But how to make it in my case? I can't understand it in any way...
Please help me :) What it is necessary to make to read the file in chunks and to receive it as ArrayBuffer?

Categories