React: How to avoid performance problem when uploading images to Cloudinary - javascript

I wrote a code that works but it's not a masterpiece. It has huge performance problems. How would I deal with it?
Problems
I have problems with the performance of my component:
Sending multiple images is very slow, when I send 5 pictures, it takes about 1-2 minutes.
How could I make sure that the photos are only uploaded to Cloudinary during the submission?
Because, for example, when a user chooses photo A and then takes a look around and chooses photo B, photos A and B are uploaded to the cloud.
Questions
Would it be a good idea to disable the button for adding an album while transferring files to Cloudinary?
Please help me to solve these two problems and I want to make sure that the application works smoothly. Thank you so much.
const AddAlbum = () => {
const [fileUrl, setFileUrl] = useState();
// MultipleFiles
const [multipleFileUrl, setMultipleFileUrl] = useState([]);
// cloudnary upload api endpoint
const api_cloudinary =
"https://api.cloudinary.com/v1_1/cloudinary_name/image/upload";
// file upload
const handleSingleFileUpload = async (files) => {
const formData = new FormData();
formData.append("file", files[0]);
formData.append("upload_preset", "preset_number");
// send cloudinary image and presets info
const res = await fetch(api_cloudinary, {
method: "POST",
body: formData,
});
const file = await res.json();
console.log(file);
const fileUrl = await file.eager[0].url;
setFileUrl(fileUrl);
};
console.log(fileUrl);
// upload many files cloudnary
// For now max amount of files is 20
const handleMultipleFileUpload = async (files, amount) => {
const formData = new FormData();
for (let i = 0; i <= files.length; i++) {
let file = files[i];
formData.append("file", file);
formData.append("upload_preset", ""preset_number"");
const res = await fetch(api_cloudinary, {
method: "POST",
body: formData,
});
const cloudinaryFiles = await res.json();
console.log(cloudinaryFiles);
setMultipleFileUrl(cloudinaryFiles);
}
};
const handleSubmit = async (e) => {
e.preventDefault();
console.log(
`
Album name:${albumName}
color:${color}
Background Image: ${fileUrl}
files:${multipleFileUrl}
`
);
};
And forms
return (
<Wrapper>
<form onSubmit={handleSubmit}>
<div>
<StyledLabel>Upload background image</StyledLabel>
<DefaultInput
type="file"
onChange={(e) => handleSingleFileUpload(e.target.files)}
/>
</div>
<div>
<StyledLabel>Upload background image</StyledLabel>
<DefaultInput
type="file"
onChange={(e) => handleMultipleFileUpload(e.target.files)}
multiple
/>
</div>
<StyledButton type="submit">Submit</StyledButton>
</form>
</Wrapper>
);
};

You only need 1 function to process 1 or more image files. You need to get the contents of each file and load the contents of the file into form data. Sample code is using the browser FileReader API to read the contents of local image files. Look at this code in repl.it that is using vanilla JS. If it makes sense, you can fit it in to your framework. Let me know if there are questions. https://repl.it/#rpeltz/fe-upload#script.js
reader.addEventListener(
"load", function () {
const fileContents = reader.result;
https://developer.mozilla.org/en-US/docs/Web/API/FileReader

I recommend using a library that handles this for you. There are many edge cases that you need to take care of when dealing with file uploads.
react-uploady for example does this for you. so its easy as this:
import React from "react";
import Uploady from "#rpldy/uploady";
import UploadButton from "#rpldy/upload-button";
const CLOUD_NAME = "<your-cloud-name>";
const UPLOAD_PRESET = "<your-upload-preset>";
const App = () => (<Uploady
destination={{
url: `https://api.cloudinary.com/v1_1/${CLOUD_NAME}/upload`,
params: {
upload_preset: UPLOAD_PRESET,
}
}}>
<UploadButton/>
</Uploady>);
For production code, where you dont want to allow unsigned uploads, I recommend looking at this guide

Related

How to correctly display images on React frontend that are returned by Laravel as StreamedResponse objects?

The idea is as follows:
Images/documents are stored privately on the server
A logged-in user on frontend clicks a button which sends an axios request to backend to get an aggregated result of ModelA from TableA and it's associated attachment file list from TableB
For each ModelA, numerous requests are made to endpoint to fetch images which are returned as \Symfony\Component\HttpFoundation\StreamedResponse via Storage::download($request->file_name)
This works in the sense that files are returned.
Note - I tried attaching all files to response in step 2 but this didn't work, so added the extra step to get file list and get individual files after that based on the list. This might kill the webserver if the amount of requests becomes too high, so would appreciate any advise on a different approach.
The problem
How to display the files in React and is this the right approach at all considering potential performance issues noted above?
I've tried the following:
Create an octet-stream url link with FileReader but these wouldn't display and had the same url despite await being used for the reader.readAsDataURL(blob) function:
const { email, name, message, files } = props
const [previews, setPreviews] = useState<string[]>([])
const { attachments } = useAttachment(files)
useEffect(() => {
const p = previews
files && attachments?.forEach(async filename => {
const reader = new FileReader()
reader.onloadend = () => {
p.push(reader.result as string)
setPreviews(p)
}
const blob = new Blob([filename])
await reader.readAsDataURL(blob)
})
}, [files, attachments, previews])
Create src attributes with URL.createObjectURL() but these, although generated and unique, wouldn't display when used in an <img /> tag:
useEffect(() => {
const p = previews
files && attachments?.forEach(filename => {
const blob = new Blob([filename])
const src = URL.createObjectURL(blob)
p.push(src)
setPreviews(p)
})
}, [files, attachments, previews])
Results example:
<img src="blob:http://127.0.0.1:8000/791f5efb-1b4e-4474-a4b6-d7b14b881c28" class="chakra-image css-0">
<img src="blob:http://127.0.0.1:8000/3d93449e-175d-49af-9a7e-61de3669817c" class="chakra-image css-0">
Here's the useAttachment hook:
import { useEffect, useState } from 'react'
import { api } from '#utils/useAxios'
const useAttachment = (files: any[] | undefined) => {
const [attachments, setAttachments] = useState<any[]>([])
const handleRequest = async (data: FormData) => {
await api().post('api/attachment', data).then(resp => {
const attach = attachments
attach.push(resp)
setAttachments(attach)
})
}
useEffect(() => {
if (files) {
files.forEach(async att => {
const formData = new FormData()
formData.append('file_name', att.file_name)
await handleRequest(formData)
})
}
}, [files, attachments])
return { attachments }
}
export default useAttachment
Try Storage::response(). This is the same as Storage::download() just that it sets the Content-Disposition header to inline instead of attachment.
This tells the browser to display it instead of downloading it. See MDN Docs Here
Then you can use it as the src for an <img/>.
Solved it by sending the files in a single response but encoded with base64encode(Storage::get('filename')). Then, on the frontend, it was as simple as:
const base64string = 'stringReturned'
<img> src={`data:image/png;base64,${base64string}`}</img>```

Uploading an mp3 to Firebase Storage with React Native Expo

I am attempting to upload an mp3 to firebase storage using expo and react native. So far I've got the file into firebase storage, but it's only 9bytes large, so I'm doing something wrong. I've attempted this with blob as shown below with no success.
Here is a screenshot of the firebase storage folder showing the file uploaded but not the data of said file:
Any help is greatly appreciated, I feel like I'm missing a step to actually upload the data along with the file.
export default function SongPicker() {
const [song, setSong] = useState(null);
//Get current user through authentication
const user = auth.currentUser;
const pickDocument = async () => {
let result = await DocumentPicker.getDocumentAsync({});
// Fetch the photo with it's local URI
const response = fetch(result.uri);
alert(result.uri);
console.log(result);
const file = new Blob(
[response.value], {
type: 'audio/mpeg'
});
console.log('do we see this?');
try {
//Create the file reference
const storage = getStorage();
const storageRef = ref(storage, `songs/${user.uid}/${result.name}`);
// Upload Blob file to Firebase
const snapshot = uploadBytes(storageRef, file, 'blob').then((snapshot) => {
console.log('Uploaded a song to firebase storage!');
});
setSong(result.uri);
} catch (error) {
console.log(error);
}
}
The fetch() returns a Promise so you should add an await for that as well.
const response = await fetch(result.uri);
Then try using blob() method on the Response:
const file = await response.blob()
The third param in uploadBytes should be upload metadata object but you can skip that here:
const snapshot = await uploadBytes(storageRef, file).

How to display image downloaded from Azure Blob Storage in React

First of all, Ive checked this question and it's not a duplicate How to display an image saved as blob in React
I'm not sure what they're doing, but our code is entirely different.
I have been following Azure's documentation on downloading files from blob storage via React: https://learn.microsoft.com/en-us/javascript/api/overview/azure/storage-blob-readme?view=azure-node-latest#download-a-blob-and-convert-it-to-a-string-browsers
Here is my component did mount, where I'm downloading the images from my blob storage container.
async function blobToString(blob) { // this is copy-pasted from Azure who provided this as a helper function
const fileReader = new FileReader();
return new Promise((resolve, reject) => {
fileReader.onloadend = (ev) => {
resolve(ev.target.result);
};
fileReader.onerror = reject;
fileReader.readAsText(blob);
});
}
useEffect(() => {
async function fetchData() {
if (!mounted) {
console.log(images)
setImages(images.map(async (image) => {
// console.log(image)
const blobClient = containerClient.getBlobClient(image.src);
let downloadBlockBlobResponse = await blobClient.download()
let downloaded = await blobToString(await downloadBlockBlobResponse.blobBody)
// console.log(downloaded)
image.src = URL.createObjectURL(downloadBlockBlobResponse);
return image;
}));
console.log(images)
setMounted(true);
}
}
fetchData();
}, []);
return (
<>
{mounted ? <>
<img src={images[0].src} />
</> : null}
</>
)
When I console.log downloaded, which is the string version of the image, this is what I see:
Chrome says it's a 1.1 mb string, so I'm sure it's the entire image.
I know that this is not a valid image src because the screen is entirely blank and the image being pulled from blob storage is not a white picture. (There is no doubt that mounted is being set correctly, I'm actually running this image through a custom component which is also registering that the image source is invalid, and you'll just have to believe that I'm passing the source correctly to this component)
Any idea how I can create a valid image source from this convoluted string Azure has produced?
edit:
unfortunately, whatever this blob is, is still not registering as a valid image source. my code is a little bit different because I was getting this error (Failed to execute 'createObjectURL' on 'URL':) so I followed the instructions there.
This also does not work:
The Azure SDK documentation example just show how you would consume the downloaded Blob as a string using blobToString() (assuming the blob is text). In your case you don't need do convert it to string. Could you please try
const downloaded = await downloadBlockBlobResponse.blobBody;
image.src = URL.createObjectURL(downloaded);
Edited: Not sure what you are missing. but I just tried, and this simple page shows the downloaded image correctly.
export default function Page(): JSX.Element {
const [imgSrc, setImgSrc] = useState<string>("");
useEffect(() => {
async function fetchImage() {
if (!downloaded) {
if (imageUrl) {
client = new BlockBlobClient(imageUrl);
const response = await client.download();
const blob = await response.blobBody;
if (blob) {
setImgSrc(URL.createObjectURL(blob));
console.log("##### ", imgSrc);
}
}
downloaded = true;
}
}
fetchImage();
}, [imgSrc]);
return (
<>
<div className="jumbotron">
<h1>Storage Blob Download Sample!</h1>
<div className="alert alert-light">
<h4 className="alert-heading">Image</h4>
<div className="container">
<img src={imgSrc} />
</div>
</div>
</div>
</>
);
}

File content changes when uploading to api

I'm in the process of creating a small next.js application which includes a file upload form. The upload almost works. Uploading simple text files works just fine, but binary files like pictures changes slightly and I can't figure out why.
The content of the file is read using the javascript FileReader and the output of that is used as body for the post request.
The byteLength on the client size matches the byte size of the file with ls -l so I'm assuming that value is correct.
But when the size of the body on the api side is logged it is a few bytes less for binary files. For text the sizes are the same.
In the real code the file content is then send to another api which stores the content in a database and makes it available for download. The content is not the same - it looks like the "pattern" of where the bytes are for pictures have remained, but the bytes are different.
For example, a small png file with size of 1764 bytes is still 1764 bytes on the client side but becomes 1731 bytes on server side.
Here is the client side code:
import { useState } from "react";
const TestPage = () => {
const [file, setFile] = useState();
function readFile(file) {
return new Promise((resolve, reject) => {
const reader = new FileReader();
reader.onload = (res) => {
resolve(res.target.result);
};
reader.onerror = (err) => reject(err);
reader.readAsArrayBuffer(file);
});
}
function onFilesChosen({ target }) {
setFile(target.files[0]);
}
async function onClick(event) {
event.preventDefault();
const fileContent = await readFile(file);
console.log(fileContent.byteLength);
fetch("/api/upload", {
method: "POST",
body: fileContent,
headers: {
"Content-Type": "application/octet-stream",
},
}).then((r) => alert("Uploaded!"));
}
return (
<div>
<form>
<div className="form-group">
<label htmlFor="file-upload">Choose file</label>
<input className="form-control-file" onChange={onFilesChosen} type="file" name="file-upload"/>
</div>
<button type="submit" onClick={onClick}>Upload</button>
</form>
</div>
);
};
export default TestPage;
And this is the simple server side code (just to show the received file size):
export default function handler(req, res) {
const data = req.body;
console.log("length", data.length);
return res.status(200).end();
}
I've tried using axios but couldn't get it to work.
Any suggestions about what I do wrong?

How to upload a file into Firebase Storage from a callable https cloud function

I have been trying to upload a file to Firebase storage using a callable firebase cloud function.
All i am doing is fetching an image from an URL using axios and trying to upload to storage.
The problem i am facing is, I don't know how to save the response from axios and upload it to storage.
First , how to save the received file in the temp directory that os.tmpdir() creates.
Then how to upload it into storage.
Here i am receiving the data as arraybuffer and then converting it to Blob and trying to upload it.
Here is my code. I have been missing a major part i think.
If there is a better way, please recommend me. Ive been looking through a lot of documentation, and landed up with no clear solution. Please guide. Thanks in advance.
const bucket = admin.storage().bucket();
const path = require('path');
const os = require('os');
const fs = require('fs');
module.exports = functions.https.onCall((data, context) => {
try {
return new Promise((resolve, reject) => {
const {
imageFiles,
companyPIN,
projectId
} = data;
const filename = imageFiles[0].replace(/^.*[\\\/]/, '');
const filePath = `ProjectPlans/${companyPIN}/${projectId}/images/${filename}`; // Path i am trying to upload in FIrebase storage
const tempFilePath = path.join(os.tmpdir(), filename);
const metadata = {
contentType: 'application/image'
};
axios
.get(imageFiles[0], { // URL for the image
responseType: 'arraybuffer',
headers: {
accept: 'application/image'
}
})
.then(response => {
console.log(response);
const blobObj = new Blob([response.data], {
type: 'application/image'
});
return blobObj;
})
.then(async blobObj => {
return bucket.upload(blobObj, {
destination: tempFilePath // Here i am wrong.. How to set the path of downloaded blob file
});
}).then(buffer => {
resolve({ result: 'success' });
})
.catch(ex => {
console.error(ex);
});
});
} catch (error) {
// unknown: 500 Internal Server Error
throw new functions.https.HttpsError('unknown', 'Unknown error occurred. Contact the administrator.');
}
});
I'd take a slightly different approach and avoid using the local filesystem at all, since its just tmpfs and will cost you memory that your function is using anyway to hold the buffer/blob, so its simpler to just avoid it and write directly from that buffer to GCS using the save method on the GCS file object.
Here's an example. I've simplified out a lot of your setup, and I am using an http function instead of a callable. Likewise, I'm using a public stackoverflow image and not your original urls. In any case, you should be able to use this template to modify back to what you need (e.g. change the prototype and remove the http response and replace it with the return value you need):
const functions = require('firebase-functions');
const axios = require('axios');
const admin = require('firebase-admin');
admin.initializeApp();
exports.doIt = functions.https.onRequest((request, response) => {
const bucket = admin.storage().bucket();
const IMAGE_URL = 'https://cdn.sstatic.net/Sites/stackoverflow/company/img/logos/so/so-logo.svg';
const MIME_TYPE = 'image/svg+xml';
return axios.get(IMAGE_URL, { // URL for the image
responseType: 'arraybuffer',
headers: {
accept: MIME_TYPE
}
}).then(response => {
console.log(response); // only to show we got the data for debugging
const destinationFile = bucket.file('my-stackoverflow-logo.svg');
return destinationFile.save(response.data).then(() => { // note: defaults to resumable upload
return destinationFile.setMetadata({ contentType: MIME_TYPE });
});
}).then(() => { response.send('ok'); })
.catch((err) => { console.log(err); })
});
As a commenter noted, in the above example the axios request itself makes an external network access, and you will need to be on the Blaze or Flame plan for that. However, that alone doesn't appear to be your current problem.
Likewise, this also defaults to using a resumable upload, which the documentation does not recommend when you are doing large numbers of small (<10MB files) as there is some overhead.
You asked how this might be used to download multiple files. Here is one approach. First, lets assume you have a function that returns a promise that downloads a single file given its filename (I've abridged this from the above but its basically identical except for the change of INPUT_URL to filename -- note that it does not return a final result such as response.send(), and there's sort of an implicit assumption all the files are the same MIME_TYPE):
function downloadOneFile(filename) {
const bucket = admin.storage().bucket();
const MIME_TYPE = 'image/svg+xml';
return axios.get(filename, ...)
.then(response => {
const destinationFile = ...
});
}
Then, you just need to iteratively build a promise chain from the list of files. Lets say they are in imageUrls. Once built, return the entire chain:
let finalPromise = Promise.resolve();
imageUrls.forEach((item) => { finalPromise = finalPromise.then(() => downloadOneFile(item)); });
// if needed, add a final .then() section for the actual function result
return finalPromise.catch((err) => { console.log(err) });
Note that you could also build an array of the promises and pass them to Promise.all() -- that would likely be faster as you would get some parallelism, but I wouldn't recommend that unless you are very sure all of the data will fit inside the memory of your function at once. Even with this approach, you need to make sure the downloads can all complete within your function's timeout.

Categories