First of all, Ive checked this question and it's not a duplicate How to display an image saved as blob in React
I'm not sure what they're doing, but our code is entirely different.
I have been following Azure's documentation on downloading files from blob storage via React: https://learn.microsoft.com/en-us/javascript/api/overview/azure/storage-blob-readme?view=azure-node-latest#download-a-blob-and-convert-it-to-a-string-browsers
Here is my component did mount, where I'm downloading the images from my blob storage container.
async function blobToString(blob) { // this is copy-pasted from Azure who provided this as a helper function
const fileReader = new FileReader();
return new Promise((resolve, reject) => {
fileReader.onloadend = (ev) => {
resolve(ev.target.result);
};
fileReader.onerror = reject;
fileReader.readAsText(blob);
});
}
useEffect(() => {
async function fetchData() {
if (!mounted) {
console.log(images)
setImages(images.map(async (image) => {
// console.log(image)
const blobClient = containerClient.getBlobClient(image.src);
let downloadBlockBlobResponse = await blobClient.download()
let downloaded = await blobToString(await downloadBlockBlobResponse.blobBody)
// console.log(downloaded)
image.src = URL.createObjectURL(downloadBlockBlobResponse);
return image;
}));
console.log(images)
setMounted(true);
}
}
fetchData();
}, []);
return (
<>
{mounted ? <>
<img src={images[0].src} />
</> : null}
</>
)
When I console.log downloaded, which is the string version of the image, this is what I see:
Chrome says it's a 1.1 mb string, so I'm sure it's the entire image.
I know that this is not a valid image src because the screen is entirely blank and the image being pulled from blob storage is not a white picture. (There is no doubt that mounted is being set correctly, I'm actually running this image through a custom component which is also registering that the image source is invalid, and you'll just have to believe that I'm passing the source correctly to this component)
Any idea how I can create a valid image source from this convoluted string Azure has produced?
edit:
unfortunately, whatever this blob is, is still not registering as a valid image source. my code is a little bit different because I was getting this error (Failed to execute 'createObjectURL' on 'URL':) so I followed the instructions there.
This also does not work:
The Azure SDK documentation example just show how you would consume the downloaded Blob as a string using blobToString() (assuming the blob is text). In your case you don't need do convert it to string. Could you please try
const downloaded = await downloadBlockBlobResponse.blobBody;
image.src = URL.createObjectURL(downloaded);
Edited: Not sure what you are missing. but I just tried, and this simple page shows the downloaded image correctly.
export default function Page(): JSX.Element {
const [imgSrc, setImgSrc] = useState<string>("");
useEffect(() => {
async function fetchImage() {
if (!downloaded) {
if (imageUrl) {
client = new BlockBlobClient(imageUrl);
const response = await client.download();
const blob = await response.blobBody;
if (blob) {
setImgSrc(URL.createObjectURL(blob));
console.log("##### ", imgSrc);
}
}
downloaded = true;
}
}
fetchImage();
}, [imgSrc]);
return (
<>
<div className="jumbotron">
<h1>Storage Blob Download Sample!</h1>
<div className="alert alert-light">
<h4 className="alert-heading">Image</h4>
<div className="container">
<img src={imgSrc} />
</div>
</div>
</div>
</>
);
}
Related
The idea is as follows:
Images/documents are stored privately on the server
A logged-in user on frontend clicks a button which sends an axios request to backend to get an aggregated result of ModelA from TableA and it's associated attachment file list from TableB
For each ModelA, numerous requests are made to endpoint to fetch images which are returned as \Symfony\Component\HttpFoundation\StreamedResponse via Storage::download($request->file_name)
This works in the sense that files are returned.
Note - I tried attaching all files to response in step 2 but this didn't work, so added the extra step to get file list and get individual files after that based on the list. This might kill the webserver if the amount of requests becomes too high, so would appreciate any advise on a different approach.
The problem
How to display the files in React and is this the right approach at all considering potential performance issues noted above?
I've tried the following:
Create an octet-stream url link with FileReader but these wouldn't display and had the same url despite await being used for the reader.readAsDataURL(blob) function:
const { email, name, message, files } = props
const [previews, setPreviews] = useState<string[]>([])
const { attachments } = useAttachment(files)
useEffect(() => {
const p = previews
files && attachments?.forEach(async filename => {
const reader = new FileReader()
reader.onloadend = () => {
p.push(reader.result as string)
setPreviews(p)
}
const blob = new Blob([filename])
await reader.readAsDataURL(blob)
})
}, [files, attachments, previews])
Create src attributes with URL.createObjectURL() but these, although generated and unique, wouldn't display when used in an <img /> tag:
useEffect(() => {
const p = previews
files && attachments?.forEach(filename => {
const blob = new Blob([filename])
const src = URL.createObjectURL(blob)
p.push(src)
setPreviews(p)
})
}, [files, attachments, previews])
Results example:
<img src="blob:http://127.0.0.1:8000/791f5efb-1b4e-4474-a4b6-d7b14b881c28" class="chakra-image css-0">
<img src="blob:http://127.0.0.1:8000/3d93449e-175d-49af-9a7e-61de3669817c" class="chakra-image css-0">
Here's the useAttachment hook:
import { useEffect, useState } from 'react'
import { api } from '#utils/useAxios'
const useAttachment = (files: any[] | undefined) => {
const [attachments, setAttachments] = useState<any[]>([])
const handleRequest = async (data: FormData) => {
await api().post('api/attachment', data).then(resp => {
const attach = attachments
attach.push(resp)
setAttachments(attach)
})
}
useEffect(() => {
if (files) {
files.forEach(async att => {
const formData = new FormData()
formData.append('file_name', att.file_name)
await handleRequest(formData)
})
}
}, [files, attachments])
return { attachments }
}
export default useAttachment
Try Storage::response(). This is the same as Storage::download() just that it sets the Content-Disposition header to inline instead of attachment.
This tells the browser to display it instead of downloading it. See MDN Docs Here
Then you can use it as the src for an <img/>.
Solved it by sending the files in a single response but encoded with base64encode(Storage::get('filename')). Then, on the frontend, it was as simple as:
const base64string = 'stringReturned'
<img> src={`data:image/png;base64,${base64string}`}</img>```
I'm trying to build a PDF viewer in React, I've already built the backend to upload the files and fetch the buffer data. But I'm having some problems with react-pdf, as I can't seem to serve the right type of data.
This is the code I've written:
const [data, setData] = useState();
useEffect(() => {
fetch("http://localhost:5000/files/pdf-test.pdf")
.then((res) => res.text())
.then((res) => setData(new Buffer(res, "binary")));
});
return (
<>
<Document file={{ data: data }} />
</>
);
This is one of the few tries I've made, in this one the backend serves the binary data of the file, and if we console log the last .then we can see that I'm serving the Document a UInt8Array, which is the recommended data format:
Apart from the code in the image, I've also tried it with binary and an ArrayBuffer, but still no results, changing both the backend and frontend code.
The error I get is:
TL;DR: I'm trying to display PDF files in React with react-pdf using their buffer but I can't manage to do it. I have already made a backend to upload files (express-fileupload) and store them so that their data can be fetched.
Thank u for helping, and I'm open to other approaches to the problem
For RCA (react-create-app) you need to config the worker in order to view your PDF file.
import { Document, Page, pdfjs } from 'react-pdf';
Then configure it like this:
pdfjs.GlobalWorkerOptions.workerSrc = `//cdnjs.cloudflare.com/ajax/libs/pdf.js/${pdfjs.version}/pdf.worker.js`;
Usage:
const [uint8Arr, setUint8Arr] = useState();
function getUint8Array() {
let reader = new FileReader();
// reader.readAsDataURL(selectedFile); base64
reader.readAsArrayBuffer(selectedFile);
reader.onloadend = async (e) => {
if ( e.target?.result instanceof ArrayBuffer) {
const uint8Array = new Uint8Array(e.target.result);
setUint8Arr(uint8Array) <----
// more callbacks(file.name, Buffer.from(new Uint8Array(target.result)));
}
};
}
<Document file={{
data: uint8Arr <----
}} onLoadSuccess={() => console.log('SUCCESS LOAD')}>
<Page pageNumber={1} />
</Document>
Hope that this will fix your issue.
I am using a react component as my MarkDown editor. This component provided a dropzone to paste in files and it accepts a callback function which is passed ArrayBuffer of the file (mainly image file). The backend expects an image file as if uploaded by a form. But having ArrayBuffer returned instead of the file has proven to be a bit of an issue.
I have attempted to convert the ArrayBuffer to a Blob, still, the backend needs other info from the file uploaded such as filename and size which exist on the binary image file.
I'd appreciate any help! Thank you.
Example below:
function converArrayBufferToImage(ab) {
// Obtain a blob: URL for the image data.
return new Blob([ab])
}
export default function ReactMdEditor(props) {
const { value, setValue } = props;
const save = async function* (file) {
let fileData = new FormData()
let convertedFile = converArrayBufferToImage(file)
fileData.append('file', convertedFile);
try {
const response = yield uploadFile(fileData);
const data = response.data;
yield data.url;
return true;
} catch (error) {
console.log(error);
return false;
}
};
return (
<div>
<ReactMde
value={value}
onChange={setValue}
childProps={{
writeButton: {
tabIndex: -1
}
}}
paste={{
saveImage: save
}}
/>
</div>
);
}
Edit: if needed for context, my backend is build with Python, Flask.
Alright, it seems I have found a solution.
Here is the save function now:
const save = async function* (file) {
try {
const blobFile = new Blob([file], { "type": "image/png" });
let formData = new FormData()
formData.append('file', blobFile, 'file.png');
try {
const response = await uploadFile(formData);
const data = response.data;
yield data.url;
return true;
} catch (error) {
console.log(error);
return false;
}
} catch (e) {
console.warn(e.message)
return false;
}
};
You'll need to generate a unique filename on the client or server in order to not overwrite existing files with same name if you are not already.
explanation of what this function does if anyone needs that: if simply gets passed an array buffer, converts it to an image Blob, and it uploads it to the server, the latter returning the url where the file was saved.
I hope this helps someone and saves them sometime.
I wrote a code that works but it's not a masterpiece. It has huge performance problems. How would I deal with it?
Problems
I have problems with the performance of my component:
Sending multiple images is very slow, when I send 5 pictures, it takes about 1-2 minutes.
How could I make sure that the photos are only uploaded to Cloudinary during the submission?
Because, for example, when a user chooses photo A and then takes a look around and chooses photo B, photos A and B are uploaded to the cloud.
Questions
Would it be a good idea to disable the button for adding an album while transferring files to Cloudinary?
Please help me to solve these two problems and I want to make sure that the application works smoothly. Thank you so much.
const AddAlbum = () => {
const [fileUrl, setFileUrl] = useState();
// MultipleFiles
const [multipleFileUrl, setMultipleFileUrl] = useState([]);
// cloudnary upload api endpoint
const api_cloudinary =
"https://api.cloudinary.com/v1_1/cloudinary_name/image/upload";
// file upload
const handleSingleFileUpload = async (files) => {
const formData = new FormData();
formData.append("file", files[0]);
formData.append("upload_preset", "preset_number");
// send cloudinary image and presets info
const res = await fetch(api_cloudinary, {
method: "POST",
body: formData,
});
const file = await res.json();
console.log(file);
const fileUrl = await file.eager[0].url;
setFileUrl(fileUrl);
};
console.log(fileUrl);
// upload many files cloudnary
// For now max amount of files is 20
const handleMultipleFileUpload = async (files, amount) => {
const formData = new FormData();
for (let i = 0; i <= files.length; i++) {
let file = files[i];
formData.append("file", file);
formData.append("upload_preset", ""preset_number"");
const res = await fetch(api_cloudinary, {
method: "POST",
body: formData,
});
const cloudinaryFiles = await res.json();
console.log(cloudinaryFiles);
setMultipleFileUrl(cloudinaryFiles);
}
};
const handleSubmit = async (e) => {
e.preventDefault();
console.log(
`
Album name:${albumName}
color:${color}
Background Image: ${fileUrl}
files:${multipleFileUrl}
`
);
};
And forms
return (
<Wrapper>
<form onSubmit={handleSubmit}>
<div>
<StyledLabel>Upload background image</StyledLabel>
<DefaultInput
type="file"
onChange={(e) => handleSingleFileUpload(e.target.files)}
/>
</div>
<div>
<StyledLabel>Upload background image</StyledLabel>
<DefaultInput
type="file"
onChange={(e) => handleMultipleFileUpload(e.target.files)}
multiple
/>
</div>
<StyledButton type="submit">Submit</StyledButton>
</form>
</Wrapper>
);
};
You only need 1 function to process 1 or more image files. You need to get the contents of each file and load the contents of the file into form data. Sample code is using the browser FileReader API to read the contents of local image files. Look at this code in repl.it that is using vanilla JS. If it makes sense, you can fit it in to your framework. Let me know if there are questions. https://repl.it/#rpeltz/fe-upload#script.js
reader.addEventListener(
"load", function () {
const fileContents = reader.result;
https://developer.mozilla.org/en-US/docs/Web/API/FileReader
I recommend using a library that handles this for you. There are many edge cases that you need to take care of when dealing with file uploads.
react-uploady for example does this for you. so its easy as this:
import React from "react";
import Uploady from "#rpldy/uploady";
import UploadButton from "#rpldy/upload-button";
const CLOUD_NAME = "<your-cloud-name>";
const UPLOAD_PRESET = "<your-upload-preset>";
const App = () => (<Uploady
destination={{
url: `https://api.cloudinary.com/v1_1/${CLOUD_NAME}/upload`,
params: {
upload_preset: UPLOAD_PRESET,
}
}}>
<UploadButton/>
</Uploady>);
For production code, where you dont want to allow unsigned uploads, I recommend looking at this guide
I am using React Dropzone for the file upload. I then generate the S3 putObject signedURL and send the image to S3 using axios.
It looks something like this:
const {getRootProps, getInputProps} = useDropzone({
onDrop: (acceptedFiles) => {
const image = acceptedFiles[0]
getS3SignedUrl(...)
.then(path => {
const options = {...}
//????
return axios.put(path, image, options)
})
}
})
Everything is working fine but the images are very big. I would like to reduce the width/height of the image, scale it down and maybe reduce the quality before sending it to S3.
I looked at some similar question but I can't figure out what is the best lib/ way of doing it.
Can someone help me with an example?
You can use react-imgpro library. And below how to use.
import React from 'react';
import ProcessImage from 'react-imgpro';
class App extends React.Component {
state = {
src: '',
err: null
}
render() {
return (
<ProcessImage
image='http://365.unsplash.com/assets/paul-jarvis-9530891001e7f4ccfcef9f3d7a2afecd.jpg'
resize={{ width: 500, height: 500 }}
colors={{
mix: {
color: 'mistyrose',
amount: 20
}
}}
processedImage={(src, err) => this.setState({ src, err})}
/>
)
}
}
And processed image would store in the state.
Cloudinary has great features including what you want. All you have to do is use the SDK and then send the image along with the parameters resize the image there and will return the processed image URL along with the image and stored in the cloud of your cloudinary account. I really rely on Cloudinary specifically NodeJS SDK. You can see the SDK documentation for ReactJS.
https://cloudinary.com/documentation/react_integration
You can use react-image-file-resizer package.
How to use:
First, wrap this resizer:
const resizeFile = (file) => new Promise(resolve => {
Resizer.imageFileResizer(file, 300, 300, 'JPEG', 100, 0,
uri => {
resolve(uri);
},
'base64'
);
});
And then use it in your async function:
const onChange = async (event) => {
try {
const file = event.target.files[0];
const image = await resizeFile(file);
console.log(image);
} catch(err) {
console.log(err);
}
}