I'm in the process of creating a small next.js application which includes a file upload form. The upload almost works. Uploading simple text files works just fine, but binary files like pictures changes slightly and I can't figure out why.
The content of the file is read using the javascript FileReader and the output of that is used as body for the post request.
The byteLength on the client size matches the byte size of the file with ls -l so I'm assuming that value is correct.
But when the size of the body on the api side is logged it is a few bytes less for binary files. For text the sizes are the same.
In the real code the file content is then send to another api which stores the content in a database and makes it available for download. The content is not the same - it looks like the "pattern" of where the bytes are for pictures have remained, but the bytes are different.
For example, a small png file with size of 1764 bytes is still 1764 bytes on the client side but becomes 1731 bytes on server side.
Here is the client side code:
import { useState } from "react";
const TestPage = () => {
const [file, setFile] = useState();
function readFile(file) {
return new Promise((resolve, reject) => {
const reader = new FileReader();
reader.onload = (res) => {
resolve(res.target.result);
};
reader.onerror = (err) => reject(err);
reader.readAsArrayBuffer(file);
});
}
function onFilesChosen({ target }) {
setFile(target.files[0]);
}
async function onClick(event) {
event.preventDefault();
const fileContent = await readFile(file);
console.log(fileContent.byteLength);
fetch("/api/upload", {
method: "POST",
body: fileContent,
headers: {
"Content-Type": "application/octet-stream",
},
}).then((r) => alert("Uploaded!"));
}
return (
<div>
<form>
<div className="form-group">
<label htmlFor="file-upload">Choose file</label>
<input className="form-control-file" onChange={onFilesChosen} type="file" name="file-upload"/>
</div>
<button type="submit" onClick={onClick}>Upload</button>
</form>
</div>
);
};
export default TestPage;
And this is the simple server side code (just to show the received file size):
export default function handler(req, res) {
const data = req.body;
console.log("length", data.length);
return res.status(200).end();
}
I've tried using axios but couldn't get it to work.
Any suggestions about what I do wrong?
Related
I'm having issues uploading video files to cloudinary in my react app, deployed in netlify. In my App I have a react page with a form that sends the data to my API. My API handles the HTTP requests to my netlify functions (using Axios), and then with the serverless function I call the cloudinary node API to store the video file. The problem happens when I'm passing the data from my API to the serverless function, I'm getting "Error: Stream body too big", because the video exceeds the payload limit of netlify functions (6mb). Do I have to compress the file? It's okay to do it like this (frontend page -> api.js -> serverless function)? Thanks for all the help you guys provide everyday, it helped me a lot!
Files that I have and the error:
page.jsx
const formHandler = async (formValues) => {
try {
...
const res = await addExercise(formValues);
...
} catch (error) {
console.log("error ", error);
}
};
api.js
import { instance } from "./instance";
...
export const addExercise = async (data) => {
try {
const reader = new FileReader();
reader.readAsDataURL(data.exerciseVideo[0]);
reader.onloadend = async () => {
const cloudVideo = reader.result;
const cloudinaryResponse = await instance.post("upload-exerciseVideo", { cloudVideo });
...
}
} catch (error) {
console.log("error", error);
}
};
serverless function (upload-exerciseVideo.js)
import cloudinary from '../src/config/cloudinary';
export const handler = async (event, context) => {
try {
const data = JSON.parse(event.body);
const { cloudVideo } = data;
const cloudinaryRequest = await cloudinary.uploader
.upload(cloudVideo, {
resource_type: "video",
upload_preset: "z9qxfc6q"
});
...
} catch (error) {
console.log('error', error)
return {
statusCode: 400,
body: JSON.stringify(error)
}
}
}
Error:
Netlify Serverless functions are built on top of AWS Lambda functions so there is a hard limit to the size of the file and the amount of time it takes to run the code in the file. You didn't mention the size of your video, but the video does take longer to upload, and even if you are within the 1GB size limit, you may be exceeding the 10-second processing limit. Your video likely already has been compressed, so compression is not a viable option, and decompressing it in the serverless function would probably exceed the time limit. https://www.netlify.com/blog/intro-to-serverless-function.
If you're uploading a large file, like a video, from front-end code, consider using the Upload Widget with an unsigned preset. Here's a link to a code sandbox showing how to create and use the upload widget in React: https://codesandbox.io/s/cld-uw-uymkb. You will need to add your Cloudinary cloudname and an unsigned preset to make this work. You'll find instructions for creating unsigned presets here: https://cloudinary.com/documentation/upload_presets
I have an input that the user can upload an image, I want to get this image and pass it to the server side and the server will store this image on a local folder, for example:
I use linux for the server so the server.js is running from the folder /home/user/project/server/server.js. When the server get the image I want it to store on the folder /home/user/project/images/img.jpg
This my code:
HTML:
<input type="file" id="imageFile" accept=".jpg, .jpeg, .png" />
Front-End:
const signup = async () => {
const name = document.getElementById("signup_name").value;
const passwd = document.getElementById("signup_passwd").value;
const image = document.getElementById("imageFile").files[0];
let formData = new FormData();
formData.append("fileToUpload", image);
const response = await fetch("http:/localhost:3000/signup", {
method: "post",
headers: { "Content-Type": "application/json" },
body: JSON.stringify({
nome: cadastro_nome,
senha: cadastro_senha,
imagem: formData
}),
});
const result = await response.json();
document.getElementById("cadastro_nome").value = "";
document.getElementById("cadastro_senha").value = "";
alert(result);
};
Back-End:
app.post("/signup", async (req, res) => {
const { name, passwd, image } = req.body;
if (!name || !passwd) {
return res.status(400).json("Dados incorretos!");
}
knex
.transaction((trx) => {
trx
.insert({
login: name,
senha: passwd,
divida: 0,
})
.into("usuarios")
.then(trx.commit)
.catch(trx.rollback)
.then(res.json("Cadastrado com sucesso!"));
})
.catch((err) => {
console.log(err);
return res.json("Login existente, tente novamente!");
});
//PUT SOMETHING HERE TO SAVE IMAGE LOCALLY, MAYBE??
});
Yes, you can first store the uploaded image as a Base64 string using the FileReader, data urls are already base64 so when you call reader.readAsDataURL the e.target.result sent to the reader.onload handler and it will be all you need, but also may need add in your HDD or do it asynchronous using res.json, check the WDN official documentation about FileReader.
(Get user's uploaded image for example)
const imgPath = document.querySelector('input[type=file]').files[0];
const reader = new FileReader();
reader.addEventListener("load", function () {
// Convert file to base64 string and save to localStorage
localStorage.setItem("image", reader.result);
}, false);
if (imgPath) {
reader.readAsDataURL(imgPath);
}
And to read the image back from the localStorage, just use querySelector or getElementById:
const img = document.getElementById('image');
img.src = localStorage.getItem('image');
About the "fd" argument must be of type number, in my case, sometimes I was using:
fs.readSync() when I should have been using fs.readFileSync()
fs.writeSync() usage but should be fs.writeFileSync()
fr.write() could be in your case fs.writeFile()
The comment of #Dimava in your question can work too, I flagged up.
For more help, consult this post related to your similar question! ;)
I'm trying to build a PDF viewer in React, I've already built the backend to upload the files and fetch the buffer data. But I'm having some problems with react-pdf, as I can't seem to serve the right type of data.
This is the code I've written:
const [data, setData] = useState();
useEffect(() => {
fetch("http://localhost:5000/files/pdf-test.pdf")
.then((res) => res.text())
.then((res) => setData(new Buffer(res, "binary")));
});
return (
<>
<Document file={{ data: data }} />
</>
);
This is one of the few tries I've made, in this one the backend serves the binary data of the file, and if we console log the last .then we can see that I'm serving the Document a UInt8Array, which is the recommended data format:
Apart from the code in the image, I've also tried it with binary and an ArrayBuffer, but still no results, changing both the backend and frontend code.
The error I get is:
TL;DR: I'm trying to display PDF files in React with react-pdf using their buffer but I can't manage to do it. I have already made a backend to upload files (express-fileupload) and store them so that their data can be fetched.
Thank u for helping, and I'm open to other approaches to the problem
For RCA (react-create-app) you need to config the worker in order to view your PDF file.
import { Document, Page, pdfjs } from 'react-pdf';
Then configure it like this:
pdfjs.GlobalWorkerOptions.workerSrc = `//cdnjs.cloudflare.com/ajax/libs/pdf.js/${pdfjs.version}/pdf.worker.js`;
Usage:
const [uint8Arr, setUint8Arr] = useState();
function getUint8Array() {
let reader = new FileReader();
// reader.readAsDataURL(selectedFile); base64
reader.readAsArrayBuffer(selectedFile);
reader.onloadend = async (e) => {
if ( e.target?.result instanceof ArrayBuffer) {
const uint8Array = new Uint8Array(e.target.result);
setUint8Arr(uint8Array) <----
// more callbacks(file.name, Buffer.from(new Uint8Array(target.result)));
}
};
}
<Document file={{
data: uint8Arr <----
}} onLoadSuccess={() => console.log('SUCCESS LOAD')}>
<Page pageNumber={1} />
</Document>
Hope that this will fix your issue.
I wrote a code that works but it's not a masterpiece. It has huge performance problems. How would I deal with it?
Problems
I have problems with the performance of my component:
Sending multiple images is very slow, when I send 5 pictures, it takes about 1-2 minutes.
How could I make sure that the photos are only uploaded to Cloudinary during the submission?
Because, for example, when a user chooses photo A and then takes a look around and chooses photo B, photos A and B are uploaded to the cloud.
Questions
Would it be a good idea to disable the button for adding an album while transferring files to Cloudinary?
Please help me to solve these two problems and I want to make sure that the application works smoothly. Thank you so much.
const AddAlbum = () => {
const [fileUrl, setFileUrl] = useState();
// MultipleFiles
const [multipleFileUrl, setMultipleFileUrl] = useState([]);
// cloudnary upload api endpoint
const api_cloudinary =
"https://api.cloudinary.com/v1_1/cloudinary_name/image/upload";
// file upload
const handleSingleFileUpload = async (files) => {
const formData = new FormData();
formData.append("file", files[0]);
formData.append("upload_preset", "preset_number");
// send cloudinary image and presets info
const res = await fetch(api_cloudinary, {
method: "POST",
body: formData,
});
const file = await res.json();
console.log(file);
const fileUrl = await file.eager[0].url;
setFileUrl(fileUrl);
};
console.log(fileUrl);
// upload many files cloudnary
// For now max amount of files is 20
const handleMultipleFileUpload = async (files, amount) => {
const formData = new FormData();
for (let i = 0; i <= files.length; i++) {
let file = files[i];
formData.append("file", file);
formData.append("upload_preset", ""preset_number"");
const res = await fetch(api_cloudinary, {
method: "POST",
body: formData,
});
const cloudinaryFiles = await res.json();
console.log(cloudinaryFiles);
setMultipleFileUrl(cloudinaryFiles);
}
};
const handleSubmit = async (e) => {
e.preventDefault();
console.log(
`
Album name:${albumName}
color:${color}
Background Image: ${fileUrl}
files:${multipleFileUrl}
`
);
};
And forms
return (
<Wrapper>
<form onSubmit={handleSubmit}>
<div>
<StyledLabel>Upload background image</StyledLabel>
<DefaultInput
type="file"
onChange={(e) => handleSingleFileUpload(e.target.files)}
/>
</div>
<div>
<StyledLabel>Upload background image</StyledLabel>
<DefaultInput
type="file"
onChange={(e) => handleMultipleFileUpload(e.target.files)}
multiple
/>
</div>
<StyledButton type="submit">Submit</StyledButton>
</form>
</Wrapper>
);
};
You only need 1 function to process 1 or more image files. You need to get the contents of each file and load the contents of the file into form data. Sample code is using the browser FileReader API to read the contents of local image files. Look at this code in repl.it that is using vanilla JS. If it makes sense, you can fit it in to your framework. Let me know if there are questions. https://repl.it/#rpeltz/fe-upload#script.js
reader.addEventListener(
"load", function () {
const fileContents = reader.result;
https://developer.mozilla.org/en-US/docs/Web/API/FileReader
I recommend using a library that handles this for you. There are many edge cases that you need to take care of when dealing with file uploads.
react-uploady for example does this for you. so its easy as this:
import React from "react";
import Uploady from "#rpldy/uploady";
import UploadButton from "#rpldy/upload-button";
const CLOUD_NAME = "<your-cloud-name>";
const UPLOAD_PRESET = "<your-upload-preset>";
const App = () => (<Uploady
destination={{
url: `https://api.cloudinary.com/v1_1/${CLOUD_NAME}/upload`,
params: {
upload_preset: UPLOAD_PRESET,
}
}}>
<UploadButton/>
</Uploady>);
For production code, where you dont want to allow unsigned uploads, I recommend looking at this guide
Alright, so I'm fairly new to Node/Express. I've got an Angular chart, and a React-based Node exporting service.
When you click on a chart's export button, it should create an SVG, pass it along to the export service in an axios.post request, convert it to React component, and export the HTML -> PNG via an image stream, in memory, which should be downloaded on the client when we get a successful response back.
I've tried a Blob, a Buffer, hacking an image.src, etc. I can't seem to get it.
The response passes back the raw image string, so all of that works (if that's what I should be returning)... but I can't get it to download on the client. After digging, I found out that res.download only works with GET, but I specifically need a POST here to pass the SVG and other random data up to the export server (too large for query params).
I'm not sure if it's just that I'm converting it incorrectly (stream) and I can't parse it correctly on the client, or if I'm missing something key.
Here are some snippets:
Angular Service
// ...stuff...
exportPNG(chart) {
const exportServiceURL = 'someUrl.com'
const svg = chart.toSvg();
const params = { svg, ...someOtherData};
const exportUrl = `${exportServiceURL}/exports/image/chart`;
return axios.post(exportUrl, params).then(res => {
//... I need to download the response's image somehow, here?
// res.data looks like "�PNG IHDR�9]�{pHYs��IDATx^�}T[Ǚ�w6�\(-��c[؍�� ')�...
});
React/Export service:
// ...stuff...
async getChart (req, res) => {
const deferred = async.defer();
const filename = 'reportingExport.png';
// Returns HTML, wrapped around the SVG string.
const html = this.getChartHtml(req.body);
const stream = await this.htmlToImageStream(html, filename);
stream.on('open', () => this.pipeStream(req, res, stream, filename));
}
async htmlToImageStream(html, tempFilename) {
const deferred = async.defer();
webshot(html, tempFilename, { siteType: 'html' }, err => {
if (err) deferred.error(err);
const stream = fs.createReadStream(tempFilename);
// Does cleanup, fs.unlink, not important
stream.on('end', () => this.imageExportStreamEnd(stream, tempFilename));
deferred.resolve(stream);
});
return deferred.promise;
}
pipeStream(req, res, stream, filename) {
res.set('Content-Type', 'application/png');
//res.attachment(filename); // this didn't work
res.download(`${path.join(__dirname, '../../../')}${filename}`, filename);
stream.pipe(res); // Seems to return an actual PNG
}