How to send a jpg image to the server with react-camera? - javascript

My Goal is to upload an image taken from a webcam to a Lambda function which then uploads it to AWS S3.
The lambda function seems to work when I test it, however I can't work out what exactly needs to be sent through from the React Camera.
Or if I am sending through the right format to upload it.
import Camera from 'react-camera';
..
This is the JSX
<Camera
ref={(cam) => {
this.camera = cam;
}}
>
<Button onClick={this.takePicture}>
<i className="fas fa-camera"></i> Take photo
</Button>
</Camera>
This is the react code that is called when they take the photo
takePicture = () => {
this.camera.capture()
.then(blob => {
console.log(blob);
this.props.dispatch(uploadImage(blob))
})
}
The uploadImage function in my action is:
export const uploadImage = (fileObj) => dispatch => {
return fetch(url, {
method: 'POST',
headers: {
'Accept': 'image/jpeg'
},
body: fileObj
})
.then((response) => response.json())
.then(function (response) {
if (response.status === 'success') {
console.log(response);
// ... Show feedback
return response
} else {
// ... Show feedback
}
})
.catch((error) => {
console.error(error)
});
}
I figure I need to upload a base64 image..?
I don't understand how I get that from the blob
Here is the Lambda Function code for reference:
var params = {
Bucket: 'bucketName',
Key: Date.now() + '.jpg',
ContentType: 'image/jpeg',
Body: event.body,
ACL: "public-read"
};
return uploading = new Promise(function (resolve, reject) {
return s3.upload(params, function (err, data) {
if(err) {
state.uploadError = err
return reject({
error: err,
status: 'error',
message: 'something went wrong'
})
}
state.uploadData = data
state.fileLocation = data.Location
state.status = "success"
state.message = "File has been uploaded to the fileLocation"
return resolve(data)
});
})
Question:
How do I make the format of the blob correct so that when it's POSTed though as the body it will be the correct image format?

Very well organized question and code... thank you!
Updated:
Use the filesystem module to read the file and specify the encoding.
const fs = require('fs');
takePicture = () => {
this.camera.capture()
.then(blob => {
console.log(blob);
const image = fs.readFileSync(blob.path);
const b64Image = Buffer.from(image).toString('base64');
this.props.dispatch(uploadImage(b64image));
})
}

Related

jest script finish before axios.post work

I have component which calls changeFile() and then uploadFile() when file of input is changed
const uploadFile = () =>{
console.log("uploadfile is called");
console.log(filePath);
if (filePath == undefined){return;}
var formData = new FormData();
formData.append("drawing",filePath);
formData.append("detail",JSON.stringify({}));
axios.defaults.xsrfHeaderName = "X-CSRFTOKEN";
axios.defaults.xsrfCookieName = "csrftoken";
axios.defaults.withCredentials = true;
console.log("axios post is called")
axios.post(
`http://localhost:8000/api/drawings/`,formData,{
headers: {'Content-Type': 'multipart/form-data'}
}
).then(function (response) {
console.log("File is created");
console.log(response.data);
GlobalParam.obj = response.data;
props.onSetFileObj(response.data);
})
.catch(function (response) {
console.log("Catch is called");
console.log(response);
});
};
const changeFile = (event) =>{
console.log("changefile is called:" + event.target.files[0]);
setFilePath(event.target.files[0]);
uploadFile();
}
return (
<div>
<div ref={innerRef} id="drop-zone" style={{border: '1px solid', padding: 30}}>
<p>file drag and drop</p>
<input data-testid="fileInput" ref={fileRef} onChange={changeFile} type="file" id="file_input" />
</div>
</div>);
I am testing this script with jest and react-test-library. test script is like this below.
it('input file one', async() => {
var file = new File(["foo"], "300x300.png", { type: "image/png" });
const { getByTestId} = render(<FileUploader/>) ;
let uploader = getByTestId("fileInput");
await waitFor(() =>
fireEvent.change(uploader, {
target: { files: [file] },
})
);
});
I can see the log just before axios.post executed.
console.log
axios post is called
at log (src/components/FileUploader.js:68:17)
However callback then, or catch is not called.(There must be either "File is created" or "Catch is called" should be appeard)
I have two guess
await waitFor(() only wait for change not waiting for axios.post? if so how can I wait for the axios?
Or I put the file 300x300.png in current directly,however it doesn't work corretcy?
where file object is created.
var file = new File(["foo"], "300x300.png", { type: "image/png" });
When I remove the file, it works the same. So I can't figure out this line correctly pick up the real file.

Unable to send large payload using fetch in javascript

I am a beginner in the JavaScript and NodeJS space. I was trying to create a simple file upload project with NodeJS. Furthermore, I created the routes in NodeJS and capturing the image from the webpage and sending it to the NodeJS routes using fetch.
This is the HTML file:
<div class="file-upload">
<div class="image-upload-wrap">
<input class="file-upload-input" type='file' onchange="readURL(this);" accept="image/*" />
<div class="drag-text">
<h3>Drag and drop a file or select add Image</h3>
</div>
</div>
<div class="file-upload-content">
<div class="file-upload-display">
<img id="file-upload-image" class="file-upload-image" src="#" alt="your image" />
<div class="image-title-wrap">
<button type="button" onclick="removeUpload()" class="remove-image"><i class="fas fa-trash-alt"></i> Remove <span class="image-title">Uploaded Image</span></button>
</div>
</div>
</div>
</div>
<br>
<div class="file-upload-server d-flex justify-content-center">
<button class="btn btn-expand-lg btn-primary" onclick="uploadFile()"><i class="fas fa-cloud-upload-alt"></i> Upload</button>
</div>
This is my JavaScript file:
async function uploadFile() {
let img = document.getElementById('file-upload-image').src;
console.log('Image String Length: ' + img.length);
const payload = {
method: 'POST',
headers: {
'Content-Type': 'application/json'
},
body: JSON.stringify({
file: img,
})
}
console.log(`Payload: ${JSON.stringify(payload)}`);
await fetch('http://localhost:3030/image/uploadimage', payload)
.then(response => response.json())
.then(data => {
console.log('Success:', data);
})
.catch((error) => {
console.error('Error:', error);
});
}
The image here is the base64 encoded string and that I passing to my nodejs route.
When I click my upload button I get the following error:
I have tested this code with an image of size 5.72kb and it works. But when I try to upload an image of size 81.7kb it fails with that error.
This is the nodejs route:
router.use(imageupload({
limits: { fileSize: 50 * 1024 * 1024 },
}));
router.use(express.urlencoded({limit: '50mb', extended: true}));
router.use(express.json());
const decodeBase64Image = (dataString) => {
let matches = dataString.match(/^data:([A-Za-z-+\/]+);base64,(.+)$/),
response = {};
if (matches.length !== 3) {
return new Error('Invalid input string');
}
response.type = matches[1];
response.data = Buffer.from(matches[2], 'base64');
return response;
}
router.post('/uploadimage', cors(corsOptions), async (req, res) => {
let decodedImage = decodeBase64Image(req.body.file);
let imageBuffer = decodedImage.data;
let type = decodedImage.type;
let extension = mime.getExtension(type);
let NewImageName = Math.random().toString(36).substring(7);
let fileName = 'image-' + NewImageName + '.' + extension;
try {
fs.writeFile(`${path.join(__dirname, '../')}/public/uploads/${fileName}`,
imageBuffer, function(err) {
if(err) throw err;
console.log('The file was uploaded successfully');
return res.status(200).json({status: 'success', message: 'File uploaded successfully'});
});
} catch(err) {
console.log(err);
return res.status(500).send(err);
}
});
Any help or guidance around this would be great.
You need to set an upload limit for your express.json() middleware since you are using 'Content-Type': 'application/json' header in the post request.
Try replacing router.use(express.json()) with router.use(express.json({ limit: '50mb'})
Documentation
Edit
The error Unexpected token < in JSON at position 0' happens if you try to parse a string with response.json() which is not formatted as json. You can reproduce it in a one liner in nodejs as JSON.parse('I am no json...')
If you call response.json() on the client , it has to be properly formatted json or it result in an error. For doing so, use res.json() instead of res.send().
You should also make sure to always respond to the client, so change the line if(err) throw err; to
if(err) {
console.error(err)
// always send back a status, 500 means server error..
res.status(500).json('Internal server error')
return
}
I would suggest that you should use multer here's a link
Backend Code Sample:
var multer = require('multer')
var signature = multer.diskStorage({
destination: function (req, file, cb) {
cb(null, './public/')
},
filename: function (req, file, cb) {
if (file.mimetype == 'image/jpeg' || file.mimetype === 'image/jpg' || file.mimetype === 'image/png' || file.mimetype === 'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet') {
cb(null, Date.now() + file.originalname)
} else {
cb({ error: 'Image type not supported' })
}
}
})
var uploadSignature = multer({ storage: signature })
router.put('/updateWorkOrderForSign',uploadSignature.array('signature', 1), function (req, res) {
try {
if (req.body.signature) {
var base64Data = req.body.signature.replace(/^data:image\/png;base64,/, "");
let rString = Math.random().toString(36).substring(7);
let date = Date.now();
let imageName = '/' + date + rString + '.png';
let imagePath = '/public' + imageName;
require("fs").writeFile('./' + imagePath, base64Data, 'base64', function (err) {
if (err) {
log.error('SAVE IMAGE FAILED WITH ERROR: ', err);
} else {
log.info('Signature image saved successfully')
}
});
} else {
//nothing to do
log.error("No signature image present")
}
} catch (err) {
log.error('ERROR WHILE CONVERTING BASE64 TO IMAGE: ', err);
}
})
Front-end code sample(I have implemented this in react native):
async function uploadFile() {
let imageObject = {
uri: image.path,
type: 'image/png',
name: 'signatureImage'
}
var form = new FormData();
form.append('signature', imageObject)
form.append(‘task’Id, taskId);
await fetch(urlStr, {
method: 'PUT',
headers: {
'Authorization': bearerToken,
Accept: 'application/json',
'Content-Type': 'multipart/form-data'},
body: params,
})
.then(response => response.json())
.then(responseData => {
var result = JSON.stringify(responseData);
return result;
})
.catch(error => {
console.log('ERROR WHILE UPLOADING IMAGE: ',error)
});
}

Node.js Google Drive Client - Error downloading file: response.data.on is not a function

I am using the Node.js Google Drive client trying to download certain files from a gdrive. When using the example provided in their GitHub I get a Uncaught (in promise) TypeError: res.data.on is not a function error. The file is still created locally, but it's just an empty file from createWriteStream().
When I log the res variable I get: ReadableStream {locked: false}.
I'm pretty new to streams so this is quite a bit over my head.
Here is my code. You'll notice it's almost exactly what the example they provide looks like.
syncFileFromDrive(fileId, filePath) {
filePath.replace(userDataPath, '');
filePath = `${userDataPath}/${filePath}`;
filePath.replaceAll('//', '/');
logger.info(`Sync file from drive: Syncing file to path: ${filePath}`);
logger.info(`Sync file from drive: File id: ${fileId}`)
const dest = fs.createWriteStream(filePath);
let progress = 0;
this.drive.files.get({fileId, alt: 'media'}, {responseType: 'stream'}).then(res => {
console.log(res)
console.log(res.data)
res.data
.on('end', () => {
console.log('Done downloading file.');
folderStructure.buildFileMenu()
resolve(dest)
})
.on('error', err => {
console.error('Error downloading file.');
reject(err);
})
.on('data', d => {
progress += d.length;
if (process.stdout.isTTY) {
process.stdout.clearLine();
process.stdout.cursorTo(0);
process.stdout.write(`Downloaded ${progress} bytes`);
}
})
.pipe(dest);
});
}
Edit: I should add that this is for an Electron application. So while Node is supported, I'm not sure if that may affect the way I can use streams.
This feels like it's a bit of a work around, and I am open to any suggestions, but this was able to solve the issue I was having.
syncFileFromDrive(fileId, filePath) {
filePath.replace(userDataPath, '');
filePath = `${userDataPath}/${filePath}`;
filePath.replaceAll('//', '/');
logger.info(`Sync file from drive: Syncing file to path: ${filePath}`);
logger.info(`Sync file from drive: File id: ${fileId}`)
this.drive.files
.get({ fileId, alt: "media"}, {responseType: 'stream'})
.then((res) => {
const dest = fs.createWriteStream(filePath);
const decoder = new TextDecoder("utf-8");
const reader = res.data.getReader()
reader.read().then(function processText({ done, value }) {
if (done) {
console.log("Stream complete");
return;
}
dest.write(decoder.decode(value))
// Read some more, and call this function again
return reader.read().then(processText);
});
})
}
Please take a look at my implementation, which I used to downloading the file
import { google } from 'googleapis';
const getOauth2Client = () => new google.auth.OAuth2(
process.env.GOOGLE_DRIVE_CLIENT_ID,
process.env.GOOGLE_DRIVE_CLIENT_SECRET,
process.env.GOOGLE_DRIVE_REDIRECT_URL
);
const downloadFile = ({ id, access_token, path }) => {
return new Promise((resolve, reject) => {
const dest = fs.createWriteStream(path);
const oauth2Client = getOauth2Client();
oauth2Client.setCredentials({ access_token });
const drive = google.drive({
version: 'v3',
auth: oauth2Client
});
drive.files.get(
{ fileId: id, alt: 'media' }, { responseType: 'stream' },
(err, res) => {
if (err) reject(err);
res.data
.on('end', () => {
console.log('Done');
})
.on('error', _e => {
console.log('Error', _e);
if (_e) reject(_e);
})
.pipe(dest);
dest.on('finish', () => {
console.log('Download finished');
resolve(true);
});
}
);
});
};
This is because in the renderer process, Google's gaxios modules uses the fetch API instead of Node's http. Fetch API returns a ReadableStream unlike http which returns a Node.js Readable. Currently there's no way to change the default adapter. You can use this quick workaround the convert it.
// Transforms a web ReadableStream to Node.js Readable
function toNodeReadable(webStream) {
const reader = webStream.getReader();
const rs = new Readable();
rs._read = async () => {
const result = await reader.read();
if (!result.done) {
rs.push(Buffer.from(result.value));
} else {
rs.push(null);
}
};
return rs;
}
Usage with your code:
syncFileFromDrive(fileId, filePath) {
filePath.replace(userDataPath, '');
filePath = `${userDataPath}/${filePath}`;
filePath.replaceAll('//', '/');
logger.info(`Sync file from drive: Syncing file to path: ${filePath}`);
logger.info(`Sync file from drive: File id: ${fileId}`)
const dest = fs.createWriteStream(filePath);
let progress = 0;
this.drive.files.get({fileId, alt: 'media'}, {responseType: 'stream'}).then(res => {
console.log(res)
console.log(res.data)
toNodeReadable(res.data)
.on('end', () => {
console.log('Done downloading file.');
folderStructure.buildFileMenu()
resolve(dest)
})
.on('error', err => {
console.error('Error downloading file.');
reject(err);
})
.on('data', d => {
progress += d.length;
if (process.stdout.isTTY) {
process.stdout.clearLine();
process.stdout.cursorTo(0);
process.stdout.write(`Downloaded ${progress} bytes`);
}
})
.pipe(dest);
});
}

Images and Videos are not uploading to server using axios while in debugging mode

I'm using react-native-image-crop-picker to get image from gallery and trying to upload it on the server using Axios.But its not uploading to the server and when I hit the api to upload it starts sending and never-ending and no response getting from server.But when I try to make its build and then try to upload then its uploaded successfully and gettting reponse from server.
Here is my code .
const handleProfilePic = () => {
const date = new Date();
const formData = new FormData();
formData.append('files', {
uri: image.path,
type: image.mime,
name: 'image_' + Math.floor(date.getTime() + date.getSeconds() / 2),
});
console.log(formData);
new Promise((rsl, rej) => {
setLoading(true);
updatePic(formData, user.auth, rsl, rej);
})
.then((res) => {
Snackbar.show({
text: res,
duration: Snackbar.LENGTH_SHORT,
});
setLoading(false);
})
.catch((errorData) => {
setLoading(false);
Snackbar.show({
text: errorData,
duration: Snackbar.LENGTH_SHORT,
});
});
};
//add pic code
export const updatePic = (data, token, rsl, rej) => {
return (dispatch) => {
axios(`${BASE_URL}/Authentication/addpicture`, {
method: 'post',
data,
headers: {
auth: token,
},
})
.then((res) => {
console.log(res);
if (res.data.status == true) {
rsl(res.data.message);
} else {
rej(res.data.message);
}
})
.catch((err) => {
console.log(err);
rej(err.message);
});
};
};
I've solved it by commenting this line
Open this dir 'android/app/src/debug/java/com/flatApp/ReactNativeFlipper.java'
NetworkingModule.setCustomClientBuilder(
new NetworkingModule.CustomClientBuilder() {
#Override
public void apply(OkHttpClient.Builder builder) {
// builder.addNetworkInterceptor(new FlipperOkhttpInterceptor(networkFlipperPlugin));
}
});

How to save files into AWS using signedURLs and ReactJS?

I'm trying to attach images with regular text inputs into my form in order to submit to my MongoDB.
This is what my function to create a post looks like:
const [postData, setPostData] = useState({
text: '',
images: null,
postedto: auth && auth.user.data._id === userId ? null : userId
});
const { text, images, postedto } = postData;
const handleChange = name => e => {
setPostData({ ...postData, [name]: e.target.value, images: e.target.files });
};
const createPost = async e => {
e.preventDefault();
await addPost(postData, setUploadPercentage);
};
From there I move into my action addPost; on this function I call two API routes:
// #route POST api/v1/posts
// #description Add post
// #access Private
// #task DONE
export const addPost = (formData, setUploadPercentage) => async dispatch => {
try {
// ATTACH FILES
let fileKeys = [];
for(let file of formData.images) {
const uploadConfig = await axios.get(`${API}/api/v1/uploads/getS3url?type=${file.type}`);
await axios.put(uploadConfig.data.url, file, {
headers: {
'Content-Type': file.type
}
});
fileKeys.push(uploadConfig.data.key);
}
console.log(fileKeys);
// INSERT NEW BLOG
const config = {
headers: {
'Content-Type': 'multipart/form-data; application/json'
},
onUploadProgress: ProgressEvent => {
setUploadPercentage(
parseInt(Math.round(ProgressEvent.loaded * 100) / ProgressEvent.total)
);
// Clear percentage
setTimeout(() => setUploadPercentage(0), 10000);
}
};
formData.images = fileKeys;
const res = await axios.post(`${API}/api/v1/posts`, formData, config);
dispatch({
type: ADD_POST,
payload: res.data
});
dispatch(setAlert('Post Created', 'success'));
} catch (err) {
const errors = err.response && err.response.data.errors;
if (errors) {
errors.forEach(error => dispatch(setAlert(error.msg, 'danger')));
}
dispatch({
type: POST_ERROR,
payload: { msg: err.response && err.response.statusText, status: err.response && err.response.status }
});
}
};
My getS3url function looks exactly like this:
exports.uploadFile = asyncHandler(async (req, res, next) => {
const { type } = req.query;
const fileExtension = type.substring(type.indexOf('/') + 1);
const key = `${process.env.WEBSITE_NAME}-${req.user._id}-${
req.user.email
}-${Date.now().toString()}.${fileExtension}`;
const params = {
Bucket: process.env.AWS_BUCKET_NAME,
Key: key,
ContentType: type
};
s3.getSignedUrl(`putObject`, params, (err, url) => {
if (err) {
return next(
new ErrorResponse(
`There was an error with the files being uploaded`,
500
)
);
}
return res.status(201).json({ success: true, key: url });
});
});
I would like to point out that every post might have more than one image file and the function should return a signedURL for each single file; let's say I upload two files, I then should have two URLS retrieved in order to attach them into my post.
I'm sure there's nothing wrong with the way I;m managing state to submit data because it always return what I expect when using on console.log(postData) , even the files are shown.
Now I'm assuming the problem resides on my action, especially the code before the /// INSERT NEW BLOG comment because when I console.log(fileKeys) nothing is returned, not even an error/undefined/null.....I mean just nothing!.
My uploadFile is working fine when used with a single file....well not really because yes, it returns an URL of the 'supposed' uploaded file but when I get into my AWS console/bucket, there's nothing..but thats for its own post.
What I need help with?
Well, I'm trying to upload one/multiple files into my AWS using signedURL to return them as strings and attach them into my post. Is there any problem with my action file?.
Thanks!!.
for my case, I have been looping through the images and generating signed URLs and returning them since s3 doesn't support the signed URL option for multiple files at once.
In the end I found my own solution, here it is:
export const addPost = (formData, images, setUploadPercentage) => async dispatch => {
try {
let fileKeys = [];
for(let i = 0; i < images.length; i++) {
/// STEP 3
const token = localStorage.getItem("xAuthToken");
api.defaults.headers.common["Authorization"] = `Bearer ${token}`
const uploadConfig = await api.get(`/uploads/getS3url?name=${images[i].name}&type=${images[i].type}&size=${images[i].size}`);
// STEP 1
delete api.defaults.headers.common['Authorization'];
await api.put(uploadConfig.data.postURL, images[i], {
headers: {
'Content-Type': images[i].type
}
});
fileKeys.push(uploadConfig.data.getURL);
}
// INSERT NEW BLOG
const config = {
onUploadProgress: ProgressEvent => {
setUploadPercentage(
parseInt(Math.round(ProgressEvent.loaded * 100) / ProgressEvent.total)
);
setTimeout(() => setUploadPercentage(0), 10000);
}
};
// STEP 2
const token = localStorage.getItem("xAuthToken");
api.defaults.headers.common["Authorization"] = `Bearer ${token}`
const res = await api.post(`/posts`, {...formData, images: fileKeys}, config);
dispatch({
type: ADD_POST,
payload: res.data
});
dispatch(setAlert('Post Created', 'success'));
} catch (err) {
const errors = err.response && err.response.data.errors;
if (errors) {
errors.forEach(error => dispatch(setAlert(error.msg, 'danger')));
}
dispatch({
type: POST_ERROR,
payload: { msg: err.response && err.response.statusText, status: err.response && err.response.status }
});
}
};

Categories