React Native save base64 image to Album - javascript

Third Party API return a "QR code image" in base64 encode,
I need save that image to User's Album.
CamerRoll - not support saving base64 image to album
React-Native-Fetch-Blob -
https://github.com/wkh237/react-native-fetch-blob
still looking into it
React-Native-fs -
https://github.com/itinance/react-native-fs
I am trying this now
There are few npm modules with very little Github star (<10)
the React-Native-Fetch-Blob maintainer gone missing, so no one answering Github Issue,
createFile from React-Native-Fetch-Blob Document not working as expected(not saving image into album)
import fetch_blob from 'react-native-fetch-blob';
// json.qr variable are return from API
const fs = fetch_blob.fs
const base64 = fetch_blob.base64
const dirs = fetch_blob.fs.dirs
const file_path = dirs.DCIMDir + "/some.jpg"
const base64_img = base64.encode(json.qr)
fs.createFile(file_path, base64_img, 'base64')
.then((rep) => {
alert(JSON.stringify(rep));
})
.catch((error) => {
alert(JSON.stringify(error));
});
Anyone deal with this problem before?
How to save a base64 encode Image string to User album? (as a jpg or png file)
because I fetch an API with no CORS header,
I can't debug it in Debug JS Remotely
Chrome would stop that request from happening,
I have to run that on my Android Phone to make it work
(no CORS control on real phone)
I am planing use Clipboard save base64 string,
and hardcode it in my code,
to debug what's wrong with react-native-fetch-blob createFile API

Remove data:image/png;base64, in your base64 string
var Base64Code = base64Image.split("data:image/png;base64,"); //base64Image is my image base64 string
const dirs = RNFetchBlob.fs.dirs;
var path = dirs.DCIMDir + "/image.png";
RNFetchBlob.fs.writeFile(path, Base64Code[1], 'base64')
.then((res) => {console.log("File : ", res)});
And then I solved my problem.

I solve the problem,
turn out I forgot data:image/png;base64, at beginning of the string.
I remove it with following code
// json.qr is base64 string
var image_data = json.qr.split('data:image/png;base64,');
image_data = image_data[1];
and then save the image file
import fetch_blob from 'react-native-fetch-blob';
import RNFS from 'react-native-fs';
const fs = fetch_blob.fs
const dirs = fetch_blob.fs.dirs
const file_path = dirs.DCIMDir + "/bigjpg.png"
// json.qr is base64 string "data:image/png;base64,..."
var image_data = json.qr.split('data:image/png;base64,');
image_data = image_data[1];
RNFS.writeFile(file_path, image_data, 'base64')
.catch((error) => {
alert(JSON.stringify(error));
});
I wrote a blog about this
http://1c7.me/react-native-save-base64-image-to-album/

You can now use only react native fetch blob to achieve this.
Simply replace RNFS.writeFile with
RNFetchBlob.fs.writeFile(file_path, image_data, 'base64')
If you wish to view file in native OS viewer you can simply put
if (isAndroid) {
RNFetchBlob.android.actionViewIntent(file_path, 'application/pdf');
} else {
RNFetchBlob.ios.previewDocument(file_path);
}

const path = `${RNFS.PicturesDirectoryPath}/My Album`;
await RNFS.mkdir(path);
return await fetch(uri)
.then(res => res.blob())
.then(image => {
RNFetchBlob.fs.readFile(uri, "base64").then(data => {
RNFS.appendFile(`${path}/${image.data.name}`, data, "base64").catch(
err => {
console.log("error writing to android storage :", err);
}
);
});
});

I got this worked in following example
import RNFetchBlob from 'rn-fetch-blob';
import Permissions from 'react-native-permissions';
takeSnapshot = async () => {
const currentStatus = await Permissions.check('storage');
if (currentStatus !== 'authorized') {
const status = await Permissions.request('storage');
if (status !== 'authorized') {
return false;
}
}
// put here your base64
const base64 = '';
const path = `${RNFetchBlob.fs.dirs.DCIMDir}/test11.png`;
try {
const data = await RNFetchBlob.fs.writeFile(path, base64, 'base64');
console.log(data, 'data');
} catch (error) {
console.log(error.message);
}
};

this works for me.
I was wanna download base64 as an image in react native
this.state.base64img is my base64 without 'data:image/png;base64,'
checkPermision = async () => {
if (Platform.OS === 'ios') {
this.downloadImage();
} else {
try {
const granted = await PermissionsAndroid.request(
PermissionsAndroid.PERMISSIONS.WRITE_EXTERNAL_STORAGE,
{
title: 'Storage Permission Required',
message: 'App needs access to your storage to download photos',
},
);
if (granted === PermissionsAndroid.RESULTS.GRANTED) {
console.log('Storage permission Granted');
this.downloadImage();
} else {
console.log('Storage permission not Granted');
}
} catch (error) {
console.log('errro', error);
}
}
};
downloadImage() {
let date = new Date();
const { fs} = RNFetchBlob;
const dirs = RNFetchBlob.fs.dirs;
let PictureDir = fs.dirs.PictureDir;
var path = PictureDir + '/image_' +
Math.floor(date.getTime() + date.getSeconds() / 2) +
'.png';
console.log("path :-",path,"dirs :-",dirs)
RNFetchBlob.fs.writeFile(path, this.state.base64img, 'base64').then(res => {
console.log('File : ', res);
alert('Image downloaded successfully.');
}).catch((error) => {
alert(JSON.stringify(error));
});
}

Related

How do I get uploaded image in next js and save it?

How do I get uploaded image in next.js API route and save it on public folder? I have front end ready. I'm uploading images to an endpoint using plain JavaScript.
here is the onSubmit function for uploading images. Suggest me if I'm doing it wrong here. The main question is how do I retrieve it?
const onSubmit=async(e)=>{
e.preventDefault();
const fd=new FormData()
fd.append('myfile',image.name)
let res=await fetch(`http://localhost:3000/api/upload`,{
method: 'POST',
headers: {
"Content-Type": "image/jpeg",
},
body: fd,
})
let response=await res.json();
one more bonus question, it's surely not a good idea to save the uploaded images on public folder. I have save it somewhere on the cloud.
This is the endpoint code I used for uploading image in nextjs, it requires some additional packages I will list them bellow also.
next-connect
multer
uuid
import nextConnect from "next-connect";
import multer from "multer";
import { v4 as uuidv4 } from "uuid";
let filename = uuidv4() + "-" + new Date().getTime();
const upload = multer({
storage: multer.diskStorage({
destination: "./public/uploads/profiles", // destination folder
filename: (req, file, cb) => cb(null, getFileName(file)),
}),
});
const getFileName = (file) => {
filename +=
"." +
file.originalname.substring(
file.originalname.lastIndexOf(".") + 1,
file.originalname.length
);
return filename;
};
const apiRoute = nextConnect({
onError(error, req, res) {
res
.status(501)
.json({ error: `Sorry something Happened! ${error.message}` });
},
onNoMatch(req, res) {
res.status(405).json({ error: `Method '${req.method}' Not Allowed` });
},
});
apiRoute.use(upload.array("file")); // attribute name you are sending the file by
apiRoute.post((req, res) => {
res.status(200).json({ data: `/uploads/profiles/${filename}` }); // response
});
export default apiRoute;
export const config = {
api: {
bodyParser: false, // Disallow body parsing, consume as stream
},
};
no Need to use any packages to handle file uploading you can use base64 to convert file to string and return it back to file by using "fs" module
why This way is beterr then using formData ?
because you duleing with normal post request where you can send any kind of data with it and use body parsere .
converting
const toBase64 = (file: File) => new Promise((resolve, reject) => {
const reader = new FileReader();
reader.readAsDataURL(file);
reader.onload = () => resolve(reader.result);
reader.onerror = error => reject(error);
}
send a post request to server
const base64: string = await toBase64(file) as string;
const fileData = { base64, fileName: file.name };
const result = await api.post("/foo", fileData, name: "Salih", massage: "Hello World"})
converting base64 to file in server
function base64ToFile(file: { base64: string, fileName: string }) {
const fileContents = file.base64.replace(/^data:image\/png;base64,/, "");
fs.mkdirSync("./public/uploads", { recursive: true });
const fileName = `./public/uploads/${Date.now().toString() + file.fileName}`
fs.writeFile(fileName, fileContents, 'base64', function (err) { console.log(err) });
}
I suggest the popular and lightweight formidable library:
# install
yarn add formidable#v3 #types/formidable
// pages/api/file-upload.ts
import fs from "fs";
import path from "path";
import { File } from "formidable";
// Important for NextJS!
export const config = {
api: {
bodyParser: false,
},
};
export default async function handler(
req: NextApiRequest,
res: NextApiResponse<string>
) {
try {
// Parse request with formidable
const { fields, files } = await parseFormAsync(req);
// Files are always arrays (formidable v3+)
const myfile = (files["myfile"] as any as File[])[0];
// Save file in the public folder
saveFile(myfile, "./public/uploads");
// Return success
res.status(200).json("success!");
} catch (e) {
return res.status(500).json(e);
}
}
function saveFile(file: File, publicFolder: string): void {
const fileExt = path.extname(file.originalFilename || "");
fs.renameSync(file.filepath, `${publicFolder}/${file.newFilename}${fileExt}`);
}
// ./helpers/formidable.ts
import type { NextApiRequest } from "next";
import formidable from "formidable";
export type FormidableParseReturn = {
fields: formidable.Fields;
files: formidable.Files;
};
export async function parseFormAsync(
req: NextApiRequest,
formidableOptions?: formidable.Options
): Promise<FormidableParseReturn> {
const form = formidable(formidableOptions);
return await new Promise<FormidableParseReturn>((resolve, reject) => {
form.parse(req, async (err, fields, files) => {
if (err) {
reject(err);
}
resolve({ fields, files });
});
});
}
Bonus question
one more bonus question, it's surely not a good idea to save the uploaded images on public folder. I have save it somewhere on the cloud.
S3 and other cloud services
You can save on cloud services with Formidable.
See the official examples: https://github.com/node-formidable/formidable/blob/master/examples/store-files-on-s3.js
But you don't need to use cloud storage to protect private uploads. You can store them locally.
Working with private uploads locally
Saving:
Store the uploads in a non-public folder;
Ex. /private-uploads/{logged_user_id}/;
Reading:
Create an API page to fetch the file
Ex. https://.../uploads/{filename}
Fail if the file doesn't belong to the authenticated user;
Send the file as the response;
Security:
With the above folder scheme, hackers can use .. and similar on the filename to obtain unauthorized access;
Sanitize the filename having this in mind (ex. only allow alphanumeric characters);
Alternatively, use a database table to control ownership instead of a folder scheme;

Why is URL.creatObjectURL(blob) giving a cross-origin frame error in NodeJS/React application

I have never had this happen before and am not sure why it's happening.
I have a component written to display PDF files in an iframe as part of a larger application. I am retrieving a BLOB stream from the server and attempting to create a URL for it to display in the iframe but it keeps giving me a cross-origin error, which I thought would not be possible since it is creating the URL out of data.
Here is my entire component:
import React, { useState, useEffect } from 'react'
import IFrameComponent from '../Elements/IFrameComponent';
const PDFPages = (props) => {
let [file, setFile] = useState(null)
let [notFound, show404]=useState(false)
useEffect(() => {
let id=props.site?.componentFile;
fetch(`${process.env.REACT_APP_HOST}/documents/GetPDF`,
{
method: 'POST'
, headers: {
'Content-Type': 'application/json'
}
, credentials: 'include'
, body: JSON.stringify({file:id})
})
.then(async response => {
let blob;
try{
blob=await response.blob(); // <--- this functions correctly
}
catch(ex){
let b64=await response.json()
blob=Buffer.from(b64.fileData,'base64')
}
//Create a Blob from the PDF Stream
//Build a URL from the file
const str=`data:application/pdf;base64,${b64.fileData}`
const url=URL.createObjectURL(blob) //<--- ERROR IS THROWN HERE
setFile(url);
})
.catch(error => {
show404(true)
});
}, []);
if(!notFound){
return <IFrameComponent src={file} title=''>
Please enable iFrames in your browser for this page to function correctly
</IFrameComponent>
}
else {
return (
<>
<h3> File {file} could not be found on server</h3>
</>
)
}
}
export default PDFPages;
For completeness here is the GetPDF function from the server which is sending the file.
router.post('/GetPDF', async (req, res, next) => {
const props = req.body;
let fileName = props.file;
try {
fileName = fileName.replace(/%20/g, " ");
let options = {};
if (props.base64) options.encoding = 'base64'
let data = await dataQuery.loadFile(`./data/documentation/${fileName}`, options);
if (!props.base64) {
res.attachment = "filename=" + fileName
res.contentType = 'application/pdf'
res.send(data);
}
else{
res.send({fileData:data, fileName: fileName});
}
}
catch (ex) {
res.send({ error: true })
}
});
I have done very little work in node sending files but am positive my client code is good. Where am I going wrong here?
The problem was that I was trying to be too fancy sending a BLOB or Base64 data. After investigation I rewrote
router.post('/GetPDF', async (req, res, next) => {
const props = req.body;
let fileName = props.file;
try {
fileName = fileName.replace(/%20/g, " ");
let options = {};
if (props.base64) options.encoding = 'base64'
let data = await dataQuery.loadFile(`./data/documentation/${fileName}`, options);
if (!props.base64) {
res.attachment = "filename=" + fileName
res.contentType = 'application/pdf'
res.send(data);
}
else{
res.send({fileData:data, fileName: fileName});
}
}
catch (ex) {
res.send({ error: true })
}
});
on the server to
router.get('/GetPDF/:fileName', async (req, res, next) => {
let fileName = req.params.fileName
fileName = `./data/documentation/${fileName.replace(/%20/g, " ")}`;
try {
let data = await dataQuery.loadFile(fileName);
res.contentType("application/pdf");
res.send(data);
}
catch (ex) {
res.send({ error: true })
}
});
Then calling it from the client using
const url = `${process.env.REACT_APP_HOST}/documents/GetPDF/${props.site.componentFile}`
as the iFrame src sends the PDF properly as expected.
This same method also solved another problem with HTML pages sent from the server not functioning correctly.

How can I send an image from Node.js to React.js?

I have the following setup, by which I send the image, from its url, to be edited and sent back to be uploaded to S3. The problem I currently have is that the image gets on S3 corrupted, and I am wondering if there's trouble in my code that's causing the issue.
Server side:
function convertImage(inputStream) {
return gm(inputStream)
.contrast(-2)
.stream();
}
app.get('/resize/:imgDetails', (req, res, next) => {
let params = req.params.imgDetails.split('&');
let fileName = params[0]; console.log(fileName);
let tileType = params[1]; console.log(tileType);
res.set('Content-Type', 'image/jpeg');
let url = `https://${process.env.Bucket}.s3.amazonaws.com/images/${tileType}/${fileName}`;
convertImage(request.get(url)).pipe(res);
})
Client side:
axios.get('/resize/' + fileName + '&' + tileType)
.then(res => {
/** PUT FILE ON AWS **/
var img = res;
axios.post("/sign_s3_sized", {
fileName : fileName,
tileType : tileType,
ContentType : 'image/jpeg'
})
.then(response => {
var returnData = response.data.data.returnData;
var signedRequest = returnData.signedRequest;
var url = returnData.url;
this.setState({url: url})
// Put the fileType in the headers for the upload
var options = {
headers: {
'Content-Type': 'image/jpeg'
}
};
axios.put(signedRequest,img, options)
.then(result => {
this.setState({success: true});
}).bind(this)
.catch(error => {
console.log("ERROR: " + JSON.stringify(error));
})
})
.catch(error => {
console.log(JSON.stringify(error));
})
})
.catch(error => console.log(error))
Before going any further, I can assure you now that uploading any images via this setup minus the convertImage() works, otherwise the image gets put on S3 corrupted.
Any pointers as to what the issue behind the image being corrupted is?
Is my understanding of streams here lacking perhaps? If so, what should I change?
Thank you!
EDIT 1:
I tried not running the image through the graphicsmagick API at all (request.get(url).pipe(res);) and the image is still corrupted.
EDIT 2:
I gave up at the end and just uploaded the file from Node.js straight to S3; it turned out to be better practice anyway.
So if you are end goal is to upload the image in the S3 bucket using Node Js, there are simple ways by using multer-s3 node module.

AWS S3 Upload after GET Request to Image, Not Uploading Correctly

I'm trying to upload an image to my AWS S3 bucket after downloading the image from another URL using Node (using request-promise-native & aws-sdk):
'use strict';
const config = require('../../../configs');
const AWS = require('aws-sdk');
const request = require('request-promise-native');
AWS.config.update(config.aws);
let s3 = new AWS.S3();
function uploadFile(req, res) {
function getContentTypeByFile(fileName) {
var rc = 'application/octet-stream';
var fn = fileName.toLowerCase();
if (fn.indexOf('.png') >= 0) rc = 'image/png';
else if (fn.indexOf('.jpg') >= 0) rc = 'image/jpg';
return rc;
}
let body = req.body,
params = {
"ACL": "bucket-owner-full-control",
"Bucket": 'testing-bucket',
"Content-Type": null,
"Key": null, // Name of the file
"Body": null // File body
};
// Grabs the filename from a URL
params.Key = body.url.substring(body.url.lastIndexOf('/') + 1);
// Setting the content type
params.ContentType = getContentTypeByFile(params.Key);
request.get(body.url)
.then(response => {
params.Body = response;
s3.putObject(params, (err, data) => {
if (err) { console.log(`Error uploading to S3 - ${err}`); }
if (data) { console.log("Success - Uploaded to S3: " + data.toString()); }
});
})
.catch(err => { console.log(`Error encountered: ${err}`); });
}
The upload succeeds when I test it out, however after trying to redownload it from my bucket the image is unable to display. Additionally, I notice after uploading the file with my function, the file listed in the bucket is much larger in filesize than the originally uploaded image. I'm trying to figure out where I've been going wrong but cannot find where. Any help is appreciated.
Try to open the faulty file with a text editor, you will see some errors written in it.
You can try using s3.upload instead of putObject, it works better with streams.

how to load an image from url into buffer in nodejs

I am new to nodejs and am trying to set up a server where i get the exif information from an image. My images are on S3 so I want to be able to just pass in the s3 url as a parameter and grab the image from it.
I am u using the ExifImage project below to get the exif info and according to their documentation:
"Instead of providing a filename of an image in your filesystem you can also pass a Buffer to ExifImage."
How can I load an image to the buffer in node from a url so I can pass it to the ExifImage function
ExifImage Project:
https://github.com/gomfunkel/node-exif
Thanks for your help!
Try setting up request like this:
var request = require('request').defaults({ encoding: null });
request.get(s3Url, function (err, res, body) {
//process exif here
});
Setting encoding to null will cause request to output a buffer instead of a string.
Use the axios:
const response = await axios.get(url, { responseType: 'arraybuffer' })
const buffer = Buffer.from(response.data, "utf-8")
import fetch from "node-fetch";
let fimg = await fetch(image.src)
let fimgb = Buffer.from(await fimg.arrayBuffer())
I was able to solve this only after reading that encoding: null is required and providing it as an parameter to request.
This will download the image from url and produce a buffer with the image data.
Using the request library -
const request = require('request');
let url = 'http://website.com/image.png';
request({ url, encoding: null }, (err, resp, buffer) => {
// Use the buffer
// buffer contains the image data
// typeof buffer === 'object'
});
Note: omitting the encoding: null will result in an unusable string and not in a buffer. Buffer.from won't work correctly too.
This was tested with Node 8
Use the request library.
request('<s3imageurl>', function(err, response, buffer) {
// Do something
});
Also, node-image-headers might be of interest to you. It sounds like it takes a stream, so it might not even have to download the full image from S3 in order to process the headers.
Updated with correct callback signature.
Here's a solution that uses the native https library.
import { get } from "https";
function urlToBuffer(url: string): Promise<Buffer> {
return new Promise((resolve, reject) => {
const data: Uint8Array[] = [];
get(url, (res) => {
res
.on("data", (chunk: Uint8Array) => {
data.push(chunk);
})
.on("end", () => {
resolve(Buffer.concat(data));
})
.on("error", (err) => {
reject(err);
});
});
});
}
const imageUrl = "https://i.imgur.com/8k7e1Hm.png";
const imageBuffer = await urlToBuffer(imageUrl);
Feel free to delete the types if you're looking for javascript.
I prefer this approach because it doesn't rely on 3rd party libraries or the deprecated request library.
request is deprecated and should be avoided if possible.
Good alternatives include got (only for node.js) and axios (which also support browsers).
Example of got:
npm install got
Using the async/await syntax:
const got = require('got');
const url = 'https://www.google.com/images/branding/googlelogo/2x/googlelogo_color_272x92dp.png';
(async () => {
try {
const response = await got(url, { responseType: 'buffer' });
const buffer = response.body;
} catch (error) {
console.log(error.body);
}
})();
you can do it that way
import axios from "axios";
function getFileContentById(
download_url: string
): Promise < Buffer > {
const response = await axios.get(download_url, {
responseType: "arraybuffer",
});
return Buffer.from(response.data, "base64");
}

Categories