knowing whether a fetch is requesting or responding [duplicate] - javascript

I'm struggling to find documentation or examples of implementing an upload progress indicator using fetch.
This is the only reference I've found so far, which states:
Progress events are a high level feature that won't arrive in fetch for now. You can create your own by looking at the Content-Length header and using a pass-through stream to monitor the bytes received.
This means you can explicitly handle responses without a Content-Length differently. And of course, even if Content-Length is there it can be a lie. With streams you can handle these lies however you want.
How would I write "a pass-through stream to monitor the bytes" sent? If it makes any sort of difference, I'm trying to do this to power image uploads from the browser to Cloudinary.
NOTE: I am not interested in the Cloudinary JS library, as it depends on jQuery and my app does not. I'm only interested in the stream processing necessary to do this with native javascript and Github's fetch polyfill.
https://fetch.spec.whatwg.org/#fetch-api

Streams are starting to land in the web platform (https://jakearchibald.com/2016/streams-ftw/) but it's still early days.
Soon you'll be able to provide a stream as the body of a request, but the open question is whether the consumption of that stream relates to bytes uploaded.
Particular redirects can result in data being retransmitted to the new location, but streams cannot "restart". We can fix this by turning the body into a callback which can be called multiple times, but we need to be sure that exposing the number of redirects isn't a security leak, since it'd be the first time on the platform JS could detect that.
Some are questioning whether it even makes sense to link stream consumption to bytes uploaded.
Long story short: this isn't possible yet, but in future this will be handled either by streams, or some kind of higher-level callback passed into fetch().

My solution is to use axios, which supports this pretty well:
axios.request({
method: "post",
url: "/aaa",
data: myData,
onUploadProgress: (p) => {
console.log(p);
//this.setState({
//fileprogress: p.loaded / p.total
//})
}
}).then (data => {
//this.setState({
//fileprogress: 1.0,
//})
})
I have example for using this in react on github.

fetch: not possible yet
It sounds like upload progress will eventually be possible with fetch once it supports a ReadableStream as the body. This is currently not implemented, but it's in progress. I think the code will look something like this:
warning: this code does not work yet, still waiting on browsers to support it
async function main() {
const blob = new Blob([new Uint8Array(10 * 1024 * 1024)]); // any Blob, including a File
const progressBar = document.getElementById("progress");
const totalBytes = blob.size;
let bytesUploaded = 0;
const blobReader = blob.stream().getReader();
const progressTrackingStream = new ReadableStream({
async pull(controller) {
const result = await blobReader.read();
if (result.done) {
console.log("completed stream");
controller.close();
return;
}
controller.enqueue(result.value);
bytesUploaded += result.value.byteLength;
console.log("upload progress:", bytesUploaded / totalBytes);
progressBar.value = bytesUploaded / totalBytes;
},
});
const response = await fetch("https://httpbin.org/put", {
method: "PUT",
headers: {
"Content-Type": "application/octet-stream"
},
body: progressTrackingStream,
});
console.log("success:", response.ok);
}
main().catch(console.error);
upload: <progress id="progress" />
workaround: good ol' XMLHttpRequest
Instead of fetch(), it's possible to use XMLHttpRequest to track upload progress — the xhr.upload object emits a progress event.
async function main() {
const blob = new Blob([new Uint8Array(10 * 1024 * 1024)]); // any Blob, including a File
const uploadProgress = document.getElementById("upload-progress");
const downloadProgress = document.getElementById("download-progress");
const xhr = new XMLHttpRequest();
const success = await new Promise((resolve) => {
xhr.upload.addEventListener("progress", (event) => {
if (event.lengthComputable) {
console.log("upload progress:", event.loaded / event.total);
uploadProgress.value = event.loaded / event.total;
}
});
xhr.addEventListener("progress", (event) => {
if (event.lengthComputable) {
console.log("download progress:", event.loaded / event.total);
downloadProgress.value = event.loaded / event.total;
}
});
xhr.addEventListener("loadend", () => {
resolve(xhr.readyState === 4 && xhr.status === 200);
});
xhr.open("PUT", "https://httpbin.org/put", true);
xhr.setRequestHeader("Content-Type", "application/octet-stream");
xhr.send(blob);
});
console.log("success:", success);
}
main().catch(console.error);
upload: <progress id="upload-progress"></progress><br/>
download: <progress id="download-progress"></progress>

Update: as the accepted answer says it's impossible now. but the below code handled our problem for sometime. I should add that at least we had to switch to using a library that is based on XMLHttpRequest.
const response = await fetch(url);
const total = Number(response.headers.get('content-length'));
const reader = response.body.getReader();
let bytesReceived = 0;
while (true) {
const result = await reader.read();
if (result.done) {
console.log('Fetch complete');
break;
}
bytesReceived += result.value.length;
console.log('Received', bytesReceived, 'bytes of data so far');
}
thanks to this link: https://jakearchibald.com/2016/streams-ftw/

As already explained in the other answers, it is not possible with fetch, but with XHR. Here is my a-little-more-compact XHR solution:
const uploadFiles = (url, files, onProgress) =>
new Promise((resolve, reject) => {
const xhr = new XMLHttpRequest();
xhr.upload.addEventListener('progress', e => onProgress(e.loaded / e.total));
xhr.addEventListener('load', () => resolve({ status: xhr.status, body: xhr.responseText }));
xhr.addEventListener('error', () => reject(new Error('File upload failed')));
xhr.addEventListener('abort', () => reject(new Error('File upload aborted')));
xhr.open('POST', url, true);
const formData = new FormData();
Array.from(files).forEach((file, index) => formData.append(index.toString(), file));
xhr.send(formData);
});
Works with one or multiple files.
If you have a file input element like this:
<input type="file" multiple id="fileUpload" />
Call the function like this:
document.getElementById('fileUpload').addEventListener('change', async e => {
const onProgress = progress => console.log('Progress:', `${Math.round(progress * 100)}%`);
const response = await uploadFiles('/api/upload', e.currentTarget.files, onProgress);
if (response.status >= 400) {
throw new Error(`File upload failed - Status code: ${response.status}`);
}
console.log('Response:', response.body);
}
Also works with the e.dataTransfer.files you get from a drop event when building a file drop zone.

I don't think it's possible. The draft states:
it is currently lacking [in comparison to XHR] when it comes to request progression
(old answer):
The first example in the Fetch API chapter gives some insight on how to :
If you want to receive the body data progressively:
function consume(reader) {
var total = 0
return new Promise((resolve, reject) => {
function pump() {
reader.read().then(({done, value}) => {
if (done) {
resolve()
return
}
total += value.byteLength
log(`received ${value.byteLength} bytes (${total} bytes in total)`)
pump()
}).catch(reject)
}
pump()
})
}
fetch("/music/pk/altes-kamuffel.flac")
.then(res => consume(res.body.getReader()))
.then(() => log("consumed the entire body without keeping the whole thing in memory!"))
.catch(e => log("something went wrong: " + e))
Apart from their use of the Promise constructor antipattern, you can see that response.body is a Stream from which you can read byte by byte using a Reader, and you can fire an event or do whatever you like (e.g. log the progress) for every of them.
However, the Streams spec doesn't appear to be quite finished, and I have no idea whether this already works in any fetch implementation.

with fetch: now possible with Chrome >= 105 🎉
How to:
https://developer.chrome.com/articles/fetch-streaming-requests/
Currently not supported by other browsers (maybe that will be the case when you read this, please edit my answer accordingly)
Feature detection (source)
const supportsRequestStreams = (() => {
let duplexAccessed = false;
const hasContentType = new Request('', {
body: new ReadableStream(),
method: 'POST',
get duplex() {
duplexAccessed = true;
return 'half';
},
}).headers.has('Content-Type');
return duplexAccessed && !hasContentType;
})();
HTTP >= 2 required
The fetch will be rejected if the connection is HTTP/1.x.

Since none of the answers solve the problem.
Just for implementation sake, you can detect the upload speed with some small initial chunk of known size and the upload time can be calculated with content-length/upload-speed. You can use this time as estimation.

A possible workaround would be to utilize new Request() constructor then check Request.bodyUsed Boolean attribute
The bodyUsed attribute’s getter must return true if disturbed, and
false otherwise.
to determine if stream is distributed
An object implementing the Body mixin is said to be disturbed if
body is non-null and its stream is disturbed.
Return the fetch() Promise from within .then() chained to recursive .read() call of a ReadableStream when Request.bodyUsed is equal to true.
Note, the approach does not read the bytes of the Request.body as the bytes are streamed to the endpoint. Also, the upload could complete well before any response is returned in full to the browser.
const [input, progress, label] = [
document.querySelector("input")
, document.querySelector("progress")
, document.querySelector("label")
];
const url = "/path/to/server/";
input.onmousedown = () => {
label.innerHTML = "";
progress.value = "0"
};
input.onchange = (event) => {
const file = event.target.files[0];
const filename = file.name;
progress.max = file.size;
const request = new Request(url, {
method: "POST",
body: file,
cache: "no-store"
});
const upload = settings => fetch(settings);
const uploadProgress = new ReadableStream({
start(controller) {
console.log("starting upload, request.bodyUsed:", request.bodyUsed);
controller.enqueue(request.bodyUsed);
},
pull(controller) {
if (request.bodyUsed) {
controller.close();
}
controller.enqueue(request.bodyUsed);
console.log("pull, request.bodyUsed:", request.bodyUsed);
},
cancel(reason) {
console.log(reason);
}
});
const [fileUpload, reader] = [
upload(request)
.catch(e => {
reader.cancel();
throw e
})
, uploadProgress.getReader()
];
const processUploadRequest = ({value, done}) => {
if (value || done) {
console.log("upload complete, request.bodyUsed:", request.bodyUsed);
// set `progress.value` to `progress.max` here
// if not awaiting server response
// progress.value = progress.max;
return reader.closed.then(() => fileUpload);
}
console.log("upload progress:", value);
progress.value = +progress.value + 1;
return reader.read().then(result => processUploadRequest(result));
};
reader.read().then(({value, done}) => processUploadRequest({value,done}))
.then(response => response.text())
.then(text => {
console.log("response:", text);
progress.value = progress.max;
input.value = "";
})
.catch(err => console.log("upload error:", err));
}

I fished around for some time about this and just for everyone who may come across this issue too here is my solution:
const form = document.querySelector('form');
const status = document.querySelector('#status');
// When form get's submitted.
form.addEventListener('submit', async function (event) {
// cancel default behavior (form submit)
event.preventDefault();
// Inform user that the upload has began
status.innerText = 'Uploading..';
// Create FormData from form
const formData = new FormData(form);
// Open request to origin
const request = await fetch('https://httpbin.org/post', { method: 'POST', body: formData });
// Get amount of bytes we're about to transmit
const bytesToUpload = request.headers.get('content-length');
// Create a reader from the request body
const reader = request.body.getReader();
// Cache how much data we already send
let bytesUploaded = 0;
// Get first chunk of the request reader
let chunk = await reader.read();
// While we have more chunks to go
while (!chunk.done) {
// Increase amount of bytes transmitted.
bytesUploaded += chunk.value.length;
// Inform user how far we are
status.innerText = 'Uploading (' + (bytesUploaded / bytesToUpload * 100).toFixed(2) + ')...';
// Read next chunk
chunk = await reader.read();
}
});

const req = await fetch('./foo.json');
const total = Number(req.headers.get('content-length'));
let loaded = 0;
for await(const {length} of req.body.getReader()) {
loaded = += length;
const progress = ((loaded / total) * 100).toFixed(2); // toFixed(2) means two digits after floating point
console.log(`${progress}%`); // or yourDiv.textContent = `${progress}%`;
}

Key part is ReadableStream &Lt;obj_response.body&Gt;.
Sample:
let parse=_/*result*/=>{
console.log(_)
//...
return /*cont?*/_.value?true:false
}
fetch('').
then(_=>( a/*!*/=_.body.getReader(), b/*!*/=z=>a.read().then(parse).then(_=>(_?b:z=>z)()), b() ))
You can test running it on a huge page eg https://html.spec.whatwg.org/ and https://html.spec.whatwg.org/print.pdf . CtrlShiftJ and load the code in.
(Tested on Chrome.)

Related

Not able to stop the spinner on page when lengthy response is coming using Angular and Node

I am facing one issue. In my application my page is requesting to server which is run by Node.js to fetch 2000 record at a time. Here the records are coming from Node but in dev tool console its not expanding and also I have some loader implementation that is not stopping even after receiving the response. I am explaining whole code below.
demo.component.ts:
onFileSelect($event) {
const file = $event.target.files[0];
const fileName = file.name;
const fileExtension = fileName.replace(/^.*\./, '');
if (fileExtension === 'ubot') {
this.loginService.startSpinner(true);
const formData = new FormData();
formData.append('cec', this.cec);
formData.append('screenName', this.intFlow);
formData.append('fileCategory', 'payload');
formData.append('file', file);
this.intentService.reverseFile(formData).subscribe(async (res: any) => {
console.log('response', res);
console.log('succ', res.status);
if (res && res.status === 'success') {
this.loginService.startSpinner(false);
this.intentService.intentData = '';
this.resettoOriginalState();
this.cdref.detach();
await this.repopulateDataFromFile(res.body);
(<HTMLInputElement>document.getElementById('fileuploader')).value = "";
}
else {
this.loginService.startSpinner(false);
this._notifications.create(res.msg, '', this.errorNotificationType);
(<HTMLInputElement>document.getElementById('fileuploader')).value = "";
}
});
} else {
this.loginService.startSpinner(false);
this._notifications.create('Please choose a file', '', this.errorNotificationType);
}
}
Here I am requesting to server through one service which is given below.
reverseFile(value) {
// const token = localStorage.getItem('token')
// let headers = new HttpHeaders({
// Authorization: 'Bearer ' + token
// })
return this.http.post(this.nodeAppUrl + 'reverseFile', value,{ observe: 'response'})
.pipe(
tap((res:any) => this.loginService.validateToken(res)),
map((res:any) => {
return res.body
})
)
}
Here the angular is requesting the spinner is starting and after some sec the response also coming from Node.js but as we have the line this.loginService.startSpinner(false); after success message but the spinner is still running.
Here in the response we have more than 2000 records which is in nested array of object format and we are populating the record using this.repopulateDataFromFile(res.body); method. I am attaching below the screen shot of console tool.
Even the status is success I am not able to stop the spinner and also I am not able to expand the record the console which is showing the value was evaluated upon first expanding.......
Can anybody please give any help why it is happening and how to resolve this.

Progress for a fetch blob javascript

I'm trying to do a javascript fetch to grab a video file using fetch. I am able to get the file downloaded and get the blob URL, but I can't seem to get the progress while its downloading.
I tried this:
let response = await fetch('test.mp4');
const reader = response.body.getReader();
const contentLength=response.headers.get('Content-Length');
let receivedLength = 0;
d=document.getElementById('progress_bar');
while(true)
{
const {done, value} = await reader.read();
if (done)
{
break;
}
receivedLength += value.length;
d.innerHTML="Bytes loaded:"+receivedLength;
}
const blob = await response.blob();
var vid=URL.createObjectURL(blob);
The problem is that I get "Response.blob: Body has already been consumed". I see that the reader.read() is probably doing that. How do I just get the amount of data received and then get a blob URL at the end of it?
Thanks.
Update:
My first attempt collected the chunks as they downloaded and them put them back together, with a large (2-3x the size of the video) memory footprint. Using a ReadableStream has a much lower memory footprint (memory usage hovers around 150MB for a 1.1GB mkv). Code largely adapted from the snippet here with only minimal modifications from me:
https://github.com/AnthumChris/fetch-progress-indicators/blob/master/fetch-basic/supported-browser.js
<div id="progress_bar"></div>
<video id="video_player"></video>
const elProgress = document.getElementById('progress_bar'),
player = document.getElementById('video_player');
function getVideo2() {
let contentType = 'video/mp4';
fetch('$pathToVideo.mp4')
.then(response => {
const contentEncoding = response.headers.get('content-encoding');
const contentLength = response.headers.get(contentEncoding ? 'x-file-size' : 'content-length');
contentType = response.headers.get('content-type') || contentType;
if (contentLength === null) {
throw Error('Response size header unavailable');
}
const total = parseInt(contentLength, 10);
let loaded = 0;
return new Response(
new ReadableStream({
start(controller) {
const reader = response.body.getReader();
read();
function read() {
reader.read().then(({done, value}) => {
if (done) {
controller.close();
return;
}
loaded += value.byteLength;
progress({loaded, total})
controller.enqueue(value);
read();
}).catch(error => {
console.error(error);
controller.error(error)
})
}
}
})
);
})
.then(response => response.blob())
.then(blob => {
let vid = URL.createObjectURL(blob);
player.style.display = 'block';
player.type = contentType;
player.src = vid;
elProgress.innerHTML += "<br /> Press play!";
})
.catch(error => {
console.error(error);
})
}
function progress({loaded, total}) {
elProgress.innerHTML = Math.round(loaded / total * 100) + '%';
}
First Attempt (worse, suitable for smaller files)
My original approach. For a 1.1GB mkv, the memory usage creeps up to 1.3GB while the file is downloading, then spikes to about 3.5Gb when the chunks are being combined. Once the video starts playing, the tab's memory usage goes back down to ~200MB but Chrome's overall usage stays over 1GB.
Instead of calling response.blob() to get the blob, you can construct the blob yourself by accumulating each chunk of the video (value). Adapted from the exmaple here: https://javascript.info/fetch-progress#0d0g7tutne
//...
receivedLength += value.length;
chunks.push(value);
//...
// ==> put the chunks into a Uint8Array that the Blob constructor can use
let Uint8Chunks = new Uint8Array(receivedLength), position = 0;
for (let chunk of chunks) {
Uint8Chunks.set(chunk, position);
position += chunk.length;
}
// ==> you may want to get the mimetype from the content-type header
const blob = new Blob([Uint8Chunks], {type: 'video/mp4'})

Get WebAssembly instantiateStreaming progress

WebAssembly.instantiateStreaming is the fastest way to download and instantiate a .wasm module however for large .wasm files it can still take a long time. Simply displaying a spinner does not provide enough user feedback in this case.
Is there a way to use the WebAssembly.instantiateStreaming api and get some form of progress event so that an eta can displayed to the user? Ideally I would like to be able to display a percentage progress bar / estimated time left indicator so user's know how long they will have to wait.
Building off the answer here.
To get the progress of WebAssembly.instantiateStreaming / WebAssembly.compileStreaming create a new Fetch Response with a custom ReadableStream which implements it's own controller.
Example:
// Get your normal fetch response
var response = await fetch('https://www.example.com/example.wasm');
// Note - If you are compressing your .wasm file the Content-Length will be incorrect
// One workaround is to use a custom http header to manually specify the uncompressed size
var contentLength = response.headers.get('Content-Length');
var total = parseInt(contentLength, 10);
var loaded = 0;
function progressHandler(bytesLoaded, totalBytes)
{
// Do what you want with this info...
}
var res = new Response(new ReadableStream({
async start(controller) {
var reader = response.body.getReader();
for (;;) {
var {done, value} = await reader.read();
if (done)
{
progressHandler(total, total)
break
}
loaded += value.byteLength;
progressHandler(loaded, total)
controller.enqueue(value);
}
controller.close();
},
}, {
"status" : response.status,
"statusText" : response.statusText
}));
// Make sure to copy the headers!
// Wasm is very picky with it's headers and it will fail to compile if they are not
// specified correctly.
for (var pair of response.headers.entries()) {
res.headers.set(pair[0], pair[1]);
}
// The response (res) can now be passed to any of the streaming methods as normal
var promise = WebAssembly.instantiateStreaming(res)
Building off of various other SO answers, here is what I ended up with.
My solution also has decent fallback for Firefox, which doesn't yet have proper stream support. I opted for falling back to a good old XHR and WebAssembly.Instantiate there, as I really do want to show a loading bar, even if it means slightly slower startup just on FF.
async function fetchWithProgress(path, progress) {
const response = await fetch(path);
// May be incorrect if compressed
const contentLength = response.headers.get("Content-Length");
const total = parseInt(contentLength, 10);
let bytesLoaded = 0;
const ts = new TransformStream({
transform (chunk, ctrl) {
bytesLoaded += chunk.byteLength;
progress(bytesLoaded / total);
ctrl.enqueue(chunk)
}
});
return new Response(response.body.pipeThrough(ts), response);
}
async function initWasmWithProgress(wasmFile, importObject, progress) {
if (typeof TransformStream === "function" && ReadableStream.prototype.pipeThrough) {
let done = false;
const response = await fetchWithProgress(wasmFile, function() {
if (!done) {
progress.apply(null, arguments);
}
});
await WebAssembly.InstantiateStreaming(response, importObject);
done = true;
progress(1);
} else {
// xhr fallback, this is slower and doesn't use WebAssembly.InstantiateStreaming,
// but it's only happening on Firefox, and we can probably live with the game
// starting slightly slower there...
const xhr = new XMLHttpRequest();
await new Promise(function(resolve, reject) {
xhr.open("GET", wasmFile);
xhr.responseType = "arraybuffer";
xhr.onload = resolve;
xhr.onerror = reject;
xhr.onprogress = e => progress(e.loaded / e.total);
xhr.send();
});
await WebAssembly.Instantiate(xhr.response, importObject);
progress(1);
}
}
const wasmFile = "./wasm.wasm";
await initWasmWithProgress(wasmFile, importObject, p => console.log(`progress: ${p*100}%`));
console.log("Initialized wasm");

Firebase Storage: string does not match format base64: invalid character found. Only when debug is off

I'm trying to upload an image file to firebase storage, save the download URL, and load it after the upload is completed. When I run the app with debug js remotely on it works fine. When I turn off debug mode it stops working with the invalid format exception. The same happens when I run in a real device (both iOS and Android)
The base64 response data from React Native Image Picker seems to be correct
Here's my code
...
import * as ImagePicker from 'react-native-image-picker'; //0.26.10
import firebase from 'firebase'; //4.9.1
...
handleImagePicker = () => {
const { me } = this.props;
const options = {
title: 'Select pic',
storageOptions: {
skipBackup: true,
path: 'images'
},
mediaType: 'photo',
quality: 0.5,
};
ImagePicker.showImagePicker(options, async (response) => {
const storageRef = firebase.storage().ref(`/profile-images/user_${me.id}.jpg`);
const metadata = {
contentType: 'image/jpeg',
};
const task = storageRef.putString(response.data, 'base64', metadata);
return new Promise((resolve, reject) => {
task.on(
'state_changed',
(snapshot) => {
var progress = (snapshot.bytesTransferred / snapshot.totalBytes) * 100;
console.log('Upload is ' + progress + '% done');
},
(error) =>
console.log(error),
() => {
this.onChangeProfileImage();
}
);
});
}
}
onChangeProfileImage = async () => {
const { me } = this.props;
const storageRef = firebase.storage().ref(`/profile-images/user_${me.id}.jpg`);
const profileImageUrl = await new Promise((resolve, reject) => {
storageRef.getDownloadURL()
.then((url) => {
resolve(url);
})
.catch((error) => {
console.log(error);
});
});
// some more logic to store profileImageUrl in the database
}
Any idea how to solve this?
Thanks in advance.
After some research and debug I found the cause of the issue and a solution for it.
Why does it happen?
Firebase uses atob method to decode the base64 string sent by putstring method.
However, since JavaScriptCore doesn't have a default support to atob and btoa, the base64 string can't be converted, so this exception is triggered.
When we run the app in debug javascript remotely mode, all javascript code is run under chrome environment, where atob and btoa are supported. That's why the code works when debug is on and doesn't when its off.
How to solve?
To handle atob and btoa in React Native, we should either write our own encode/decode method, or install a lib to handle it for us.
In my case I preferred to install base-64 lib
But here's an example of a encode/decode script:
const chars = 'ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789+/=';
const Base64 = {
btoa: (input:string = '') => {
let str = input;
let output = '';
for (let block = 0, charCode, i = 0, map = chars;
str.charAt(i | 0) || (map = '=', i % 1);
output += map.charAt(63 & block >> 8 - i % 1 * 8)) {
charCode = str.charCodeAt(i += 3/4);
if (charCode > 0xFF) {
throw new Error("'btoa' failed: The string to be encoded contains characters outside of the Latin1 range.");
}
block = block << 8 | charCode;
}
return output;
},
atob: (input:string = '') => {
let str = input.replace(/=+$/, '');
let output = '';
if (str.length % 4 == 1) {
throw new Error("'atob' failed: The string to be decoded is not correctly encoded.");
}
for (let bc = 0, bs = 0, buffer, i = 0;
buffer = str.charAt(i++);
~buffer && (bs = bc % 4 ? bs * 64 + buffer : buffer,
bc++ % 4) ? output += String.fromCharCode(255 & bs >> (-2 * bc & 6)) : 0
) {
buffer = chars.indexOf(buffer);
}
return output;
}
};
export default Base64;
Usage:
import Base64 from '[path to your script]';
const stringToEncode = 'xxxx';
Base64.btoa(scriptToEncode);
const stringToDecode = 'xxxx';
Base64.atob(stringToDecode);
After choosing either to use the custom script or the lib, now we must add the following code to the index.js file:
import { decode, encode } from 'base-64';
if (!global.btoa) {
global.btoa = encode;
}
if (!global.atob) {
global.atob = decode;
}
AppRegistry.registerComponent(appName, () => App);
This will declare atob and btoa globally. So whenever in the app those functions are called, React Native will use the global scope to handle it, and then trigger the encode and decode methods from base-64 lib.
So this is the solution for Base64 issue.
However, after this is solved, I found another issue Firebase Storage: Max retry time for operation exceed. Please try again when trying to upload larger images. It seems that firebase has some limitation on support to React Native uploads, as this issue suggests.
I believe that react-native-firebase may not struggle on this since it's already prepared to run natively, instead of using the web environment as firebase does. I didn't test it yet to confirm, but it looks like this will be the best approach to handle it.
Hope this can be helpful for someone else.
The problem is now solved using fetch() API. The promise returned can be converted to blob which you can upload to firebase/storage
Here is an example
let storageRef = storage().ref();
let imageName = data.name + "image";
let imagesRef = storageRef.child(`images/${imageName}`);
const response = await fetch(image);
const blob = await response.blob(); // Here is the trick
imagesRef
.put(blob)
.then((snapshot) => {
console.log("uploaded an image.");
})
.catch((err) => console.log(err));

Upload image taken with camera to firebase storage on React Native

All I want to do is upload a photo taken using react-native-camera to firebase storage with react-native-fetch-blob, but no matter what I do it doesn't happen.
I've gone through all of the documentations I can find and nothing seems to work.
If anyone has a working system for accomplishing this please post it as an answer. I can get the uri of the jpg that react-native-camera returns (it displays in the ImageView and everything), but my upload function seems to stop working when it's time to put the blob.
My current function:
uploadImage = (uri, imageName, mime = 'image/jpg') => {
return new Promise((resolve, reject) => {
const uploadUri = Platform.OS === 'ios' ? uri.replace('file://', '') : uri
let uploadBlob = null
const imageRef = firebase.storage().ref('selfies').child(imageName)
console.log("uploadUri",uploadUri)
fs.readFile(uploadUri, 'base64').then((data) => {
console.log("MADE DATA")
var blobEvent = new Blob(data, 'image/jpg;base64')
var blob = null
blobEvent.onCreated(genBlob => {
console.log("CREATED BLOB EVENT")
blob = genBlob
firebase.storage().ref('selfies').child(imageName).put(blob).then(function(snapshot) {
console.log('Uploaded a blob or file!');
firebase.database().ref("selfies/" + firebase.auth().currentUser.uid).set(0)
var updates = {};
updates["/users/" + firebase.auth().currentUser.uid + "/signup/"] = 1;
firebase.database().ref().update(updates);
});
}, (error) => {
console.log('Upload Error: ' + error)
alert(error)
}, () => {
console.log('Completed upload: ' + uploadTask.snapshot.downloadURL)
})
})
}).catch((error) => {
alert(error)
})
}
I want to be as efficient as possible, so if it's faster and takes less memory to not change it to base64, then I prefer that. Right now I just have no clue how to make this work.
This has been a huge source of stress in my life and I hope someone has this figured out.
The fastest approach would be to use the native android / ios sdk's and avoid clogging the JS thread, there are a few libraries out there that will provide a react native module to do just this (they all have a small js api that communicates over react natives bridge to the native side where all the firebase logic runs)
react-native-firebase is one such library. It follows the firebase web sdk's api, so if you know how to use the web sdk then you should be able to use the exact same logic with this module as well as additional firebase apis that are only available on the native SDKS.
For example, it has a storage implementation included and a handy putFile function, which you can provide it with a path to a file on the device and it'll upload it for you using the native firebase sdks, no file handling is done on the JS thread and is therefore extremely fast.
Example:
// other built in paths here: https://github.com/invertase/react-native-firebase/blob/master/lib/modules/storage/index.js#L146
const imagePath = firebase.storage.Native.DOCUMENT_DIRECTORY_PATH + '/myface.png';
const ref = firebase.storage().ref('selfies').child('/myface.png');
const uploadTask = ref.putFile(imagePath);
// .on observer is completely optional (can instead use .then/.catch), but allows you to
// do things like a progress bar for example
uploadTask.on(firebase.storage.TaskEvent.STATE_CHANGED, (snapshot) => {
// observe state change events such as progress
// get task progress, including the number of bytes uploaded and the total number of bytes to be uploaded
const progress = (snapshot.bytesTransferred / snapshot.totalBytes) * 100;
console.log(`Upload is ${progress}% done`);
switch (snapshot.state) {
case firebase.storage.TaskState.SUCCESS: // or 'success'
console.log('Upload is complete');
break;
case firebase.storage.TaskState.RUNNING: // or 'running'
console.log('Upload is running');
break;
default:
console.log(snapshot.state);
}
}, (error) => {
console.error(error);
}, () => {
const uploadTaskSnapshot = uploadTask.snapshot;
// task finished
// states: https://github.com/invertase/react-native-firebase/blob/master/lib/modules/storage/index.js#L139
console.log(uploadTaskSnapshot.state === firebase.storage.TaskState.SUCCESS);
console.log(uploadTaskSnapshot.bytesTransferred === uploadTaskSnapshot.totalBytes);
console.log(uploadTaskSnapshot.metadata);
console.log(uploadTaskSnapshot.downloadUrl)
});
Disclaimer: I am the author of react-native-firebase.

Categories