Download file to Firebase Storage via Firebase Functions - javascript

I have a program that allows users to upload video files to firebase storage. If the file is not an mp4 I then send it to a third party video converting site to convert it to mp4. They hit a webhook(firebase function) with the URL and other information about the converted file.
Right now I'm trying to download the file to the tmp dir in firebase functions and then send it to firebase storage. I have the following questions.
Can I bypass downloading a file to the functions tmp dir and just save it directly to storage? if so how?
I'm currently having trouble downloading to the functions tmp dir below is my code. The function is returning Function execution took 6726 ms, finished with status: 'crash'
export async function donwloadExternalFile(url:string, fileName:string) {
try {
const axios = await import('axios')
const fs = await import('fs')
const workingDir = join(tmpdir(), 'downloadedFiles')
const tmpFilePath = join(workingDir, fileName)
const writer = fs.createWriteStream(tmpFilePath);
const response = await axios.default.get(url, { responseType: 'stream' })
response.data.pipe(writer);
await new Promise((resolve, reject) => {
writer.on('error', err => {
writer.close();
reject(err);
});
writer.on('close', () => {
resolve();
});
});
return
} catch (error) {
throw error
}
}

As mentioned above in the comments section, you can use the Cloud Storage Node.js SDK to effectuate the upload of your file to Cloud Storage.
Please take a look at the SDK Client reference documentation where you can find numerous samples and more information about this Cloud Storage client library.
Also, I'd like to remind you that you can bypass writing to /tmp by using a pipeline. According to the documentation for Cloud Functions, "you can process a file on Cloud Storage by creating a read stream, passing it through a stream-based process, and writing the output stream directly to Cloud Storage."
Last but not least, always delete temporary files from the Cloud Function's local system. Failing to do so can eventually result in out-of-memory issues and subsequent cold starts.

Related

Uploading a file from front-end to firebase storage via cloud functions

I am trying to achieve the following:
User selects file on website
User calls Firebase Cloud function and passes file into the function
Cloud function uploads the file that to storage.
So far I am able to do all of the above, however, when I try to access the above file in storage, a file with no extension is downloaded. The original file was a pdf, but I am still unable able to open it with PDF viewers. It appears I am storing something in storage, although I am not exactly sure what.
Here is an example of how my front-end code works:
const getBase64 = file => new Promise((resolve, reject) => {
const reader = new FileReader();
reader.readAsDataURL(file);
reader.onload = () => resolve(reader.result);
reader.onerror = error => reject(error);
});
var document_send = document.getElementById('myFile')
var send_button = document.getElementById('send_button')
send_button.addEventListener('click', async () => {
var sendDocument = firebase.functions().httpsCallable('sendDocument')
try {
await sendDocument({
docu: await getBase64(document_send.files[0])
})
} catch (error) {
console.log(error.message);
}
})
Here is an example of how my cloud function works:
const functions = require("firebase-functions");
const admin = require("firebase-admin");
exports.sendDocument = functions.https
.onCall((data, context) => {
return admin.storage().bucket()
.file("randomLocationName")
//.file("randomLocationName"+".pdf") - tried this also
.save(data.docu);
})
.catch((error) => {
console.log(error.message);
return error;
});
});
I do not receive an error message as the function runs without error.
The save() function seems to take either a string or Buffer as first parameter.
> save(data: string | Buffer, options?: SaveOptions)
The issue arises when you pass the base64 string directly instead of a Buffer. Try refactoring the code as shown below:
return admin.storage().bucket()
.file("randomLocationName" + ".pdf") // <-- file extension required
.save(Buffer.from(data.docu, "base64"));
Cloud Functions also have a 10 MB max request size so you won't be able to upload large images that way. You can use Firebase client SDKs to upload files directly, restrict access using security rules and use Cloud Storage Triggers for Cloud Function in case you want to process the file and update the database. Alternatively, use signed URLs for uploading the files if you are using GCS without Firebase.

Upload File to Firebase Ddmin SDK (GCS) with https stream

When trying to upload a stream into a Google bucket I am getting Error: Not Found when using get method and Error: socket hang up after a few second delay when using the request method.
Everything with firebase seems to be initialized fine, and when I log the stream I see the data coming through, but what would be the best way to write a file to GCS using a remote URL?
const storage = firebase.storage()
const bucket = storage.bucket("bucket/path")
const file = bucket.file("filename.pdf")
const url =
"https://url/to/file/filename.pdf"
https.get(url, async (res) => {
console.log(res)
res.pipe(file.createWriteStream())
})
The cause of the issue was passing the folder path into the bucket name instead of the file name.
Bucket name is available in the storage console, and do not pass in a folder path.
Bucket name example:
gs://bucket.appspot.com
(remove the gs:// when passing it as a value)
const bucket = storage.bucket("bucketname")
const file = bucket.file("bucket/path/filename.pdf")

Firebase Storage - Get actual data instead of downloadURL

Is there any way to get the actual data instead of downladURL from firebase storage ? In my case, i store string(some amount of html) to the storage and i want to get the actual data when its needed.
But i can't figure out how to do it. According to firebase documentation i can get the download able url but can't fetch the actual data.
Here is the function to fetch data from the storage(In the test case i can get the url properly, but i need the actual data)
// Create a reference to the file we want to download
var starsRef = storageRef.child('images/stars.jpg');
// Get the download URL
starsRef.getDownloadURL().then(function(url) {
// Insert url into an <img> tag to "download"
})
Thanks
Update 17.02.2020
I solve my problem, my mistake! Its possible to download file from storage using ajax request which is mentioned in the docs. Here is the simple function i define which return a promise and after resolving you can get the actual file/data.
async updateToStorage(pathArray, dataToUpload) {
let address = pathArray.join("/");
// Create a storage ref
let storageRef = firebase.storage().ref(address);
// Upload file as string format, known as firebase task
let uploadPromise = await storageRef.putString(dataToUpload);
let url = await uploadPromise.ref.getDownloadURL();
const res = await fetch(url);
const content = await res.text();
return content;
}
const avatar = await updateToStorage(['storage', 'uid', 'avatarUrl'], avatar.png);
//avatar will be the actual image after download.
The Cloud Storage for Firebase APIs for JavaScript running in web browsers actually don't provide a way to download the raw data of a file. This is different on Android and iOS. Notice that StorageReference doesn't have any direct accessors for data, unlike the Android and iOS equivalents. I don't know why this is. Consider it a feature request that you can file with Firebase support.
You will probably need to set up some sort of API endpoint that your code can call, routed through the web server that serves your web site, or through something else that supports CORS so that you can make an request from the browser that crosses web domains without security issues.

How to upload image on hyperledger composer playground?

I am trying to build a Block chain application for distributed image sharing and copyright protection. I am using image as an asset.
So now I want to upload an image on Hyper ledger Composer playground. How can I do that?
You can store your file data into the IPFS. IPFS is a protocol and network designed to create a content-addressable, peer-to-peer method of storing and sharing hypermedia in a distributed file system.
For IPFS I recommend you to follow the link
In your application, In js file where you need to store Image. There you have to just write ipfs connectivity code. When you run the application at that time just make sure ipfs daemon started.
IPFS will give you a Hash link after successfully upload a file. You can store that hash into an asset or participate of hyperledger composer.
for example
function toIPFS(file) {
return new Promise(resolve => {
const reader = new FileReader();
reader.onloadend = function() {
const ipfs = window.IpfsApi('ipfs', 5001,{protocol : "https"}) // Connect to IPFS
const buf = buffer.Buffer(reader.result) // Convert data into buffer
ipfs.files.add(buf, (err, result) => { // Upload buffer to IPFS
if(err) {
return
}
let url = `https://ipfs.io/ipfs/${result[0].hash}`
resolve('resolved url');
})
}
reader.readAsArrayBuffer(file); // Read Provided File
});
}
I hope it will help you. :)

NodeJs Microsoft Azure Storage SDK Download File to Stream

I just started working with the Microsoft Azure Storage SDK for NodeJS (https://github.com/Azure/azure-storage-node) and already successfully uploaded my first pdf files to the cloud storage.
However, now I started looking at the documentation, in order to download my files as a node_buffer (so I dont have to use fs.createWriteStream), however the documentation is not giving any examples of how this works. The only thing they are writing is "There are also several ways to download files. For example, getFileToStream downloads the file to a stream:", but then they only show one example, which is using the fs.createWriteStream, which I dont want to use.
I was also not able to find anything on Google that really helped me, so I was wondering if anybody has experience with doing this and could share a code sample with me?
The getFileToStream function need a writable stream as param. If you want all the data wrote to a Buffer instead of a file, you just need to create a custom writable stream.
const { Writable } = require('stream');
let bufferArray = [];
const myWriteStream = new Writable({
write(chunk, encoding, callback) {
bufferArray.push(...chunk)
callback();
}
});
myWriteStream.on('finish', function () {
// all the data is stored inside this dataBuffer
let dataBuffer = Buffer.from(bufferArray);
})
then pass myWriteStream to getFileToStream function
fileService.getFileToStream('taskshare', 'taskdirectory', 'taskfile', myWriteStream, function(error, result, response) {
if (!error) {
// file retrieved
}
});

Categories