Related
After uploading a file in Firebase Storage with Functions for Firebase, I'd like to get the download url of the file.
I have this :
...
return bucket
.upload(fromFilePath, {destination: toFilePath})
.then((err, file) => {
// Get the download url of file
});
The object file has a lot of parameters. Even one named mediaLink. However, if I try to access this link, I get this error :
Anonymous users does not have storage.objects.get access to object ...
Can somebody tell me how to get the public download Url?
Thank you
You'll need to generate a signed URL using getSignedURL via the #google-cloud/storage NPM module.
Example:
const gcs = require('#google-cloud/storage')({keyFilename: 'service-account.json'});
// ...
const bucket = gcs.bucket(bucket);
const file = bucket.file(fileName);
return file.getSignedUrl({
action: 'read',
expires: '03-09-2491'
}).then(signedUrls => {
// signedUrls[0] contains the file's public URL
});
You'll need to initialize #google-cloud/storage with your service account credentials as the application default credentials will not be sufficient.
UPDATE: The Cloud Storage SDK can now be accessed via the Firebase Admin SDK, which acts as a wrapper around #google-cloud/storage. The only way it will is if you either:
Init the SDK with a special service account, typically through a second, non-default instance.
Or, without a service account, by giving the default App Engine service account the "signBlob" permission.
This answer will summarize the options for getting a download URL when uploading a file to Google/Firebase Cloud Storage. There are three types of download URLS:
Token download URLs, which are persistent and have security features
Signed download URLs, which are temporary and have security features
Public download URLs, which are persistent and lack security
There are two ways to get a token download URL. Signed and public download URLs each have only one way to get them.
Token URL method #1: From the Firebase Storage Console
You can get the download URL from Firebase Storage console:
The download URL looks like this:
https://firebasestorage.googleapis.com/v0/b/languagetwo-cd94d.appspot.com/o/Audio%2FEnglish%2FUnited_States-OED-0%2Fabout.mp3?alt=media&token=489c48b3-23fb-4270-bd85-0a328d2808e5
The first part is a standard path to your file. At the end is the token. This download URL is permanent, i.e., it won't expire, although you can revoke it.
Token URL method #2: From the Front End
The documentation tells us to use getDownloadURL():
let url = await firebase.storage().ref('Audio/English/United_States-OED-' + i +'/' + $scope.word.word + ".mp3").getDownloadURL();
This gets the same download URL that you can get from your Firebase Storage console. This method is easy but requires that you know the path to your file, which in my app is difficult. You could upload files from the front end, but this would expose your credentials to anyone who downloads your app. So for most projects you'll want to upload your files from your Cloud Functions, then get the download URL and save it to your database along with other data about your file.
I can't find a way to get the token download URL when I write a file to Storage from a Cloud Function (because I can't find a way to tell the front end that a file has written to Storage), but what works for me is to write a file to a publicly available URL, write the publicly available URL to Firebase, then when my Angular front end gets the download URL from Firebase it also runs getDownloadURL(), which has the token, then compares the download URL in Firestore to the token download URL, and if they don't match then it updates the token download URL in place of the publicly available URL in Firestore. This exposes your file to the public only once.
This is easier than it sounds. The following code iterates through an array of Storage download URLs and replaces publicly available download URLs with token download URLs.
const storage = getStorage();
var audioFiles: string[] = [];
if (this.pronunciationArray[0].pronunciation != undefined) {
for (const audioFile of this.pronunciationArray[0].audioFiles) { // for each audio file in array
let url = await getDownloadURL(ref(storage, audioFile)); // get the download url with token
if (audioFile !== url) { // download URLs don't match
audioFiles.push(url);
} // end inner if
}; // end for of loop
if (audioFiles.length > 0) { // update Firestore only if we have new download URLs
await updateDoc(doc(this.firestore, 'Dictionaries/' + this.l2Language.long + '/Words/' + word + '/Pronunciations/', this.pronunciationArray[0].pronunciation), {
audioFiles: audioFiles
});
}
} // end outer if
You're thinking, "I'll return the Storage location from my Cloud Function to my front end and then use the location with getDownloadURL() to write the token download URL to Firestore." That won't work because Cloud Functions can only return synchronous results. Async operations return null.
"No problem," you say. "I'll set up an Observer on Storage, get the location from the Observer, and then use the location with getDownloadURL() to write the token download URL to Firestore." No dice. Firestore has Observers. Storage doesn't have Observers.
"How about," you say, "calling listAll() from my front end, getting a list of all my Storage files, then calling the metadata for each file, and extracting the download URL and token for each file, and then writing these to Firestore?" Good try, but Storage metadata doesn't include the download URL or token.
Signed URL method #1: getSignedUrl() for Temporary Download URLs
getSignedUrl() is easy to use from a Cloud Function:
function oedPromise() {
return new Promise(function(resolve, reject) {
http.get(oedAudioURL, function(response) {
response.pipe(file.createWriteStream(options))
.on('error', function(error) {
console.error(error);
reject(error);
})
.on('finish', function() {
file.getSignedUrl(config, function(err, url) {
if (err) {
console.error(err);
return;
} else {
resolve(url);
}
});
});
});
});
}
A signed download URL looks like this:
https://storage.googleapis.com/languagetwo-cd94d.appspot.com/Audio%2FSpanish%2FLatin_America-Sofia-Female-IBM%2Faqu%C3%AD.mp3?GoogleAccessId=languagetwo-cd94d%40appspot.gserviceaccount.com&Expires=4711305600&Signature=WUmABCZIlUp6eg7dKaBFycuO%2Baz5vOGTl29Je%2BNpselq8JSl7%2BIGG1LnCl0AlrHpxVZLxhk0iiqIejj4Qa6pSMx%2FhuBfZLT2Z%2FQhIzEAoyiZFn8xy%2FrhtymjDcpbDKGZYjmWNONFezMgYekNYHi05EPMoHtiUDsP47xHm3XwW9BcbuW6DaWh2UKrCxERy6cJTJ01H9NK1wCUZSMT0%2BUeNpwTvbRwc4aIqSD3UbXSMQlFMxxWbPvf%2B8Q0nEcaAB1qMKwNhw1ofAxSSaJvUdXeLFNVxsjm2V9HX4Y7OIuWwAxtGedLhgSleOP4ErByvGQCZsoO4nljjF97veil62ilaQ%3D%3D
The signed URL has an expiration date and long signature. The documentation for the command line gsutil signurl -d says that signed URLs are temporary: the default expiration is one hour and the maximum expiration is seven days.
I'm going to rant here that the getSignedUrl documentation never says that your signed URL will expire in a week. The documentation code has 3-17-2025 as the expiration date, suggesting that you can set the expiration years in the future. My app worked perfectly, and then crashed a week later. The error message said that the signatures didn't match, not that the download URL had expired. I made various changes to my code, and everything worked...until it all crashed a week later. This went on for more than a month of frustration. Is the 3-17-2025 date an inside joke? Like the gold coins of a leprechaun that vanish when the leprechaun is out of sight, a St. Patrick's Day expiry date years in the future vanishes in two weeks, just when you thought your code was bug-free.
Public URL #1: Make Your File Publicly Available
You can set the permissions on your file to public read, as explained in the documentation. This can be done from the Cloud Storage Browser or from your Node server. You can make one file public or a directory or your entire Storage database. Here's the Node code:
var webmPromise = new Promise(function(resolve, reject) {
var options = {
destination: ('Audio/' + longLanguage + '/' + pronunciation + '/' + word + '.mp3'),
predefinedAcl: 'publicRead',
contentType: 'audio/' + audioType,
};
synthesizeParams.accept = 'audio/webm';
var file = bucket.file('Audio/' + longLanguage + '/' + pronunciation + '/' + word + '.webm');
textToSpeech.synthesize(synthesizeParams)
.then(function(audio) {
audio.pipe(file.createWriteStream(options));
})
.then(function() {
console.log("webm audio file written.");
resolve();
})
.catch(error => console.error(error));
});
The result will look like this in your Cloud Storage Browser:
Anyone can then use the standard path to download your file:
https://storage.googleapis.com/languagetwo-cd94d.appspot.com/Audio/English/United_States-OED-0/system.mp3
Another way to make a file public is to use the method makePublic(). I haven't been able to get this to work, it's tricky to get the bucket and file paths right.
An interesting alternative is to use Access Control Lists. You can make a file available only to users whom you put on a list, or use authenticatedRead to make the file available to anyone who is logged in from a Google account. If there were an option "anyone who logged into my app using Firebase Auth" I would use this, as it would limit access to only my users.
Deprecated: Build Your Own Download URL with firebaseStorageDownloadTokens
Several answers describe an undocumented Google Storage object property firebaseStorageDownloadTokens. This was never an official Google Cloud Storage feature and it no longer works. Here's how it used to work.
You told Storage the token you wanted to use. You then generated a token with the uuid Node module. Four lines of code and you could build your own download URL, the same download URL you get from the console or getDownloadURL(). The four lines of code are:
const uuidv4 = require('uuid/v4');
const uuid = uuidv4();
metadata: { firebaseStorageDownloadTokens: uuid }
https://firebasestorage.googleapis.com/v0/b/" + bucket.name + "/o/" + encodeURIComponent('Audio/' + longLanguage + '/' + pronunciation + '/' + word + '.webm') + "?alt=media&token=" + uuid);
Here's the code in context:
var webmPromise = new Promise(function(resolve, reject) {
var options = {
destination: ('Audio/' + longLanguage + '/' + pronunciation + '/' + word + '.mp3'),
contentType: 'audio/' + audioType,
metadata: {
metadata: {
firebaseStorageDownloadTokens: uuid,
}
}
};
synthesizeParams.accept = 'audio/webm';
var file = bucket.file('Audio/' + longLanguage + '/' + pronunciation + '/' + word + '.webm');
textToSpeech.synthesize(synthesizeParams)
.then(function(audio) {
audio.pipe(file.createWriteStream(options));
})
.then(function() {
resolve("https://firebasestorage.googleapis.com/v0/b/" + bucket.name + "/o/" + encodeURIComponent('Audio/' + longLanguage + '/' + pronunciation + '/' + word + '.webm') + "?alt=media&token=" + uuid);
})
.catch(error => console.error(error));
});
That's not a typo--you have to nest firebaseStorageDownloadTokens in double layers of metadata:!
Here's an example on how to specify the download token on upload:
const UUID = require("uuid-v4");
const fbId = "<YOUR APP ID>";
const fbKeyFile = "./YOUR_AUTH_FIlE.json";
const gcs = require('#google-cloud/storage')({keyFilename: fbKeyFile});
const bucket = gcs.bucket(`${fbId}.appspot.com`);
var upload = (localFile, remoteFile) => {
let uuid = UUID();
return bucket.upload(localFile, {
destination: remoteFile,
uploadType: "media",
metadata: {
contentType: 'image/png',
metadata: {
firebaseStorageDownloadTokens: uuid
}
}
})
.then((data) => {
let file = data[0];
return Promise.resolve("https://firebasestorage.googleapis.com/v0/b/" + bucket.name + "/o/" + encodeURIComponent(file.name) + "?alt=media&token=" + uuid);
});
}
then call with
upload(localPath, remotePath).then( downloadURL => {
console.log(downloadURL);
});
The key thing here is that there is a metadata object nested within the metadata option property. Setting firebaseStorageDownloadTokens to a uuid-v4 value will tell Cloud Storage to use that as its public auth token.
Many thanks to #martemorfosis
If you're working on a Firebase project, you can create signed URLs in a Cloud Function without including other libraries or downloading a credentials file. You just need to enable the IAM API and add a role to your existing service account (see below).
Initialize the admin library and get a file reference as your normally would:
import * as functions from 'firebase-functions'
import * as admin from 'firebase-admin'
admin.initializeApp(functions.config().firebase)
const myFile = admin.storage().bucket().file('path/to/my/file')
You then generate a signed URL with
myFile.getSignedUrl({action: 'read', expires: someDateObj}).then(urls => {
const signedUrl = urls[0]
})
Make sure your Firebase service account has sufficient permissions to run this
Go to the Google API console and enable the IAM API (https://console.developers.google.com/apis/api/iam.googleapis.com/overview)
Still in the API console, go to the main menu, "IAM & admin" -> "IAM"
Click edit for the "App Engine default service account" role
Click "Add another role", and add the one called "Service Account Token Creator"
Save and wait a minute for the changes to propagate
With a vanilla Firebase config, the first time you run the above code you'll get an error Identity and Access Management (IAM) API has not been used in project XXXXXX before or it is disabled.. If you follow the link in the error message and enable the IAM API, you'll get another error: Permission iam.serviceAccounts.signBlob is required to perform this operation on service account my-service-account. Adding the Token Creator role fixes this second permission issue.
You should avoid harcoding URL prefix in your code, especially when there are alternatives. I suggest using the option predefinedAcl: 'publicRead' when uploading a file with Cloud Storage NodeJS 1.6.x or +:
const options = {
destination: yourFileDestination,
predefinedAcl: 'publicRead'
};
bucket.upload(attachment, options);
Then, getting the public URL is as simple as:
bucket.upload(attachment, options).then(result => {
const file = result[0];
return file.getMetadata();
}).then(results => {
const metadata = results[0];
console.log('metadata=', metadata.mediaLink);
}).catch(error => {
console.error(error);
});
With the recent changes in the functions object response you can get everything you need to "stitch" together the download URL like so:
const img_url = 'https://firebasestorage.googleapis.com/v0/b/[YOUR BUCKET]/o/'
+ encodeURIComponent(object.name)
+ '?alt=media&token='
+ object.metadata.firebaseStorageDownloadTokens;
console.log('URL',img_url);
This is what I currently use, it's simple and it works flawlessly.
You don't need to do anything with Google Cloud. It works out of the box with Firebase..
// Save the base64 to storage.
const file = admin.storage().bucket('url found on the storage part of firebase').file(`profile_photos/${uid}`);
await file.save(base64Image, {
metadata: {
contentType: 'image/jpeg',
},
predefinedAcl: 'publicRead'
});
const metaData = await file.getMetadata()
const url = metaData[0].mediaLink
EDIT:
Same example, but with upload:
await bucket.upload(fromFilePath, {destination: toFilePath});
file = bucket.file(toFilePath);
metaData = await file.getMetadata()
const trimUrl = metaData[0].mediaLink
#update:
no need to make two different call in upload method to get the metadata:
let file = await bucket.upload(fromFilePath, {destination: toFilePath});
const trimUrl = file[0].metaData.mediaLink
For those wondering where the Firebase Admin SDK serviceAccountKey.json file should go. Just place it in the functions folder and deploy as usual.
It still baffles me why we can't just get the download url from the metadata like we do in the Javascript SDK. Generating a url that will eventually expire and saving it in the database is not desirable.
One method I'm using with success is to set a UUID v4 value to a key named firebaseStorageDownloadTokens in the metadata of the file after it finishes uploading and then assemble the download URL myself following the structure Firebase uses to generate these URLs, eg:
https://firebasestorage.googleapis.com/v0/b/[BUCKET_NAME]/o/[FILE_PATH]?alt=media&token=[THE_TOKEN_YOU_CREATED]
I don't know how much "safe" is to use this method (given that Firebase could change how it generates the download URLs in the future ) but it is easy to implement.
Sorry but i can't post a comment to your question above because of missing reputation, so I will include it in this answer.
Do as stated above by generating a signed Url, but instead of using the service-account.json I think you have to use the serviceAccountKey.json which you can generate at (replace YOURPROJECTID accordingly)
https://console.firebase.google.com/project/YOURPROJECTID/settings/serviceaccounts/adminsdk
Example:
const gcs = require('#google-cloud/storage')({keyFilename: 'serviceAccountKey.json'});
// ...
const bucket = gcs.bucket(bucket);
// ...
return bucket.upload(tempLocalFile, {
destination: filePath,
metadata: {
contentType: 'image/jpeg'
}
})
.then((data) => {
let file = data[0]
file.getSignedUrl({
action: 'read',
expires: '03-17-2025'
}, function(err, url) {
if (err) {
console.error(err);
return;
}
// handle url
})
I can't comment on the answer James Daniels gave, but I think this is very Important to read.
Giving out a signed URL Like he did seems for many cases pretty bad and possible Dangerous.
According to the documentation of Firebase the signed url expires after some time, so adding that to your databse will lead to a empty url after a certain timeframe
It may be that misunderstood the Documentation there and the signed url doesn't expire, which would have some security issues as a result.
The Key seems to be the same for every uploaded file. This means once you got the url of one file, someone could easily access files that he is not suposed to access, just by knowing their names.
If i missunderstood that then i would lvoe to be corrected.
Else someone should probably Update the above named solution.
If i may be wrong there
If you use the predefined access control lists value of 'publicRead', you can upload the file and access it with a very simple url structure:
// Upload to GCS
const opts: UploadOptions = {
gzip: true,
destination: dest, // 'someFolder/image.jpg'
predefinedAcl: 'publicRead',
public: true
};
return bucket.upload(imagePath, opts);
You can then construct the url like so:
const storageRoot = 'https://storage.googleapis.com/';
const bucketName = 'myapp.appspot.com/'; // CHANGE TO YOUR BUCKET NAME
const downloadUrl = storageRoot + bucketName + encodeURIComponent(dest);
Use file.publicUrl()
Async/Await
const bucket = storage.bucket('bucket-name');
const uploadResponse = await bucket.upload('image-name.jpg');
const downloadUrl = uploadResponse[0].publicUrl();
Callback
const bucket = storage.bucket('bucket-name');
bucket.upload('image-name.jpg', (err, file) => {
if(!file) {
throw err;
}
const downloadUrl = file.publicUrl();
})
The downloadUrl will be "https://storage.googleapis.com/bucket-name/image-name.jpg".
Please note that in order for the above code to work, you have to make the bucket or file public. To do so, follow the instructions here https://cloud.google.com/storage/docs/access-control/making-data-public. Also, I imported the #google-cloud/storage package directly not through the Firebase SDK.
I had the same issue, however, I was looking at the code of the firebase function example instead of the README. And Answers on this thread didn't help either...
You can avoid passing the config file by doing the following:
Go to your project's Cloud Console > IAM & admin > IAM, Find the App
Engine default service account and add the Service Account Token
Creator role to that member. This will allow your app to create signed
public URLs to the images.
source: Automatically Generate Thumbnails function README
Your role for app engine should look like this:
This works if you just need a public file with a simple URL. Note that this may overrule your Firebase storage rules.
bucket.upload(file, function(err, file) {
if (!err) {
//Make the file public
file.acl.add({
entity: 'allUsers',
role: gcs.acl.READER_ROLE
}, function(err, aclObject) {
if (!err) {
var URL = "https://storage.googleapis.com/[your bucket name]/" + file.id;
console.log(URL);
} else {
console.log("Failed to set permissions: " + err);
}
});
} else {
console.log("Upload failed: " + err);
}
});
Without signedURL() using makePublic()
const functions = require('firebase-functions');
const admin = require('firebase-admin');
admin.initializeApp()
var bucket = admin.storage().bucket();
// --- [Above] for admin related operations, [Below] for making a public url from a GCS uploaded object
const { Storage } = require('#google-cloud/storage');
const storage = new Storage();
exports.testDlUrl = functions.storage.object().onFinalize(async (objMetadata) => {
console.log('bucket, file', objMetadata.bucket + ' ' + objMetadata.name.split('/').pop()); // assuming file is in folder
return storage.bucket(objMetadata.bucket).file(objMetadata.name).makePublic().then(function (data) {
return admin.firestore().collection('publicUrl').doc().set({ publicUrl: 'https://storage.googleapis.com/' + objMetadata.bucket + '/' + objMetadata.name }).then(writeResult => {
return console.log('publicUrl', writeResult);
});
});
});
answer by https://stackoverflow.com/users/269447/laurent works best
const uploadOptions: UploadOptions = {
public: true
};
const bucket = admin.storage().bucket();
[ffile] = await bucket.upload(oPath, uploadOptions);
ffile.metadata.mediaLink // this is what you need
I saw this on the admin storage doc
const options = {
version: 'v4',
action: 'read',
expires: Date.now() + 15 * 60 * 1000, // 15 minutes
};
// Get a v4 signed URL for reading the file
const [url] = await storage
.bucket(bucketName)
.file(filename)
.getSignedUrl(options);
console.log('Generated GET signed URL:');
console.log(url);
console.log('You can use this URL with any user agent, for example:');
console.log(`curl '${url}'`);
For those who are using Firebase SDK andadmin.initializeApp:
1 - Generate a Private Key and place in /functions folder.
2 - Configure your code as follows:
const serviceAccount = require('../../serviceAccountKey.json');
try { admin.initializeApp(Object.assign(functions.config().firebase, { credential: admin.credential.cert(serviceAccount) })); } catch (e) {}
Documentation
The try/catch is because I'm using a index.js that imports other files and creates one function to each file. If you're using a single index.js file with all functions, you should be ok with admin.initializeApp(Object.assign(functions.config().firebase, { credential: admin.credential.cert(serviceAccount) }));.
As of firebase 6.0.0 I was able to access the storage directly with the admin like this:
const bucket = admin.storage().bucket();
So I didn't need to add a service account. Then setting the UUID as referenced above worked for getting the firebase url.
This is the best I came up. It is redundant, but the only reasonable solution that worked for me.
await bucket.upload(localFilePath, {destination: uploadPath, public: true});
const f = await bucket.file(uploadPath)
const meta = await f.getMetadata()
console.log(meta[0].mediaLink)
I already post my ans... in below URL Where you can get full code with solution
How do I upload a base64 encoded image (string) directly to a Google Cloud Storage bucket using Node.js?
const uuidv4 = require('uuid/v4');
const uuid = uuidv4();
const os = require('os')
const path = require('path')
const cors = require('cors')({ origin: true })
const Busboy = require('busboy')
const fs = require('fs')
var admin = require("firebase-admin");
var serviceAccount = {
"type": "service_account",
"project_id": "xxxxxx",
"private_key_id": "xxxxxx",
"private_key": "-----BEGIN PRIVATE KEY-----\jr5x+4AvctKLonBafg\nElTg3Cj7pAEbUfIO9I44zZ8=\n-----END PRIVATE KEY-----\n",
"client_email": "xxxx#xxxx.iam.gserviceaccount.com",
"client_id": "xxxxxxxx",
"auth_uri": "https://accounts.google.com/o/oauth2/auth",
"token_uri": "https://oauth2.googleapis.com/token",
"auth_provider_x509_cert_url": "https://www.googleapis.com/oauth2/v1/certs",
"client_x509_cert_url": "https://www.googleapis.com/robot/v1/metadata/x509/firebase-adminsdk-5rmdm%40xxxxx.iam.gserviceaccount.com"
}
admin.initializeApp({
credential: admin.credential.cert(serviceAccount),
storageBucket: "xxxxx-xxxx" // use your storage bucket name
});
const app = express();
app.use(bodyParser.urlencoded({ extended: false }));
app.use(bodyParser.json());
app.post('/uploadFile', (req, response) => {
response.set('Access-Control-Allow-Origin', '*');
const busboy = new Busboy({ headers: req.headers })
let uploadData = null
busboy.on('file', (fieldname, file, filename, encoding, mimetype) => {
const filepath = path.join(os.tmpdir(), filename)
uploadData = { file: filepath, type: mimetype }
console.log("-------------->>",filepath)
file.pipe(fs.createWriteStream(filepath))
})
busboy.on('finish', () => {
const bucket = admin.storage().bucket();
bucket.upload(uploadData.file, {
uploadType: 'media',
metadata: {
metadata: { firebaseStorageDownloadTokens: uuid,
contentType: uploadData.type,
},
},
})
.catch(err => {
res.status(500).json({
error: err,
})
})
})
busboy.end(req.rawBody)
});
exports.widgets = functions.https.onRequest(app);
For those trying to use the token parameter to share the file and would like to use gsutil command, here is how I did it:
First you need to authenticate by running: gcloud auth
Then run:
gsutil setmeta -h "x-goog-meta-firebaseStorageDownloadTokens:$FILE_TOKEN" gs://$FIREBASE_REPO/$FILE_NAME
Then you can download the file with the following link:
https://firebasestorage.googleapis.com/v0/b/$FIREBASE_REPO/o/$FILE_NAME?alt=media&token=$FILE_TOKEN
From the Admin SDKs, you cannot retrieve the download token generated by Firebase of an uploaded file, but you can set that token when uploading by adding it in the metadata.
For those who are working on Python SDK. This is the way to do it:
from firebase_admin import storage
from uuid import uuid4
bucket = storage.bucket()
blob = bucket.blob(path_to_file)
token = str(uuid4()) # Random ID
blob.metadata = {
"firebaseStorageDownloadTokens": token
}
blob.upload_from_file(file)
You have now uploaded the file and got the URL token. You could now save the token (or even the full download URL) into your database (e.g. Firestore) and send it to the client when the file is requested and then making the client itself retrieve the file.
The full download URL looks like this:
https://firebasestorage.googleapis.com/v0/b/{bucket_name}/o/{file_name}?alt=media&token={token}
If you are getting error:
Google Cloud Functions: require(…) is not a function
try this:
const {Storage} = require('#google-cloud/storage');
const storage = new Storage({keyFilename: 'service-account-key.json'});
const bucket = storage.bucket(object.bucket);
const file = bucket.file(filePath);
.....
I have a problem:
I want a logged in user to be able to use my getRealtimeUsers function. But I get the following error: FirebaseError: Missing or insufficient permissions.
Below you will find the code. Some explanations:
On the React side, I created a firebase, registered a getRealtimeUsers function there, and tried to use it.
firebase.js file
import firebase from "firebase/app";
import "firebase/firestore";
const firebaseConfig = {
apiKey: process.env.REACT_APP_FIREBASE_KEY,
authDomain: process.env.REACT_APP_FIREBASE_DOMAIN,
projectId: process.env.REACT_APP_FIREBASE_PROJECT_ID,
storageBucket: process.env.REACT_APP_FIREBASE_STORAGE_BUCKET,
messagingSenderId: process.env.REACT_APP_FIREBASE_SENDER_ID,
appId: process.env.REACT_APP_FIREBASE_APP_ID,
measurementId: process.env.REACT_APP_FIREBASE_MEASUREMENT_ID
};
firebase.initializeApp(firebaseConfig);
export default firebase;
A function that uses the firebase I created:
import firebase from '../../firebase';
export const getRealtimeUsers = () => {
return async () => {
const db = firebase.firestore();
const unsubscribe = db.collection("users")
return unsubscribe;
}
}
Testing of the function
import React, { Component } from 'react';
import { connect } from 'react-redux';
import { getRealtimeUsers } from '../redux/actions/chatActions';
import firebase from '../firebase';
let unsubscribe;
class chat extends Component {
componentDidMount() {
unsubscribe = this.props.getRealtimeUsers()
.then(unsubscribe => {
return unsubscribe;
}).catch(error => {
console.log(error);
});
}
componentWillUnmount() {
return () => {
unsubscribe.then(f => f()).catch(error => console.log(error));
}
}
render() {
const ref = firebase.firestore().collection("users");
return (
<div>
check if works
</div>
);
}
}
const mapStateToProps = (state) => ({
});
export default connect(
mapStateToProps,
{
getRealtimeUsers
}
);
Here I let the user do the login
login.js action
export const loginUser = (userData) => (dispatch) => {
axios
.post('/login', userData)
.then((res) => {
console.log(res)
setAuthorizationHeader(res.data.token);
})
.catch((err) => {
console.log(err)
});
};
The login is done on the nodejs side, which is also where I set it to firebase:
const config = {
apiKey: "WqcVU",
authDomain: "c468.firebaseapp.com",
projectId: "c468",
storageBucket: "c468.appspot.com",
messagingSenderId: "087",
appId: "1:087:web:c912",
measurementId: "G-SQX1"
};
;
const firebase = require("firebase");
firebase.initializeApp(config);
// Log user in
exports.login = (req, res) => {
const user = {
email: req.body.email,
password: req.body.password,
};
firebase
.auth()
.signInWithEmailAndPassword(user.email, user.password)
.then((data) => {
console.log(JSON.stringify(data));
return data.user.getIdToken();
})
.catch((err) => {
console.error(err);
});
};
app.post('/login', login);
the rules
rules_version = '2';
service cloud.firestore {
match /databases/{database}/documents {
match /{document=**} {
allow read, write: if request.auth != null;
}
}
}
I want only a logged in user to use the getRealtimeUsers function, but even if the user is logged in it doesnt work. That's my problem.
I do not know at all how to solve it
Your case is really problematic, the only way to do security in rules is to access your correntUser or auth. Because you connect through the nodejs, you only return json object and there will be no access to these objects.
For you to have access to these objects what you need to do is connect once more, in react itself, with the same firebase and that's how you enjoy all the worlds that firebase has to offer.
When you log in again, the user does not need to know about it, since when you log in you will reset the password, and choose a random password, it will solve all your problems.
I think this is the most likely solution, but it is not for production
Please update me.
Is there a reason why you're not authenticating directly from the client? The thing is you have to pass in some way the authenticated state from the server to the front...
You have signed in with firebase on your server but you did not sign in with firebase on the front end. Or maybe you did in setAuthorizationHeader(res.data.token); but I'd be curious to know how you did that. It would have been interesting to be able to authenticate with nodejs and then send the tokenId which has been generated, that is data.user.getIdToken(), to the client which the client would then pass to firebase.auth().signInWithToken(token) but this is not possible. If you want to know more about it, I invite you to read this issue (a feature request).
The way to solve this problem is to use Admin SDK and create a custom token.
Firebase gives you complete control over authentication by allowing you to authenticate users or devices using secure JSON Web Tokens (JWTs). You generate these tokens on your server, pass them back to a client device, and then use them to authenticate via the signInWithCustomToken() method.
To achieve this, you must create a server endpoint that accepts sign-in credentials—such as a username and password—and, if the credentials are valid, returns a custom JWT. The custom JWT returned from your server can then be used by a client device to authenticate with Firebase (iOS, Android, web). Once authenticated, this identity will be used when accessing other Firebase services, such as the Firebase Realtime Database and Cloud Storage. Furthermore, the contents of the JWT will be available in the auth object in your Realtime Database Rules and the request.auth object in your Cloud Storage Security Rules.
Here are the steps:
First, you will add the SDK to your server:
yarn add firebase-admin --save
and import it
const admin = require("firebase-admin")
Go over to your Project settings under the "Service accounts" tab.
and click on "Generate a new private key"
a file has been downloaded and saved on your machine. From there, you will either import the file and initialize admin like this:
var serviceAccount = require("/path/to/serviceAccountKey.json");
admin.initializeApp({
credential: admin.credential.cert(serviceAccount)
});
or better yet, in the console
export GOOGLE_APPLICATION_CREDENTIALS="/path/to/serviceAccountKey.json"
and in your file
const credential = admin.credential.applicationDefault() // this will automatically find your env variable
admin.initializeApp({
credential
})
Note: don't forget to set the proper permissions to the private key file with chmod (I had to).
nodejs
firebase
.auth()
.signInWithEmailAndPassword(user.email, user.password)
.then((data) => {
admin
.auth()
.createCustomToken(data.user.uid)
.then((customToken) => {
console.log('customToken', customToken)
// send token back to client here
res.send(customToken)
})
.catch((error) => {
console.log('Error creating custom token:', error);
});
})
.catch((err) => {
console.error(err);
});
and then on the client:
// get the token and then
firebase.auth().signInWithCustomToken(token)
.then((userCredential) => {
// Signed in
var user = userCredential.user;
})
.catch((error) => {
var errorCode = error.code;
var errorMessage = error.message;
});
Another solution would be not using firebase on the client at all and use the backend as some sort of proxy between firebase and the client. The client would make a request to the backend which in turn would interrogate firebase.
in my case, faced same problem with rule's.
please give a try, change your rue and check whether its working or not
After uploading a file in Firebase Storage with Functions for Firebase, I'd like to get the download url of the file.
I have this :
...
return bucket
.upload(fromFilePath, {destination: toFilePath})
.then((err, file) => {
// Get the download url of file
});
The object file has a lot of parameters. Even one named mediaLink. However, if I try to access this link, I get this error :
Anonymous users does not have storage.objects.get access to object ...
Can somebody tell me how to get the public download Url?
Thank you
You'll need to generate a signed URL using getSignedURL via the #google-cloud/storage NPM module.
Example:
const gcs = require('#google-cloud/storage')({keyFilename: 'service-account.json'});
// ...
const bucket = gcs.bucket(bucket);
const file = bucket.file(fileName);
return file.getSignedUrl({
action: 'read',
expires: '03-09-2491'
}).then(signedUrls => {
// signedUrls[0] contains the file's public URL
});
You'll need to initialize #google-cloud/storage with your service account credentials as the application default credentials will not be sufficient.
UPDATE: The Cloud Storage SDK can now be accessed via the Firebase Admin SDK, which acts as a wrapper around #google-cloud/storage. The only way it will is if you either:
Init the SDK with a special service account, typically through a second, non-default instance.
Or, without a service account, by giving the default App Engine service account the "signBlob" permission.
This answer will summarize the options for getting a download URL when uploading a file to Google/Firebase Cloud Storage. There are three types of download URLS:
Token download URLs, which are persistent and have security features
Signed download URLs, which are temporary and have security features
Public download URLs, which are persistent and lack security
There are two ways to get a token download URL. Signed and public download URLs each have only one way to get them.
Token URL method #1: From the Firebase Storage Console
You can get the download URL from Firebase Storage console:
The download URL looks like this:
https://firebasestorage.googleapis.com/v0/b/languagetwo-cd94d.appspot.com/o/Audio%2FEnglish%2FUnited_States-OED-0%2Fabout.mp3?alt=media&token=489c48b3-23fb-4270-bd85-0a328d2808e5
The first part is a standard path to your file. At the end is the token. This download URL is permanent, i.e., it won't expire, although you can revoke it.
Token URL method #2: From the Front End
The documentation tells us to use getDownloadURL():
let url = await firebase.storage().ref('Audio/English/United_States-OED-' + i +'/' + $scope.word.word + ".mp3").getDownloadURL();
This gets the same download URL that you can get from your Firebase Storage console. This method is easy but requires that you know the path to your file, which in my app is difficult. You could upload files from the front end, but this would expose your credentials to anyone who downloads your app. So for most projects you'll want to upload your files from your Cloud Functions, then get the download URL and save it to your database along with other data about your file.
I can't find a way to get the token download URL when I write a file to Storage from a Cloud Function (because I can't find a way to tell the front end that a file has written to Storage), but what works for me is to write a file to a publicly available URL, write the publicly available URL to Firebase, then when my Angular front end gets the download URL from Firebase it also runs getDownloadURL(), which has the token, then compares the download URL in Firestore to the token download URL, and if they don't match then it updates the token download URL in place of the publicly available URL in Firestore. This exposes your file to the public only once.
This is easier than it sounds. The following code iterates through an array of Storage download URLs and replaces publicly available download URLs with token download URLs.
const storage = getStorage();
var audioFiles: string[] = [];
if (this.pronunciationArray[0].pronunciation != undefined) {
for (const audioFile of this.pronunciationArray[0].audioFiles) { // for each audio file in array
let url = await getDownloadURL(ref(storage, audioFile)); // get the download url with token
if (audioFile !== url) { // download URLs don't match
audioFiles.push(url);
} // end inner if
}; // end for of loop
if (audioFiles.length > 0) { // update Firestore only if we have new download URLs
await updateDoc(doc(this.firestore, 'Dictionaries/' + this.l2Language.long + '/Words/' + word + '/Pronunciations/', this.pronunciationArray[0].pronunciation), {
audioFiles: audioFiles
});
}
} // end outer if
You're thinking, "I'll return the Storage location from my Cloud Function to my front end and then use the location with getDownloadURL() to write the token download URL to Firestore." That won't work because Cloud Functions can only return synchronous results. Async operations return null.
"No problem," you say. "I'll set up an Observer on Storage, get the location from the Observer, and then use the location with getDownloadURL() to write the token download URL to Firestore." No dice. Firestore has Observers. Storage doesn't have Observers.
"How about," you say, "calling listAll() from my front end, getting a list of all my Storage files, then calling the metadata for each file, and extracting the download URL and token for each file, and then writing these to Firestore?" Good try, but Storage metadata doesn't include the download URL or token.
Signed URL method #1: getSignedUrl() for Temporary Download URLs
getSignedUrl() is easy to use from a Cloud Function:
function oedPromise() {
return new Promise(function(resolve, reject) {
http.get(oedAudioURL, function(response) {
response.pipe(file.createWriteStream(options))
.on('error', function(error) {
console.error(error);
reject(error);
})
.on('finish', function() {
file.getSignedUrl(config, function(err, url) {
if (err) {
console.error(err);
return;
} else {
resolve(url);
}
});
});
});
});
}
A signed download URL looks like this:
https://storage.googleapis.com/languagetwo-cd94d.appspot.com/Audio%2FSpanish%2FLatin_America-Sofia-Female-IBM%2Faqu%C3%AD.mp3?GoogleAccessId=languagetwo-cd94d%40appspot.gserviceaccount.com&Expires=4711305600&Signature=WUmABCZIlUp6eg7dKaBFycuO%2Baz5vOGTl29Je%2BNpselq8JSl7%2BIGG1LnCl0AlrHpxVZLxhk0iiqIejj4Qa6pSMx%2FhuBfZLT2Z%2FQhIzEAoyiZFn8xy%2FrhtymjDcpbDKGZYjmWNONFezMgYekNYHi05EPMoHtiUDsP47xHm3XwW9BcbuW6DaWh2UKrCxERy6cJTJ01H9NK1wCUZSMT0%2BUeNpwTvbRwc4aIqSD3UbXSMQlFMxxWbPvf%2B8Q0nEcaAB1qMKwNhw1ofAxSSaJvUdXeLFNVxsjm2V9HX4Y7OIuWwAxtGedLhgSleOP4ErByvGQCZsoO4nljjF97veil62ilaQ%3D%3D
The signed URL has an expiration date and long signature. The documentation for the command line gsutil signurl -d says that signed URLs are temporary: the default expiration is one hour and the maximum expiration is seven days.
I'm going to rant here that the getSignedUrl documentation never says that your signed URL will expire in a week. The documentation code has 3-17-2025 as the expiration date, suggesting that you can set the expiration years in the future. My app worked perfectly, and then crashed a week later. The error message said that the signatures didn't match, not that the download URL had expired. I made various changes to my code, and everything worked...until it all crashed a week later. This went on for more than a month of frustration. Is the 3-17-2025 date an inside joke? Like the gold coins of a leprechaun that vanish when the leprechaun is out of sight, a St. Patrick's Day expiry date years in the future vanishes in two weeks, just when you thought your code was bug-free.
Public URL #1: Make Your File Publicly Available
You can set the permissions on your file to public read, as explained in the documentation. This can be done from the Cloud Storage Browser or from your Node server. You can make one file public or a directory or your entire Storage database. Here's the Node code:
var webmPromise = new Promise(function(resolve, reject) {
var options = {
destination: ('Audio/' + longLanguage + '/' + pronunciation + '/' + word + '.mp3'),
predefinedAcl: 'publicRead',
contentType: 'audio/' + audioType,
};
synthesizeParams.accept = 'audio/webm';
var file = bucket.file('Audio/' + longLanguage + '/' + pronunciation + '/' + word + '.webm');
textToSpeech.synthesize(synthesizeParams)
.then(function(audio) {
audio.pipe(file.createWriteStream(options));
})
.then(function() {
console.log("webm audio file written.");
resolve();
})
.catch(error => console.error(error));
});
The result will look like this in your Cloud Storage Browser:
Anyone can then use the standard path to download your file:
https://storage.googleapis.com/languagetwo-cd94d.appspot.com/Audio/English/United_States-OED-0/system.mp3
Another way to make a file public is to use the method makePublic(). I haven't been able to get this to work, it's tricky to get the bucket and file paths right.
An interesting alternative is to use Access Control Lists. You can make a file available only to users whom you put on a list, or use authenticatedRead to make the file available to anyone who is logged in from a Google account. If there were an option "anyone who logged into my app using Firebase Auth" I would use this, as it would limit access to only my users.
Deprecated: Build Your Own Download URL with firebaseStorageDownloadTokens
Several answers describe an undocumented Google Storage object property firebaseStorageDownloadTokens. This was never an official Google Cloud Storage feature and it no longer works. Here's how it used to work.
You told Storage the token you wanted to use. You then generated a token with the uuid Node module. Four lines of code and you could build your own download URL, the same download URL you get from the console or getDownloadURL(). The four lines of code are:
const uuidv4 = require('uuid/v4');
const uuid = uuidv4();
metadata: { firebaseStorageDownloadTokens: uuid }
https://firebasestorage.googleapis.com/v0/b/" + bucket.name + "/o/" + encodeURIComponent('Audio/' + longLanguage + '/' + pronunciation + '/' + word + '.webm') + "?alt=media&token=" + uuid);
Here's the code in context:
var webmPromise = new Promise(function(resolve, reject) {
var options = {
destination: ('Audio/' + longLanguage + '/' + pronunciation + '/' + word + '.mp3'),
contentType: 'audio/' + audioType,
metadata: {
metadata: {
firebaseStorageDownloadTokens: uuid,
}
}
};
synthesizeParams.accept = 'audio/webm';
var file = bucket.file('Audio/' + longLanguage + '/' + pronunciation + '/' + word + '.webm');
textToSpeech.synthesize(synthesizeParams)
.then(function(audio) {
audio.pipe(file.createWriteStream(options));
})
.then(function() {
resolve("https://firebasestorage.googleapis.com/v0/b/" + bucket.name + "/o/" + encodeURIComponent('Audio/' + longLanguage + '/' + pronunciation + '/' + word + '.webm') + "?alt=media&token=" + uuid);
})
.catch(error => console.error(error));
});
That's not a typo--you have to nest firebaseStorageDownloadTokens in double layers of metadata:!
Here's an example on how to specify the download token on upload:
const UUID = require("uuid-v4");
const fbId = "<YOUR APP ID>";
const fbKeyFile = "./YOUR_AUTH_FIlE.json";
const gcs = require('#google-cloud/storage')({keyFilename: fbKeyFile});
const bucket = gcs.bucket(`${fbId}.appspot.com`);
var upload = (localFile, remoteFile) => {
let uuid = UUID();
return bucket.upload(localFile, {
destination: remoteFile,
uploadType: "media",
metadata: {
contentType: 'image/png',
metadata: {
firebaseStorageDownloadTokens: uuid
}
}
})
.then((data) => {
let file = data[0];
return Promise.resolve("https://firebasestorage.googleapis.com/v0/b/" + bucket.name + "/o/" + encodeURIComponent(file.name) + "?alt=media&token=" + uuid);
});
}
then call with
upload(localPath, remotePath).then( downloadURL => {
console.log(downloadURL);
});
The key thing here is that there is a metadata object nested within the metadata option property. Setting firebaseStorageDownloadTokens to a uuid-v4 value will tell Cloud Storage to use that as its public auth token.
Many thanks to #martemorfosis
If you're working on a Firebase project, you can create signed URLs in a Cloud Function without including other libraries or downloading a credentials file. You just need to enable the IAM API and add a role to your existing service account (see below).
Initialize the admin library and get a file reference as your normally would:
import * as functions from 'firebase-functions'
import * as admin from 'firebase-admin'
admin.initializeApp(functions.config().firebase)
const myFile = admin.storage().bucket().file('path/to/my/file')
You then generate a signed URL with
myFile.getSignedUrl({action: 'read', expires: someDateObj}).then(urls => {
const signedUrl = urls[0]
})
Make sure your Firebase service account has sufficient permissions to run this
Go to the Google API console and enable the IAM API (https://console.developers.google.com/apis/api/iam.googleapis.com/overview)
Still in the API console, go to the main menu, "IAM & admin" -> "IAM"
Click edit for the "App Engine default service account" role
Click "Add another role", and add the one called "Service Account Token Creator"
Save and wait a minute for the changes to propagate
With a vanilla Firebase config, the first time you run the above code you'll get an error Identity and Access Management (IAM) API has not been used in project XXXXXX before or it is disabled.. If you follow the link in the error message and enable the IAM API, you'll get another error: Permission iam.serviceAccounts.signBlob is required to perform this operation on service account my-service-account. Adding the Token Creator role fixes this second permission issue.
You should avoid harcoding URL prefix in your code, especially when there are alternatives. I suggest using the option predefinedAcl: 'publicRead' when uploading a file with Cloud Storage NodeJS 1.6.x or +:
const options = {
destination: yourFileDestination,
predefinedAcl: 'publicRead'
};
bucket.upload(attachment, options);
Then, getting the public URL is as simple as:
bucket.upload(attachment, options).then(result => {
const file = result[0];
return file.getMetadata();
}).then(results => {
const metadata = results[0];
console.log('metadata=', metadata.mediaLink);
}).catch(error => {
console.error(error);
});
This is what I currently use, it's simple and it works flawlessly.
You don't need to do anything with Google Cloud. It works out of the box with Firebase..
// Save the base64 to storage.
const file = admin.storage().bucket('url found on the storage part of firebase').file(`profile_photos/${uid}`);
await file.save(base64Image, {
metadata: {
contentType: 'image/jpeg',
},
predefinedAcl: 'publicRead'
});
const metaData = await file.getMetadata()
const url = metaData[0].mediaLink
EDIT:
Same example, but with upload:
await bucket.upload(fromFilePath, {destination: toFilePath});
file = bucket.file(toFilePath);
metaData = await file.getMetadata()
const trimUrl = metaData[0].mediaLink
#update:
no need to make two different call in upload method to get the metadata:
let file = await bucket.upload(fromFilePath, {destination: toFilePath});
const trimUrl = file[0].metaData.mediaLink
With the recent changes in the functions object response you can get everything you need to "stitch" together the download URL like so:
const img_url = 'https://firebasestorage.googleapis.com/v0/b/[YOUR BUCKET]/o/'
+ encodeURIComponent(object.name)
+ '?alt=media&token='
+ object.metadata.firebaseStorageDownloadTokens;
console.log('URL',img_url);
For those wondering where the Firebase Admin SDK serviceAccountKey.json file should go. Just place it in the functions folder and deploy as usual.
It still baffles me why we can't just get the download url from the metadata like we do in the Javascript SDK. Generating a url that will eventually expire and saving it in the database is not desirable.
One method I'm using with success is to set a UUID v4 value to a key named firebaseStorageDownloadTokens in the metadata of the file after it finishes uploading and then assemble the download URL myself following the structure Firebase uses to generate these URLs, eg:
https://firebasestorage.googleapis.com/v0/b/[BUCKET_NAME]/o/[FILE_PATH]?alt=media&token=[THE_TOKEN_YOU_CREATED]
I don't know how much "safe" is to use this method (given that Firebase could change how it generates the download URLs in the future ) but it is easy to implement.
Sorry but i can't post a comment to your question above because of missing reputation, so I will include it in this answer.
Do as stated above by generating a signed Url, but instead of using the service-account.json I think you have to use the serviceAccountKey.json which you can generate at (replace YOURPROJECTID accordingly)
https://console.firebase.google.com/project/YOURPROJECTID/settings/serviceaccounts/adminsdk
Example:
const gcs = require('#google-cloud/storage')({keyFilename: 'serviceAccountKey.json'});
// ...
const bucket = gcs.bucket(bucket);
// ...
return bucket.upload(tempLocalFile, {
destination: filePath,
metadata: {
contentType: 'image/jpeg'
}
})
.then((data) => {
let file = data[0]
file.getSignedUrl({
action: 'read',
expires: '03-17-2025'
}, function(err, url) {
if (err) {
console.error(err);
return;
}
// handle url
})
I can't comment on the answer James Daniels gave, but I think this is very Important to read.
Giving out a signed URL Like he did seems for many cases pretty bad and possible Dangerous.
According to the documentation of Firebase the signed url expires after some time, so adding that to your databse will lead to a empty url after a certain timeframe
It may be that misunderstood the Documentation there and the signed url doesn't expire, which would have some security issues as a result.
The Key seems to be the same for every uploaded file. This means once you got the url of one file, someone could easily access files that he is not suposed to access, just by knowing their names.
If i missunderstood that then i would lvoe to be corrected.
Else someone should probably Update the above named solution.
If i may be wrong there
If you use the predefined access control lists value of 'publicRead', you can upload the file and access it with a very simple url structure:
// Upload to GCS
const opts: UploadOptions = {
gzip: true,
destination: dest, // 'someFolder/image.jpg'
predefinedAcl: 'publicRead',
public: true
};
return bucket.upload(imagePath, opts);
You can then construct the url like so:
const storageRoot = 'https://storage.googleapis.com/';
const bucketName = 'myapp.appspot.com/'; // CHANGE TO YOUR BUCKET NAME
const downloadUrl = storageRoot + bucketName + encodeURIComponent(dest);
Use file.publicUrl()
Async/Await
const bucket = storage.bucket('bucket-name');
const uploadResponse = await bucket.upload('image-name.jpg');
const downloadUrl = uploadResponse[0].publicUrl();
Callback
const bucket = storage.bucket('bucket-name');
bucket.upload('image-name.jpg', (err, file) => {
if(!file) {
throw err;
}
const downloadUrl = file.publicUrl();
})
The downloadUrl will be "https://storage.googleapis.com/bucket-name/image-name.jpg".
Please note that in order for the above code to work, you have to make the bucket or file public. To do so, follow the instructions here https://cloud.google.com/storage/docs/access-control/making-data-public. Also, I imported the #google-cloud/storage package directly not through the Firebase SDK.
I had the same issue, however, I was looking at the code of the firebase function example instead of the README. And Answers on this thread didn't help either...
You can avoid passing the config file by doing the following:
Go to your project's Cloud Console > IAM & admin > IAM, Find the App
Engine default service account and add the Service Account Token
Creator role to that member. This will allow your app to create signed
public URLs to the images.
source: Automatically Generate Thumbnails function README
Your role for app engine should look like this:
This works if you just need a public file with a simple URL. Note that this may overrule your Firebase storage rules.
bucket.upload(file, function(err, file) {
if (!err) {
//Make the file public
file.acl.add({
entity: 'allUsers',
role: gcs.acl.READER_ROLE
}, function(err, aclObject) {
if (!err) {
var URL = "https://storage.googleapis.com/[your bucket name]/" + file.id;
console.log(URL);
} else {
console.log("Failed to set permissions: " + err);
}
});
} else {
console.log("Upload failed: " + err);
}
});
Without signedURL() using makePublic()
const functions = require('firebase-functions');
const admin = require('firebase-admin');
admin.initializeApp()
var bucket = admin.storage().bucket();
// --- [Above] for admin related operations, [Below] for making a public url from a GCS uploaded object
const { Storage } = require('#google-cloud/storage');
const storage = new Storage();
exports.testDlUrl = functions.storage.object().onFinalize(async (objMetadata) => {
console.log('bucket, file', objMetadata.bucket + ' ' + objMetadata.name.split('/').pop()); // assuming file is in folder
return storage.bucket(objMetadata.bucket).file(objMetadata.name).makePublic().then(function (data) {
return admin.firestore().collection('publicUrl').doc().set({ publicUrl: 'https://storage.googleapis.com/' + objMetadata.bucket + '/' + objMetadata.name }).then(writeResult => {
return console.log('publicUrl', writeResult);
});
});
});
answer by https://stackoverflow.com/users/269447/laurent works best
const uploadOptions: UploadOptions = {
public: true
};
const bucket = admin.storage().bucket();
[ffile] = await bucket.upload(oPath, uploadOptions);
ffile.metadata.mediaLink // this is what you need
I saw this on the admin storage doc
const options = {
version: 'v4',
action: 'read',
expires: Date.now() + 15 * 60 * 1000, // 15 minutes
};
// Get a v4 signed URL for reading the file
const [url] = await storage
.bucket(bucketName)
.file(filename)
.getSignedUrl(options);
console.log('Generated GET signed URL:');
console.log(url);
console.log('You can use this URL with any user agent, for example:');
console.log(`curl '${url}'`);
For those who are using Firebase SDK andadmin.initializeApp:
1 - Generate a Private Key and place in /functions folder.
2 - Configure your code as follows:
const serviceAccount = require('../../serviceAccountKey.json');
try { admin.initializeApp(Object.assign(functions.config().firebase, { credential: admin.credential.cert(serviceAccount) })); } catch (e) {}
Documentation
The try/catch is because I'm using a index.js that imports other files and creates one function to each file. If you're using a single index.js file with all functions, you should be ok with admin.initializeApp(Object.assign(functions.config().firebase, { credential: admin.credential.cert(serviceAccount) }));.
As of firebase 6.0.0 I was able to access the storage directly with the admin like this:
const bucket = admin.storage().bucket();
So I didn't need to add a service account. Then setting the UUID as referenced above worked for getting the firebase url.
This is the best I came up. It is redundant, but the only reasonable solution that worked for me.
await bucket.upload(localFilePath, {destination: uploadPath, public: true});
const f = await bucket.file(uploadPath)
const meta = await f.getMetadata()
console.log(meta[0].mediaLink)
I already post my ans... in below URL Where you can get full code with solution
How do I upload a base64 encoded image (string) directly to a Google Cloud Storage bucket using Node.js?
const uuidv4 = require('uuid/v4');
const uuid = uuidv4();
const os = require('os')
const path = require('path')
const cors = require('cors')({ origin: true })
const Busboy = require('busboy')
const fs = require('fs')
var admin = require("firebase-admin");
var serviceAccount = {
"type": "service_account",
"project_id": "xxxxxx",
"private_key_id": "xxxxxx",
"private_key": "-----BEGIN PRIVATE KEY-----\jr5x+4AvctKLonBafg\nElTg3Cj7pAEbUfIO9I44zZ8=\n-----END PRIVATE KEY-----\n",
"client_email": "xxxx#xxxx.iam.gserviceaccount.com",
"client_id": "xxxxxxxx",
"auth_uri": "https://accounts.google.com/o/oauth2/auth",
"token_uri": "https://oauth2.googleapis.com/token",
"auth_provider_x509_cert_url": "https://www.googleapis.com/oauth2/v1/certs",
"client_x509_cert_url": "https://www.googleapis.com/robot/v1/metadata/x509/firebase-adminsdk-5rmdm%40xxxxx.iam.gserviceaccount.com"
}
admin.initializeApp({
credential: admin.credential.cert(serviceAccount),
storageBucket: "xxxxx-xxxx" // use your storage bucket name
});
const app = express();
app.use(bodyParser.urlencoded({ extended: false }));
app.use(bodyParser.json());
app.post('/uploadFile', (req, response) => {
response.set('Access-Control-Allow-Origin', '*');
const busboy = new Busboy({ headers: req.headers })
let uploadData = null
busboy.on('file', (fieldname, file, filename, encoding, mimetype) => {
const filepath = path.join(os.tmpdir(), filename)
uploadData = { file: filepath, type: mimetype }
console.log("-------------->>",filepath)
file.pipe(fs.createWriteStream(filepath))
})
busboy.on('finish', () => {
const bucket = admin.storage().bucket();
bucket.upload(uploadData.file, {
uploadType: 'media',
metadata: {
metadata: { firebaseStorageDownloadTokens: uuid,
contentType: uploadData.type,
},
},
})
.catch(err => {
res.status(500).json({
error: err,
})
})
})
busboy.end(req.rawBody)
});
exports.widgets = functions.https.onRequest(app);
For those trying to use the token parameter to share the file and would like to use gsutil command, here is how I did it:
First you need to authenticate by running: gcloud auth
Then run:
gsutil setmeta -h "x-goog-meta-firebaseStorageDownloadTokens:$FILE_TOKEN" gs://$FIREBASE_REPO/$FILE_NAME
Then you can download the file with the following link:
https://firebasestorage.googleapis.com/v0/b/$FIREBASE_REPO/o/$FILE_NAME?alt=media&token=$FILE_TOKEN
From the Admin SDKs, you cannot retrieve the download token generated by Firebase of an uploaded file, but you can set that token when uploading by adding it in the metadata.
For those who are working on Python SDK. This is the way to do it:
from firebase_admin import storage
from uuid import uuid4
bucket = storage.bucket()
blob = bucket.blob(path_to_file)
token = str(uuid4()) # Random ID
blob.metadata = {
"firebaseStorageDownloadTokens": token
}
blob.upload_from_file(file)
You have now uploaded the file and got the URL token. You could now save the token (or even the full download URL) into your database (e.g. Firestore) and send it to the client when the file is requested and then making the client itself retrieve the file.
The full download URL looks like this:
https://firebasestorage.googleapis.com/v0/b/{bucket_name}/o/{file_name}?alt=media&token={token}
tldr; Uploading a blob and getting the image Url:
const file = storage.bucket().file(`images/${imageName}.jpeg`)
await file.save(image)
const imgUrl = file.metadata.mediaLink
If you are getting error:
Google Cloud Functions: require(…) is not a function
try this:
const {Storage} = require('#google-cloud/storage');
const storage = new Storage({keyFilename: 'service-account-key.json'});
const bucket = storage.bucket(object.bucket);
const file = bucket.file(filePath);
.....
I am trying to implement cloud function but getting error if i
require it like this
var storage =require('#google-cloud/storage')();
like this when deploying
var storage = require('#google-cloud/storage');
so i resolved to using it as above but tried uploading a picture i am getting error "TypeError: gcs.bucket is not a function"
const os = require('os');
const path = require('path');
///
exports.onFileChange = functions.storage.object().onFinalize((event) => {
const bucket = event.bucket;
const contentType = event.contentType;
const filePath = event.name;
console.log('Changes made to bucket');
///
if(path.basename(filePath).startsWith('renamed-')){
console.log("File was previously renamed");
return;
}
const gcs = storage({
projectId: 'clfapi'
});
///
const destBucket = gcs.bucket(bucket);
const tmFiilePath = path.join(os.tmpdir(), path.basename(filePath));
const metadata = {contentType: contentType};
///
return destBucket.file(filePath).download({
destination: tmFiilePath
}).then(() => {
return destBucket.upload(tmFiilePath, {
destination: 'renamed-' + path.basename(filePath),
metadata: metadata
})
});
});
The API changed in version 2.x of the Cloud Storage node SDK. According to the documentation, you import the SDK like this:
// Imports the Google Cloud client library
const {Storage} = require('#google-cloud/storage');
Then you can create a new Storage object:
// Creates a client
const storage = new Storage();
Then you can reach into a bucket:
const bucket = storage.bucket()
I have some trouble with Cloud function and firestore rules.
I would like use cloud function with limited privilèges on Firestore and give
only has access as defined in the Security Rules
It's working without problem on RTDB but not on Firestore.
I have try with this rules
service cloud.firestore {
match /databases/{database}/documents {
match /init/{ID=**} {
allow read, write: if true;
}
match /test/{ID=**} {
allow read, write: if false;
}
}
}
And this
const admin = require('firebase-admin');
const functions = require('firebase-functions');
const FieldValue = require('firebase-admin').firestore.FieldValue;
admin.initializeApp({
credential: admin.credential.cert(serviceAccount),
databaseURL: 'https://******.firebaseio.com',
databaseAuthVariableOverride: {
uid: 'my-worker',
},
});
const db = admin.firestore();
exports.onTestRights = functions.firestore
.document('init/{initID}')
.onCreate((event) => {
const initID = event.params.initID;
return db.collection('test').doc(initID).set({'random key': 'random value'}).then(()=>{
console.log('working');
return;
}).catch((err) =>{
console.log('error: ', err);
return;
});
});
But it's still writing so whereas it should be "permission denied"
Anyone know if it's normal(or not yet implanted) on firestore or I have misunderstood something ?
Edit:
Of course my final goal is not with this rules, but only give write/read access on some documents/collections using (allow read, write: if request.auth.uid == 'my-worker';)
Edit2:
I would like use the security rules for checking like a transaction if no change during process using this model
As you've noticed databaseAuthVariableOverride only works for the Realtime Database. There is nothing right now that allows you to do the same for Firestore in the Admin SDK.
One workaround you could use if you want to limit the access rights on your server code is to use the Client JS SDK rather than Firebase Admin and sign the user-in using a custom token. Here is a sample code to do this:
// Configure Firebase Client SDK.
const firebase = require('firebase/app');
require('firebase/auth');
require('firebase/firestore');
firebase.initializeApp({
// ... Initialization settings for web apps. You get this from your Firebase console > Add Firebase to your web app
});
// Configure Firebase Admin SDK.
const admin = require('firebase-admin');
admin.initializeApp({
credential: admin.credential.cert(serviceAccount),
});
// Create Custom Auth token for 'my-worker'.
const firebaseReady = admin.auth().createCustomToken('my-worker').then(token => {
// Sign in the Client SDK as 'my-worker'
return firebase.auth().signInWithCustomToken(token).then(user => {
console.log('User now signed-in! uid:', user.uid);
return firebase.firestore();
});
});
// Now firebaseReady gives you a Promise that completes with a authorized firestore instance. Use it like this:
exports.onTestRights = functions.firestore
.document('init/{initID}')
.onCreate(event => {
const initID = event.params.initID;
return firebaseReady.then(db => db.collection('test').doc(initID).set({'random key': 'random value'}).then(() => {
console.log('working');
return;
}).catch((err) =>{
console.log('error: ', err);
return;
});
);
});