Getting firebase storage image 403 - javascript

I'm using Next.js and am struggling to get firebase storage images to render on screen as they keep coming back as 403 errors.
I upload an image to storage when adding a document to firestore with a url nested in it then use that url to get the image
This is my Next Image tag with recipe.main_image.url == https://storage.googleapis.com/chairy-cooks.appspot.com/recipes/0O3Oycow4lJvyhDbwN0j/main_image/190524-classic-american-cheeseburger-ew-207p.png
And recipe.main_image.name == the name of the photo in firebase storage.
<Image
alt={recipe.main_image.name}
src={recipe.main_image.url}
width={650}
height={650}
/>
The alt tag comes back as the right thing (image name in firebase storage) so I know the image is getting sent back but for some reason I'm unauthorised I've tried multiple fixes including changing firebase rules to
service firebase.storage {
match /b/{bucket}/o {
match /{allPaths=**} {
allow read, write: if request.auth != null;
}
}
}
AND
service firebase.storage {
match /b/{bucket}/o {
match /{allPaths=**} {
allow read, write;
}
}
}
But neither change the outcome I've also set my next config up so I'm getting firebase storage
images: {
domains: ['images.ctfassets.net', 'storage.googleapis.com'],
},

If the images will be served to the public, then i would suggest to grant public access to the image. Cloud Storage for Firebase is backed by Google Cloud Storage. You can make individual objects publicly readable or make the bucket publicly readable.
or better yet, generate signed URLs, this way you can grant limited time access to an object. If you're working with Javascript then you may take a look at this sample code on how to generate a signed URL.
/**
* TODO(developer): Uncomment the following lines before running the sample.
* Note: when creating a signed URL, unless running in a GCP environment,
* a service account must be used for authorization.
*/
// The ID of your GCS bucket
// const bucketName = 'your-unique-bucket-name';
// The full path of your file inside the GCS bucket, e.g. 'yourFile.jpg' or 'folder1/folder2/yourFile.jpg'
// const fileName = 'your-file-name';
// Imports the Google Cloud client library
const {Storage} = require('#google-cloud/storage');
// Creates a client
const storage = new Storage();
async function generateV4ReadSignedUrl() {
// These options will allow temporary read access to the file
const options = {
version: 'v4',
action: 'read',
expires: Date.now() + 15 * 60 * 1000, // 15 minutes
};
// Get a v4 signed URL for reading the file
const [url] = await storage
.bucket(bucketName)
.file(fileName)
.getSignedUrl(options);
console.log('Generated GET signed URL:');
console.log(url);
console.log('You can use this URL with any user agent, for example:');
console.log(`curl '${url}'`);
}
generateV4ReadSignedUrl().catch(console.error);
You may visit this documentation for more information.

Related

Azure Blob Storage 403 Authentication Failed Due To Authorization Header

Problem
I have uploaded a set of images (blobs) to a private Azure Blob Storage account, but when I try to access them, I am faced with the following error.
GET https://<account-name>.blob.core.windows.net/<container-name>/<blob-name> 403 (Server failed
to authenticate the request. Make sure the value of Authorization header is formed correctly
including the signature.)
I don't have any problems uploading this data as this is done through the server-side using a Django app. I wish to be able to successfully retrieve this uploaded blob data using client-side JavaScript.
Background
I have thoroughly read through and implemented the steps from the Microsoft Azure documentation for authorizing access to my private account via the use of Shared Keys. This includes everything from constructing my signature string to hashing this data using the HMAC SHA-256 algorithm, as detailed in the link above.
I am running everything on Docker containers except for the client-side Vue-based interface which is attempting to invoke the Get Blob API endpoint, as you will see below.
Minimum Reproducible Example
The code that raises this error is as follows:
// Add imports
const crypto = require('crypto');
const axios = require('axios');
// Set Azure blob storage data
const account = "<azure-blob-storage-private-account-name>"
const version = "2020-04-08"
const blob = "<blob-name>"
const container = "<container-name>"
const blob_uri = `https://${account}.blob.core.windows.net/${container}/${blob}`;
const today = new Date().toGMTString();
// Construct signature string
const CanonicalisedHeaders = `x-ms-date:${today}\nx-ms-version:${version}\n`;
const CanonicalisedResource = `/${account}/${container}/${blob}`;
const StringToSign = `GET\n\n\n\n\n\n\n\n\n\n\n\n` + CanonicalisedHeaders + CanonicalisedResource;
// Hash string using HMAC Sha-256 and encode to base64
const key = "<shared-access-key-in-base64>";
const utf8encoded = Buffer.from(key, 'base64').toString('utf8');
const signature = crypto.createHmac('sha256', utf8encoded).update(StringToSign).digest("base64");
// Construct the headers and invoke the API call
const blob_config = {
headers: {
"Authorization": `SharedKey ${account}:${signature}`,
"x-ms-date": today,
"x-ms-version": version
}
}
await axios.get(blob_uri, blob_config)
.then((data) => console.log(data))
.catch((error) => console.log(error.message));
What I have tried
I have tried the following, but none of them have helped me resolve the issue at hand.
Updated CORS settings to avoid CORS-related 403 Forbidden Access issues.
Regenerated my key and connection strings.
Checked the DateTime settings on my local machine and on my Docker containers to ensure they are on the correct GMT time.
Checked that my signature string's components (canonicalized headers, resources, etc.) are constructed according to the rules defined here.
Read through similar StackOverflow and Azure forum posts in search of a solution.
Please try by changing the following lines of code:
const utf8encoded = Buffer.from(key, 'base64').toString('utf8');
const signature = crypto.createHmac('sha256', utf8encoded).update(StringToSign).digest("base64");
to
const keyBuffer = Buffer.from(key, 'base64');
const signature = crypto.createHmac('sha256', keyBuffer).update(StringToSign).digest("base64");
I don't think you need to convert the key buffer to a UTF8 encoded string.
Few other things:
Considering you're using it in the browser, there's a massive security risk as you're exposing your storage keys to your users.
Is there a reason you're using REST API directly instead of using Azure Storage Blob SDK?
In browser-based environments, you should be using Shared Access Signature based authorization instead of Shared Access Key based authorization.

Firebase: Does the bucket owner have access to all data?

I am making a To-do Android Application using the Firebase backend. For that, I can use a simple access rule which allows read and write access only when uid == auth.uid. However, as the owner of the bucket, I am currently able to see all the to-dos that any user creates. Clearly, this is unexpected behavior. How do I restrict my access to certain data in my own bucket? Or is there any other alternative?
EDIT: More details. The current security rule is as follows
rules_version = '2';
service firebase.storage {
match /b/{bucket}/o {
match /{allPaths=**} {
allow read, write: if uid == auth.uid;
}
}
}
I am working with react-native and following rnfirebase and used Google Sign-in by this SDK wrapper here. The code to upload an image to Firebase storage follows the tutorial here. Here is the snippet in react-native.
const reference = storage().ref('black-t-shirt-sm.png');
return (
<View>
<Button
onPress={async () => {
// path to existing file on filesystem
const pathToFile = `${utils.FilePath.PICTURES_DIRECTORY}/black-t-shirt-sm.png`;
// uploads file
await reference.putFile(pathToFile);
}}
/>
</View>
);
This makes no sense:
match /{allPaths=**} {
allow read, write: if uid == auth.uid;
}
This allows access to the file if the built-in variable auth.uid matches the context variable uid. But you never define uid anywhere, so the rule always fails. In fact, this should not even save as you have a syntax error by not defining uid anywhere before you use it.
Are you saying that with these rules you have a user who is able to see all files? Can you show an example of that?

Get image url Firebase storage (admin)

I have to upload an image to the firebase storage. I'm not using the web version of storage (I shouldn't use it). I am using the firebase admin.
No problem, I upload the file without difficulty and I get the result in the variable "file".
and if I access the firebase storage console, the image is there. all right.
return admin.storage().bucket().upload(filePath, {destination: 'demo/images/restaurantCover.jpg',
metadata:{contentType: 'image/jpeg'}
public: true
}).then(file =>{
console.log(`file --> ${JSON.stringify(file, null, 2)}`);
let url = file["0"].metadata.mediaLink; // image url
return resolve(res.status(200).send({data:file})); // huge data
}) ;
Now, I have some questions.
Why so much information and so many objects as a response to the upload () method? Reviewing the immense object, I found a property called mediaLink inside metadata and it is the download url of the image. but...
Why is the url different from the one shown by firebase? Why can not I find the downloadURL property?
How can get the url of firebase?
firebase: https://firebasestorage.googleapis.com/v0/b/myfirebaseapp.appspot.com/o/demo%2Fimages%2Fthumb_restaurant.jpg?alt=media&token=bee96b71-2094-4492-96aa-87469363dd2e
mediaLink: https://www.googleapis.com/download/storage/v1/b/myfirebaseapp.appspot.com/o/demo%2Fimages%2Frestaurant.jpg?generation=1530193601730593&alt=media
If I use the mediaLink url is there any problem with different urls? (read, update from ios and Web Client)
Looking at Google Cloud Storage: Node.js Client documentation, they have a link to sample code which shows exactly how to do this. Also, see the File class documentation example (below)
// Imports the Google Cloud client library
const Storage = require('#google-cloud/storage');
// Creates a client
const storage = new Storage();
/**
* TODO(developer): Uncomment the following lines before running the sample.
*/
// const bucketName = 'Name of a bucket, e.g. my-bucket';
// const filename = 'File to access, e.g. file.txt';
// Gets the metadata for the file
storage
.bucket(bucketName)
.file(filename)
.getMetadata()
.then(results => {
const metadata = results[0];
console.log(`File: ${metadata.name}`);
console.log(`Bucket: ${metadata.bucket}`);
console.log(`Storage class: ${metadata.storageClass}`);
console.log(`Self link: ${metadata.selfLink}`);
console.log(`ID: ${metadata.id}`);
console.log(`Size: ${metadata.size}`);
console.log(`Updated: ${metadata.updated}`);
console.log(`Generation: ${metadata.generation}`);
console.log(`Metageneration: ${metadata.metageneration}`);
console.log(`Etag: ${metadata.etag}`);
console.log(`Owner: ${metadata.owner}`);
console.log(`Component count: ${metadata.component_count}`);
console.log(`Crc32c: ${metadata.crc32c}`);
console.log(`md5Hash: ${metadata.md5Hash}`);
console.log(`Cache-control: ${metadata.cacheControl}`);
console.log(`Content-type: ${metadata.contentType}`);
console.log(`Content-disposition: ${metadata.contentDisposition}`);
console.log(`Content-encoding: ${metadata.contentEncoding}`);
console.log(`Content-language: ${metadata.contentLanguage}`);
console.log(`Metadata: ${metadata.metadata}`);
console.log(`Media link: ${metadata.mediaLink}`);
})
.catch(err => {
console.error('ERROR:', err);
});

How to delete all of a users files (and a user in general )in firebase.storage?

Have an app that I want to be able to delete users. I can delete everything in the database. But firebase storage I am unable to delete the user and their files that they uploaded.
I am able to delete an individual file (file location/reference) for example with
const userStorageRef = this.storage.ref(`/${deletedUid}/full/TimeStamp/fileName.png`);
return userStorageRef.delete();
But unable to delete all files at once or user from storage with api.
Getting error Firebase Storage Error Object reference does not exist
My FIREBASE DATA STRUCTURE is like this(the root is not angular-app but same structure)
gs://angular-app.appspot.com/userId/fullImages/[list of fileKeys]/imageFile[.png, jpeg, gif]
Using Angularfire2(v5) but to delete the storage, really just accessing the normal webapi for firebase storage.
Using typscript so constructor looks like
constructor(private app: FirebaseApp) {
// Regular firebase api
this.fbDatabase = this.app.database();
this.storage = this.app.storage();
}
deleteUsersStorage(deletedUid) {
// Tried also .ref(`/${deletedUid}/`)
// .ref(`/${deletedUid}`)
// .ref(`${deletedUid}/`)
// .ref().child(`${deletedUid}`) and more
const userStorageRef = this.storage.ref(`${deletedUid}`);
userStorageRef.delete()
.then((r) => {
console.log('Deletion success');
})
.catch((e) => {
console.log('Error in storage deletion: ', e);
});
}
Getting error
Firebase Storage Error Object reference does not exist
My FIREBASE DATA STRUCTURE is like this(the root is not angular-app but same structure)
gs://angular-app.appspot.com/userId/fullImages/[list of fileKeys]/imageFile[.png, jpeg, gif]
The brackets are meant to reference what is there ie. list of fileKeys is a description of the location
update
my instinct now answer may be in the storage rules. Will need to work it later.
service firebase.storage {
match /b/angular-app.appspot.com/o {
match /{allPaths=**} {
allow read, write: if request.auth != null;
}
}
}

How to persist Cognito identity across pages in browser

I am authenticating through Cognito on client side browser using a developer authenticated identity. When my page loads (or is refreshed) I would like my application to remember the Identity for as long as the object is not expired (I think it lasts about an hour). However, I don't know how to retrieve the identity from Cognito without having to go through the developer authentication again.
Here is what the code does on page load:
var cognitoCredentials
$(document).ready(function() {
"use strict";
cognitoParams = {
IdentityPoolId: 'us-east-1:xxxxxxx'
};
cognitoCredentials = new AWS.CognitoIdentityCredentials(cognitoParams);
AWS.config.credentials = cognitoCredentials;
});
And after logging in through the developer authentication:
cognitoCredentials.params.IdentityId = output.identityId;
cognitoCredentials.params.Logins = {
'cognito-identity.amazonaws.com': output.token
};
cognitoCredentials.expired = true;
If I have already logged in, and then refresh the page, and try to log in again I get an error that I am trying to get an identity when I already have one
Error: Missing credentials in config(…) NotAuthorizedException: Missing credentials in config
"Access to Identity 'us-east-1:xxxxxxx' is forbidden."
However, I don't know how to access it. How do I retrieve the credentials so that when the page is refreshed, I can detect the previous identity given by Cognito?
Save at least accessKeyId, secretAccessKey, sessionToken in sessionStorage between pages. You can load these into AWS.config.credentials (after the AWS SDK has been loaded of course). It is much faster than waiting for Cognito to respond. Keep in mind, you'll have to manually refresh them with a token from one of the providers and this is only good until the temporary token expires (~1 hour).
var credKeys = [
'accessKeyId',
'secretAccessKey',
'sessionToken'
];
// After Cognito login
credKeys.forEach(function(key) {
sessionStorage.setItem(key, AWS.config.credentials[key]);
});
// After AWS SDK load
AWS.config.region = 'us-east-1'; // pick your region
credKeys.forEach(function(key) {
AWS.config.credentials[key] = sessionStorage.getItem(key);
});
// Now make your AWS calls to S3, DynamoDB, etc
The only way to get back to the same identity on page refresh would be to use the same token used to initialize that identity. You may want to refer to this question as the problems are similar (replacing the Facebook token with the OpenId Connect token from the developer authenticated identities flow).
To reiterate what that question says: the credentials in the SDK will not be persisted across pages, so you should cache the token to be reused.
I take a slightly different approach, that allows the SDK to refresh the credentials.
In short, I serialize the AssumeRoleWithWebIdentityRequest JSON object to session storage.
Here is an example using Angular, but concept applies in any JS app:
const AWS = require('aws-sdk/global');
import { STS } from 'aws-sdk';
import { environment } from '../../environments/environment';
const WEB_IDENT_CREDS_SS_KEY = 'ic.tmpAwsCreds';
// Handle tmp aws creds across page refreshes
const tmpCreds = sessionStorage.getItem(WEB_IDENT_CREDS_SS_KEY);
if (!!tmpCreds) {
AWS.config.credentials = new AWS.WebIdentityCredentials(JSON.parse(tmpCreds));
}
#Injectable({
providedIn: 'root'
})
export class AuthService {
...
async assumeAwsRoleFromWebIdent(fbUser: firebase.User) {
const token = await fbUser.getIdToken(false);
let p: STS.Types.AssumeRoleWithWebIdentityRequest = {
...environment.stsAssumeWebIdentConfig,
//environment.stsAssumeWebIdentConfig contains:
//DurationSeconds: 3600,
//RoleArn: 'arn:aws:iam::xxx:role/investmentclub-fbase-trust',
RoleSessionName: fbUser.uid + '#' + (+new Date()),
WebIdentityToken: token
};
// Store creds across page refresh, duno WTF `new AWS.WebIdentityCredentials(p)` don't have an option for this
AWS.config.credentials = new AWS.WebIdentityCredentials(p);
sessionStorage.setItem(WEB_IDENT_CREDS_SS_KEY, JSON.stringify(p));
}
removeAwsTempCreds() {
AWS.config.credentials = {};
sessionStorage.removeItem(WEB_IDENT_CREDS_SS_KEY);
}
...
Few things to note:
Upon login, I store the WebIdentityCredentials parameters as a JSON string in session cache.
You'll notice I check the browser session cache in global scope, to
handle page refreshes (sets creds before they can be used).
A tutorial with complete example can be found on my blog

Categories