Get image url Firebase storage (admin) - javascript

I have to upload an image to the firebase storage. I'm not using the web version of storage (I shouldn't use it). I am using the firebase admin.
No problem, I upload the file without difficulty and I get the result in the variable "file".
and if I access the firebase storage console, the image is there. all right.
return admin.storage().bucket().upload(filePath, {destination: 'demo/images/restaurantCover.jpg',
metadata:{contentType: 'image/jpeg'}
public: true
}).then(file =>{
console.log(`file --> ${JSON.stringify(file, null, 2)}`);
let url = file["0"].metadata.mediaLink; // image url
return resolve(res.status(200).send({data:file})); // huge data
}) ;
Now, I have some questions.
Why so much information and so many objects as a response to the upload () method? Reviewing the immense object, I found a property called mediaLink inside metadata and it is the download url of the image. but...
Why is the url different from the one shown by firebase? Why can not I find the downloadURL property?
How can get the url of firebase?
firebase: https://firebasestorage.googleapis.com/v0/b/myfirebaseapp.appspot.com/o/demo%2Fimages%2Fthumb_restaurant.jpg?alt=media&token=bee96b71-2094-4492-96aa-87469363dd2e
mediaLink: https://www.googleapis.com/download/storage/v1/b/myfirebaseapp.appspot.com/o/demo%2Fimages%2Frestaurant.jpg?generation=1530193601730593&alt=media
If I use the mediaLink url is there any problem with different urls? (read, update from ios and Web Client)

Looking at Google Cloud Storage: Node.js Client documentation, they have a link to sample code which shows exactly how to do this. Also, see the File class documentation example (below)
// Imports the Google Cloud client library
const Storage = require('#google-cloud/storage');
// Creates a client
const storage = new Storage();
/**
* TODO(developer): Uncomment the following lines before running the sample.
*/
// const bucketName = 'Name of a bucket, e.g. my-bucket';
// const filename = 'File to access, e.g. file.txt';
// Gets the metadata for the file
storage
.bucket(bucketName)
.file(filename)
.getMetadata()
.then(results => {
const metadata = results[0];
console.log(`File: ${metadata.name}`);
console.log(`Bucket: ${metadata.bucket}`);
console.log(`Storage class: ${metadata.storageClass}`);
console.log(`Self link: ${metadata.selfLink}`);
console.log(`ID: ${metadata.id}`);
console.log(`Size: ${metadata.size}`);
console.log(`Updated: ${metadata.updated}`);
console.log(`Generation: ${metadata.generation}`);
console.log(`Metageneration: ${metadata.metageneration}`);
console.log(`Etag: ${metadata.etag}`);
console.log(`Owner: ${metadata.owner}`);
console.log(`Component count: ${metadata.component_count}`);
console.log(`Crc32c: ${metadata.crc32c}`);
console.log(`md5Hash: ${metadata.md5Hash}`);
console.log(`Cache-control: ${metadata.cacheControl}`);
console.log(`Content-type: ${metadata.contentType}`);
console.log(`Content-disposition: ${metadata.contentDisposition}`);
console.log(`Content-encoding: ${metadata.contentEncoding}`);
console.log(`Content-language: ${metadata.contentLanguage}`);
console.log(`Metadata: ${metadata.metadata}`);
console.log(`Media link: ${metadata.mediaLink}`);
})
.catch(err => {
console.error('ERROR:', err);
});

Related

Getting firebase storage image 403

I'm using Next.js and am struggling to get firebase storage images to render on screen as they keep coming back as 403 errors.
I upload an image to storage when adding a document to firestore with a url nested in it then use that url to get the image
This is my Next Image tag with recipe.main_image.url == https://storage.googleapis.com/chairy-cooks.appspot.com/recipes/0O3Oycow4lJvyhDbwN0j/main_image/190524-classic-american-cheeseburger-ew-207p.png
And recipe.main_image.name == the name of the photo in firebase storage.
<Image
alt={recipe.main_image.name}
src={recipe.main_image.url}
width={650}
height={650}
/>
The alt tag comes back as the right thing (image name in firebase storage) so I know the image is getting sent back but for some reason I'm unauthorised I've tried multiple fixes including changing firebase rules to
service firebase.storage {
match /b/{bucket}/o {
match /{allPaths=**} {
allow read, write: if request.auth != null;
}
}
}
AND
service firebase.storage {
match /b/{bucket}/o {
match /{allPaths=**} {
allow read, write;
}
}
}
But neither change the outcome I've also set my next config up so I'm getting firebase storage
images: {
domains: ['images.ctfassets.net', 'storage.googleapis.com'],
},
If the images will be served to the public, then i would suggest to grant public access to the image. Cloud Storage for Firebase is backed by Google Cloud Storage. You can make individual objects publicly readable or make the bucket publicly readable.
or better yet, generate signed URLs, this way you can grant limited time access to an object. If you're working with Javascript then you may take a look at this sample code on how to generate a signed URL.
/**
* TODO(developer): Uncomment the following lines before running the sample.
* Note: when creating a signed URL, unless running in a GCP environment,
* a service account must be used for authorization.
*/
// The ID of your GCS bucket
// const bucketName = 'your-unique-bucket-name';
// The full path of your file inside the GCS bucket, e.g. 'yourFile.jpg' or 'folder1/folder2/yourFile.jpg'
// const fileName = 'your-file-name';
// Imports the Google Cloud client library
const {Storage} = require('#google-cloud/storage');
// Creates a client
const storage = new Storage();
async function generateV4ReadSignedUrl() {
// These options will allow temporary read access to the file
const options = {
version: 'v4',
action: 'read',
expires: Date.now() + 15 * 60 * 1000, // 15 minutes
};
// Get a v4 signed URL for reading the file
const [url] = await storage
.bucket(bucketName)
.file(fileName)
.getSignedUrl(options);
console.log('Generated GET signed URL:');
console.log(url);
console.log('You can use this URL with any user agent, for example:');
console.log(`curl '${url}'`);
}
generateV4ReadSignedUrl().catch(console.error);
You may visit this documentation for more information.

How to delete a Firebase Storage folder from a Firebase Cloud Function?

I couldn't find the deleteFiles() method in the Firebase API reference. My IDE tells me this method takes an optional DeleteFilesOptions argument and I couldn't find any information on that type as well. If someone could point me to this documentation I would appreciate it.
That said, I've seen a number of posts that use this method, with this argument, to delete an entire Storage folder (and all of its files) through a Cloud Function. My question is, is this the correct way to do it (since the documentation here is missing)?
const functions = require("firebase-functions");
const admin = require("firebase-admin");
exports.deleteStorageFolder = functions.https.onCall(async (data, _context) => {
const uid = data.userId;
try {
const bucket = admin.storage().bucket(); // returns the default bucket, which is good
await bucket.deleteFiles({
prefix: `images/users/${uid}`, // the path of the folder
});
return Promise.resolve(true);
} catch (error) {
throw new functions.https.HttpsError("unknown", "Failed to delete storage folder.", error);
}
});
As #Doug already mentioned in the comment, "Firebase just provides wrappers around Cloud Storage. They are the same thing.". Also, according to this documentation, "Cloud Storage for Firebase stores your files in a Google Cloud Storage bucket, making them accessible through both Firebase and Google Cloud. This allows you the flexibility to upload and download files from mobile clients via the Firebase SDKs for Cloud Storage."
Having been said that, I've tried replicating the code snippet you've provided using deleteFiles(), and it worked fine on my end:
// // The Firebase Admin SDK to access Firestore.
const functions = require("firebase-functions");
const admin = require('firebase-admin');
const firebaseConfig = {
// Your Firebase configuration...
};
admin.initializeApp(firebaseConfig);
const bucket = admin.storage().bucket();
async function deleteFolder(){
await bucket.deleteFiles({
prefix: 'images/users/${uid}' // the path of the folder
});
}
deleteFolder();
One another option that you can do is to directly use Google Cloud Storage, and skip using the Firebase Storage:
const {Storage} = require('#google-cloud/storage');
const storage = new Storage();
const bucket = storage.bucket("your-bucket-name");
bucket.deleteFiles({
prefix: 'images/users/${uid}'
}, function(err) {
if (!err) {
console.log("All files in the `images` directory have been deleted.");
}
});
Just a note, following the suggestion of Doug, you can try and test it out first in your local or test environment. For further reference, you can refer to delete() and deleteFiles()

Create a file in a cloud storage bucket from within a trigger

I would like to be able to create a file in a project Bucket as part of a Firestore cloud trigger.
When there is a change to a document on a specific collection I need to be able to take data from that document and write it to a file in a bucket in cloud storage
Example
exports.myFunction = functions.firestore
.document('documents/{docId}')
.onUpdate((change, context) => {
const after = change.after.data() as Document;
// CREATE AND WRITE TO file IN BUCKET HERE
});
I have found many examples on how to upload files. I have explored
admin.storage().bucket().file(path)
createWriteStream()
write()
But I can't seem to find documentation on how exactly to achieve the above.
Is this possible from within a trigger and if so where can I find documentation on how to do this?
Here is why I want to do this (just in case I am approaching this all wrong) . We have an application where our users are able to generate purchase orders for work they have done. At the time they initiate a generate from the software we need to create a timestamped document [pdf] (in a secure location but on that is accessible to authenticated users) representing this purchase order. The data to create this will come from the document that triggers the change.
As #Doug Stevenson said, you can use node streams.
You can see how to do this in this sample from the GCP getting started samples repo.
You need to provide a file name and the file buffer in order to stream it to GCS:
function sendUploadToGCS(req, res, next) {
if (!req.file) {
return next();
}
const gcsname = Date.now() + req.file.originalname;
const file = bucket.file(gcsname);
const stream = file.createWriteStream({
metadata: {
contentType: req.file.mimetype,
},
resumable: false,
});
stream.on('error', err => {
req.file.cloudStorageError = err;
next(err);
});
stream.on('finish', async () => {
req.file.cloudStorageObject = gcsname;
await file.makePublic();
req.file.cloudStoragePublicUrl = getPublicUrl(gcsname);
next();
});
stream.end(req.file.buffer);
}

Cannot delete Google Cloud Storage files using Firebase Functions

I cannot seem to delete files from my Firebase Storage using Firebase Functions. I have been at it for almost a week now, and the "closest" I have gotten I believe is an error within the error itself, "Cannot parse JSON response at ApiError".
Now, what I want to be doing is that once a Firebase user is deleted, I want to clear my database and storage from the users files and data.
const admin = require('firebase-admin');
const {Storage} = require('#google-cloud/storage');
exports.deleteUserHandler = (event, context) => {
const userUID = event.uid;
const bucketname = "gs://MY_PROJECT.appspot.com/user/"+userUID;
const storage = new Storage({
projectId: "MY_PROJECT_ID",
});
return admin.database().ref('/user/' + userUID).remove().then(() => {
console.log("User " + userUID + " database removed");
return storage.bucket(bucketname).file("profilepic.jpeg").delete();
}).then(() => {
return storage.bucket(bucketname).file("smallprofilepic.jpeg").delete();
}).then(() => {
console.log("User " + userUID + " firestore removed");
});
}
The function trigger when supposed to, and removes the data from the realtime database. I can not however remove images from the storage. I feel like I am closest to what should be at the moment, but the error I get from the function logs is as follows.
Error: Cannot parse JSON response
at ApiError (/user_code/node_modules/#google-cloud/storage/node_modules/#google-cloud/common/build/src/util.js:43:9)
at Util.parseHttpRespBody (/user_code/node_modules/#google-cloud/storage/node_modules/#google-cloud/common/build/src/util.js:167:42)
at Util.handleResp (/user_code/node_modules/#google-cloud/storage/node_modules/#google-cloud/common/build/src/util.js:116:117)
at retryRequest (/user_code/node_modules/#google-cloud/storage/node_modules/#google-cloud/common/build/src/util.js:403:22)
at onResponse (/user_code/node_modules/#google-cloud/storage/node_modules/retry-request/index.js:200:7)
at /user_code/node_modules/#google-cloud/storage/node_modules/teeny-request/build/src/index.js:222:13
at process._tickDomainCallback (internal/process/next_tick.js:135:7)
I do not know what it means.
I have set my Google APIs Service Agent and App Engine default service account to be storage admins.
My dependencies at the moment of posting are
"#google-cloud/storage": "^2.3.4",
"firebase-admin": "^6.4.0",
"firebase-functions": "^2.1.0"
const bucketname = "gs://MY_PROJECT.appspot.com/user/"+userUID;
That string above is not the name of your bucket. It's a URL to a location in the bucket. The name of a bucket doesn't look like a URL, and it doesn't have a path to a file. The name of your default bucket is probably just "MY_PROJECT.appspot.com", and you can check that in the Cloud console in the Storage section.
It looks like you might be under the misconception that a bucket is a folder or something. Buckets are just containers. You build references to files after you have a Bucket object. Those files might have path components, such as "/user/UID/somefile.jpg".

How to create disk from snapshot in google cloud function - node js

I have been struggling to find a solution for this particular problem. I've gone through almost all the documentation of gcloud/compute node module which is used in google cloud functions.
Now my challenge is to create a new disk from an existing snapshot in google cloud function.
I have used below code to create a disk. As they haven't provided any example to create a disk from a snapshot. Following cloud function creates a new disk named disk1 which is entirely fresh and new disk. I don't want that. I want to create a disk from an existing snapshot which has some data and setup in it.
exports.tempFunction = (req, res) => {
// Example input: {"message": "Hello!"}
const Compute = require(`#google-cloud/compute`);
const compute = new Compute();
const zone = compute.zone('us-central1-a');
const disk = zone.disk('disk1');
const config = {
// ...
//os:'ubuntu'
};
disk.create(config, function(err, disk, operation, apiResponse) {
// `disk` is a Disk object.
// `operation` is an Operation object that can be used to check the
// status of the request.
console.log(err);
console.log(disk);
console.log(operation);
console.log(apiResponse);
res.status(200).send("success");
});
};
Any help in this regard is highly appreciated.
P.S. I also tried using cloud APIs. But as I want only to use the cloud functions and I am unable to figure out that how do I get access token for gcloud to use inside cloud functions
The disk creation [1] can be customized by setting the disk resource fields [2] in the config object.
In this case, set the sourceSnapshot field in the config to the existing snapshot partial or full URL. The code should look like this:
exports.tempFunction = (req, res) => {
// Example input: {"message": "Hello!"}
const Compute = require(`#google-cloud/compute`);
const compute = new Compute();
const zone = compute.zone('us-central1-a');
const disk = zone.disk('disk1');
const config = {
sourceSnapshot: "projects/{YOUR-PROJECT}/global/snapshots/{YOUR_SNAPSHOT}"
};
disk.create(config, function(err, disk, operation, apiResponse) {
// `disk` is a Disk object.
// `operation` is an Operation object that can be used to check the
// status of the request.
console.log(err);
console.log(disk);
console.log(operation);
console.log(apiResponse);
res.status(200).send("success");
});
};

Categories