how to properly init google cloud storage using cloud function? [duplicate] - javascript

I am trying to implement cloud function but getting error if i
require it like this
var storage =require('#google-cloud/storage')();
like this when deploying
var storage = require('#google-cloud/storage');
so i resolved to using it as above but tried uploading a picture i am getting error "TypeError: gcs.bucket is not a function"
const os = require('os');
const path = require('path');
///
exports.onFileChange = functions.storage.object().onFinalize((event) => {
const bucket = event.bucket;
const contentType = event.contentType;
const filePath = event.name;
console.log('Changes made to bucket');
///
if(path.basename(filePath).startsWith('renamed-')){
console.log("File was previously renamed");
return;
}
const gcs = storage({
projectId: 'clfapi'
});
///
const destBucket = gcs.bucket(bucket);
const tmFiilePath = path.join(os.tmpdir(), path.basename(filePath));
const metadata = {contentType: contentType};
///
return destBucket.file(filePath).download({
destination: tmFiilePath
}).then(() => {
return destBucket.upload(tmFiilePath, {
destination: 'renamed-' + path.basename(filePath),
metadata: metadata
})
});
});

The API changed in version 2.x of the Cloud Storage node SDK. According to the documentation, you import the SDK like this:
// Imports the Google Cloud client library
const {Storage} = require('#google-cloud/storage');
Then you can create a new Storage object:
// Creates a client
const storage = new Storage();
Then you can reach into a bucket:
const bucket = storage.bucket()

Related

File upload at Chrome Extension using Firebase Storage(XMLHttpRequest is not defined)

I can send/take data with Firestore. But when I am trying to upload file to firebase storage, i took the following error. Have you ever faced this error? How can i handle it?
Uncaught ReferenceError: XMLHttpRequest is not defined
Kontext
...gstatic.com/firebasejs/9.6.11/firebase-storage.js
var fileId = `${request.payload.getFileName}`; //value from html
const app = initializeApp(firebaseConfig);
const storage = getStorage(app);
const storageRef = ref(storage, fileId);
const metadata = {
contentType: 'image/jpeg',
};
const uploadTask = uploadBytesResumable(storageRef, fileId, metadata);

CubeJS initial call to initialize granting of JWT

directory structure
|_ jwks.json
|_ cube.js
|_ package.json
The cube docs give config for cube.js:
const fs = require("fs");
const jwt = require("jsonwebtoken");
const jwkToPem = require("jwk-to-pem");
const jwks = JSON.parse(fs.readFileSync("jwks.json"));
const _ = require("lodash");
module.exports = {
checkAuth: async (req, auth) => {
const decoded = jwt.decode(auth, { complete: true });
const jwk = _.find(jwks.keys, x => x.kid === decoded.header.kid);
const pem = jwkToPem(jwk);
req.authInfo = jwt.verify(auth, pem);
},
contextToAppId: ({ authInfo }) => `APP_${authInfo.userId}`,
preAggregationsSchema: ({ authInfo }) => "pre_aggregations_${authInfo.userId}"
};
Question: If this is the model used (image below) how does one initialize the process to acquire a token. In other words how does the client go about making the initial call in vanilla Javascript to start the token process using /.well_known/jwks.json?
Cube.js does not support token creation because Cube.js is a microservice for analytics.
You can generate JWT:
In your application
Use auth services (auth0, keycloak, etc.)
P.S There is support for token creation, but it's only for development mode.

Firebase Storage Nodejs How I can show url Public PDF Files in the console [duplicate]

I use firebase-admin and firebase-functions to upload a file in Firebase Storage.
I have this rules in storage:
service firebase.storage {
match /b/{bucket}/o {
match /images {
allow read;
allow write: if false;
}
}
}
And I want get a public URL with this code:
const config = functions.config().firebase;
const firebase = admin.initializeApp(config);
const bucketRef = firebase.storage();
server.post('/upload', async (req, res) => {
// UPLOAD FILE
await stream.on('finish', async () => {
const fileUrl = bucketRef
.child(`images/${fileName}`)
.getDownloadUrl()
.getResult();
return res.status(200).send(fileUrl);
});
});
But I have this error .child is not a function.
How can I get the public url of a file with firebase-admin?
From the sample application code on the using Cloud Storage documentation, you should be able to implement the following code to obtain the public download URL after the upload is successful:
// Create a new blob in the bucket and upload the file data.
const blob = bucket.file(req.file.originalname);
const blobStream = blob.createWriteStream();
blobStream.on('finish', () => {
// The public URL can be used to directly access the file via HTTP.
const publicUrl = format(`https://storage.googleapis.com/${bucket.name}/${blob.name}`);
res.status(200).send(publicUrl);
});
Alternatively, if you need a publicly accessible download URL, see this answer which suggests using getSignedUrl() from the Cloud Storage NPM module because the Admin SDK doesn't support this directly:
You'll need to generate a signed URL using getSignedURL via the
#google-cloud/storage NPM module.
Example:
const gcs = require('#google-cloud/storage')({keyFilename: 'service-account.json'});
// ...
const bucket = gcs.bucket(bucket);
const file = bucket.file(fileName);
return file.getSignedUrl({
action: 'read',
expires: '03-09-2491'
}).then(signedUrls => {
// signedUrls[0] contains the file's public URL
});
What worked for me is to compose a URL like this:
https://storage.googleapis.com/<bucketName>/<pathToFile>
Example: https://storage.googleapis.com/mybucket.appspot.com/public/myFile.png
How I found it?
I went to GCP Console, Storage. Located the uploaded file. Clicked "Copy URL".
You may want to make a file Public first. I did it like this:
const bucket = seFirebaseService.admin().storage().bucket()
await bucket.file(`public/myFile.png`).makePublic()
I've been tinkering with this for days and realized
A) correct access rights on the bucket is key:
service firebase.storage {
match /b/{bucket}/o {
match /{allPaths=**} {
allow read;
allow write: if request.auth != null;
}
}
}
B) The functional public URL is right there in the meta data (tested and works). Notice the access rights.
const pdfDoc = printer.createPdfKitDocument(docDefinition);
const pdfFile = admin
.storage()
.bucket()
.file(newId + '.pdf');
pdfDoc.pipe(
pdfFile.createWriteStream({
contentType: 'application/pdf',
public: true,
})
);
pdfDoc.end();
console.log('Get public URL');
const publicUrl = pdfFile.metadata.mediaLink;

How do I transfer data from one method to another in Node.js?

I'm using Telegram bot API and AWS S3 to read data from a bucket. I need to use the data from the s3 method in the Telgraf method, but I don't know how:
'use strict'
const Telegraf = require('telegraf');
const bot = new Telegraf('TOKEN')
var AWS = require('aws-sdk')
var s3 = new AWS.S3({
accessKeyId: 'key',
secretAccessKey: 'secret'
})
var params = {Bucket: 'myBucket', Key:"ipsum.txt"};
var s3Promise = s3.getObject(params, function(err, data) {
if (err) console.log(err, err.stack);
else
var words= data.Body.toString(); //WHAT I WANT IN IN COMMAND METHOD
console.log('\n' + words+ '\n') //Returns ipsum.txt as string on console
})
bot.command('s', (ctx) => { //Bot Command
s3Promise; //Returns ipsum.txt as string on console
ctx.reply('Check console') //Meesage in Telegram
//ctx.reply(<I WANT data.Body.toSting() HERE>)
});
const { PORT = 3000 } = process.env
bot.startWebhook('/', null, PORT)
How do I use the data from the s3.getObject method in ctx.reply() ?
If you want to send the file as an attachment, you have to use: ctx.replyWithDocument. Aside from that your problem is: How do I return the response from an asynchronous call?
In this particular case you can use s3.getObject(params).promise() in order to avoid the callback API, and use it easily inside your bot.command listener.
Using async/await (Node >= 7.6) your code can be written like this
'use strict';
const Telegraf = require('telegraf');
const bot = new Telegraf('TOKEN');
const AWS = require('aws-sdk');
const s3 = new AWS.S3({
accessKeyId: 'key',
secretAccessKey: 'secret'
});
const params = {
Bucket: 'myBucket',
Key: 'ipsum.txt'
};
bot.command('s', async ctx => { // Bot Command
try {
// If you're sending always the same file and it won't change
// too much, you can cache it to avoid the external call everytime
const data = await s3.getObject(params).promise();
ctx.reply('Check console'); // Message in Telegram
// This will send the file as an attachment
ctx.replyWithDocument({
source: data.Body,
filename: params.Key
});
// or just as text
ctx.reply(data.Body.toString());
} catch(e) {
// S3 failed
ctx.reply('Oops');
console.log(e);
}
});
const {
PORT = 3000
} = process.env;
bot.startWebhook('/', null, PORT);
More info on how to work with files can be found on telegraf docs
PS: I tested the code and it it's fully working:
While I haven't used S3, I do know that AWS services added support for Promises to their implementations to avoid using callbacks. Personally, I much prefer the use of promises as I think they lead to more readable code.
I think the following should handle the issue you're having.
'use strict'
const Telegraf = require('telegraf');
const bot = new Telegraf('TOKEN')
var AWS = require('aws-sdk')
var s3 = new AWS.S3({
accessKeyId: 'key',
secretAccessKey: 'secret'
})
var params = {Bucket: 'myBucket', Key:"ipsum.txt"};
bot.command('s', (ctx) => {
s3.getObject(params).promise()
.then(data => {
ctx.reply('Check console');
ctx.reply(data.Body.toString());
}, err => console.log(err, err.stack));
})
const { PORT = 3000 } = process.env
bot.startWebhook('/', null, PORT)
As suggested by Luca, I called bot.command inside of s3.getObject and it works!
s3.getObject(params, function(err, data) {
if (err) console.log(err, err.stack); // an error occurred
else
bot.command('s', (ctx) => {
ctx.reply('Succesfully read from S3:\n\n' + data.Body.toString())
});
})

How do I upload a base64 encoded image (string) directly to a Google Cloud Storage bucket using Node.js?

Currently, I am using the #google-cloud/storage NPM package to upload a file directly to a Google Cloud Storage bucket. This requires some trickery as I only have the image's base64 encoded string. I have to:
Decode the string
Save it as a file
Send the file path to the below script to upload to Google Cloud Storage
Delete the local file
I'd like to avoid storing the file in the filesystem altogether since I am using Google App Engine and I don't want to overload the filesystem / leave junk files there if the delete operation doesn't work for whatever reason. This is what my upload script looks like right now:
// Convert the base64 string back to an image to upload into the Google Cloud Storage bucket
var base64Img = require('base64-img');
var filePath = base64Img.imgSync(req.body.base64Image, 'user-uploads', 'image-name');
// Instantiate the GCP Storage instance
var gcs = require('#google-cloud/storage')(),
bucket = gcs.bucket('google-cloud-storage-bucket-name');
// Upload the image to the bucket
bucket.upload(__dirname.slice(0, -15) + filePath, {
destination: 'profile-images/576dba00c1346abe12fb502a-original.jpg',
public: true,
validation: 'md5'
}, function(error, file) {
if (error) {
sails.log.error(error);
}
return res.ok('Image uploaded');
});
Is there anyway to directly upload the base64 encoded string of the image instead of having to convert it to a file and then upload using the path?
The solution, I believe, is to use the file.createWriteStream functionality that the bucket.upload function wraps in the Google Cloud Node SDK.
I've got very little experience with streams, so try to bear with me if this doesn't work right off.
First of all, we need take the base64 data and drop it into a stream. For that, we're going to include the stream library, create a buffer from the base64 data, and add the buffer to the end of the stream.
var stream = require('stream');
var bufferStream = new stream.PassThrough();
bufferStream.end(Buffer.from(req.body.base64Image, 'base64'));
More on decoding base64 and creating the stream.
We're then going to pipe the stream into a write stream created by the file.createWriteStream function.
var gcs = require('#google-cloud/storage')({
projectId: 'grape-spaceship-123',
keyFilename: '/path/to/keyfile.json'
});
//Define bucket.
var myBucket = gcs.bucket('my-bucket');
//Define file & file name.
var file = myBucket.file('my-file.jpg');
//Pipe the 'bufferStream' into a 'file.createWriteStream' method.
bufferStream.pipe(file.createWriteStream({
metadata: {
contentType: 'image/jpeg',
metadata: {
custom: 'metadata'
}
},
public: true,
validation: "md5"
}))
.on('error', function(err) {})
.on('finish', function() {
// The file upload is complete.
});
Info on file.createWriteStream, File docs, bucket.upload, and the bucket.upload method code in the Node SDK.
So the way the above code works is to define the bucket you want to put the file in, then define the file and the file name. We don't set upload options here. We then pipe the bufferStream variable we just created into the file.createWriteStream method we discussed before. In these options we define the metadata and other options you want to implement. It was very helpful to look directly at the Node code on Github to figure out how they break down the bucket.upload function, and recommend you do so as well. Finally, we attach a couple events for when the upload finishes and when it errors out.
Posting my version of the answer in response to #krlozadan 's request above:
// Convert the base64 string back to an image to upload into the Google Cloud Storage bucket
var mimeTypes = require('mimetypes');
var image = req.body.profile.image,
mimeType = image.match(/data:([a-zA-Z0-9]+\/[a-zA-Z0-9-.+]+).*,.*/)[1],
fileName = req.profile.id + '-original.' + mimeTypes.detectExtension(mimeType),
base64EncodedImageString = image.replace(/^data:image\/\w+;base64,/, ''),
imageBuffer = new Buffer(base64EncodedImageString, 'base64');
// Instantiate the GCP Storage instance
var gcs = require('#google-cloud/storage')(),
bucket = gcs.bucket('my-bucket');
// Upload the image to the bucket
var file = bucket.file('profile-images/' + fileName);
file.save(imageBuffer, {
metadata: { contentType: mimeType },
public: true,
validation: 'md5'
}, function(error) {
if (error) {
return res.serverError('Unable to upload the image.');
}
return res.ok('Uploaded');
});
This worked just fine for me. Ignore some of the additional logic in the first few lines as they are only relevant to the application I am building.
If you want to save a string as a file in Google Cloud Storage, you can do it easily using the file.save method:
const {Storage} = require('#google-cloud/storage');
const storage = new Storage();
const myBucket = storage.bucket('my-bucket');
const file = myBucket.file('my-file.txt');
const contents = 'This is the contents of the file.';
file.save(contents).then(() => console.log('done'));
:) what an issue !! Have tried it and got the issue Image has uploaded on firebase Storage but not download and just loader is moving around and around... After spending time... Got the success to upload the image on firebase storage with downloading... There was an issue in an access token...
check the screenshot
If you check in the file location section on the right side bottom there is an option "create access token" and not showing any "access token" on there if you create manually access token on there then refresh the page image will showing... So now the question is how to create it by code...
just use below code to create the access token
const uuidv4 = require('uuid/v4');
const uuid = uuidv4();
metadata: { firebaseStorageDownloadTokens: uuid }
Full code is given below for uploading an image to storage image on firebase storage
const functions = require('firebase-functions')
var firebase = require('firebase');
var express = require('express');
var bodyParser = require("body-parser");
const uuidv4 = require('uuid/v4');
const uuid = uuidv4();
const os = require('os')
const path = require('path')
const cors = require('cors')({ origin: true })
const Busboy = require('busboy')
const fs = require('fs')
var admin = require("firebase-admin");
var serviceAccount = {
"type": "service_account",
"project_id": "xxxxxx",
"private_key_id": "xxxxxx",
"private_key": "-----BEGIN PRIVATE KEY-----\jr5x+4AvctKLonBafg\nElTg3Cj7pAEbUfIO9I44zZ8=\n-----END PRIVATE KEY-----\n",
"client_email": "xxxx#xxxx.iam.gserviceaccount.com",
"client_id": "xxxxxxxx",
"auth_uri": "https://accounts.google.com/o/oauth2/auth",
"token_uri": "https://oauth2.googleapis.com/token",
"auth_provider_x509_cert_url": "https://www.googleapis.com/oauth2/v1/certs",
"client_x509_cert_url": "https://www.googleapis.com/robot/v1/metadata/x509/firebase-adminsdk-5rmdm%40xxxxx.iam.gserviceaccount.com"
}
admin.initializeApp({
credential: admin.credential.cert(serviceAccount),
storageBucket: "xxxxx-xxxx" // use your storage bucket name
});
const app = express();
app.use(bodyParser.urlencoded({ extended: false }));
app.use(bodyParser.json());
app.post('/uploadFile', (req, response) => {
response.set('Access-Control-Allow-Origin', '*');
const busboy = new Busboy({ headers: req.headers })
let uploadData = null
busboy.on('file', (fieldname, file, filename, encoding, mimetype) => {
const filepath = path.join(os.tmpdir(), filename)
uploadData = { file: filepath, type: mimetype }
console.log("-------------->>",filepath)
file.pipe(fs.createWriteStream(filepath))
})
busboy.on('finish', () => {
const bucket = admin.storage().bucket();
bucket.upload(uploadData.file, {
uploadType: 'media',
metadata: {
metadata: { firebaseStorageDownloadTokens: uuid,
contentType: uploadData.type,
},
},
})
.catch(err => {
res.status(500).json({
error: err,
})
})
})
busboy.end(req.rawBody)
});
exports.widgets = functions.https.onRequest(app);
You have to convert base64 to image buffer then upload as below, you need to provide image_data_from_html variable as the data you extract from HTML event.
const base64Text = image_data_from_html.split(';base64,').pop();
const imageBuffer = Buffer.from(base64Text, 'base64');
const contentType = data.image_data.split(';base64,')[0].split(':')[1];
const fileName = 'myimage.png';
const imageUrl = 'https://storage.googleapis.com/bucket-url/some_path/' + fileName;
await admin.storage().bucket().file('some_path/' + fileName).save(imageBuffer, {
public: true,
gzip: true,
metadata: {
contentType,
cacheControl: 'public, max-age=31536000',
}
});
console.log(imageUrl);
I was able to get the base64 string over to my Cloud Storage bucket with just one line of code.
var decodedImage = new Buffer(poster64, 'base64');
// Store Poster to storage
let posterFile = await client.file(decodedImage, `poster_${path}.jpeg`, { path: 'submissions/dev/', isBuffer: true, raw: true });
let posterUpload = await client.upload(posterFile, { metadata: { cacheControl: 'max-age=604800' }, public: true, overwrite: true });
let permalink = posterUpload.permalink
Something to be aware of is that if you are inside of a Nodejs environment you wont be able to use atob().
The top answer of this post showed me the errors of my ways!
NodeJS base64 image encoding/decoding not quite working

Categories