I am new to javascript. I am trying to implement OAuth 2.0 for Server to Server Applications for that i am using this library. So while i was doing this
googleAuth.authenticate(
{
email: 'my.gserviceaccount.com',
keyFile: fs.readFileSync("./accesstoken/key.pem"),
scopes: ['https://www.googleapis.com/auth/drive.readonly']
},
function (err, token) {
console.log(token);
console.log("err:"+err);
});
it gave me following exception
ENOENT: no such file or directory, open '-----BEGIN PRIVATE KEY-----asdasxxx---END PRIVATE KEY-----
my file pem.key file is in the same directory in which my js file is.
There is no need of fs.readFileSync
keyFile: fs.readFileSync("./accesstoken/key.pem"),
Just give simple path to file
keyFile: "./key.pem", // if file is in same folder
As given in Original Doc :
// the path to the PEM file to use for the cryptographic key (ignored if 'key' is also defined)
// the key will be used to sign the JWT and validated by Google OAuth
keyFile: 'path/to/key.pem',
Related
I'm working on an automated testing project using PuppeteerJS in headless Chrome and trying to integrate existing screenshot functionality with AWS-SDK to upload images to an AWS S3 bucket on test failure.
The problem i'm having is the sub directories in a screenshots folder and the image file names are generated randomly in another file based on the current date and test environment, and run every time a test runs. The format of the generated directories/files is "screenshots/year/month/day/randomname.png".
The next step in the test is after the screenshots are generated, the folder containing the newly created images should be uploaded to AWS, and I've tried to achieve this using a glob to get every subdirectory and file with a png extension, like "screenshots/**/**/**/*.png", but i get a "no such file or directory" error". The folders/file names will be different everytime the tests run.
I've just started using AWS and I haven't been able to find a specific answer to my problem while researching.
import { PutObjectCommand } from "#aws-sdk/client-s3";
import { s3Client } from "../libs/s3Client.js";
import path from "path";
import fs from "fs";
const file = "../../screenshots/**/**/**/*.png";
const fileStream = fs.createReadStream(file);
// Set the parameters
export const uploadParams = {
Bucket: "bucket-name",
Key: path.basename(file),
// Add the required 'Body' parameter
Body: fileStream,
};
// Upload file to specified bucket.
export const run = async () => {
try {
const data = await s3Client.send(new PutObjectCommand(uploadParams));
console.log("Success", data);
return data; // For unit tests.
} catch (err) {
console.log("Error", err);
}
};
run();
Worked this out with the help of Jarmod. I needed to use the nodeJS:fs module to get the file paths recursively and returns a string which can be passed into the AWS fileStream variable for it to be uploaded to AWS. Jarmod shared the webmound article and i found the coder rocket fuel article hepful also.
https://www.webmound.com/nodejs-get-files-in-directories-recursively/
https://coderrocketfuel.com/article/recursively-list-all-the-files-in-a-directory-using-node-js
Please help me to solve..
My JS file file is in Next JS app > pages/api/profile and google-cloud-key.json is in Next JS app root folder itself where package.json is there.
Everything works fine in local, but below error given at Vercel
ENOENT: no such file or directory, open '/var/task/google-cloud-key.json' vercel
My google-cloud-key.json is in the root directory of next JS app.
I have tried below code also:
const storage = new Storage({
projectId: 'careful-relic-319511',
keyFilename: __dirname + '/../../../google-cloud-key.json'
});
This is giving below error now:
ENOENT: no such file or directory, open '/var/task/.next/server/google-cloud-key.json'] {\n errno: -2,\n code: 'ENOENT',\n syscall: 'open',\n path: '/var/task/.next/server/google-cloud-key.json'\n}\nEND
Hey i found a great solution
Base64 encode the json file. (https://www.base64encode.org/)
Set an env variable (both in local and vercel setting) this encoded string. (In my case GCP_CRED_FILE)
const gcsKey = JSON.parse(
Buffer.from(process.env.GCP_CRED_FILE, 'base64').toString()
);
const storage = new Storage({
credentials: {
client_email: gcsKey.client_email,
private_key: gcsKey.private_key
},
projectId: gcsKey.project_id
});
this works for me. tested using nextjs 11 and vercel deployments
Moreover you dont need to store or push the file to a repo or store it (which can be a security threat)
I have a use case where I have to read a ZIP file and pass it to the creation of lambda as a template.
Now I want to read zip file from a S3 public bucket. How can I read the file from the public bucket?
S3 bucket zip file where I am reading is https://lambda-template-code.s3.amazonaws.com/LambdaTemplate.zip
const zipContents = 'https://lambda-template-code.s3.amazonaws.com/LambdaTemplate.zip';
var params = {
Code: {
// here at below I have to pass the zip file reading it from the S3 public bucket
ZipFile: zipContents,
},
FunctionName: 'testFunction', /* required */
Role: 'arn:aws:iam::149727569662:role/ROLE', /* required */
Description: 'Created with tempalte',
Handler: 'index.handler',
MemorySize: 256,
Publish: true,
Runtime: 'nodejs12.x',
Timeout: 15,
TracingConfig: {
Mode: "Active"
}
};
lambda.createFunction(params, function(err, data) {
if (err) console.log(err, err.stack); // an error occurred
else console.log(data); // successful response
});
The above code gives error Could not unzip uploaded file. Please check your file, then try to upload again
How can I read the URL file? And pass it in the params
Can anyone help me here
Using the createFunction docs and specifically the Code docs, you can see that ZipFile expects
The base64-encoded contents of the deployment package. AWS SDK and AWS CLI clients handle the encoding for you.
and not a URL of where it is. Instead, you need to use S3Bucket and S3Key.
It is not clear from the docs that public buckets are allowed for this purpose, but the docs do say
An Amazon S3 bucket in the same AWS Region as your function. The bucket can be in a different AWS account.
I want to read zip file from local in Node JS (Also I would be hosting this backend on Linux based server)
In the below code, exampleLambda.zip file is in the same folder where the .js code file exists.
const zipContents = fs.readFileSync('exampleLambda.zip');
var params = {
Code: {
// here at below I have to pass the zip file reading it from the S3 public bucket
ZipFile: zipContents,
},
FunctionName: 'testFunction', /* required */
Role: 'arn:aws:iam::149727569662:role/ROLE', /* required */
Description: 'Created with tempalte',
Handler: 'index.handler',
MemorySize: 256,
Publish: true,
Runtime: 'nodejs12.x',
Timeout: 15,
TracingConfig: {
Mode: "Active"
}
};
lambda.createFunction(params, function(err, data) {
if (err) console.log(err, err.stack); // an error occurred
else console.log(data); // successful response
});
while reading the file it throws error Error: ENOENT: no such file or directory, open 'exampleLambda.zip error I have also tried with
const zipContents = fs.readFileSync('./exampleLambda.zip');
The file is in the same folder where the .Js file exists.
Why this throws an error, I am using fs lib from nodeJS. Is there any way around for this?
Also, I want a solution where It also works on the server as well which is a Linux base because I am gonna upload it there.
I am making a twitch chat bot using the tmi.js module. It dawned upon me that having the OAuth token within the main js file may not be the most secure practice. How do I separate the token from the main file and include the token to my main app?
let opts = {
identity: {
username: <BOT USERNAME>,
password: 'oauth:' + <OAUTH TOKEN>},
channels: [
<CHANNEL NAME>]
}
You can create .env file and add it to .gitignore file.
in the .env file insert your variable like that:
OAUTH_TOKEN=yourToken
In the opts object you can call the token like that:
process.env.OAUTH_TOKEN