Can't access EC2 instance IAM Role Credentials from NodeJS Javascript - javascript

I've successfully added an AWS Role to an EC2 instance. When I ssh into the instance and curl http://169.254.169.254/latest/meta-data/iam/security-credentials/, I can see the temporary credentials just fine.
On this EC2 instance, I have an NGINX reverse proxy serving a NodeJs web app. I want to be able to access DynamoDB in the Javascript of the web app but the AWS.config.credentials are null.
From what I've read, shouldn't these credentials be loaded automatically since the role was applied to the EC2 instance?
Is there some way to pass those credentials into the web app that I'm missing?
DynamoDB is being setup like this:
private dynamoDB = new AWS.DynamoDB();
private dynamoDBClient = new AWS.DynamoDB.DocumentClient({ service: this.dynamoDB, convertEmptyValues: true });

Related

Can't connect to Azure Cache for Redis from Azure Web App

I have NodeJS Web App which trying to connect to the Azure Cache for Redis which is part of the same subscription.
const redis = require('redis')
const redisConnectionConfig = {
host: REDIS_HOST,
port: REDIS_PORT,
auth_pass: REDIS_PASSWORD
tls: { servername: REDIS_HOST }
}
...
redis.createClient(redisConnectionConfig)
I'm able to connect to Redis from my local machine after adding my IP to the Redis Firewall rules.
Also, I've added all 'Outbound IPs and Additional Outbound IP Addresses' from the Service App Properties.
I've tried even to allow access from all IPs
still not pass
But it is not connected and if I try to use Redis I receive the following connection error:
MaxRetriesPerRequestError: Reached the max retries per request limit (which is 20). Refer to "maxRetriesPerRequest" option for details.
Something similar solved for the VM here. But in the case of the App Service Azure managed that layer.
Looks like it's not connectivity issue.
Network part you always can check via WebApp->Console and use command
tccping redisservername:redisserverport
tcpping_example
Probably something with your redis cache size. What size are you using now?
You can find azure redis limits here

How to use .pem file inside Amazon Lambda to access EC2 instance

I'm currently working on a project that take place inside the AWS environment. I have configure a S3 bucket in order to receive mails (mails are coming from SES but that's not relevant).
What I want to do is to create a Lambda function that will be able to access a EC2 instance and launch a python scripts. So far i have the code below. The problem is that when I created my ec2 instance, I didnt create any username or password to connect via SSH. I only have a .pem file (certificate file) to authenticate to the instance.
I did some research but i couldn't find anything useful.
var SSH = require('simple-ssh');
var ssh = new SSH({
host: 'localhost',
user: 'username',
pass: 'password'
});
ssh.exec('python3.6 path/to/my/python/script.py', {
out: function(stdout) {
console.log(stdout);
}
}).start();
i've been thinking of severals solutions, but i'm not sure at all :
find an SSH library in Javascript that handle .pem file
converting .pem into a String (not secure at all, in my opinion).
maybe create a new ssh user in EC2 ?
Thanks you for your time.
A better option would be to use AWS Systems Manager to remotely run commands on your Amazon EC2 instances.
If you still choose to use simple-ssh then you need to supply an SSH key in config.key when creating your SSH object. You can store the private key in Parameter Store or Secrets Manager and retrieve it within the Lambda. In this case, you should definitely use passwordless SSH (with keypair).

Amazon S3 serving file only to app user in react-native

I am building a social network in react-native / nodejs and using the S3 amazon service to handle the users personal photos (upload, serve)
But I can't seem to wrap my head around how to serve those images, my question is how do I serve those uploaded images to only the application users and not the whole world ?
At first I tried to explicitly fetch the image myself, this allowed me to directly put the S3 credentials, but it doesn't seem practical.
Is there a way to make every GET call made by an app authorized to fetch from my bucket ?
What you need is S3 signed URLs http://docs.aws.amazon.com/AmazonS3/latest/dev/ShareObjectPreSignedURL.html
Instead of fetching images, create unique signed URLs to this images (with a custom expiration time, say 1 week) and pass those to your application. This way you can close your S3 bucket to the world, but your application will be able to obtain the images with these private links.
Thanks to #sergey I made it to what fits my needs the 'getSignedUrl' method.
Here is the code that worked for me :
import AWS from 'aws-sdk/dist/aws-sdk-react-native';
const credentials = new AWS.Crendentials({ accessKeyId: '', secretAccessKey: ''})
const s3 = new AWS.S3({ credentials, signatureVersion: 'v4', region: ''});
// and there it is.
const url = s3.getSignedUrl('getObject', { Bucket: 'your bucket name', Key: 'the filename'}).
And now each time I loop through an array containing multiple references to my photos, during each loop I create for an item a single pre-signed url that I put in my component.
You can use the new AWS Amplify library to accomplish this: https://github.com/aws/aws-amplify
There is an Auth for getting user credentials and establishing identities, both in an Authenticated and UnAuthenticated state, as well as a Storage component which has public and private access profiles.
Install via npm:
npm install --save aws-amplify-react-native
You will need to link the project if using Cognito User Pools:
react-native link amazon-cognito-identity-js
More info here: https://github.com/aws/aws-amplify/blob/master/media/quick_start.md#react-native-development
Then pull in the modules:
import Amplify, {Auth, Storage} from 'aws-amplify-react-native'
Amplify.configure('your_config_file');
Storage.configure({level: 'private'});
Storage.get('myfile');

Firebase Authentiction using Javascript

We are building a web app using firebase. We are using firebase hosting to host static files like CSS, images, and javascript. We are also using firebase functions to make web app dynamic.
const app = express();
...
app.use(express.static(__dirname + '/public'));
app.get('/', (req,res) => {
res.render('index');
});
I have a signIn.js in public folder and hosted on Firebase Hosting. When I try to access "firebase.auth().signInWithPhoneNumber(phoneNumber, appVerifier) {}" function in signin.js, I get an error that "Firebase" is not found.
I want to send OTP on user phone using Firebase. Now I am confused whether we should handle it on the client side in hosted JS files or we have to handle it at server side in server.js.
Also, how could I invoke a function in server.js files hosted on firebase functions from static js file hosted on firebase hosting?
firebase.auth().signInWithPhoneNumber is client side only. You also need to render an invisible reCAPTCHA with it too which can only be done client side. If you are using the Firebase Admin SDK, it allows you to create a user or update an existing user with a phone number but it assumes you verified the number via your own means.
If you need to provide authenticated access to some http endpoints via Firebase Functions, you can send the Firebase ID token of the phone authenticated user to that endpoint (you can get it on the client via currentUser.getIdToken()) and verify it via the Admin SDK verifyIdToken(idToken).

How to connect to elasticache in my case?

I am trying to connect to aws elasticache from my app.
I know the endpoint and the port but for some reason I can't connect to it.
I used this npm package:
https://www.npmjs.com/package/node-memcached-client
code:
const Memcached = require('node-memcached-client');
const client = new Memcached({
host: 'mycache.aa11c.0001.use2.cache.amazonaws.com', //fake aws cache endpoint
port: 11211
});
console.log(client); // I can see it outputs stuff
client.connect()
.then(c => {
console.log('connected');
console.log(c);
}).catch(function(err){
console.log('error connecting');
console.log(err);
});
For some reason when I run the codes, all I see is
[Memcached] INFO: Nothing any connection to mycache.aa11c.0001.use2.cache.amazonaws.com:11211, created sid:1
no errors or connected message in the console.log. Am I doing something wrong here?
Thanks!
You may want to go over the below document from AWS to access Elasticache resources from outside of AWS:
Access AWS Elasticache from outside
I would recommend setting up a local memcached instance for development and debugging, and connect to Elasticache from an EC2 instance in test and production environments.
The ROI for trying to setup NAT and mapping the IP addresses is not justifiable for dev/test unless absolutely necessary.
I ended using port forwarding to bypass the local instance can't connect to elasticache issue.
by using
ssh -L 11211:{your elasticache endpoint}:11211 {your ec2 instance ip}

Categories