I use the AWS SSM send-command AWS-RunShellScript through the AWS command line interface to send commands to a number of EC2 instances. The command runs the JS script which will generate run-time console logs (JS); is there a way to view it in EC2?
You can use Amazon CloudWatch Logs to monitor, store, and access your
log files from Amazon Elastic Compute Cloud (Amazon EC2) instances. CloudWatch Logs enables
you to centralize the logs from all of your systems, applications, and
AWS services that you use, in a single, highly scalable service.
To install Cloudwatch agent and configure your cloudwatch agent, Kindly refer
Related
I'm trying to install a node.js server for a sentiment analysis service with my Twitter account that retrieve the tweets on my profile and provides a statistical output and saves them on a Mongo db istance.
I have uploaded my node.js code on an AWS virtual machine with an public IP address and with the permission to create an endpoint with HTTP and HTTPS protocols.
I have installed successfully the node.js code on AWS virtual machine with Windows Server 2019 OS, with the npm install -g -n command with 0 dependencies errors, and when I try to connect to the AWS virtual machine with http://ip_public_address:8080 I get the error "impossible to connect - err-connection_timeout".
This is the link to the github project that I need to install and to work on AWS virtual machine:
https://github.com/thisandagain/sentiment
Maybe I am confused about how to connect with the index.html page via AWS virtual machine and I don't know if this page must be retrieved with a IP public address or localhost parameter and what is required, at node.js level code, in order to enable the AWS virtual machine to respond to my browser with the content of index.html page.
Please can you give me advices about to implement successfully this project?
Thanks
Filippo
You don't mention security groups in your question at all, so the likely cause is that you never opened port 8080 in the security group assigned to the EC2 instance. You may also need to open that port in the Windows firewall on that server.
I have configured VPC in my Lambda because I wanted to use AWS EFS and I get the following error when my lambda function tries to get data from third party application.
"Error: connect ETIMEDOUT 35.157.139.105:443 at
TCPConnectWrap.afterConnect [as oncomplete] (net.js:1159:16) "
Can someone please guide me what I'm missing here?
I had the same issue ,
the problem was with the VPC .
from AWS Troubleshoot networking issues in Lambda :
By default, Lambda runs your functions in an internal virtual private
cloud (VPC) with connectivity to AWS services and the internet. To
access local network resources, you can configure your function to
connect to a VPC in your account. When you use this feature, you
manage the function's internet access and network connectivity with
Amazon Virtual Private Cloud (Amazon VPC) resources.
I solved it by creating a VPC with two private subnets, VPC endpoints, a public subnet with a NAT gateway, and an internet gateway. Internet-bound traffic from functions in the private subnets is routed to the NAT gateway using a route table , according to the AWS instructions here.
I built the VPC in cloudFormation using this yaml (from the AWS documention above) and connected the Lambda function to the new VPC, its subnets and security group.
the question is pretty straight forward. I would like to control a drone (Bitcraze Crazyflie), using a Google Home. The Input is: "Drone fly to x3 y4", processed as usual by Firebase etc. Resulting in the Google Assistant Output: "Flying to x3 y4", but also an Ouput in e.g. JSON format, to navigate the drone. Because the drone works with Python, this is the preferable Output language.
EDIT Added more Context
Currently I'm using an node server running this code:
'use strict';
// Import the Dialogflow module from the Actions on Google client library.
const {dialogflow} = require('actions-on-google');
// Import the firebase-functions package for deployment.
const functions = require('firebase-functions');
// Instantiate the Dialogflow client.
const app = dialogflow({debug: true});
// Handle the Dialogflow intent named 'fly'.
// The intent collects parameters named 'xaxis, yaxis'.
app.intent('fly', (conv, {xaxis,yaxis}) => {
const xAxis = xaxis;
const yAxis = yaxis;
// Respond with the user's coordinates and end the conversation.
conv.close('Roger that, flying to ' + xAxis + ", " + yAxis);
});
// Set the DialogflowApp object to handle the HTTPS POST request.
exports.dialogflowFirebaseFulfillment = functions.https.onRequest(app);
Now I would like to get the const xAxis and yAxis and use them in a Python program. I've tried using
process.stdout.write(xAxis + yAxis);
Listening in Python with something like
out = sensor.stdout.read(1)
but the code will be run on the Google Server, so local port listening does not work.
Thanks for your help.
The best approach is having another machine on GCP rather than communicate with your home PC. You'll learn more and have an easier time, in the long run, building solutions. As I'm more familiar with AWS rather than GCP, I can't cite the network/security components you need to configure but the docs say you don't have to. So, in theory, it should be just about spinning up another compute machine with your Python code running on it.
Were you to decide on speaking to your home PC, you'll need to forward ports on your router. It is, currently, acting as a firewall for your LAN devices and doesn't permit outside machines initiating connections to your internal addresses. e.g. your GCP machine initiating a connection to your home PC. The other way around is permitted, by default. If you think about it, your router has one WAN IP address but your LAN can have multiple devices (multiple LAN IPs). If your GCP machine connects to your router WAN IP at port 8080, to which LAN IP should it connect? You have to help your router and explicitly tell it.
Once you have a networking solution in place, you can debug the connectivity itself (server can talk to client) by using netcat (nc/ncat, depending on Linux distro). Netcat is a versatile networking tool with which you can purely open connections (before you add in your program to the debugging stack) and assure the networking part of your solution is working as intended.
nc -v <destination_ip> <port>
Simple.
This should get you to where you want to be. A working connection between your GCP drone controller and the Python processor machine.
Bonus - If you want a quick way to have your machine (PC or otherwise) listen on a port, you can use Python's built-in HTTP file server module with
python -m http.server 8080
This will serve files from the directory you ran this command. So, keep that in mind if you're open to the world.
Or, a simple "echo server", using netcat.
nc -v -l 8080
Lastly, for a solid Python HTTP API framework, I highly recommend FastAPI. It allows quickly writing a HTTP API server with, for example, a POST method that your GCP drone controller can call. It has the great bonus of generating both interactive OpenAPI docs, example, for your code and, using 3rd party tools from Swagger (that you can see in the example linked), generate server/client/testing "boiler plate" code. Did I also mention their docs are great?
I want to run some load tests but my pc cannot handle more requests than the server. So I would like to run these tests on amazon ecs. Is there a way to run k6 on amazon cloud instead of their loadimpact cloud, if so, how?
Yes you can run k6 on Amazon cloud. The easiest way is probably to stand up a Centos or Ubuntu server in ec2 and then install k6 on it. This is how I have run it. Then install InfluxDB on the server also along with Grafana. Feed the output of the load test from k6 into InfluxDB and there is a community Grafana dashboard that will display stats from the load test. It will be close to using LoadImpact. You will still need to use LoadImpact to create the scripts if you are using the web browser plugin. But that is free.
You can run k6 inside a docker container too. I have not done that yet, but a coworker did. I am going to look into using ECS to run the container version of k6. But I have not tried it yet. You would still want an ec2 instance with InfluxDB to grab the data out of k6.
I need to execute a command on AWS EC2 on demand, rather than a scheduled cron. How can I trigger a script on the command line on my EC2 by just using the web browser?
I have made an attempt via a webhook via Zapier but struggling to link this into the EC2 CLI.
Use Systems Manager Run Command or Fabric over SSH (example here, though it won't work from within a browser).