simple node.js example in aws lambda - javascript

I am trying to send a simple request with aws lambda.
My module structure is as follows:
mylambda
|-- index.js
|-- node_modules
| |-- request
I zip the file up and it is uploaded to lambda.
Then I invoke it, and it returns the following error. "errorMessage": "Cannot find module 'index'"
Here is the contents of the index.js file
var request = require('request');
exports.handler = function(event, context) {
var headers = { 'User-Agent': 'Super Agent/0.0.1', 'Content-Type': 'application/x-www-form-urlencoded' }
// Configure the request
var options = {
url: 'https://myendpoint',
method: 'POST',
headers: headers,
form: {'payload': {"text":""} }
}
// Start the request
request(options, function (error, response, body) {
if (!error && response.statusCode == 200) {
console.log(body)
}
})
console.log('value1 =', event.key1);
context.succeed(event.key1); // Echo back the first key value
};
Any help is appreciated, Thanks

All working now, I had to increase the Timeout(s) seconds in advanced settings, as it was taking longer than 3 seconds.
Also I had to ensure my node modules were correctly installed. I had messed up the request module when trying to figure out what was wrong.
To reinstall the module, I deleted then re-installed request.
deleted node_modules
npm init
added the dependancies "request" : "*" in the package.json,
npm install. Compressed the zip and uploaded, all working now. :)

You have to zip and upload subfolders only, not a root folder. You have to zip following folders as per your example, then upload:
|-- index.js
|-- node_modules
|-- request

Task: Write an aws lamda function:
How I have see us doing:
We write code in aws editor and run that
Not running as expected, put a lot of consoles there(because we can't debug our code)
Wait for some seconds then see the consoles in another window, keep changing the windows until we resolve our problem
4.changing the windows takes a lot of time and effort.
why can't we?
write the code in our server (not aws editor) and then send that code to aws.
Yes, we can.
new Function (https://davidwalsh.name/new-function) Blessing in disguise
concept.
Sample code:
let fs = require('fs');
const aws = require("aws-sdk");
const s3 = new aws.S3(),
async = require('async');
aws.config = {
"accessKeyId": "xyz",
"secretAccessKey": "xyz",
"region": "us-east-1"
};
fs.readFile('path to your code file', 'utf-8', async (err, code) => {
if (err) return res.status(500).send({ err });
async function uploadToS3(docs) { (only this function has to go into aws editor)
let func = new Function('docs', "aws", "s3", 'async', `${code}`);
return func(docs, aws, s3, async);
}
let resp = await uploa`enter code here`dToS3(req.files.docs);(this line will call aws lambda function from our server)
return res.send({ resp });
});
Code which I have written in my file:
docs = Array.isArray(docs) ? docs : [docs]
let funArray = [];
docs.forEach((value) => {
funArray.push(function (callback) {
s3.upload({
Bucket: "xxx",
Body: value.data,
Key: "anurag" + "/" + new Date(),
ContentType: value.mimetype
}, function (err, res) {
if (err) {
return callback(err, null);
}
return callback(null, res);
});
});
});
return new Promise((resolve, reject) => {
async.parallel(funArray, (err, data) => {
resolve(data);
});
});
Benefit:
As the whole code will be written in our familiar IDE, it will be easy to debug.
Only two lines have to go into aws editor (isn't it easy).
quite easy to modify/update the code(as the code will in our repo, we may not even have to go to aws editor).
yes we can other third parties libraries, but the above thing is written in pure JavaScript, no third party library is utilized there.
Also, here you don't have to deploy your code.
Sometimes our libraries total size increased to 5 MB and AWS lambda editor stop supporting it.
It will resolve this problem as well, now we will send only the required function from a library, not the whole library
on an average an async library contains around 100s of functions, but we use 1-2 functions. So, now we will send only the function which we are going to use.
Note:
I searched this a lot but nowhere found this kind of thing.
The above piece of code will upload docs to the s3 bucket.

Related

How to download file from deployed server?

Can someone tell me why function below properly download my file from server when i work locally (by localhost) but not download me and return me 500 internal server error when i try do is when i deploy my app on remote server?
async downloadFile(fileId: number): Promise<Buffer> {
const fileName = await this.getFileName(fileId);
const fileBuffer = await new Promise<Buffer>((resolve, reject) => {
fs.readFile(process.cwd() + '/files/' + fileName + '.xlsx', {}, (err, data) => {
if (err) reject(err)
else resolve(data)
});
});
return fileBuffer ;
}
thanks for any help
EDIT, ERROR FROM LOG:
ENOENT: no such file or directory
If you are willing to access your file relatively to your script dir you should use __dirname
Also using the path module in order to build your file location in a platform agnostic way is a good practice.
const path = require('path')
const filePath = path.join(__dirname, 'files', `${fileName}.xlsx`)
process.cwd() refers to you node process working dir. Using it in your context would tie your code to how the entry point has been called. This is bad. Code should not have to be aware of its execution context to work whenever this is possible.
An even better way would be to make your file location configurable (using an environment variable or a config file) and pass your download folder value to your code this way.
see https://12factor.net/config
example
const baseDir = process.env.FILES_PATH || '/some/default/location';
const filePath = path.join(baseDir, 'files', `${fileName}.xlsx`);
then run your program with
FILES_PATH=/your/directory node your_script.js

npm project behind a corporate proxy global

I found already a lot of helpful tutorials to set the proxy locally and global to install packages and so on.
Now I started a new project and I figured out how to reuse the proxy settings:
#! /usr/bin/env node
var http = require("http");
var shell = require('shelljs');
var request = require('request');
var iplocation = require('iplocation')
// setup proxy
var proxyUrl = shell.exec('npm config get proxy', {silent:true}).stdout;
var proxiedRequest = request.defaults({
'proxy': proxyUrl,
'https-proxy' : proxyUrl,
'strict-ssl' : false
});
// get location (works)
proxiedRequest('http://ipinfo.io/216.58.194.46', function (error, response, body) {
console.log('error:', error);
console.log('statusCode:', response && response.statusCode);
console.log('body:', body);
});
// doesn't work
iplocation('56.70.97.8').then(res => {
console.log(res.iplocation)
})
.catch(err => {
console.error(err);
})
Is there a way to set it someway global for the project so other npm packages could use it too?
I tried a local .npmrc file in the projects folder but it doesn't affect the environment at all.
Any hints are welcome. Thanks
This SO answer1, SO answer2 explains different ways to set npm proxy. See if it helps you.
You could add functions like proxy_on and proxy_off to your bashrc which will let you set global npm config and toggle it from your command line.
Refer to this gist for the code.

Can't load scripts because of node.js

I can't use <script src="node_modules/jquery/dist/jquery.min.js"></script>
in my index.html file because of:
Failed to load resource: the server responded with a status of 404 (Not Found)
http://localhost:8080/node_modules/jquery/dist/jquery.min.js
This happens because of my else statement in server file code:
var http = require('http')
var url = require('url')
var fs = require('fs')
var server = http.createServer((req, res) => {
let parsedUrl = url.parse(req.url, true)
if (parsedUrl.pathname === '/') {
console.log('home page')
fs.readFile('./index.html', (err, data) => {
res.writeHead(200, { 'Content-Type': 'text/html' })
res.end(data)
})
} else if (parsedUrl.pathname === '/readJson') {
console.log('read json')
fs.readFile('./data.json', (err, data) => {
res.writeHead(200, { 'Content-Type': 'application/json' })
res.end(data)
})
} else {
console.log('We can\'nt load any resources because of this statement')
res.writeHead(404)
res.end()
}
})
server.listen(8080)
I've read about how to fix this problem when using express module. Is there any way to solve the problem without using that module?
The easiest way would be to simply load jQuery from a CDN instead of serving it from your own server. This is a widely accepted best practice.
Example:
<script src="https://code.jquery.com/jquery-3.2.1.min.js" integrity="sha256-hwg4gsxgFZhOsEEamdOYGBf13FyQuiTwlAQgxVSNgt4=" crossorigin="anonymous"></script>
You can find various options for loading jQuery from a CDN here: https://code.jquery.com
If you create an HTTP server like in your code example and want it to serve jQuery, then you'd have to read the jquery.min.js using fs.readFile and serve its contents, just like you're doing with your data.json file.
I recommend the use a CDN instead. If you install the jQuery module you can use it for your back-end JavasSript. However you want to use it on the front-end. You are using NodeJS as web server to serve a HTML page. The http module doesn't know anything about other files because it only read the index.html.
So you might wanna search for a solution to read/serve a complete folder. Withing this folder, lets call it public, you can store you HTML, CSS and JS files that are public available. Since the HTTP module knows about the entire folder all files can be used on the front-end. This is also a good solution to separate your back-end en front-end JavaScript.

Serverless lambda function throwing error when trying to insert data into dynamoDB Table

I am trying to test CRUD functionality on a aws lambda function using the Serverless Framework.
Here is the command I am running:
sudo serverless invoke local -f createArticle -p articles/event.json
When I try to create a record an error is thrown. Here is the error in the console:
Syntax Error -------------------------------------------
Unexpected token u in JSON at position 0
For debugging logs, run again after setting the "SLS_DEBUG=*" environment variable.
Get Support --------------------------------------------
Docs: docs.serverless.com
Bugs: github.com/serverless/serverless/issues
Forums: forum.serverless.com
Chat: gitter.im/serverless/serverless
Your Environment Information -----------------------------
OS: darwin
Node Version: 6.10.3
Serverless Version: 1.13.1
Now I have linted my javascript and validated my event.json file.
Here is my javascript:
'use strict';
const AWS = require('aws-sdk');
const dynamo = new AWS.DynamoDB.DocumentClient();
const uuid = require('uuid');
module.exports.handler = (event, context, callback) => {
const data = JSON.parse(event.body);
if (data.text && typeof data.text !== 'string') {
console.log('Validation Failed');
callback(new Error('Body did not contain a text property.'));
return;
}
const params = {
TableName: 'BlogTable',
Item: {
article_id: "1",
text: data.text
}
};
const putCallback = (error, result) => {
if (error) {
console.log(error);
callback(new Error('Could not save record.'));
return;
}
//console.log(result);
const response = {
statusCode: 200,
body: JSON.stringify(result.Item)
};
callback(null, response);
}
dynamo.put(params, putCallback);
};
Here is my event.json file:
{"text":"Hello World"}
Why is the error getting thrown? And how do I set that environmental variable I tried in my serverless.yml file but I did not get any output.
If you use the "Lambda proxy" integration, you need to use a specific format for http:// events that directly invoke your Lambda functions (e.g. if you use serverless invoke from the CLI, and not the AWS API Gateway Management Console, where you can invoke the Lambda function through your API using the "Test" button).
With a Lambda proxy, in this example, you need to create a json file with a "body" property and stringify the value, like this:
{
"body": "{\"text\":\"hello\"}"
}
The reason is: "With the Lambda proxy integration, API Gateway maps the entire client request to the input event parameter of the back-end Lambda function"
http://docs.aws.amazon.com/apigateway/latest/developerguide/api-gateway-set-up-simple-proxy.html#api-gateway-simple-proxy-for-lambda-input-format
In my event.json files I had to put my json like this:
{"body": "{\"text\": \"Hello World\"}"

Upload S3 knox node js (signature doesnt match)

I've been trying for many days now to upload a file (message.txt) to aws s3 using knox and node js.
I keep having a signature doesnt match error.
my code in node js (upload was not working so i'm just trying to do a get) :
var client = knox.createClient({
key: 'myAWSkey'
, secret: 'mySecretKey'
, bucket: 'mybucket'
, endpoint: 'mybucket.s3-eu-west-1.amazonaws.com'
});
client.get('/').on('response', function(res){
console.log(res.statusCode);
console.log(res.headers);
res.setEncoding('utf8');
res.on('data', function(chunk){
console.log(chunk);
});
}).end();
I also tried the amazon to compare the test signature with many different methods like this one : html and python version
Nothing worked for me, I'm probably a bit lost in the process...
If someone could give me some big lines to guide me and/or a script to generate correctly a signature in javascript/node js I would be very grateful.
You might want to try the AwsSum library. It's actively maintained and also comes with a load of examples and another repo with more fully featured scripts.
https://github.com/appsattic/node-awssum/
And for your needs, there is an example upload script in the scripts repo (separate GitHub project):
https://github.com/appsattic/node-awssum-scripts/blob/master/bin/amazon-s3-upload.js
Let me know if you need any help or if you get on ok. Disclaimer: I'm the author of AwsSum. :)
I just struggled with this issue for a few days. Assuming you're on Windows, it seems like it's an issue on Knox's end. Apparently the problem has been solved, but the solution has not not pulled into the main project yet.
See this thread: https://github.com/LearnBoost/knox/issues/56
My workaround was to just remove the knox library and clone this repository into my node_modules folder: https://github.com/domenic/knox.git
Hope that helps!
For NodeJS, there is an API which helps to generate Signature. It is available at NPM repo with name aws4.
Can refer from link: https://www.npmjs.com/package/aws4
To download and install, use below command:
npm i aws4
Can add in package.json file
{
"dependencies": {
"aws4": "^1.11.0",
"https": "^1.0.0",
"crypto": "^1.0.1"
}
}
Following parameters are used for generating signature:
host: host of service, mandatory
path: Path of the file being uploaded, mandatory
service: Service name e.g. s3
region: Region name e.g. us-east-1
method: HTTP method e.g. GET, PUT
accessKeyId: Access Key ID for the service, mandatory
secretAccessKey: Secret for access key id, mandatory
sessionToken: Temporary session token, optional
Use below code to upload a file:
var https = require('https')
var aws4 = require('aws4')
var crypto = require('crypto');
var fs = require('fs');
var fileBuffer = fs.readFileSync('1.jpg'); //File name from local which need to upload
var hashSum = crypto.createHash('sha256');
hashSum.update(fileBuffer);
var hex = hashSum.digest('hex'); //Generate SHA256 from the file
var opts = aws4.sign({
host: '<host name for the s3 service>',
path: '<bucket and file path in s3>',
service: 's3',
region: 'us-east-1',
method: 'PUT',
headers: {
'X-Amz-Content-Sha256': hex
},
body: undefined
}, {
accessKeyId: '<access key>',
secretAccessKey: '<secret key>'
sessionToken: '<session token>'
}
)
opts.path = '<complete path: https://host+bucket+filepath>';
opts.headers['Content-Type'] = 'image/jpeg'; //Content type of the file
opts.headers['User-Agent'] = 'Custom Agent - v0.0.1'; //Agent name, optional
opts.headers['Agent-Token'] = '47a8e1a0-87df-40a1-a021-f9010e3f6690'; // Agent unique token, optional
opts.headers['Content-Length'] = fileBuffer.length; //Content length of the file being uploaded
console.log(opts) //It will print generated Signature
var req = https.request(opts, function(res) {
console.log('STATUS: ' + res.statusCode);
console.log('HEADERS: ' + JSON.stringify(res.headers));
res.setEncoding('utf8');
res.on('data', function (chunk) {
console.log('BODY: ' + chunk);
});
});
req.on('error', function(e) {
console.log('problem with request: ' + e.message);
});
req.write(fileBuffer);
req.end();
Note: The SHA256 is generated manually and passed into the argument before generating signature and body set as undefined in aws4.sign() method. This important when uploading a file as binary data therefore, SHA256 is generated and set before aws4.sign() method call.
This same API can be used for all different calls e.g. GET call for file download.
The sessionToken is optional as it is only required for the cases where temporary session token is generated for accessing S3 service.

Categories