I'm returning a response from Nodejs Lambda function and is giving a blank response like this :
{
"version": "1.0",
"sessionAttributes": {
"outputSpeech": {
"type": "PlainText",
"text": "Welcome to Alexa Skill"
},
"shouldEndSession": true
},
"response": {}
}
The session attribute should be blank and response should contain the content of the session attribute but it is happening exactly reverse. Here is the code to generate the response.
context.succeed(
buildResponse(
buildSpeechletResponse("Welcome to Alexa Skill", true),
{}
)
)
and these are the helper functions:
function buildSpeechletResponse(outputText, shouldEndSession) {
return {
outputSpeech: {
type: "PlainText",
text: outputText
},
// card: {
// type: "Simple",
// title: title,
// content: output
// },
// reprompt: {
// outputSpeech: {
// type: "PlainText",
// text: repromptText
// }
// },
shouldEndSession: shouldEndSession
};
}
function buildResponse(sessionAttributes, speechletResponse) {
return {
version: "1.0",
sessionAttributes: sessionAttributes,
response: speechletResponse
};
Just watch the argument order in buildResponse helper method. You are passing it in reverse. Just change as follows.
context.succeed(
buildResponse({},
buildSpeechletResponse("Welcome to Alexa Skill", true)
)
)
Related
An Action is calling the helper function inside the loop. If the helper function raise some error then it exits with a specific code queryFailed like follows:
helpers/a/execute.js
module.exports = {
friendlyName: '',
description: '',
inputs: {},
exits: {
queryError: {
description: 'Query error'
},
success: {
description: 'yayyy!! success!'
}
},
fn: async function ({ conditions }, exits) {
let records = [],
MYSQL_QUERY = `SELECT * FROM model WHERE COLUMN = $1`;
try {
records = await Model.getDatastore().sendNativeQuery(MYSQL_QUERY, [['true']]);
}
catch (error) {
return exits.queryFailed(error);
}
return exits.success(records);
}
};
I have an action as follows that calls the above mentioned helper function.
controllers/action.js:
module.exports = {
friendlyName: 'Action',
description: 'Performs some action',
inputs: {
param1: {
description: 'param 1',
type: 'string'
},
param2: {
description: 'param 2',
type: 'ref'
}
},
exits: {
invalid: {
description: 'Invalid request',
responseType: 'invalid',
statusCode: 400
},
unexpected: {
description: 'Unexpected error',
responseType: 'unexpected',
statusCode: 500
},
success: {
description: 'success',
statusCode: 200,
outputType: 'ref'
}
},
fn: async function (inputs, exits) {
// Helper Ids
const arr = ['a', 'b'];
let response = [];
for (const element of arr) {
try {
records = await sails.helpers[element].execute.with({
conditions: conditions
});
}
catch (err) {
if (err.code === 'queryError') {
LOGGER.error('Database Error', err);
return exits.unexpected();
}
return exits.unexpected();
}
response.push(records);
}
return exits.success(response);
}
};
The issue with this is in case of an invalid query the helper function exits with queryError code as follows:
return exits.queryFailed(error);
Assuming helper a is executed successfully, if there is an error in helper b then ideally the action should not exit itself. It should continue executing and show the error in the final response for that block.
Expected Response:
{
"rows": [
{
"value": {
"id": "a",
"data": {},
"meta": {},
}
},
{
"error": {
"name": "serverError",
"statusCode": 500,
"message": "Internal server error.",
"id": 2
}
},
Current Behaviour: It's catching the queryError in the action and doing an exit with the error response:
{
"trace": "",
"error": {
"name": "serverError",
"statusCode": 500,
"message": "Internal server error"
}
}
Thank you in advance!
In a test Elasticsearch index, I have indexed a document, and I now want to update the document by setting its length property to 100. I want to do this through scripting (as this is a simplified example to illustrate my problem) via the elasticsearch package.
client.update({
index: 'test',
type: 'object',
id: '1',
body: {
script: 'ctx._source.length = length',
params: { length: 100 }
}
})
However, I receive the following error:
{
"error": {
"root_cause": [
{
"type": "remote_transport_exception",
"reason": "[6pAE96Q][127.0.0.1:9300][indices:data/write/update[s]]"
}
],
"type": "illegal_argument_exception",
"reason": "failed to execute script",
"caused_by": {
"type": "script_exception",
"reason": "compile error",
"script_stack": [
"ctx._source.length = length",
" ^---- HERE"
],
"script": "ctx._source.length = length",
"lang": "painless",
"caused_by": {
"type": "illegal_argument_exception",
"reason": "Variable [length]is not defined."
}
}
},
"status": 400
}
This happens even though I have included the length property in body.params.length.
Using the following:
Elasticsearch server v6.1.1
Elasticsearch JavaScript client v14.1.0
How can I resolve this issue?
The documentation is wrong at https://www.elastic.co/guide/en/elasticsearch/client/javascript-api/current/api-reference.html#api-update
In their example, they put:
client.update({
index: 'myindex',
type: 'mytype',
id: '1',
body: {
script: 'ctx._source.tags += tag',
params: { tag: 'some new tag' }
}
}, function (error, response) {
// ...
});
Whilst in fact, body.script should read:
client.update({
index: 'myindex',
type: 'mytype',
id: '1',
body: {
script: {
lang: 'painless',
source: 'ctx._source.tags += params.tag',
params: { tag: 'some new tag' }
}
}
}, function (error, response) {
// ...
});
Therefore, if you change your script to:
script: {
lang: 'painless',
source: 'ctx._source.length = params.length',
params: { length: 100 }
}
it should work!
You may want to reference the Painless Examples - Updating Fields with Painless page!
I'm pushing DynamoDB rows into a Elasticsearch cluster. The date fields are unix timestamps and not recognized by Kibana as a date.
I read about Elasticsearch mapping types and found this post, but don't know where to implement the mapping in my Lambda script:
/* ... requires and config ... */
exports.handler = (event, context, callback) => {
event.Records.forEach((record) => {
var dbRecord = JSON.stringify(record.dynamodb);
postToES(dbRecord, context, callback);
});
};
function postToES(doc, context, lambdaCallback) {
var req = new AWS.HttpRequest(endpoint);
req.method = 'POST';
req.path = path.join('/', esDomain.index, esDomain.doctype);
req.region = esDomain.region;
req.headers['presigned-expires'] = false;
req.headers['Host'] = endpoint.host;
req.body = doc;
// Maybe here?
var signer = new AWS.Signers.V4(req , 'es');
signer.addAuthorization(creds, new Date());
var send = new AWS.NodeHttpClient();
send.handleRequest(req, null, function(httpResp) {
var respBody = '';
httpResp.on('data', function (chunk) {
respBody += chunk;
});
httpResp.on('end', function (chunk) {
lambdaCallback(null,'Lambda added document ' + doc);
});
}, function(err) {
console.log('Error: ' + err);
lambdaCallback('Lambda failed with error ' + err);
});
}
Elasticsearch document
{
_index: "posts",
_type: "post",
_id: "6YKF2AAV06RSSRrzv6R-",
_version: 1,
found: true,
_source: {
ApproximateCreationDateTime: 1499922960,
Keys: {
id: {
S: "7asda8b0-628a-11e7-9e5e-25xyc7179dx7"
}
},
NewImage: {
posted_at: {
N: "1499922995401"
},
id: {
S: "7asda8b0-628a-11e7-9e5e-25xyc7179dx7"
}
},
SequenceNumber: "2442423900000000003279639454",
SizeBytes: 221,
StreamViewType: "NEW_AND_OLD_IMAGES"
}
}
Dynamoose Schema
var Schema = dynamoose.Schema;
var s = new Schema({
id: {
type: String,
hashKey: true,
required: true
},
posted_at: {
type: Date,
required: true
}
});
module.exports = dynamoose.model('posts', s);
Example: in my DynamoDB table I've the field posted_at. The content is a unix timestamp. In Kiabana it's indexed as
NewImage.posted_at.N (type: string, searchable, analyzed) and
NewImage.posted_at.N.keyword (type: string, searchable, aggregateable)
I'm confused by the N and type: string.
Any ideas?
Thanks!
Ok it turns out that the N is there to denote the DynamoDB attribute type (i.e. N stands for Number).
The problem is that the number gets stringified and thus indexed as a string in ES (i.e. what you currently see in your mapping).
We can get around this using a dynamic template definition. First delete your index in ES and the corresponding index pattern in Kibana. Then run this command:
curl -XPUT localhost:9200/_template/post_template -d '{
"template": "posts",
"mappings": {
"post": {
"dynamic_templates": [
{
"dates": {
"path_match": "NewImage.posted_at.N",
"mapping": {
"type": "date"
}
}
},
{
"strings": {
"match_mapping_type": "string",
"mapping": {
"type": "text",
"fields": {
"raw": {
"type": "keyword",
"ignore_above": 256
}
}
}
}
}
]
}
}
}'
Finally you can reindex your data through Dynamoose and you should be able to find a date field in Kibana afterwards.
[Previously titled "How to get 1 record from a list..."]
I am very new to GraphQL and trying to understand how to get 1 record from query.
This is the result of my current query:
{
"data": {
"todos": null
}
}
I am not sure what is wrong. I would like the result to be this:
{
"data": {
"todos": {
"id": 1,
"title": "wake up",
"completed": true
}
}
}
Here is my code that I've created as I try to learn GraphQL.
schema.js:
var graphql = require('graphql');
var TODOs = [
{
"id": 1,
"title": "wake up",
"completed": true
},
{
"id": 2,
"title": "Eat Breakfast",
"completed": true
},
{
"id": 3,
"title": "Go to school",
"completed": false
}
];
var TodoType = new graphql.GraphQLObjectType({
name: 'todo',
fields: function () {
return {
id: {
type: graphql.GraphQLID
},
title: {
type: graphql.GraphQLString
},
completed: {
type: graphql.GraphQLBoolean
}
};
}
});
var queryType = new graphql.GraphQLObjectType({
name: 'Query',
fields: function () {
return {
todos: {
type: new graphql.GraphQLList(TodoType),
args: {
id: { type: graphql.GraphQLID }
},
resolve: function (source, args, root, ast) {
if (args.id) {
return TODOs.filter(function(item) {
return item.id === args.id;
})[0];
}
return TODOs;
}
}
}
}
});
module.exports = new graphql.GraphQLSchema({
query: queryType
});
index.js:
var graphql = require ('graphql').graphql;
var express = require('express');
var graphQLHTTP = require('express-graphql');
var Schema = require('./schema');
var query = 'query { todos(id: 1) { id, title, completed } }';
graphql(Schema, query).then( function(result) {
console.log(JSON.stringify(result,null," "));
});
var app = express()
.use('/', graphQLHTTP({ schema: Schema, pretty: true }))
.listen(8080, function (err) {
console.log('GraphQL Server is now running on localhost:8080');
});
To run this code I just run node index from the root directory. How can I get one specific record returned by the records id?
You have the wrong type for the todos field of your queryType. It should be TodoType, not a list of TodoType. You're getting an error because GraphQL expects to see a list, but your resolver is just returning a single value.
By the way, I suggest passing the graphiql: true option to graphqlHTTP, which will let you use GraphiQL to explore your schema and make queries.
I'm new to AngularJS and so far haven't had any problems until this one...
I am trying to display json data returned for my REST service call without any luck. I can hard-code in a data array into my controller script file and that will be displayed on my web page just fine however when trying to display my json data I'm not having any luck.
This is what I currently have coded...
Web page-
<div ng-controller="ExceptionLogDataController">
<div ui-grid="gridOptions" class="vertexGrid"></div>
</div>
ExceptionLogDataController-
$scope.vertexData = [];
$scope.gridOptions = {
enableSorting: true,
data: "vertexData",
columnDefs: [
{ name: 'Data Id', field: 'DataId' },
{ name: 'Source Date Time', field: 'SourceDateTime' },
{ name: 'Message Text', field: 'MessageText' },
{ name: 'IsDirty', field: 'IsDirty' }
// { name: 'FileName', field: 'FileName' },
// { name: 'GenJIRATicket', field: 'GenJIRATicket' },
// { name: 'MessageCount', field: 'MessageCount' },
// { name: 'MachineName', field: 'MachineName' },
// { name: 'AppDomainName', field: 'AppDomainName' },
// { name: 'ProcessName', field: 'ProcessName' },
// { name: 'StackTrace', field: 'StackTrace' }
],
};
//$scope.vertexData = [
// {
// "First Name": "John",
// "Last Name": "Smith",
// },
// {
// "First Name": "Jane",
// "Last Name": "Doe",
// }
//];
$scope.load = function () {
ExceptionLogDataFactory()
.then(function (response) {
$scope.vertexData = JSON.parse(response.data);
});
}
$scope.load();
}
ExceptionLogDataController.$inject = ['$scope', 'ExceptionLogDataFactory'];
ExceptionLogDataFactory-
var ExceptionLogDataFactory = function ($http, $q, SessionService) {
return function () {
var result = $q.defer();
$http({
method: 'GET',
url: SessionService.apiUrl + '/api/ExceptionLogData',
headers: { 'Content-Type': 'application/json', 'Authorization': 'Bearer ' + SessionService.getToken() }
})
.success(function (response) {
result.resolve(response);
})
.error(function (response) {
result.reject(response);
});
return result.promise;
}
}
ExceptionLogDataFactory.$inject = ['$http', '$q', 'SessionService'];
I've verified that my REST call is returning JSON data through Postman so the problem lies with my front end code.
Making progress...
I'm getting my json object successfully returned and am trying to display it with the following...
$scope.data = [];
$scope.gridOptions = {
enableSorting: true,
data: 'data',
};
ExceptionLogDataService() //Call to Service that returns json object
.then(function (data) {
$scope.data = data;
$scope.gridOptions.data = $scope.data;
console.log($scope.data);
}
And this is the json object that is being returned via console.log call...
Object { DataId: 1074, SourceDateTime: "2016-01-19T13:29:01.2512456-05:00", MessageText: "There is an error in XML document (…", IsDirty: false, StatusList: Object, FileName: "D:\ProdMonitorSiteDev\ErrorFiles\…", GenJIRATicket: false, MessageCount: 1, MachineName: "VERTEXCUTIL01", AppDomainName: "", 2 more… }
This is the error that I am getting...
Error: newRawData.forEach is not a function
Well I figured it out!
I finally got my head out of the weeds and 'really' looked at the JSON object that was being returned from my service and noticed that the object was encapsulated with '{}' (curly braces) which is what was causing the newRawData.forEach error. So what I did was the following...
.then(function (data) {
$scope.data = "[" + JSON.stringify(data) + "]"; // 'Stringify my object and then encapsulate it with square brackets '[]' and then I could use JSON.parse to then parse the new string into a JSON object...
$scope.data = JSON.parse($scope.data);
// Worked like a champ!....
$scope.gridOptions.data = JSON.stringify($scope.data);
You don't need to parse the JSON.
$http.get(url)
.success(function (data) {
$scope.gridOptions.data = data;
}
This should work just fine for what you are doing.