I need to store errors in the logs table. If I understood correctly, I should write a report for this in Handler.ts
This is my Handlet.ts
export default class ExceptionHandler extends HttpExceptionHandler {
constructor() {
super(Logger)
}
public async handle(error: any, ctx: HttpContextContract) {
console.log(error.messages)
await Log.create({
type: 'menus.store', // if we get error in menu store function
message: error.messages,
})
return super.handle(error, ctx)
}
}
I have two problems. how can I know in which method error I get? to add it to type in log creation.
The second question is that I get the same error that am getting in insomnia when testing. for example:
{"errors":[{"rule":"number","field":"menuId","message":"number validation failed on menuId"},{"rule":"exists","field":"menuId","message":"exists validation failed on menuId"}]}
But I need a full error, I'm getting full error in the visual studio terminal.
How can I add this error to the Log message?
FATAL [17-09-2022, 9:07:53 AM] (projectName/30718 on system): "exists" validation rule failed
err: {
"type": "DatabaseError",
"message": "select 1 from \"menus\" where \"id\" = $1 limit $2 - invalid input syntax for type integer: \"sd\"",
"stack":
error: select 1 from "menus" where "id" = $1 limit $2 - invalid input syntax for type integer: "sd"
at Parser.parseErrorMessage (/home/biping/Development/projectName/backend/node_modules/pg-protocol/src/parser.ts:369:69)
at Parser.handlePacket (/home/biping/Development/projectName/backend/node_modules/pg-protocol/src/parser.ts:188:21)
at Parser.parse (/home/biping/Development/projectName/backend/node_modules/pg-protocol/src/parser.ts:103:30)
at Socket.<anonymous> (/home/biping/Development/projectName/backend/node_modules/pg-protocol/src/index.ts:7:48)
at Socket.emit (node:events:513:28)
at addChunk (node:internal/streams/readable:315:12)
at readableAddChunk (node:internal/streams/readable:289:9)
at Socket.Readable.push (node:internal/streams/readable:228:10)
at TCP.onStreamRead (node:internal/stream_base_commons:190:23)
at TCP.callbackTrampoline (node:internal/async_hooks:130:17)
"length": 103,
"name": "error",
"severity": "ERROR",
"code": "22P02",
"file": "numutils.c",
"line": "256",
"routine": "pg_strtoint32"
}
How can I log this error?
Related
I have introduced an nest HTTP exception filter in the project and included it globally in main.ts
const { httpAdapter } = app.get(HttpAdapterHost);
app.useGlobalFilters(new HttpExceptionFilter(httpAdapter));
NestJS Jest tests are failing with below error.
[Nest] 13076 - 09/03/2022, 2:16:50 PM ERROR [ExceptionsHandler] Invalid role(s): []
AccessControlError: Invalid role(s): []
at new AccessControlError (/Users/shubhamjain/SJ/Projects/amplication/packages/amplication-data-service-generator/generated/server/node_modules/accesscontrol/lib/core/AccessControlError.js:24:28)
at Object.getFlatRoles (/Users/shubhamjain/SJ/Projects/amplication/packages/amplication-data-service-generator/generated/server/node_modules/accesscontrol/lib/utils.js:550:19)
at Object.getUnionAttrsOfRoles (/Users/shubhamjain/SJ/Projects/amplication/packages/amplication-data-service-generator/generated/server/node_modules/accesscontrol/lib/utils.js:739:27)
at new Permission (/Users/shubhamjain/SJ/Projects/amplication/packages/amplication-data-service-generator/generated/server/node_modules/accesscontrol/lib/core/Permission.js:51:43)
at RolesBuilder.Object.<anonymous>.AccessControl.permission (/Users/shubhamjain/SJ/Projects/amplication/packages/amplication-data-service-generator/generated/server/node_modules/accesscontrol/lib/AccessControl.js:481:16)
at AclFilterResponseInterceptor.intercept (/Users/shubhamjain/SJ/Projects/amplication/packages/amplication-data-service-generator/generated/server/src/interceptors/aclFilterResponse.interceptor.ts:25:42)
at /Users/shubhamjain/SJ/Projects/amplication/packages/amplication-data-service-generator/generated/server/node_modules/#nestjs/core/interceptors/interceptors-consumer.js:23:36
at InterceptorsConsumer.intercept (/Users/shubhamjain/SJ/Projects/amplication/packages/amplication-data-service-generator/generated/server/node_modules/#nestjs/core/interceptors/interceptors-consumer.js:25:24)
at /Users/shubhamjain/SJ/Projects/amplication/packages/amplication-data-service-generator/generated/server/node_modules/#nestjs/core/router/router-execution-context.js:46:60
at processTicksAndRejections (node:internal/process/task_queues:96:5)
Edit1
Every Controller's endpoint has something related to using roles.
#common.UseInterceptors(AclValidateRequestInterceptor)
#nestAccessControl.UseRoles({
resource: "Customer",
action: "create",
possession: "any",
})
#common.Post()
#swagger.ApiCreatedResponse({ type: Customer })
#swagger.ApiForbiddenResponse({ type: errors.ForbiddenException })
async create(#common.Body() data: CustomerCreateInput): Promise<Customer> {
Any pointers to debug this problem would be appreciated.
I'm getting two errors when I try adding new data to an array with mongoose. Here is my code:
return await db.fileMeta.findOneAndUpdate({
username: username,
'files.fileUID': {
$ne: data.fileUID
}
}, {
$addToSet: {
files: data
}
}).exec();
data is a JavaScript object with 25 items that amounts to 789 bytes... Nothing near 16MBs. My code worked fine until recently (last few days) then I started getting this error:
MongoError: BSONObj size: 16829075 (0x100CA93) is invalid. Size must be between 0 and 16793600(16MB) First element: $v: 1
at MessageStream.messageHandler (/home/user/projects/web-app/node_modules/mongoose/node_modules/mongodb/lib/cmap/connection.js:268:20)
at MessageStream.emit (events.js:314:20)
at processIncomingData (/home/user/projects/web-app/node_modules/mongoose/node_modules/mongodb/lib/cmap/message_stream.js:144:12)
at MessageStream._write (/home/user/projects/web-app/node_modules/mongoose/node_modules/mongodb/lib/cmap/message_stream.js:42:5)
at doWrite (_stream_writable.js:403:12)
at writeOrBuffer (_stream_writable.js:387:5)
at MessageStream.Writable.write (_stream_writable.js:318:11)
at TLSSocket.ondata (_stream_readable.js:719:22)
at TLSSocket.emit (events.js:314:20)
at addChunk (_stream_readable.js:298:12)
at readableAddChunk (_stream_readable.js:273:9)
at TLSSocket.Readable.push (_stream_readable.js:214:10)
at TLSWrap.onStreamRead (internal/stream_base_commons.js:188:23) {
ok: 0,
code: 10334,
codeName: 'BSONObjectTooLarge'
}
Then the above error stopped and I got anther instead:
MongoError: Resulting document after update is larger than 16777216
at MessageStream.messageHandler (/home/user/projects/web-app/node_modules/mongoose/node_modules/mongodb/lib/cmap/connection.js:268:20)
at MessageStream.emit (events.js:314:20)
at processIncomingData (/home/user/projects/web-app/node_modules/mongoose/node_modules/mongodb/lib/cmap/message_stream.js:144:12)
at MessageStream._write (/home/user/projects/web-app/node_modules/mongoose/node_modules/mongodb/lib/cmap/message_stream.js:42:5)
at doWrite (_stream_writable.js:403:12)
at writeOrBuffer (_stream_writable.js:387:5)
at MessageStream.Writable.write (_stream_writable.js:318:11)
at TLSSocket.ondata (_stream_readable.js:719:22)
at TLSSocket.emit (events.js:314:20)
at addChunk (_stream_readable.js:298:12)
at readableAddChunk (_stream_readable.js:273:9)
at TLSSocket.Readable.push (_stream_readable.js:214:10)
at TLSWrap.onStreamRead (internal/stream_base_commons.js:188:23) {
ok: 0,
code: 17419,
codeName: 'Location17419'
}
Using MongoDB Compass I can see this:
Note the total size of 16.2MB, this is the only thing that I can think of that is even lose to 16MBs.
I would understand that if the error was trying to say that my data object was too large, but since it is so small (789 bytes) I don't understand why I'm getting the error or how to fix it. If the error is because the entire DB is larger than 16MBs then something must be wrong because obviously DBs should scale larger than 16MBs.
How can I prevent this error?
I took #Evert's advice and flattened my DB so that each file's metadata was it's own document instead of an array within a document. And now it works fine.
const fileMetadataDocument = new db.FileMetadata(data);
const saved = await fileMetadataDocument.save();
//Works
**Error Type**
error: A hook (`orm`) failed to load!
error: Could not tear down the ORM hook. Error details: Error: Consistency violation: Attempting to tear down a datastore (`default`) which is not currently registered with this adapter. This is usually due to a race condition in userland code (e.g. attempting to tear down the same ORM instance more than once), or it could be due to a bug in this adapter. (If you get stumped, reach out at http://sailsjs.com/support.)
at Object.teardown (D:\HTML\Sails\test\node_modules\sails-mongo\lib\index.js:390:19)
at D:\HTML\Sails\test\node_modules\waterline\lib\waterline.js:758:27
at D:\HTML\Sails\test\node_modules\waterline\node_modules\async\dist\async.js:3047:20
at eachOfArrayLike (D:\HTML\Sails\test\node_modules\waterline\node_modules\async\dist\async.js:1002:13)
at eachOf (D:\HTML\Sails\test\node_modules\waterline\node_modules\async\dist\async.js:1052:9)
at Object.eachLimit (D:\HTML\Sails\test\node_modules\waterline\node_modules\async\dist\async.js:3111:7)
at Object.teardown (D:\HTML\Sails\test\node_modules\waterline\lib\waterline.js:742:11)
at Hook.teardown (D:\HTML\Sails\test\node_modules\sails-hook-orm\index.js:246:30)
at Sails.wrapper (D:\HTML\Sails\test\node_modules\#sailshq\lodash\lib\index.js:3282:19)
at Object.onceWrapper (events.js:421:28)
at Sails.emit (events.js:315:20)
at Sails.EventEmitter.emit (domain.js:482:12)
at Sails.emitter.emit (D:\HTML\Sails\test\node_modules\sails\lib\app\private\after.js:56:26)
at D:\HTML\Sails\test\node_modules\sails\lib\app\lower.js:67:11
at beforeShutdown (D:\HTML\Sails\test\node_modules\sails\lib\app\lower.js:45:12)
at Sails.lower (D:\HTML\Sails\test\node_modules\sails\lib\app\lower.js:49:3)
at Sails.wrapper [as lower] (D:\HTML\Sails\test\node_modules\#sailshq\lodash\lib\index.js:3282:19)
at whenSailsIsReady (D:\HTML\Sails\test\node_modules\sails\lib\app\lift.js:68:13)
at D:\HTML\Sails\test\node_modules\async\dist\async.js:3861:9
at D:\HTML\Sails\test\node_modules\async\dist\async.js:421:16
at iterateeCallback (D:\HTML\Sails\test\node_modules\async\dist\async.js:924:17)
at D:\HTML\Sails\test\node_modules\async\dist\async.js:906:16
error:
error: Error: Consistency violation: Unexpected error creating db connection manager:
```
MongoError: failed to connect to server [localhost:27017] on first connect [Error: connect ECONNREFUSED 127.0.0.1:27017
at TCPConnectWrap.afterConnect [as oncomplete] (net.js:1141:16) {
name: 'MongoError'
}]
at flaverr (D:\HTML\Sails\test\node_modules\flaverr\index.js:94:15)
at Function.module.exports.parseError (D:\HTML\Sails\test\node_modules\flaverr\index.js:371:12)
at Function.handlerCbs.error (D:\HTML\Sails\test\node_modules\machine\lib\private\help-build-machine.js:665:56)
at connectCb (D:\HTML\Sails\test\node_modules\sails-mongo\lib\private\machines\create-manager.js:130:22)
at connectCallback (D:\HTML\Sails\test\node_modules\mongodb\lib\mongo_client.js:428:5)
at D:\HTML\Sails\test\node_modules\mongodb\lib\mongo_client.js:335:11
at processTicksAndRejections (internal/process/task_queues.js:79:11)
```
at Object.error (D:\HTML\Sails\test\node_modules\sails-mongo\lib\index.js:268:21)
at D:\HTML\Sails\test\node_modules\machine\lib\private\help-build-machine.js:1514:39
at proceedToFinalAfterExecLC (D:\HTML\Sails\test\node_modules\parley\lib\private\Deferred.js:1153:14)
at proceedToInterceptsAndChecks (D:\HTML\Sails\test\node_modules\parley\lib\private\Deferred.js:913:12)
at proceedToAfterExecSpinlocks (D:\HTML\Sails\test\node_modules\parley\lib\private\Deferred.js:845:10)
at D:\HTML\Sails\test\node_modules\parley\lib\private\Deferred.js:303:7
at D:\HTML\Sails\test\node_modules\machine\lib\private\help-build-machine.js:952:35
at Function.handlerCbs.error (D:\HTML\Sails\test\node_modules\machine\lib\private\help-build-machine.js:742:26)
at connectCb (D:\HTML\Sails\test\node_modules\sails-mongo\lib\private\machines\create-manager.js:130:22)
at connectCallback (D:\HTML\Sails\test\node_modules\mongodb\lib\mongo_client.js:428:5)
at D:\HTML\Sails\test\node_modules\mongodb\lib\mongo_client.js:335:11
at processTicksAndRejections (internal/process/task_queues.js:79:11)
error: Could not load Sails app.
error:
error: Tips:
error: • First, take a look at the error message above.
error: • Make sure you've installed dependencies with `npm install`.
error: • Check that this app was built for a compatible version of Sails.
error: • Have a question or need help? (http://sailsjs.com/support)
/config/datastores.js
module.exports.datastores = {
default: {
adapter: "sails-mongo",
url: "mongodb://root#localhost/datab",
},
};
config/models.js
module.exports.models = {
schema: true,
migrate: "alter",
attributes: {
createdAt: { type: "number", autoCreatedAt: true },
updatedAt: { type: "number", autoUpdatedAt: true },
id: { type: "string", columnName: "_id" },
deleted: { type: "boolean", defaultsTo: false },
},
dataEncryptionKeys: {
default: "t9AkMiCRfZeODiZKQsgGif2zsE40wKJK6Uudr51L4hU=",
},
cascadeOnDestroy: true,
};
Could you try:
module.exports.datastores = {
default: {
adapter: require("sails-mongo"),
url: "mongodb://root#localhost/datab",
},
};
I've got something really weird happening in Lambda. I have a JS function that is interpreting a string input. This code is covered by unit tests to confirm that the validation logic is working as expected. But once it's deployed to Lambda I get different behavior. Is there a really noob error that I'm overlooking. Or is there some setting in Lambda that might explain why my code is being interpreted differently.
Here is my function:
public async processMessage(message: string): Promise<void> {
logger.info(`Message: ${message}`);
const params = JSON.parse(message);
if ('fileId' in params && 'location' in params && 'fileType' in params) {
return this.fileLoader.load(params.fileId, params.location, fromString(params.fileType));
} else {
throw new InvalidArgumentError('A required field on the message is missing.');
}
}
The validation works as expected, the function is invoked correctly. Everything seems great!
Then I deploy it up to lambda and call it there. I get this error:
Cannot use 'in' operator to search for 'fileId' in {\"fileId\":\"1234\",\"fileType\": \"TEST_FILE\",\"location\": \"https://d4-file-proc-staging.s3.amazonaws.com/testFile.csv\"}
Here's the CloudWatch Logs:
Message: "{\"fileId\":\"1234\",\"fileType\": \"TEST_FILE\",\"location\": \"https://d4-file-proc-staging.s3.amazonaws.com/testFile.csv\"}"
ERROR Unhandled Promise Rejection
{
"errorType": "Runtime.UnhandledPromiseRejection",
"errorMessage": "TypeError: Cannot use 'in' operator to search for 'fileId' in {\"fileId\":\"1234\",\"fileType\": \"TEST_FILE\",\"location\": \"https://d4-file-proc-staging.s3.amazonaws.com/testFile.csv\"}",
"reason": {
"errorType": "TypeError",
"errorMessage": "Cannot use 'in' operator to search for 'fileId' in {\"fileId\":\"1234\",\"fileType\": \"TEST_FILE\",\"location\": \"https://d4-file-proc-staging.s3.amazonaws.com/testFile.csv\"}",
"stack": [
"TypeError: Cannot use 'in' operator to search for 'fileId' in {\"fileId\":\"1234\",\"fileType\": \"TEST_FILE\",\"location\": \"https://d4-file-proc-staging.s3.amazonaws.com/testFile.csv\"}",
" at FileLoaderMessageHandler.processMessage (/var/task/dist/infrastructure/fileLoaderMessageHandler.js:14:22)",
" at /var/task/dist/aws/fileLoader.js:23:57",
" at Array.forEach (<anonymous>)",
" at Runtime.exports.handle [as handler] (/var/task/dist/aws/fileLoader.js:23:18)",
" at Runtime.handleOnce (/var/runtime/Runtime.js:66:25)"
]
},
"promise": {},
"stack": [
"Runtime.UnhandledPromiseRejection: TypeError: Cannot use 'in' operator to search for 'fileId' in {\"fileId\":\"1234\",\"fileType\": \"TEST_FILE\",\"location\": \"https://d4-file-proc-staging.s3.amazonaws.com/testFile.csv\"}",
" at process.<anonymous> (/var/runtime/index.js:35:15)",
" at process.emit (events.js:210:5)",
" at process.EventEmitter.emit (domain.js:476:20)",
" at processPromiseRejections (internal/process/promises.js:201:33)",
" at processTicksAndRejections (internal/process/task_queues.js:94:32)"
]
}
I tried refactoring it to use use Object.keys as well. But it's still performing oddly.
function isFileLoaderParams(params: any): params is FileLoaderParams {
logger.info(`INPUT: ${params}`);
if (!params.fileId) {
logger.warn('FileId not found');
logger.info(`DoubleCheck: ${JSON.stringify(params)}`);
logger.info(`Value Is: ${JSON.stringify(params.fileId)}`);
logger.info(`ArrayAccess?: ${JSON.stringify(params['fileId'])}`);
return false;
}
if (!params.location) {
logger.warn('Location not found');
return false;
}
if (!params.fileType) {
logger.warn('FileType not found');
return false;
}
return true;
}
Which generates this output:
INPUT: {"fileId":"1234","fileType": "TEST_FILE","location": "https://d4-file-proc-staging.s3.amazonaws.com/testFile.csv"}
FileId not found
DoubleCheck: "{\"fileId\":\"1234\",\"fileType\": \"TEST_FILE\",\"location\": \"https://d4-file-proc-staging.s3.amazonaws.com/testFile.csv\"}"
Value Is: undefined
ArrayAccess?: undefined
Configuration/environment details
Written in typescript
Complied by tsc using Typescript 8.3.0
Deployed using Serverless framework
Lambda Runtime: Node.js 12.x (Also ran it in Node 10, but upgraded to 12 trying to fix this)
That error means that params is a String, not an object. Perhaps your message is a double-encoded JSON string (a JSON string containing a JSON string). I'm not sure how it got there, but look for logic that is doing a JSON.stringify on something that's already JSON. Or as #Titus mentioned, doing a double-decode:
const params = JSON.parse(JSON.parse(message));
Normally this type of error would not be a problem but i simply cannot understand where this is happening:
Here is my setup:
router.route('/api/academyModule')
.post(function (req, res) {
req.body.academyModule.module_id = req.body.academyModule.module.id;
req.body.academyModule.module_module_type_id = req.body.academyModule.module.module_type.id;
var am = AcademyModule.build(req.body.academyModule);
am.add(req.body.academyModule.requirements, req.body.academyModule, function (success) {
res.json({id: this[null]});
},
function (err) {
res.status(err).send(err);
});
if(req.body.teams != null)
{
req.body.teams.forEach(function(y)
{
var atm = academy_team_has_academy_module.build({academy_team_id: y.id, academy_id: y.academy_id, academy_module_module_id: req.body.academyModule.module_id })
atm.add(function(success)
{
}, function(err)
{
res.status(err).send(err);
});
});
}
})
For this i have the following model:
academy_team_has_academy_module = sequelize.define('academy_team_has_academy_module', {
academy_team_id: {
type: DataTypes.INTEGER,
primaryKey: true,
autoIncrement: false
},
academy_id: DataTypes.INTEGER,
academy_module_module_id: DataTypes.INTEGER
}, {
freezeTableName: true,
instanceMethods: {
add: function (onSuccess, onError) {
academy_team_has_academy_module.build(this.dataValues)
.save().ok(onSuccess).error(onError);
}
}
});
i know for sure this happens in this model and not AcademyModule because when i remove this code it runs without any issues. So in my console i get the following print out:
Executing (default): INSERT INTO `requirements` (`id`,`value`,`requirement_type_id`) VALUES (DEFAULT,5,'2');
Executing (default): INSERT INTO `academy_team_has_academy_module` (`academy_team_id`,`academy_id`,`academy_module_module_id`) VALUES (1,3,11);
Executing (default): INSERT INTO `academy_module` (`academy_id`,`module_id`,`module_module_type_id`,`sort_number`,`requirements_id`) VALUES ('3',11,4,4,40);
And just right after i get:
/var/www/learningbankapi/src/node_modules/mysql/lib/protocol/Parser.js:82
throw err;
^
TypeError: Cannot read property '1' of null
at module.exports.Query.formatError (/var/www/learningbankapi/src/node_modules/sequelize/lib/dialects/mysql/query.js:155:23)
at Query.module.exports.Query.run [as _callback] (/var/www/learningbankapi/src/node_modules/sequelize/lib/dialects/mysql/query.js:38:23)
at Query.Sequence.end (/var/www/learningbankapi/src/node_modules/mysql/lib/protocol/sequences/Sequence.js:96:24)
at Query.ErrorPacket (/var/www/learningbankapi/src/node_modules/mysql/lib/protocol/sequences/Query.js:93:8)
at Protocol._parsePacket (/var/www/learningbankapi/src/node_modules/mysql/lib/protocol/Protocol.js:271:23)
at Parser.write (/var/www/learningbankapi/src/node_modules/mysql/lib/protocol/Parser.js:77:12)
at Protocol.write (/var/www/learningbankapi/src/node_modules/mysql/lib/protocol/Protocol.js:39:16)
at Socket.Connection.connect (/var/www/learningbankapi/src/node_modules/mysql/lib/Connection.js:82:28)
at Socket.EventEmitter.emit (events.js:95:17)
at Socket.stream.pause.paused (_stream_readable.js:746:14)
at Socket.EventEmitter.emit (events.js:92:17)
at emitReadable_ (_stream_readable.js:408:10)
at emitReadable (_stream_readable.js:404:5)
at readableAddChunk (_stream_readable.js:165:9)
at Socket.Readable.push (_stream_readable.js:127:10)
at TCP.onread (net.js:526:21)
Ive debugged the whole thing and i can't seem to find any undefined variables. What might have gone wrong here?
Your problem is a ForeignKeyContraintError. Sequelize is trying to throw a new error but get's an unexpected error.message. If your configuration is fine, this may be an bug of sequelize.
Maybe deactivating foreign contraint checks (if possible) will solve your problem.
See: https://github.com/sequelize/sequelize/blob/master/lib/dialects/mysql/query.js#L153