I'm getting two errors when I try adding new data to an array with mongoose. Here is my code:
return await db.fileMeta.findOneAndUpdate({
username: username,
'files.fileUID': {
$ne: data.fileUID
}
}, {
$addToSet: {
files: data
}
}).exec();
data is a JavaScript object with 25 items that amounts to 789 bytes... Nothing near 16MBs. My code worked fine until recently (last few days) then I started getting this error:
MongoError: BSONObj size: 16829075 (0x100CA93) is invalid. Size must be between 0 and 16793600(16MB) First element: $v: 1
at MessageStream.messageHandler (/home/user/projects/web-app/node_modules/mongoose/node_modules/mongodb/lib/cmap/connection.js:268:20)
at MessageStream.emit (events.js:314:20)
at processIncomingData (/home/user/projects/web-app/node_modules/mongoose/node_modules/mongodb/lib/cmap/message_stream.js:144:12)
at MessageStream._write (/home/user/projects/web-app/node_modules/mongoose/node_modules/mongodb/lib/cmap/message_stream.js:42:5)
at doWrite (_stream_writable.js:403:12)
at writeOrBuffer (_stream_writable.js:387:5)
at MessageStream.Writable.write (_stream_writable.js:318:11)
at TLSSocket.ondata (_stream_readable.js:719:22)
at TLSSocket.emit (events.js:314:20)
at addChunk (_stream_readable.js:298:12)
at readableAddChunk (_stream_readable.js:273:9)
at TLSSocket.Readable.push (_stream_readable.js:214:10)
at TLSWrap.onStreamRead (internal/stream_base_commons.js:188:23) {
ok: 0,
code: 10334,
codeName: 'BSONObjectTooLarge'
}
Then the above error stopped and I got anther instead:
MongoError: Resulting document after update is larger than 16777216
at MessageStream.messageHandler (/home/user/projects/web-app/node_modules/mongoose/node_modules/mongodb/lib/cmap/connection.js:268:20)
at MessageStream.emit (events.js:314:20)
at processIncomingData (/home/user/projects/web-app/node_modules/mongoose/node_modules/mongodb/lib/cmap/message_stream.js:144:12)
at MessageStream._write (/home/user/projects/web-app/node_modules/mongoose/node_modules/mongodb/lib/cmap/message_stream.js:42:5)
at doWrite (_stream_writable.js:403:12)
at writeOrBuffer (_stream_writable.js:387:5)
at MessageStream.Writable.write (_stream_writable.js:318:11)
at TLSSocket.ondata (_stream_readable.js:719:22)
at TLSSocket.emit (events.js:314:20)
at addChunk (_stream_readable.js:298:12)
at readableAddChunk (_stream_readable.js:273:9)
at TLSSocket.Readable.push (_stream_readable.js:214:10)
at TLSWrap.onStreamRead (internal/stream_base_commons.js:188:23) {
ok: 0,
code: 17419,
codeName: 'Location17419'
}
Using MongoDB Compass I can see this:
Note the total size of 16.2MB, this is the only thing that I can think of that is even lose to 16MBs.
I would understand that if the error was trying to say that my data object was too large, but since it is so small (789 bytes) I don't understand why I'm getting the error or how to fix it. If the error is because the entire DB is larger than 16MBs then something must be wrong because obviously DBs should scale larger than 16MBs.
How can I prevent this error?
I took #Evert's advice and flattened my DB so that each file's metadata was it's own document instead of an array within a document. And now it works fine.
const fileMetadataDocument = new db.FileMetadata(data);
const saved = await fileMetadataDocument.save();
//Works
Related
I need to store errors in the logs table. If I understood correctly, I should write a report for this in Handler.ts
This is my Handlet.ts
export default class ExceptionHandler extends HttpExceptionHandler {
constructor() {
super(Logger)
}
public async handle(error: any, ctx: HttpContextContract) {
console.log(error.messages)
await Log.create({
type: 'menus.store', // if we get error in menu store function
message: error.messages,
})
return super.handle(error, ctx)
}
}
I have two problems. how can I know in which method error I get? to add it to type in log creation.
The second question is that I get the same error that am getting in insomnia when testing. for example:
{"errors":[{"rule":"number","field":"menuId","message":"number validation failed on menuId"},{"rule":"exists","field":"menuId","message":"exists validation failed on menuId"}]}
But I need a full error, I'm getting full error in the visual studio terminal.
How can I add this error to the Log message?
FATAL [17-09-2022, 9:07:53 AM] (projectName/30718 on system): "exists" validation rule failed
err: {
"type": "DatabaseError",
"message": "select 1 from \"menus\" where \"id\" = $1 limit $2 - invalid input syntax for type integer: \"sd\"",
"stack":
error: select 1 from "menus" where "id" = $1 limit $2 - invalid input syntax for type integer: "sd"
at Parser.parseErrorMessage (/home/biping/Development/projectName/backend/node_modules/pg-protocol/src/parser.ts:369:69)
at Parser.handlePacket (/home/biping/Development/projectName/backend/node_modules/pg-protocol/src/parser.ts:188:21)
at Parser.parse (/home/biping/Development/projectName/backend/node_modules/pg-protocol/src/parser.ts:103:30)
at Socket.<anonymous> (/home/biping/Development/projectName/backend/node_modules/pg-protocol/src/index.ts:7:48)
at Socket.emit (node:events:513:28)
at addChunk (node:internal/streams/readable:315:12)
at readableAddChunk (node:internal/streams/readable:289:9)
at Socket.Readable.push (node:internal/streams/readable:228:10)
at TCP.onStreamRead (node:internal/stream_base_commons:190:23)
at TCP.callbackTrampoline (node:internal/async_hooks:130:17)
"length": 103,
"name": "error",
"severity": "ERROR",
"code": "22P02",
"file": "numutils.c",
"line": "256",
"routine": "pg_strtoint32"
}
How can I log this error?
I want to update an object key from mongodb. But when i want to update that key, I am getting this error
MongoServerError: Nested BSON depth greater than 50 not allowed
at Connection.onMessage (C:\Users\Mahabub Saki\Desktop\my-projects\asgmt-11\server-side\node_modules\mongodb\lib\cmap\connection.js:203:30)
at MessageStream.<anonymous> (C:\Users\Mahabub Saki\Desktop\my-projects\asgmt-11\server-side\node_modules\mongodb\lib\cmap\connection.js:63:60)
at MessageStream.emit (node:events:390:28)
at processIncomingData (C:\Users\Mahabub Saki\Desktop\my-projects\asgmt-11\server-side\node_modules\mongodb\lib\cmap\message_stream.js:108:16)
at MessageStream._write (C:\Users\Mahabub Saki\Desktop\my-projects\asgmt-11\server-side\node_modules\mongodb\lib\cmap\message_stream.js:28:9)
at writeOrBuffer (node:internal/streams/writable:389:12)
at _write (node:internal/streams/writable:330:10)
at MessageStream.Writable.write (node:internal/streams/writable:334:10)
at TLSSocket.ondata (node:internal/streams/readable:754:22)
at TLSSocket.emit (node:events:390:28) {
ok: 0,
code: 8000,
codeName: 'AtlasError',
[Symbol(errorLabels)]: Set(0) {}
}
I am novice to NodeJs and working with ebay-api.
I found this great example at GitHub
one strange issue is when I run the js file via CMD. it is working but sometimes it shows error and then I cleared cache it works and sometimes after clearing the cache it shows error. But the code is exactly the same which I got output correctly. Did anyone face the same issue or any idea where might be the problem?
var ebay = require('../index.js');
var params = {
keywords: ["Canon", "Powershot"],
// add additional fields
outputSelector: ['AspectHistogram'],
paginationInput: {
entriesPerPage: 10
},
itemFilter: [
{name: 'FreeShippingOnly', value: true},
{name: 'MaxPrice', value: '150'}
],
domainFilter: [
{name: 'domainName', value: 'Digital_Cameras'}
]
};
ebay.xmlRequest({
serviceName: 'Finding',
opType: 'findItemsByKeywords',
appId: '<your app id>', // FILL IN YOUR OWN APP KEY
params: params,
parser: ebay.parseResponseJson // (default)
},
// gets all the items together in a merged array
function itemsCallback(error, itemsResponse) {
if (error) throw error;
var items = itemsResponse.searchResult.item;
console.log('Found', items.length, 'items');
for (var i = 0; i < items.length; i++) {
console.log('- ' + items[i].title);
console.log('- ' + items[i].galleryURL);
console.log('- ' + items[i].viewItemURL);
}
}
);
I'm getting the following errors:
C:\node_modules\ebay-api\examples> node H:\NodeJs\app.js //Run via NodeJS CMD
H:\NodeJs\app.js:36
if (error) throw error;
^
Error
at Request._callback (C:\Users\shiva raju\node_modules\ebay-api\lib\xml-request.js:151:23)
at Request.self.callback (C:\Users\shiva raju\node_modules\ebay-api\node_modules\request\request.js:200:22)
at emitTwo (events.js:106:13)
at Request.emit (events.js:194:7)
at Request. (C:\Users\shiva raju\node_modules\ebay-api\node_modules\request\request.js:1067:10)
at emitOne (events.js:101:20)
at Request.emit (events.js:191:7)
at IncomingMessage. (C:\Users\shiva raju\node_modules\ebay-api\node_modules\request\request.js:988:12)
at emitNone (events.js:91:20)
at IncomingMessage.emit (events.js:188:7)
Your suggestions would be appreciated. Thanks
You can use this node module ebay-node-api where you can get the response data in form of JSON.
You can check this example to check how to consume ebay-node-api
https://github.com/pajaydev/ebay-node-api/
You are throwing an error object in the callback but you are not catching it anywhere in the code. Please handle the error you are throwing.
I have a local instance of MongoDB, started with --master, and I can see in the start log that I have a replica set oplog (file: /data/db/local.1) created. And it appears local.1 is updated every time I perform and insert().
I am trying to tail / stream the oplog with the following code...
MongoDB.connect('mongodb://localhost:27017/test',(e,db) => {
if ( e ) { return console.log('Connect MongoDB Error',e); }
console.log('Connected MongoDB');
Server.DB = db;
db.collection('oplog.rs').find({}).toArray((e,d) => {
console.log('oplog.rs');
console.log(e);
console.log(d);
})
var updates = db.collection('oplog.rs').find({},{
tailable: true,
awaitData: true,
noCursorTimeout: true,
oplogReplay: true,
numberOfRetries: Number.MAX_VALUE
}).stream();
updates.on('data',(data) => { console.log('data',data); });
updates.on('error',(e) => { console.log('error',e); });
updates.on('end',(end) => { console.log('end',end); });
});
The Code logs the following on start...
oplog.rs
null
[]
However there is no output from stream, I set numberOfRetries to a lower value I get the error..
error { MongoError: No more documents in tailed cursor
at Function.MongoError.create (/Users/peter/node_modules/mongodb-core/lib/error.js:31:11)
at nextFunction (/Users/peter/node_modules/mongodb-core/lib/cursor.js:637:50)
at /Users/peter/node_modules/mongodb-core/lib/cursor.js:588:7
at queryCallback (/Users/peter/node_modules/mongodb-core/lib/cursor.js:211:18)
at Callbacks.emit (/Users/peter/node_modules/mongodb-core/lib/topologies/server.js:116:3)
at Connection.messageHandler (/Users/peter/node_modules/mongodb-core/lib/topologies/server.js:282:23)
at Socket.<anonymous> (/Users/peter/node_modules/mongodb-core/lib/connection/connection.js:273:22)
at emitOne (events.js:96:13)
at Socket.emit (events.js:189:7)
at readableAddChunk (_stream_readable.js:176:18)
name: 'MongoError',
message: 'No more documents in tailed cursor',
tailable: true,
awaitData: true }
end undefined
oplog.rs is used in replica sets. Starting mongodb with --master you are using long-deprecated master-slave replication, which uses local.oplog.$main database, so there is nothings in oplog.rs.
You need to start mongod with --replSet to benefit from replica set oplog. Read more how to deploy and configure replica sets.
Normally this type of error would not be a problem but i simply cannot understand where this is happening:
Here is my setup:
router.route('/api/academyModule')
.post(function (req, res) {
req.body.academyModule.module_id = req.body.academyModule.module.id;
req.body.academyModule.module_module_type_id = req.body.academyModule.module.module_type.id;
var am = AcademyModule.build(req.body.academyModule);
am.add(req.body.academyModule.requirements, req.body.academyModule, function (success) {
res.json({id: this[null]});
},
function (err) {
res.status(err).send(err);
});
if(req.body.teams != null)
{
req.body.teams.forEach(function(y)
{
var atm = academy_team_has_academy_module.build({academy_team_id: y.id, academy_id: y.academy_id, academy_module_module_id: req.body.academyModule.module_id })
atm.add(function(success)
{
}, function(err)
{
res.status(err).send(err);
});
});
}
})
For this i have the following model:
academy_team_has_academy_module = sequelize.define('academy_team_has_academy_module', {
academy_team_id: {
type: DataTypes.INTEGER,
primaryKey: true,
autoIncrement: false
},
academy_id: DataTypes.INTEGER,
academy_module_module_id: DataTypes.INTEGER
}, {
freezeTableName: true,
instanceMethods: {
add: function (onSuccess, onError) {
academy_team_has_academy_module.build(this.dataValues)
.save().ok(onSuccess).error(onError);
}
}
});
i know for sure this happens in this model and not AcademyModule because when i remove this code it runs without any issues. So in my console i get the following print out:
Executing (default): INSERT INTO `requirements` (`id`,`value`,`requirement_type_id`) VALUES (DEFAULT,5,'2');
Executing (default): INSERT INTO `academy_team_has_academy_module` (`academy_team_id`,`academy_id`,`academy_module_module_id`) VALUES (1,3,11);
Executing (default): INSERT INTO `academy_module` (`academy_id`,`module_id`,`module_module_type_id`,`sort_number`,`requirements_id`) VALUES ('3',11,4,4,40);
And just right after i get:
/var/www/learningbankapi/src/node_modules/mysql/lib/protocol/Parser.js:82
throw err;
^
TypeError: Cannot read property '1' of null
at module.exports.Query.formatError (/var/www/learningbankapi/src/node_modules/sequelize/lib/dialects/mysql/query.js:155:23)
at Query.module.exports.Query.run [as _callback] (/var/www/learningbankapi/src/node_modules/sequelize/lib/dialects/mysql/query.js:38:23)
at Query.Sequence.end (/var/www/learningbankapi/src/node_modules/mysql/lib/protocol/sequences/Sequence.js:96:24)
at Query.ErrorPacket (/var/www/learningbankapi/src/node_modules/mysql/lib/protocol/sequences/Query.js:93:8)
at Protocol._parsePacket (/var/www/learningbankapi/src/node_modules/mysql/lib/protocol/Protocol.js:271:23)
at Parser.write (/var/www/learningbankapi/src/node_modules/mysql/lib/protocol/Parser.js:77:12)
at Protocol.write (/var/www/learningbankapi/src/node_modules/mysql/lib/protocol/Protocol.js:39:16)
at Socket.Connection.connect (/var/www/learningbankapi/src/node_modules/mysql/lib/Connection.js:82:28)
at Socket.EventEmitter.emit (events.js:95:17)
at Socket.stream.pause.paused (_stream_readable.js:746:14)
at Socket.EventEmitter.emit (events.js:92:17)
at emitReadable_ (_stream_readable.js:408:10)
at emitReadable (_stream_readable.js:404:5)
at readableAddChunk (_stream_readable.js:165:9)
at Socket.Readable.push (_stream_readable.js:127:10)
at TCP.onread (net.js:526:21)
Ive debugged the whole thing and i can't seem to find any undefined variables. What might have gone wrong here?
Your problem is a ForeignKeyContraintError. Sequelize is trying to throw a new error but get's an unexpected error.message. If your configuration is fine, this may be an bug of sequelize.
Maybe deactivating foreign contraint checks (if possible) will solve your problem.
See: https://github.com/sequelize/sequelize/blob/master/lib/dialects/mysql/query.js#L153