How to send request with bulk? Elasticsearch - javascript

I have an object with data, and i want to send this data to elasticsearch container
for(let key in params)
{
bulk.push(JSON.stringify({
index: {
_id: params[ key ][ 'id' ],
_type: 'id',
_index: 'geo'
}
}));
bulk.push(JSON.stringify(params[key]));
}
let bulks = bulk.join("\n") + "\n";
I made request
let cat = request(
{
'method' : 'PUT',
'uri' : 'http://dev4.int10h.net:40024/_bulk',
'body' : bulks ,
'json' : true,
'headers':
[
'Content-Type: application/x-ndjson'
],
'agent' : false
}
);
but has error
Unhandled rejection StatusCodeError: 400 - {"error":{"root_cause":[{"type":"illegal_argument_exception","reason":"The bulk request must be terminated by a newline [\n]"}],"type":"illegal_argument_exception","reason":"The bulk request must be terminated by a newline [\n]"},"status":400}
at new StatusCodeError (/usr/lib/node_modules/request-promise/node_modules/request-promise-core/lib/errors.js:32:15)
at Request.plumbing.callback (/usr/lib/node_modules/request-promise/node_modules/request-promise-core/lib/plumbing.js:104:33)
at Request.RP$callback [as _callback] (/usr/lib/node_modules/request-promise/node_modules/request-promise-core/lib/plumbing.js:46:31)
at Request.self.callback (/usr/lib/node_modules/request/request.js:185:22)
at Request.emit (events.js:182:13)
at Request. (/usr/lib/node_modules/request/request.js:1161:10)
at Request.emit (events.js:182:13)
at IncomingMessage. (/usr/lib/node_modules/request/request.js:1083:12)
at Object.onceWrapper (events.js:273:13)
at IncomingMessage.emit (events.js:187:15)
at endReadableNT (_stream_readable.js:1098:12)
at process.internalTickCallback (internal/process/next_tick.js:72:19)
How to properly send bulk?
bulks type string

Not sure if this helps you but here I found similar problem with _bulk and JSON.stringify.
The answer is:
It looks like the meta characters in your payload does not get
translated into newlines. If you instead used the
elasticsearch.js client, it would handle this for you.

Related

Store log and send it to slack

I need to store errors in the logs table. If I understood correctly, I should write a report for this in Handler.ts
This is my Handlet.ts
export default class ExceptionHandler extends HttpExceptionHandler {
constructor() {
super(Logger)
}
public async handle(error: any, ctx: HttpContextContract) {
console.log(error.messages)
await Log.create({
type: 'menus.store', // if we get error in menu store function
message: error.messages,
})
return super.handle(error, ctx)
}
}
I have two problems. how can I know in which method error I get? to add it to type in log creation.
The second question is that I get the same error that am getting in insomnia when testing. for example:
{"errors":[{"rule":"number","field":"menuId","message":"number validation failed on menuId"},{"rule":"exists","field":"menuId","message":"exists validation failed on menuId"}]}
But I need a full error, I'm getting full error in the visual studio terminal.
How can I add this error to the Log message?
FATAL [17-09-2022, 9:07:53 AM] (projectName/30718 on system): "exists" validation rule failed
err: {
"type": "DatabaseError",
"message": "select 1 from \"menus\" where \"id\" = $1 limit $2 - invalid input syntax for type integer: \"sd\"",
"stack":
error: select 1 from "menus" where "id" = $1 limit $2 - invalid input syntax for type integer: "sd"
at Parser.parseErrorMessage (/home/biping/Development/projectName/backend/node_modules/pg-protocol/src/parser.ts:369:69)
at Parser.handlePacket (/home/biping/Development/projectName/backend/node_modules/pg-protocol/src/parser.ts:188:21)
at Parser.parse (/home/biping/Development/projectName/backend/node_modules/pg-protocol/src/parser.ts:103:30)
at Socket.<anonymous> (/home/biping/Development/projectName/backend/node_modules/pg-protocol/src/index.ts:7:48)
at Socket.emit (node:events:513:28)
at addChunk (node:internal/streams/readable:315:12)
at readableAddChunk (node:internal/streams/readable:289:9)
at Socket.Readable.push (node:internal/streams/readable:228:10)
at TCP.onStreamRead (node:internal/stream_base_commons:190:23)
at TCP.callbackTrampoline (node:internal/async_hooks:130:17)
"length": 103,
"name": "error",
"severity": "ERROR",
"code": "22P02",
"file": "numutils.c",
"line": "256",
"routine": "pg_strtoint32"
}
How can I log this error?

Send websocket data to specific client

I made a websocket server using nodeJS, and my problem is as follow : two clients will connect to the server. When the client 1 send a data to the server, I want to forward it to client 2.
In order to do this, I took inspiration from this kind of solution :https://github.com/websockets/ws/issues/367
Looks great, but when I apply it, it gaves me the following error :
TypeError [ERR_INVALID_ARG_TYPE]: The first argument must be of type string or an instance of Buffer, ArrayBuffer, or Array or an Array-like Object. Received an instance of Object
at new NodeError (node:internal/errors:371:5)
at Function.from (node:buffer:323:9)
at toBuffer (/home/ttt/node_modules/ws/lib/buffer-util.js:95:18)
at Sender.send (/home/ttt/node_modules/ws/lib/sender.js:315:14)
at WebSocket.send (/home/ttt/node_modules/ws/lib/websocket.js:467:18)
at WebSocket.<anonymous> (/home/ttt/STT_project/sttproject/IHM_server/websocketServer.js:43:15)
at WebSocket.emit (node:events:526:28)
at Receiver.receiverOnMessage (/home/ttt/node_modules/ws/lib/websocket.js:1137:20)
at Receiver.emit (node:events:526:28)
at Receiver.dataMessage (/home/ttt/node_modules/ws/lib/receiver.js:528:14)
code: 'ERR_INVALID_ARG_TYPE'
I tried different methods, but the result remains the same. Does anyone can help me with this ? Or propose a better solution to my problem.
There is a sample of my code:
wss.on("connection", ws => {
ws.id = id++;
lookup[ws.id] = ws;
ws.on("message", data => {
if(data.toString() != "\n")
{
message = {
"message":
{
"text": data.toString(),
"end": false
}
}
}
else {
message = {
"message":
{
"text": "",
"end": true
}
}
}
lookup[0].send('message');
})
...

Resulting document after update is larger than 16777216

I'm getting two errors when I try adding new data to an array with mongoose. Here is my code:
return await db.fileMeta.findOneAndUpdate({
username: username,
'files.fileUID': {
$ne: data.fileUID
}
}, {
$addToSet: {
files: data
}
}).exec();
data is a JavaScript object with 25 items that amounts to 789 bytes... Nothing near 16MBs. My code worked fine until recently (last few days) then I started getting this error:
MongoError: BSONObj size: 16829075 (0x100CA93) is invalid. Size must be between 0 and 16793600(16MB) First element: $v: 1
at MessageStream.messageHandler (/home/user/projects/web-app/node_modules/mongoose/node_modules/mongodb/lib/cmap/connection.js:268:20)
at MessageStream.emit (events.js:314:20)
at processIncomingData (/home/user/projects/web-app/node_modules/mongoose/node_modules/mongodb/lib/cmap/message_stream.js:144:12)
at MessageStream._write (/home/user/projects/web-app/node_modules/mongoose/node_modules/mongodb/lib/cmap/message_stream.js:42:5)
at doWrite (_stream_writable.js:403:12)
at writeOrBuffer (_stream_writable.js:387:5)
at MessageStream.Writable.write (_stream_writable.js:318:11)
at TLSSocket.ondata (_stream_readable.js:719:22)
at TLSSocket.emit (events.js:314:20)
at addChunk (_stream_readable.js:298:12)
at readableAddChunk (_stream_readable.js:273:9)
at TLSSocket.Readable.push (_stream_readable.js:214:10)
at TLSWrap.onStreamRead (internal/stream_base_commons.js:188:23) {
ok: 0,
code: 10334,
codeName: 'BSONObjectTooLarge'
}
Then the above error stopped and I got anther instead:
MongoError: Resulting document after update is larger than 16777216
at MessageStream.messageHandler (/home/user/projects/web-app/node_modules/mongoose/node_modules/mongodb/lib/cmap/connection.js:268:20)
at MessageStream.emit (events.js:314:20)
at processIncomingData (/home/user/projects/web-app/node_modules/mongoose/node_modules/mongodb/lib/cmap/message_stream.js:144:12)
at MessageStream._write (/home/user/projects/web-app/node_modules/mongoose/node_modules/mongodb/lib/cmap/message_stream.js:42:5)
at doWrite (_stream_writable.js:403:12)
at writeOrBuffer (_stream_writable.js:387:5)
at MessageStream.Writable.write (_stream_writable.js:318:11)
at TLSSocket.ondata (_stream_readable.js:719:22)
at TLSSocket.emit (events.js:314:20)
at addChunk (_stream_readable.js:298:12)
at readableAddChunk (_stream_readable.js:273:9)
at TLSSocket.Readable.push (_stream_readable.js:214:10)
at TLSWrap.onStreamRead (internal/stream_base_commons.js:188:23) {
ok: 0,
code: 17419,
codeName: 'Location17419'
}
Using MongoDB Compass I can see this:
Note the total size of 16.2MB, this is the only thing that I can think of that is even lose to 16MBs.
I would understand that if the error was trying to say that my data object was too large, but since it is so small (789 bytes) I don't understand why I'm getting the error or how to fix it. If the error is because the entire DB is larger than 16MBs then something must be wrong because obviously DBs should scale larger than 16MBs.
How can I prevent this error?
I took #Evert's advice and flattened my DB so that each file's metadata was it's own document instead of an array within a document. And now it works fine.
const fileMetadataDocument = new db.FileMetadata(data);
const saved = await fileMetadataDocument.save();
//Works

Node Js and Ebay api

I am novice to NodeJs and working with ebay-api.
I found this great example at GitHub
one strange issue is when I run the js file via CMD. it is working but sometimes it shows error and then I cleared cache it works and sometimes after clearing the cache it shows error. But the code is exactly the same which I got output correctly. Did anyone face the same issue or any idea where might be the problem?
var ebay = require('../index.js');
var params = {
keywords: ["Canon", "Powershot"],
// add additional fields
outputSelector: ['AspectHistogram'],
paginationInput: {
entriesPerPage: 10
},
itemFilter: [
{name: 'FreeShippingOnly', value: true},
{name: 'MaxPrice', value: '150'}
],
domainFilter: [
{name: 'domainName', value: 'Digital_Cameras'}
]
};
ebay.xmlRequest({
serviceName: 'Finding',
opType: 'findItemsByKeywords',
appId: '<your app id>', // FILL IN YOUR OWN APP KEY
params: params,
parser: ebay.parseResponseJson // (default)
},
// gets all the items together in a merged array
function itemsCallback(error, itemsResponse) {
if (error) throw error;
var items = itemsResponse.searchResult.item;
console.log('Found', items.length, 'items');
for (var i = 0; i < items.length; i++) {
console.log('- ' + items[i].title);
console.log('- ' + items[i].galleryURL);
console.log('- ' + items[i].viewItemURL);
}
}
);
I'm getting the following errors:
C:\node_modules\ebay-api\examples> node H:\NodeJs\app.js //Run via NodeJS CMD
H:\NodeJs\app.js:36
if (error) throw error;
^
Error
at Request._callback (C:\Users\shiva raju\node_modules\ebay-api\lib\xml-request.js:151:23)
at Request.self.callback (C:\Users\shiva raju\node_modules\ebay-api\node_modules\request\request.js:200:22)
at emitTwo (events.js:106:13)
at Request.emit (events.js:194:7)
at Request. (C:\Users\shiva raju\node_modules\ebay-api\node_modules\request\request.js:1067:10)
at emitOne (events.js:101:20)
at Request.emit (events.js:191:7)
at IncomingMessage. (C:\Users\shiva raju\node_modules\ebay-api\node_modules\request\request.js:988:12)
at emitNone (events.js:91:20)
at IncomingMessage.emit (events.js:188:7)
Your suggestions would be appreciated. Thanks
You can use this node module ebay-node-api where you can get the response data in form of JSON.
You can check this example to check how to consume ebay-node-api
https://github.com/pajaydev/ebay-node-api/
You are throwing an error object in the callback but you are not catching it anywhere in the code. Please handle the error you are throwing.

Sequelize TypeError: Cannot read property '1' of null

Normally this type of error would not be a problem but i simply cannot understand where this is happening:
Here is my setup:
router.route('/api/academyModule')
.post(function (req, res) {
req.body.academyModule.module_id = req.body.academyModule.module.id;
req.body.academyModule.module_module_type_id = req.body.academyModule.module.module_type.id;
var am = AcademyModule.build(req.body.academyModule);
am.add(req.body.academyModule.requirements, req.body.academyModule, function (success) {
res.json({id: this[null]});
},
function (err) {
res.status(err).send(err);
});
if(req.body.teams != null)
{
req.body.teams.forEach(function(y)
{
var atm = academy_team_has_academy_module.build({academy_team_id: y.id, academy_id: y.academy_id, academy_module_module_id: req.body.academyModule.module_id })
atm.add(function(success)
{
}, function(err)
{
res.status(err).send(err);
});
});
}
})
For this i have the following model:
academy_team_has_academy_module = sequelize.define('academy_team_has_academy_module', {
academy_team_id: {
type: DataTypes.INTEGER,
primaryKey: true,
autoIncrement: false
},
academy_id: DataTypes.INTEGER,
academy_module_module_id: DataTypes.INTEGER
}, {
freezeTableName: true,
instanceMethods: {
add: function (onSuccess, onError) {
academy_team_has_academy_module.build(this.dataValues)
.save().ok(onSuccess).error(onError);
}
}
});
i know for sure this happens in this model and not AcademyModule because when i remove this code it runs without any issues. So in my console i get the following print out:
Executing (default): INSERT INTO `requirements` (`id`,`value`,`requirement_type_id`) VALUES (DEFAULT,5,'2');
Executing (default): INSERT INTO `academy_team_has_academy_module` (`academy_team_id`,`academy_id`,`academy_module_module_id`) VALUES (1,3,11);
Executing (default): INSERT INTO `academy_module` (`academy_id`,`module_id`,`module_module_type_id`,`sort_number`,`requirements_id`) VALUES ('3',11,4,4,40);
And just right after i get:
/var/www/learningbankapi/src/node_modules/mysql/lib/protocol/Parser.js:82
throw err;
^
TypeError: Cannot read property '1' of null
at module.exports.Query.formatError (/var/www/learningbankapi/src/node_modules/sequelize/lib/dialects/mysql/query.js:155:23)
at Query.module.exports.Query.run [as _callback] (/var/www/learningbankapi/src/node_modules/sequelize/lib/dialects/mysql/query.js:38:23)
at Query.Sequence.end (/var/www/learningbankapi/src/node_modules/mysql/lib/protocol/sequences/Sequence.js:96:24)
at Query.ErrorPacket (/var/www/learningbankapi/src/node_modules/mysql/lib/protocol/sequences/Query.js:93:8)
at Protocol._parsePacket (/var/www/learningbankapi/src/node_modules/mysql/lib/protocol/Protocol.js:271:23)
at Parser.write (/var/www/learningbankapi/src/node_modules/mysql/lib/protocol/Parser.js:77:12)
at Protocol.write (/var/www/learningbankapi/src/node_modules/mysql/lib/protocol/Protocol.js:39:16)
at Socket.Connection.connect (/var/www/learningbankapi/src/node_modules/mysql/lib/Connection.js:82:28)
at Socket.EventEmitter.emit (events.js:95:17)
at Socket.stream.pause.paused (_stream_readable.js:746:14)
at Socket.EventEmitter.emit (events.js:92:17)
at emitReadable_ (_stream_readable.js:408:10)
at emitReadable (_stream_readable.js:404:5)
at readableAddChunk (_stream_readable.js:165:9)
at Socket.Readable.push (_stream_readable.js:127:10)
at TCP.onread (net.js:526:21)
Ive debugged the whole thing and i can't seem to find any undefined variables. What might have gone wrong here?
Your problem is a ForeignKeyContraintError. Sequelize is trying to throw a new error but get's an unexpected error.message. If your configuration is fine, this may be an bug of sequelize.
Maybe deactivating foreign contraint checks (if possible) will solve your problem.
See: https://github.com/sequelize/sequelize/blob/master/lib/dialects/mysql/query.js#L153

Categories