Elasticsearch bulk set _id - javascript

When I'm adding docs to elasticsearch with _id set I get:
Field [_id] is a metadata field and cannot be added inside a document. Use the index API request parameters.
Using client.bulk
const body = dataset.flatMap(doc => [{ index: { _index: 'myindex' } }, doc])
const { body: bulkResponse } = await client.bulk({ refresh: true, body })
I don't see a place to put the _id in the parameters.
https://www.elastic.co/guide/en/elasticsearch/client/javascript-api/current/api-reference.html
Am I supposed to use a different method?
Thanks.

It needs to be inside the command part, but you also need to remove it from the source document in doc:
here
|
v
const body = dataset.flatMap(doc => [{ index: { _index: 'myindex', _id: doc._id } }, doc])
const { body: bulkResponse } = await client.bulk({ refresh: true, body })

Related

Apollo GraphQL updateQuery to typePolicy

I am beating my head against a wall. I have updated to Apollo 3, and cannot figure out how to migrate an updateQuery to a typePolicy. I am doing basic continuation based pagination, and this is how I used to merged the results of fetchMore:
await fetchMore({
query: MessagesByThreadIDQuery,
variables: {
threadId: threadId,
limit: Configuration.MessagePageSize,
continuation: token
},
updateQuery: (prev, curr) => {
// Extract our updated message page.
const last = prev.messagesByThreadId.messages ?? []
const next = curr.fetchMoreResult?.messagesByThreadId.messages ?? []
return {
messagesByThreadId: {
__typename: 'MessagesContinuation',
messages: [...last, ...next],
continuation: curr.fetchMoreResult?.messagesByThreadId.continuation
}
}
}
I have made an attempt to write the merge typePolicy myself, but it just continually loads and throws errors about duplicate identifiers in the Apollo cache. Here is what my typePolicy looks like for my query.
typePolicies: {
Query: {
fields: {
messagesByThreadId: {
keyArgs: false,
merge: (existing, incoming, args): IMessagesContinuation => {
const typedExisting: IMessagesContinuation | undefined = existing
const typedIncoming: IMessagesContinuation | undefined = incoming
const existingMessages = (typedExisting?.messages ?? [])
const incomingMessages = (typedIncoming?.messages ?? [])
const result = existing ? {
__typename: 'MessageContinuation',
messages: [...existingMessages, ...incomingMessages],
continuation: typedIncoming?.continuation
} : incoming
return result
}
}
}
}
}
So I was able to solve my use-case. It seems way harder than it really needs to be. I essentially have to attempt to locate existing items matching the incoming and overwrite them, as well as add any new items that don't yet exist in the cache.
I also have to only apply this logic if a continuation token was provided, because if it's null or undefined, I should just use the incoming value because that indicates that we are doing an initial load.
My document is shaped like this:
{
"items": [{ id: string, ...others }],
"continuation": "some_token_value"
}
I created a generic type policy that I can use for all my documents that have a similar shape. It allows me to specify the name of the items property, what the key args are that I want to cache on, and the name of the graphql type.
export function ContinuationPolicy(keyArgs: Array<string>, itemPropertyKey: string, typeName: string) {
return {
keyArgs,
merge(existing: any, incoming: any, args: any) {
if (!!existing && !!args.args?.continuation) {
const existingItems = (existing ? existing[itemPropertyKey] : [])
const incomingItems = (incoming ? incoming[itemPropertyKey] : [])
let items: Array<any> = [...existingItems]
for (let i = 0; i < incomingItems.length; i++) {
const current = incomingItems[i] as any
const found = items.findIndex(m => m.__ref === current.__ref)
if (found > -1) {
items[found] === current
} else {
items = [...items, current]
}
}
// This new data is a continuation of the last data.
return {
__typename: typeName,
[itemPropertyKey]: items,
continuation: incoming.continuation
}
} else {
// When we have no existing data in the cache, we'll just use the incoming data.
return incoming
}
}
}
}

mongoose check if id exists but that id is nested inside an array

When i fetch new alerts, i want to check if the ID of the new alert was already recorded. The issue is that that ID is nested inside an array. There's the alertsDetails array, which contains objects and those objects have an _ID filed which is what i want to check. I am not sure how to achieve that. I got the code below but then i have to iterate over the result to check the exists value. Im sure there must be a better way.
const mongoose = require('mongoose');
const { Schema } = mongoose;
const G2AlertsSchema = new Schema(
{
status: { type: String, required: true },
openDate: { type: Date, required: true },
alertType: { type: Array, required: true },
severity: { type: Array, required: true },
locationName: { type: Array, required: true },
history: { type: Array, required: true },
alertDetails: { type: Array, required: false },
assignedTo: { type: Schema.Types.ObjectId, ref: 'user' },
},
{
timestamps: true,
},
);
const G2Alerts = mongoose.model('G2Alert', G2AlertsSchema);
module.exports = G2Alerts;
This is the code i found on mongodb's website. I just want to see if the ID exists only. Basically when i fetch the new alerts i get an array and i iterate over it, i want to check each item's ID against what's inside the Database. If it's there, skip and go to the next. If it's new, then create a new alert and save it.
const exists = await G2Alerts.aggregate([
{
$project: {
exists: {
$in: ['5f0b4f508bda3805754ab343', '$alertDetails._id'],
},
},
},
]);
EDIT: Another thing. I am getting a eslint warning saying i should use array iteration instead of a for loop. The issue is, i need to use await when looking up the Alert ID. If i use, reduce or filter, i can't use await. If i use async inside the reduce or filter function, then it will return promises in or just an empty array.
This below works, based on the answer provided by Tom Slabbaert
const newAlertsData = [];
for (let item of alertData.data.items) {
const exists = await G2Alerts.find({ 'alertDetails._id': `${item._id}` });
if (exists.length === 0) {
newAlertsData.push(item);
}
}
if (newAlertsData.length !== 0) {......
But this does not
const filteredAlerts = alertData.data.items.reduce((filtered, item) => {
const exists = await G2Alerts.find({ 'alertDetails._id': `${item._id}` });
if (exists.length === 0) {
filtered.push(item);
}
return filtered;
}, []);
You're not far off, here is an example using the correct syntax:
const exists = await G2Alerts.findOne({"alertDetails._id": '5f0b4f508bda3805754ab343'}});
if (!exists) {
... do something
}
This can also be achieve using aggregate with a $match stage instead of a $project stage or even better countDocuments which just returns the count instead of the entire object if you do not require it.
One more thing I'd like to add is that make sure alertDetails._id is string type as you're using string in you're $in. otherwise you'll need to cast them to ObjectId type in mongoose like so:
new mongoose.Types.ObjectId('5f0b4f508bda3805754ab343')
And for Mongo:
import {ObjectId} from "mongodb"
...
new ObjectId('5f0b4f508bda3805754ab343')
EDIT
Try something like this?
let ids = alertData.data.items.map(item => item._id.toString());
let existing = await G2Alerts.distinct("alertsDetails._id", {"alertsDetails._id": {$in: ids}});
const filteredAlerts = alertData.data.items.reduce((filtered, item) => {
if (!existing.includes(item._id.toString())) {
return [item].concat(filtered)
}
return filtered;
}, []);
This way you only need to call the db once and not multiple times.
Final code based on the provided answer.
const ids = alertData.data.items.map(item => item._id);
const existing = await G2Alerts.find({ 'alertDetails._id': { $in: ids } }).distinct(
'alertDetails._id',
(err, alerts) => {
if (err) {
res.send(err);
}
return alerts;
},
);
const filteredAlerts = alertData.data.items.reduce((filtered, item) => {
if (!existing.includes(item._id.toString()) && item.openDate > dateLimit) {
return [item].concat(filtered);
}
return filtered;
}, []);

MongoDB: Why is Array.find on comments (an array) property returning undefined?

Probably a silly issue, but why is the Array.find method not working as expected when working in this case? I'm trying to query a specific comment, which involves fetching the post document that has a comments property from the DB. It is from this comments array that I'd like to extract said comment object. For whatever reason, the code below doesn't work. Why?
Below are the code snippets
// Post document from which the comments are extracted
const post = await Post.findById(postId).populate({
path: "comments",
select: "addedBy id"
});
// Resulting post.comments array
[
{ "id": "5d9b137ff542a30f2c135556", "addedBy": "5b8528131719dc141cf95c99" },
{ "id": "5d9b0ba2f28afc5c3013d4df", "addedBy": "5b8528131719dc141cf95c99" },
{ "id": "5d9b0c26f28afc5c3013d4e0", "addedBy": "5b8528131719dc141cf95c99" }
];
// For instance if commentId is '5d9b137ff542a30f2c135556'
// the resulting comment object should be {"id":"5d9b137ff542a30f2c135556","addedBy":"5b8528131719dc141cf95c99"}
// However, commentToDelete is undefined
const commentId = "5d9b137ff542a30f2c135556";
const commentToDelete = comments.find(comment => comment["id"] === commentId);
Edit: Here's the full deleteComment controller code
async function deleteComment(req, res, userId, postId, commentId) {
const post = await Post.findById(postId).populate({
path: 'comments',
select: 'addedBy id',
});
const commentToDelete = post.comments.find(
comment => comment['id'] === commentId
);
if (commentToDelete.addedBy !== userId) {
return res
.status(403)
.json({ message: 'You are not allowed to delete this comment' });
}
await Comment.findByIdAndDelete(commentId);
const updatedPost = await Post.findByIdAndUpdate(
post.id,
{ $pull: { comments: { id: commentId } } },
{ new: true, safe: true, upsert: true }
).populate(populateFields);
return res.status(200).json({ updatedPost });
}
comment => comment['id'] === commentId
Your comment subdocument comes from MongoDB/Mongoose, so comment['id'] will likely be of type ObjectID, which is never equal a string. Explicitly call the toString() function (or use some other approach for transforming to a string) before comparing:
comment => comment['id'].toString() === commentId
works fine in the below snippet, copied from your post!
I am assuming it is posts.comments in your case and not comments.find? Check for typos
const comments = [
{ "id": "5d9b137ff542a30f2c135556", "addedBy": "5b8528131719dc141cf95c99" },
{ "id": "5d9b0ba2f28afc5c3013d4df", "addedBy": "5b8528131719dc141cf95c99" },
{ "id": "5d9b0c26f28afc5c3013d4e0", "addedBy": "5b8528131719dc141cf95c99" }
];
// For instance if commentId is '5d9b137ff542a30f2c135556'
// the resulting comment object should be {"id":"5d9b137ff542a30f2c135556","addedBy":"5b8528131719dc141cf95c99"}
const commentId = "5d9b137ff542a30f2c135556";
// However, commentToDelete is undefined
const commentToDelete = comments.find(comment => comment["id"] === commentId);
console.log(commentToDelete);
you can use this :
const result = comments.find(
({ id }) => id === commentId,
);
console.log(result)
// should return { id: '5d9b137ff542a30f2c135556', addedBy: '5b8528131719dc141cf95c99' }

apollo-link-state: How to write Query resolvers?

I create my state link with defaults values, something like this:
const stateLink = withClientState({
cache,
resolvers,
defaults: {
quote: {
__typename: 'Quote',
name: '',
phoneNumber: '',
email: '',
items: []
}
}
})
So my cache should not be empty. Now my resolvers map looks like this:
resolvers = {
Mutation: { ... },
Query: {
quote: (parent, args, { cache }) => {
const query = gql`query getQuote {
quote #client {
name phoneNumber email items
}
}`
const { quote } = cache.readQuery({ query, variables: {} })
return ({ ...quote })
}
}
}
The datasource of my resolvers is the cache right ? so I have to query the cache somehow. But this is not working, I guess it is because I am trying to respond to quote query, and for that I am making another quote query.
I think I should get the quote data without querying for quote, but how ?
I am getting this error:
Can't find field **quote** on object (ROOT_QUERY) undefined
Please help
Just wanted to post the same question - and fortunatly just figured it out.
readQuery-Methode only allows you to query from root. So instead you should use readFragment, because it allows you to access any normalized field in the cache, as long you got it's id (Something like this: GraphQlTypeName:0 typically constructed from the fields: id and __typename ). Your Query-Resolver should then look something like this:
protected resolvers = {
Query: {
getProdConfig: (parent, args, { cache, getCacheKey }) => {
const id = getCacheKey({ __typename: 'ProdConfig', id: args.id });
const fragment = gql`fragment prodConfig on ProdConfig {
id,
apiKey,
backupUrl,
serverUrl,
cache,
valid
}`;
const data = cache.readFragment({ fragment, id })
return ({ ...data });
}
}
and the call from apollo like:
let query = this.$apollo.query(`
query prodConfig($id: Int!) {
getProdConfig(id: $id) #client {
apiKey,
backupUrl,
serverUrl,
cache,
valid
}
}`,
{ id: 0 }
);

Mongoose - Populate Array With Reference Property

I have an array of Tags in my Post schema:
tags: [ { type: Schema.Types.ObjectId, ref: 'Tag' } ],
Tag looks like this:
{ name: String }
When I populate the tags array it is of course populated with tag object literals.
Is there a way I can instead have mongoose populate the array with only the name string from the tag?
I have tried only specifying the name, but then name is returned within an object literal.
Currently the population outputs:
[ { name: 'React' }, { name: 'JavaScript' } ]
But I would like it to be:
[ 'React', 'JavaScript']
Is there a way to do this with Mongoose ?
You can make use of 'post' Query Middleware function. This function will be triggered before the actual data is returned by Model.find() or Model.findOne() query. Inside the function you can use Array.map to transform the data to required format.
schema.post('findOne', function(doc) {
// Transform the doc here.
// Example:
// doc.tags = doc.tags.map(tag => tag.name);
});
You could also do the same for handling Model.find().
schema.post('find', function(docs) {
// Transform the doc here.
// Example:
// docs = docs.map(doc => {
// doc.tags = doc.tags.map(tag => tag.name);
// return doc;
// });
});
You can use a virtual that returns a reduction of the tags array:
schema.virtual('plainTags').get(function () {
// first, validate if the 'tags' path is populated
if (!this.populated('tags')) {
return this.tags
}
return this.tags.reduce(function(col, Tag) {
col.push(Tag.name) return col
}, [])
})

Categories