How to Insert items into multiple Arrays in Node.js & MongoDB - javascript

This might be a weird question but I believe nothing is completely impossible.
I have a List of Users in MongoDB, each user has among other things, properties array which is currently empty.
In Excel sheet, I have a data that represents each user's properties which I want to programmatically insert in each user's properties array.
Importing excel sheet is fast and easy to populating each user's properties is what gives me the problem.
I have added userId, and PropeId, from the users and the properties they bought, so Identify them as seen below
router.put('/importdata', async (req, res)=>{
// upload queries
const imported = req.files.importdata;
const uploadpath = path.resolve(`public/excel_maneger/uploads/ ${imported.name}`);
if (imported.truncated) {
throw new Error("Uploaded File is too big, should not be morethan 20 MB");
}
await imported.mv(uploadpath);
const file = render.readFile(uploadpath);
const sheets = file.SheetNames;
const data = [];
for (let i = 0; i < sheets.length; i++) {
const sheetname = sheets[i];
const sheetData = render.utils.sheet_to_json(file.Sheets[sheetname]);
sheetData.forEach((item) => {
data.push(item);
});
}
try {
const users = await User.find({role: 'Customer'})
for(let i = 0; i < users.length; i++ ){
data.forEach((d) => {
if(users[i].id == d.id){
User.updateMany(
{},
{
$set: {
properties: {
propeId: d.propeId,
},
},
},
(err, d) => {
if (err) console.log(err);
}
);
}
});
}
} catch (err) {
console.log(err)
}
})
The Problem is that this code updates everyone on the Database (including non specified users) with the same information, Please I need help, I am trying to import 11 thousand users information from excel to database

When you are updating your User.updateMany(), You are not passing in the Id.
What it does is it when if statement is true, it updates all the user, You can use findByIdAndUpdate
Also you should be using async/await. Since that is what you are using to find the user
await User.findByIdAndUpdate( users[i]._id,{ $set: { properties: { propeId: d.propeId }}})

I have finally figured where I made the mistake, I am supposed to use $in: operator to access multiple Ids as desired.
Here's the solution:
data.forEach((d) => {
User.updateMany(
{ _id: { $in: [d.id] } },
{
$push: {
properties: d.propeId,
},
},
(err, d) => {
if (err) return false;
}
);
});
Above solved the problem amazingly

Related

How do you replace and save documents in MongoDB?

I currently have a database that holds 5 entries per document in my mongoDB. Here is an example
I would like to update my documents in my database with more fields and information. Currently I am doing that with the code below.
NFT.findOne({ "name": name }, (err, value) => {
if (err) { console.error(err) }
if (!value) {
console.log('no value')
let newNFT = new NFT({name, slug, symbol, description, verifiedStatus, bannerUrl, logoUrl, floorPrice, stats, social})
newNFT.save()
}
else {
let newNFT = new NFT({name, slug, symbol, description, verifiedStatus, bannerUrl, logoUrl, floorPrice, stats, social})
NFT.replaceOne({"name": name}, newNFT, {returnDocument: "after"})
}
})
The reason for this question is, I have run console.log(NFT.find({"name": name}) and gotten an object back with all of the fields however, they don't seem to update in my database online. Because it pulls information from the web database, I know I'm connected, but they just aren't updating. What am I doing wrong here?
save() returns a Promise, you should await it.
Plus, try to change the code for updating your existing value:
NFT.findOne({ name }, async (err, value) => {
if (err) {
console.error(err);
}
if (!value) {
// Create new NFT
let newNFT = new NFT({
name,
slug,
symbol,
description,
verifiedStatus,
bannerUrl,
logoUrl,
floorPrice,
stats,
social,
});
await newNFT.save();
} else {
// Update existing NFT
value.name = name
value.slug = slug
value.symbol = symbol
value.description = description
value.verifiedStatus = verifiedStatus
value.bannerUrl = bannerUrl
value.logoUrl = logoUrl
value.floorPrice = floorPrice
value.stats = stats
value.social = social
await value.save();
}
});

Express does not return valid response

i am trying to use mongodb to get some data for our analytics page, there's alot of data that's processing but for some reason the express returns empty array as response even when i use console log i see the data on terminal.
This is my endpoint
import Analytics from '../classes/Analytics';
import { Router } from 'express';
import role from '../middleware/role';
const router = Router();
router.use(role('admin'));
router.get('/analytics/weekly', async (req, res) => {
try {
const data = await Analytics.getWeekly();
return res.status(200).send({ data });
} catch (err) {
console.log(err);
return res.status(500).send({ message: 'Something went wrong, please try again later!' });
}
});
module.exports = router;
This is where all the magic happens for data variable
class Analytics {
static async getWeekly() {
// ignore this one day difference thing
const startDate = moment().subtract(1, 'days').startOf('week').format();
const endDate = moment().subtract(1, 'days').endOf('week').format();
try {
const orders = await Order.find(
{
sent_at: {
$gte: startDate,
$lte: endDate,
},
},
{ user: 1, created_at: 1, sent_at: 1 }
);
let counter = [];
for await (const item of orders) {
const date = moment(item.sent_at).format('YYYY-MM-DD HH');
if (!counter[date]) counter[date] = [];
if (!counter[date][item.user.username]) counter[date][item.user.username] = 0;
counter[date][item.user.username] += 1;
}
return counter;
} catch (err) {
console.log(err);
}
}
}
The point of the static method above is to fetch all orders and count how many times which user has handled the order.
Now when i console.log the data from the router endpoint i see the data on console perfectly just how i wanted it to be
'2021-07-03 22': [
Johnson: 10,
Mikaels: 15,
Vasquez: 24,
Blaskovich: 3
],
'2021-07-03 23': [
Johnson: 2,
Vasquez: 12,
Mikaels: 15,
Blaskovich: 5
]
The problem is when i make a request to my endpoint it returns an empty array []. What am i missing here?
Change this:
if (!counter[date]) counter[date] = [];
to this:
if (!counter[date]) counter[date] = {};
And this:
let counter = [];
to this:
let counter = {};
Your code here:
if (!counter[date][item.user.username]) counter[date][item.user.username] = 0;
Is adding a property to an array, not adding an array element. And JSON.stringify() ignores properties on an array. It only serializes actual array elements so when you try to send the JSON version of the array, it always appears empty.
So, when you call res.status(200).send({ data });, the .send() method serializes your object which contains a bunch of arrays that have no actual array elements, only properties.
By changing the arrays to be objects, then JSON.stringify() will serialize all those properties. Arrays and Objects have many things in common, but they are not the same. You should use the appropriate type for your situation.
In addition, change this:
} catch (err) {
console.log(err);
}
to this:
} catch (err) {
console.log(err);
throw err;
}
So that you are properly propagating errors back to the caller. Otherwise, you're just eating the error and returning undefined, both of which will cause problems for the caller.

Firebase functions: sync with Algolia doesn't work

I am currently trying to sync my firestore documents with algolia upon a new document creation or the update of a document. The path to the collection in firestore is videos/video. The function seems to be triggering fine, however after triggering, the firebase function does not seem to relay any of the information to algolia (no records are being created). I am not getting any errors in the log. (I also double checked the rules and made sure the node could be read by default, and yes I am on the blaze plan). Does anyone know how to sync a firestore node and algolia? Thanks for all your help!
"use strict";
const functions = require("firebase-functions");
const admin = require("firebase-admin");
const algoliasearch_1 = require("algoliasearch");
// Set up Firestore.
admin.initializeApp();
const env = functions.config();
// Set up Algolia.
// The app id and API key are coming from the cloud functions environment, as we set up in Part 1,
const algoliaClient = algoliasearch_1.default(env.algolia.appid, env.algolia.apikey);
// Since I'm using develop and production environments, I'm automatically defining
// the index name according to which environment is running. functions.config().projectId is a default property set by Cloud Functions.
const collectionindexvideo = algoliaClient.initIndex('videos');
exports.collectionvideoOnCreate = functions.firestore.document('videos/{uid}').onCreate(async(snapshot, context) => {
await savevideo(snapshot);
});
exports.collectionvideoOnUpdate = functions.firestore.document('videos/{uid}').onUpdate(async(change, context) => {
await updatevideo(change);
});
exports.collectionvideoOnDelete = functions.firestore.document('videos/{uid}').onDelete(async(snapshot, context) => {
await deletevideo(snapshot);
});
async function savevideo(snapshot) {
if (snapshot.exists) {
const document = snapshot.data();
// Essentially, you want your records to contain any information that facilitates search,
// display, filtering, or relevance. Otherwise, you can leave it out.
const record = {
objectID: snapshot.id,
uid: document.uid,
title: document.title,
thumbnailurl: document.thumbnailurl,
date: document.date,
description: document.description,
genre: document.genre,
recipe: document.recipe
};
if (record) { // Removes the possibility of snapshot.data() being undefined.
if (document.isIncomplete === false) {
// In this example, we are including all properties of the Firestore document
// in the Algolia record, but do remember to evaluate if they are all necessary.
// More on that in Part 2, Step 2 above.
await collectionindexvideo.saveObject(record); // Adds or replaces a specific object.
}
}
}
}
async function updatevideo(change) {
const docBeforeChange = change.before.data();
const docAfterChange = change.after.data();
if (docBeforeChange && docAfterChange) {
if (docAfterChange.isIncomplete && !docBeforeChange.isIncomplete) {
// If the doc was COMPLETE and is now INCOMPLETE, it was
// previously indexed in algolia and must now be removed.
await deletevideo(change.after);
} else if (docAfterChange.isIncomplete === false) {
await savevideo(change.after);
}
}
}
async function deletevideo(snapshot) {
if (snapshot.exists) {
const objectID = snapshot.id;
await collectionindexvideo.deleteObject(objectID);
}
}
Still don't know what I did wrong, however if anyone else is stuck in this situation, this repository is a great resource: https://github.com/nayfin/algolia-firestore-sync. I used it and was able to properly sync firebase and algolia. Cheers!
// Takes an the Algolia index and key of document to be deleted
const removeObject = (index, key) => {
// then it deletes the document
return index.deleteObject(key, (err) => {
if (err) throw err
console.log('Key Removed from Algolia Index', key)
})
}
// Takes an the Algolia index and data to be added or updated to
const upsertObject = (index, data) => {
// then it adds or updates it
return index.saveObject(data, (err, content) => {
if (err) throw err
console.log(`Document ${data.objectID} Updated in Algolia Index `)
})
}
exports.syncAlgoliaWithFirestore = (index, change, context) => {
const data = change.after.exists ? change.after.data() : null;
const key = context.params.id; // gets the id of the document changed
// If no data then it was a delete event
if (!data) {
// so delete the document from Algolia index
return removeObject(index, key);
}
data['objectID'] = key;
// upsert the data to the Algolia index
return upsertObject(index, data);
};

How can I use a variable saved from a mysql connection with NodeJS to an asynchronous function?

I'm trying to scrape a website with Puppeteer. I want to select the date of the last post inserted in my database and compare it to the dates taken by the scrape so I can see if the post is already in the database (using the date as the reference to see if it has been modified).
Here is my code:
const connection = mysql.createConnection({
host: 'localhost',
user: 'root',
password: '',
database: 'db_webcrawler_coches'
});
connection.connect((err) => {
if (err) throw err;
console.log('Connected!');
});
let lastpublishedDate;
let idCoches;
connection.query("SELECT id_coches, publish_date FROM coches ORDER BY publish_date DESC limit 1", function (err, row) {
if (err) throw err;
lastPublishedDate = row[0].publish_date;
idCoches = row[0].id_cochesNet;
console.log("Published in", lastPublishedDate);
console.log("Id Coches", idCoches);
});
const run = async () => {
try {
const options = {
headless: false,
};
...
const news = await page.evaluate(() => {
const idsList = [...document.querySelectorAll('div.mt-SerpList-item')].map(elem => elem.getAttribute("id")).filter(elem => elem.includes("#"))
const datePost = [...document.querySelectorAll('span.mt-CardAd-date')].map(elem => elem.innerText);
for(let i = 0; i < titlesCar.length; i++){
const finalDate = parsedDates[i];
if (finalDate > lastPublishedDate || idCoches !== idsList[i]){
console.log("Not repeated");
const carsList[i] = [
idsList[i],
parsedDates[i]
]
} else {
console.log("Repeated")
}
}
return carsList;
});
...
} catch (err) {
console.log(err);
await browser.close();
console.log("Browser Closed");
}
};
run();
As you can see I want to see if the date is the same or not as well as the id taken from the query. However, it appears an error that says Evaluation failed: ReferenceError: variable "lastPublishedDate" is not defined and I imagine that it will be the same with "idCoches". I wrote some console.logs to see when it crashes and it seems that it happens when reaches the function "news".
I'm not sure if it is because it is the scope or because of the function. What do you think I should do to make it work?
Could it be the scope?
Thank you!
EDIT: SOLVED!
I post it in the case that anyone faces a similar issue.
Indeed it was the scope, it is a problem related to Puppeteer. It seems that the function with page.evaluate() is unable to take any variable outside of it. To change it you need to add the page.evaluate in the following way:
await page.evaluate((variable_1, variable_2) => { /* ... */ }, variable_1, variable_2);
The callback to your Query probably does has not returned yet when the async function is run, so whatever your trying to reference is not defined.
I'm not sure if your mysql client supports promises, but if it does you could do something like this:
const run = async () => {
const row = await connection.query("SELECT id_coches, publish_date FROM coches ORDER BY publish_date DESC limit 1")
lastPublishedDate = row[0].publish_date;
idCoches = row[0].id_cochesNet;
...
}
If that does not work you could also run everything inside the callback of the query. Hope that helps.

MongoDB to Elasticsearch indexing

Stuck at the point to index data collection in elasticsearch.
Following is the code I'm trying to index the data from mongo.
const elasticsearch = require('elasticsearch');
// instantiate an Elas
var bulk = [];
var MongoClient = require('mongodb').MongoClient;
var ObjectID = require('mongodb').ObjectID;
var mongoDBName = 'mydb'; // Name of mongodb goes here
var mongoCollectionName = 'mycollection'; // Collection name of mongodb goes here
var connectionString = 'mongodb://127.0.0.1:27017/'; // put username and password for mongo here
var esIndexName = 'new-collection'; // Elasticsearch index name will go here
var bulk = [];
const client = new elasticsearch.Client({
hosts: [ 'http://localhost:9200']
});
// ping the client to be sure Elasticsearch is up
client.ping({
requestTimeout: 30000,
}, function(error) {
// At this point, eastic search is down, please check your Elasticsearch service
if (error) {
console.error('Elasticsearch cluster is down!');
} else {
console.log('Everything is ok');
}
});
MongoClient.connect(connectionString+mongoDBName, function(err, db) {
if(err) throw err;
// for each object in a collection
var collection = db.collection(mongoCollectionName);
var counter = 0;
collection.find().each(function(err, item, response, status) {
console.log(item)
Array.from(item).forEach(itemdata => {
bulk.push({index:{
_index: esIndexName,
_type: mongoCollectionName,
}
})
bulk.push(itemdata)
})
//perform bulk indexing of the data passed
client.bulk({body:bulk}, function( err, response ){
if( err ){
console.log("Failed Bulk operation".red, err)
} else {
console.log("Successfully imported %s".green, mongoCollectionName.length);
}
console.log(response);
});
if(item != null) {
if(counter % 100 == 0) console.log( "Syncing object id: "+ item['_id'] + " #: " + counter);
client.indices.create(
{ index: esIndexName },
function(error, response) {
if (error) {
console.log(error);
} else {
console.log("created a new index", response);
}
}
);
}
counter += 1;
});
});
So here I'm trying to indexing data into elasticsearch, I'm able to create the collection index, but failed to insert the data in index of elastic search. Can anyone help me here?
Where I'm getting wrong, and what mistake I'm doing here.
I'm using nodejs here, just simple function to test, later will add lambda function to update/delete and which any change.
First of all, I would suggest to tidy up your code ; it's very difficult to see how the blocks are nested.
Now, there are several problems with your code:
Why are you doing Array.from(item).forEach(itemdata => {? item is a document object from Mongo, so doing Array.from on it has no effect.
You are calling the bulk API inside the .each callback ; meaning you'll do an API call for each document. I don't think this is what you want.
You are creating the index after the bulk operation. This is wrong. You should create your ES index once and for all before inserting documents. It's important because in the future, you'll want to have a more advanced configuration to process your documents.
Your ping call to ES is nice, but it doesn't prevent the rest of your code to run if the cluster is down.
So what you should do:
Create your ES index before iterating over you documents.
Iterate over your MongoDB documents and accumulate them in your body object.
When you have a batch of n documents, call the bulk API and reset your body.
Here is the solution you are looking for
index.js
//MongoDB client config
var MongoClient = require('mongodb').MongoClient;
var mongoDBName = 'mydb'; // Name of mongodb goes here
var mongoCollectionName = 'mycollection'; // Collection name of mongodb goes here
var connectionString = 'mongodb://127.0.0.1:27017/'; // put username and password for mongo here
//Elasticsearch client config
const { Client } = require('#elastic/elasticsearch')
const esClient = new Client({ node: 'http://localhost:9200' });
var esIndexName = 'new-collection'; // Elasticsearch index name will go here
let bulk = [];
async function indexData() {
const client = await MongoClient.connect(connectionString, { useNewUrlParser: true })
.catch(err => { console.log(err); });
if (!client) {
return;
}
try {
const db = client.db(mongoDBName);
let collection = db.collection(mongoCollectionName);
await collection.find().forEach((doc) => {
bulk.push({
index: {
_index: esIndexName,
}
})
let { _id, ...data } = doc;
bulk.push(data);
})
console.log(bulk);
await esClient.indices.create({
index: esIndexName,
}, { ignore: [400] })
const { body: bulkResponse } = await esClient.bulk({ refresh: true, body: bulk })
if (bulkResponse.errors) {
const erroredDocuments = []
// The items array has the same order of the dataset we just indexed.
// The presence of the `error` key indicates that the operation
// that we did for the document has failed.
bulkResponse.items.forEach((action, i) => {
const operation = Object.keys(action)[0]
if (action[operation].error) {
erroredDocuments.push({
// If the status is 429 it means that you can retry the document,
// otherwise it's very likely a mapping error, and you should
// fix the document before to try it again.
status: action[operation].status,
error: action[operation].error,
operation: bulk[i * 2],
document: bulk[i * 2 + 1]
})
}
})
console.log(erroredDocuments)
}
const { body: count } = await esClient.count({ index: esIndexName })
console.log(count)
} catch (err) {
console.log(err);
} finally {
client.close();
}
}
indexData();
package.json
{
"name": "elastic-node-mongo",
"version": "1.0.0",
"description": "Simple example to connect ElasticSearch, MongoDB and NodeJS",
"main": "index.js",
"dependencies": {
"#elastic/elasticsearch": "^7.3.0",
"mongodb": "^3.3.2",
"nodemon": "1.18.3"
},
"scripts": {
"dev": "nodemon",
"start": "node index.js"
},
"keywords": [
"nodejs",
"node",
"mongodb",
"elasticsearch",
"docker"
],
"author": "Sathishkumar Rakkiasmy",
"license": "ISC"
}
Clarifications
I'm able to create the collection index but failed to insert the data
in an index of elastic search.
Above sentence makes sense. Because the bulk variable is unaltered.
Refer below links why bulk variable is unaltered.
Why is my variable unaltered after I modify it inside of a function? - Asynchronous code reference
How do I return the response from an asynchronous call?
To know more about asynchronous programming
https://developer.mozilla.org/en-US/docs/Learn/JavaScript/Asynchronous
https://developer.mozilla.org/en-US/docs/Learn/JavaScript/Asynchronous/Async_await
You can make logstash to import data from mongo db to elasticsearch.Please find attached configuration for your reference.
input {
mongodb {
codec => “json”
uri => ‘mongodb://localhost:27017/NewDb’
placeholder_db_dir => ‘/home/devbrt.shukla/Desktop/scalaoutput/ELK/logstash-6.4.1/db_dir’
placeholder_db_name => ‘Employee_sqlite.db’
collection => ‘Employee’
batch_size => 5000
generateId => ‘true’
parse_method => “simple”
}
}
filter {
mutate {
remove_field => [ “_id” ]
}
}
output {
elasticsearch {
hosts => [“localhost:9200”]
index => “employee-%{+YYYY.MM.dd}”
}
stdout { codec => rubydebug } }
In Logstash we will three sections Input, Filter and Output.
Input: Is to take data from sql, mongodb, mysql etc..
Filter: In this section, we can frame customized json to index into elasticsearch.
Output: In this section we will put Index name, doc type and Ip address of the output section i.e. elasticsearch.

Categories