Using results from one RethinkDB query in another? - javascript

I want to send a http post with an address, which will grab a doc from the database. I would like to then use that doc's geodata in a getNearest function, ultimately returning the nearest four other locations. How do I go about stringing these two queries together?
r.table('dealer_locations').filter(function(dealer) {
return {
dealer : ("Address").match(request.body.Address)
.getNearest(dealer, {
index: 'geodata',
maxResults: 4
})
}
}).run(conn, function(error, cursor) {
if (error) {
handleError(response, error);
} else {
cursor.toArray(function(error, results) {
if (error) {
handleError(response, error);
} else {
response.send(results);
}
});
}
});

I'll reformulate the question just to make a bit more clear:
Problem
Given a specific document with geodata, I want to also return the four nearest locations to that location.
Solution
First, make sure that you've creating the geo index in the table you want to query:
r.table('dealer_locations').indexCreate('location', { geo: true })
After that, make sure that you've inserted a r.point object into the document. You can't just use geo indexes on just any property. They have to be r.points.
r.table('dealer_locations')
.insert({
name: 'San Francisco',
location: r.point(37.774929, -122.419416)
})
After you've inserted all your documents and they all have r.points property on the same property you created an index for, now you can start querying them.
If you want to get all the nearest locations for a location, you can do as follows:
r.table('dealer_locations')
.filter({ name: 'San Francisco' })(0)
.do(function (row) {
return r.table('dealer_locations')
.getNearest(row('location'), { index: 'location', maxResults: 4 })
})
If you want to append the closets locations to a document, so that you can return both the document and the nearest locations at the same time, you can do that using the merge method.
r.table('dealer_locations')
.filter({ name: 'San Francisco' })(0)
.merge(function (row) {
return {
nearest_locations: r.table('dealer_locations')
.getNearest(row('location'), { index: 'location', maxResults: 4 })
}
})
Finally, if you want to get all the nearest locations, based on an address (supposing your document has both an address property with a string and a location property with an r.point), you can do something like this:
r.table('dealer_locations')
.filter(r.row('address').match(request.body.Address))
(0) // Only get the first document
.do(function (row) {
// Return the 4 documents closest to the 'location' if the previous document
return r.table('dealer_locations')
.getNearest(row('location'), { index: 'location', maxResults: 4 })
})
That being said, the problem with this might be that you might match multiple addresses which are not necessarily the ones you want to match!

Related

How to add a map object to mysql in nodeJS?

It's my first time creating a map object and I'm trying to add it to a mysql database but I have an error that says: "You have an error in your SQL syntax; check the manual that corresponds to your MariaDB server version for the right syntax to use near 'key = '#42343731'' at line 1".
Examples of elements inside the map object are the following:
Map {
'#44928649' => {
id: '#44928649',
name: '508',
year: '2020',
price: 34800,
included_tax: true,
state: true
},
'#44899990' => {
id: '#44899990',
name: 'yaris',
year: '2018',
price: 17800,
included_tax: true,
state: true
}
}
My query is the following:
function addNewElement(value, key, map){
connection.query("INSERT INTO test(id, name, year, price, included_tax, status) VALUES ?", {key}, (err, res) => {
if (err) throw err;
console.log("new element: ", res.insertId)
});
}
carsList.forEach(addNewElement);
I think I'm not doing right the way to add the elements inside the query. Another question I have is, do you recommend maintaining the array inside the value of every key of the map or do you think it is better to make them maps as well?
Thank you!
{key} is returning the key of the map you defined. i.e. '#44928649' or '#44899990'.
This means that your SQL statement looks as follows:
INSERT INTO test(id, name, year, price, included_tax, status) VALUES '#44928649'
What you need to do is return the value of the key in the map
Try the following:
connection.query("INSERT INTO test (id, name, year, price, included_tax, status) SET ?", value, (err, res)...

Issue in inserting bulk data in nodejs

Here is my object,
{'name': 'Lipika', 'monday': 1}
{'name': 'Ram', 'monday': 0}
{'name': 'Lipika', 'tuesday': 1}
{'name': 'Ram', 'tuesday': 0}
Here is the two condition
I want to check whether Lipika is present in the DB, If not present i should create his name in the User Collection and then insert in the Attendance Collection, If Already present then i should just insert the Attendance Collection.
Here is what i have tried
for (var key in row) {
checkUser(name,function(data) {
//Insert inside Attendence collection then..
});
}
function checkUser(name, callback){
dbModel.User.findOne({ name: name}, function (err, data) {
if(data){
callback('exist');
}else{
var User = new dbModel.User({name : name});
User.save(function (err) {
if (err) {
console.log('Err',err);
} else {
callback('not exist');
}
});
}
});
}
But it is creating Lipika and Ram two times as nodejs run async way. How can i wait the until the javascript checks and insert inside the user ?
Try this code may be help you !
function checkUser(name, callback){
dbModel.User.findOne({ name: name}).then((data)=>{
if(data){
//callback('exist');
//user already present insert attendance
}else{
var User = new dbModel.User({name : name});
User.save().then((data)=>{
//user not present insert collection
},(err)={
});
}
},(error)=>{
console.log('something went wrong');
});
}
Instead of everytime checking in DB why don't you make name field unique, That will give the more accurate result. Because Reading and writing will create the problem if multiple requests will come at same time.
Use findAndModify to avoid 2 operation reading and writing. Not Good approach :(
dbModel.User.findAndModify({
query: {
name: name
},
update: {
$setOnInsert: {
name: name
}
},
new: true,
upsert: true
}, (err, data) => {
callback('exist');
});

Push "programmatic" array to an Object's array property

I'm building a Thesaurus app, and for this question, the key note is that i'm adding a list of synonyms(words that have the same meaning) for a particular word(eg - "feline", "tomcat", "puss" are synonyms of "cat")
I have a Word object, with a property - "synonyms" - which is an array.
I'm going to add an array of synonyms to the Word synonyms property.
According to the MongoDb documentation see here, the only way to append all the indexes of an array to a document's array property at once is to try the following:
db.students.update(
{ _id: 5 },
{
$push: {
quizzes: {
$each: [ { wk: 5, score: 8 }, { wk: 6, score: 7 }, { wk: 7, score: 6 } ],
}
}
}
)
Let's re-write that solution to suit my data, before we venture further.
db.words.update(
{ baseWord: 'cat' },
{
$push: {
synonyms: {
$each: [ { _id: 'someValue', synonym: 'feline' }, { _id: 'someValue', synonym: 'puss' }, { _id: 'someValue', synonym: 'tomcat' } ],
}
}
}
)
Nice and concise, but not what i'm trying to do.
What if you don't know your data beforehand and have a dynamic array which you'd like to feed in?
My current solution is to split up the array and run a forEach() loop, resulting in an array being appended to the Word object's synonyms array property like so:
//req.body.synonym = 'feline,tomcat,puss';
var individualSynonyms = req.body.synonym.split(',');
individualSynonyms.forEach(function(synonym) {
db.words.update(
{ "_id": 5 },
{ $push: //this is the Word.synonyms
{ synonyms:
{
$each:[{ //pushing each synonym as a Synonym object
uuid : uuid.v4(),
synonym:synonym,
}]
}
}
},{ upsert : true },
function(err, result) {
if (err){
res.json({ success:false, message:'Error adding base word and synonym, try again or come back later.' });
console.log("Error updating word and synonym document");
}
//using an 'else' clause here will flag a "multiple header" error due to multiple json messages being returned
//because of the forEach loop
/*
else{
res.json({ success:true, message:'Word and synonyms added!' });
console.log("Update of Word document successful, check document list");
}
*/
});
//if each insert happen, we reach here
if (!err){
res.json({ success:true, message:'Word and synonyms added!.' });
console.log("Update of Word document successful, check document list");
}
});
}
This works as intended, but you may notice and issue at the bottom, where there's a commented out ELSE clause, and a check for 'if(!err)'.
If the ELSE clause is executed, we get a "multiple headers" error because the loop causes multiple JSON results for a single request.
As well as that, 'if(!err)' will throw an error, because it doesn't have scope to the 'err' parameter in the callback from the .update() function.
- If there was a way to avoid using a forEach loop, and directly feed the array of synonyms into a single update() call, then I can make use of if(!err) inside the callback.
You might be thinking: "Just remove the 'if(!err)' clause", but it seems unclean to just send a JSON response without some sort of final error check beforehand, whether an if, else, else if etc..
I could not find this particular approach in the documentation or on this site, and to me it seems like best practice if it can be done, as it allows you to perform a final error check before sending the response.
I'm curious about whether this can actually be done.
I'm not using the console, but I included a namespace prefix before calling each object for easier reading.
There is not need to "iterate" since $each takes an "array" as the argument. Simply .map() the produced array from .split() with the additional data:
db.words.update(
{ "_id": 5 },
{ $push: {
synonyms: {
$each: req.body.synonym.split(',').map(synonym =>
({ uuid: uuid.v4, synonym })
)
}
}},
{ upsert : true },
function(err,result) {
if (!err){
res.json({ success:true, message:'Word and synonyms added!.' });
console.log("Update of Word document successful, check document list");
}
}
);
So .split() produces an "array" from the string, which you "transform" using .map() into an array of the uuid value and the "synonym" from the elements of .split(). This is then a direct "array" to be applied with $each to the $push operation.
One request.

How to get result with specific fields in StrongLoop?

I am currently using StrongLoop as my API backend server and Mongodb as data storage engine.
Let's say there is a collection called article. It has two fields title, and content. And there are two frontend pages to display a list of articles and view a single article.
Obviously the data list page only need title field and the view page need both. Currently the GET method of StrongLoop API return all fields including content. It cost extra traffic. Is there any way that can just return specific field?
Mongodb support projection in find() method for this. How can I do the same thing by StrongLoop?
Have you taken a look at the filters offered. http://docs.strongloop.com/display/LB/Querying+models
Query for NodeAPI:
server.models.Student.findOne({where: {RFID: id},fields: {id: true,schoolId: true,classId: true}}, function (err, data) {
if (err)
callback(err);
else {
callback();
}
})
Query for RestAPI :
$http.get('http://localhost:3000/api/services?filter[fields][id]=true&filter[fields][make]=true&filter[fields][model]=true')
.then(function (response) {
}, function (error) {
});
You can use fields projections,
Sample Record:
{ name: 'Something', title: 'mr', description: 'some desc', patient: { name: 'Asvf', age: 20, address: { street: 1 }}}
First Level Projection:
model.find({ fields: { name: 1, description: 1, title: 0 } })
and I think Strong loop is not yet supporting for second-level object filter, does anyone know how to filter second-level object properties or is yet to implement?.
Second Level Projection: (Need help here)
Ex: 2
model.find({ fields: { name: 1, 'patient.name': 1, 'patient.age': 1, 'patient.address': 0 } })
// Which results { name } only

Mongoose update previously fetched collection

I would like to update a collection. Docs seem unclear on this.
I am wondering how to achieve the following:
Order.find({ _id: { $in: ids }}).exec(function(err, items, count) {
// Following gives error - same with save()
items.update({ status: 'processed'}, function(err, docs) {
});
});
I know how to batch save like this:
Model.update({ _id: id }, { $set: { size: 'large' }}, { multi: true }, callback);
But that requires setting my query again.
I've also tried:
Order.collection.update(items...
But that throws a max call stack error.
In mongoose, model.find(callback), return an Array of Document via callback. You can call save on a Document but not on an Array. So you can use for loop or forEach on the Array.
Order
.find({ _id: { $in: ids}})
.exec(function(err, items, count) {
items.forEach(function (it) {
it.save(function () {
console.log('you have saved ', it)
});
})
});

Categories