With the RethinkDB JavaScript driver, is there a way to determine if there is another document available while within a "cursor.each" or a "cursor.on('data')" event?
Via each
cursor.each(function(err, doc) {
if (err) throw err;
// process document here
// Is there a way to tell if there is another doc coming?
}, function() {
// End of data here, no document passed
})
);
or via Event Emitter
cursor.on('data', function(doc) {
// handle doc, check if another doc is coming or not
});
cursor.on('end', function() {
// no document passed, just indicating end of data
})
There are a couple of ways you could go about this. Keep in mind that these would only work if you're not currently in changefeed.
1. Using cursor.toArray()
You could first convert the cursor into an array and then just use the index to determine if there is another row on the cursor.
r
.table('hello')
.filter({ name: 'jorge' })
.run(conn)
.then(function (cursor) {
return cursor.toArray();
})
.then(function (array) {
array.forEach(function (row, i) {
if (i === array.length - 1) {
// There are now more rows after this
} else {
// There are more rows
}
});
});
As coffeemug mentioned above, this approach won't work for large datasets since converting the cursor into an array means that you have to load all the data.
2. Using cursor.next
A less convenient way of doing it, but probably more performant is to use cursor.next. The way this would work is that the .next method on the cursor would throw an error if there are no more rows in the cursor.
Your code would look something like this:
r
.table('hello')
.filter({ name: 'jorge' })
.run(conn)
.then(function (cursor) {
var hasNextRow = function (cb, prevRow) {
cursor.next(function (err, row) {
if (err) cb(false, prevRow);
if (row) cb(true, prevRow, row);
})
};
var consoleRow = function (row) {
hasNextRow(function (hasNextRow, prevRow, row) {
if (prevRow) {
if (!hasNextRow) console.log('isLast');
// This is your row
console.log(i, prevRow);
}
if (hasNextRow) {
consoleRow(row);
}
}, row);
};
consoleRow();
});
Basically, this code saves a reference to the last row and goes on to the next row. At that point, it'll know if the next row exists or not and you can handle any behavior on that row using the prevRow variable.
The each cursor method is really meant to process the elements independently. If you need to do something more sophisticated, I'd suggest using next (http://rethinkdb.com/api/javascript/next/) -- see the examples for details on how to use the method.
If your dataset is short, you can also call cursor.toArray() and just get the array of elements.
Related
So the document contains an array of objects, each object containing it's own array. So how would I go about updating one of the elements in the array that's inside the object which is inside another array. I've read some things with $. But I don't understand completely how to use it to call a position. I know the position of the element. But I can't just say $[] because the position is defined in a variable and not a string...
I've tried doing a simple
db.collection.findOne({...}, (err, data) => {...});
and then changing the arrays in the objects in the array in there with a simple:
data.arr[x].type[y] = z; data.save().catch(err => {console.log(err)});
But it doesn't save the new values I set for for the element of the array.
Sample structure after proposed solution from #Tom Slabbaert:
Data.findOne({
userID: 'CMA'
}, (err, doc) => {
if(err) {console.log(err)}
if(doc) {
for(var i = 0; i<CMA.stockMarket.length; i++) {
if(CMA.stockMarket[i].name == data.userID) {
for(var z = 0; z<CMA.stockMarket[i].userStock.length; z++) {
if(z == company) {
var updateAmount = CMA.stockMarket[i].userStock[z]+args[1]
var updateKey = `stockMarket.${i}.userStock.${z}`
Data.updateOne({userID: 'CMA'}, {'$set': {[updateKey]: updateAmount}})
}
}
}
}
}
});
-------------------------EDIT-------------------------
So I tried changing some things around in the data base to see if that would fix the problem I was having. I modified the updated code that was provided by #Tom Slabbaert. But nothing seems to work for some reason :/ Here's what I have so far, at this point I hope it's just a syntax error somewhere. Cause this is really frustrating at this point. Note that I'm still using the for loops here to find if the info exists. And if not, push that info into the database. This might only be temporary until I find a better way / if there is a better way.
for(var i = 0; i<CMA.userStocks.length; i++) {
if(CMA.userStocks[i].name == data.userID) {
for(var z = 0; z<CMA.userStocks[i].shares.length; z++) {
//console.log(CMA.userStocks[i].shares[z].companyName)
if(CMA.userStocks[i].shares[z].companyName == args[0]) {
var updateKey = `CMA.userStocks.$[elem1].shares.$[elem2].amount`
Data.updateOne(
{userID: 'CMA'},
{
"$inc": {
[updateKey]: args[1]
}
},
{
arrayFilters: [
{
"elem1.name": data.userID,
"elem2.companyName": args[0]
}
]
}
)
purchaseComplete(); return;
}
}
CMA.userStocks[i].shares.push({companyName: args[0], amount: parseInt(args[1])})
CMA.save().catch(err => {console.log(err)});
purchaseComplete(); return;
}
}
CMA.userStocks.push({name: data.userID, shares: [{companyName: args[0], amount: parseInt(args[1])}]});
CMA.save().catch(err => {console.log(err)});
purchaseComplete(); return;
The data I'm trying to find and change is structured like the following:
And what I'm trying to change in the end is the 'amount' (which is an integer)
_id: (Not relavent in this question)
userID: 'CMA'
stockMarket: [...] (Not relavent in this question)
userStocks: [
Object: (position 0 in userStocks array)
name: 'string' (equal to data.userID in the code)
shares: [
Object: (position 0 in shares array)
companyName: 'string' (this is args[0] in the code)
amount: integer
]
]
You can just prepare the "key" ahead of time. like so:
const updateKey = `arr.${x}.type.${y}`
db.collection.updateOne(
{...},
{
"$set": {
[updateKey]: z
}
})
Mongo Playground
Using Mongo's positional operators ($ and $[]) are usually required when you don't know the position in the array and want to use a condition to update the element.
------ EDIT-----
After given your sample code you just have a minor syntax error:
var updateKey = `stockMarket.${i}.userStock.${z}`
Should just be:
var updateKey = `CMA.stockMarket.${i}.userStock.${z}`
However After seeing your code I recommend you execute the following solution which uses a single update with arrayFilters, it just cleans up the code quite a bit:
const updateKey = `CMA.stockMarket.$[elem1].userStock.${company}`;
db.collection.update(
{userID: 'CMA'},
{
"$inc": {
[updateKey]: args[1]
}
},
{
arrayFilters: [
{
"elem1.name": data.userID
}
]
})
Mongo Playground
Well I found something that worked. Apparently it didn't save the db.collection.updateMany unless I made a .then() function on the end? I have no idea why, but it's the same with an aggregate I made. (It basically does the same as a Data.findOne and save it too, but it isn't limited by the parallel save error)
Solution I found with aggregation:
<collection field> = <new data for collection field>
Data.aggregate([
{
$match: { //This is used to create a filter
['<insert field>']: <insert filter>
}
}, {
$addFields: { //This is used to update existing data, or create a new field containing the data if the field isn't found
['<collection field>']: <new data for collection field>
}
}, {
$merge: { //This is used to merge the new data / document with the rest of the collection. Thus having the same effect as a standard save
into: {
db: '<insert database name>',
coll: '<insert collection name>'
}
}
}
]).then(() => {
//After it's done, do something here. Or do nothing at all it doesn't matter as long as the .then() statement remains. I found that not having this part will break the code and make it not save / work for some reason.
}); return;
Solution I found with db.collection.updateMany
db.collection.updateMany(
{<insert field>: filter}, {$set: {'<insert field>': <new data>}}
).then(() => {
//This .then() statment in my case was needed for the updateMany function to work correctly. It wouldn't save data without it for some reason, it does not need to contain any actual info in this part. As long as it's here.
});
With this new info I could simply access and change the data that I was trying to before using the previous instructions provided by #Tom Slabbaert and my new method of actually making it save the changes made into the document.
I see that someone has given me a minus 1. I am a 55 year old mother who has no experience. I have many skills but this is not one of them. I am absolutely desperate and have bust myself to get this far. If you cannot help, I accept that, but please do not be negative towards me. I am now crying. Some encouragement would be much appreciated.
I have a page which displays items from a database on a repeater. The code searches the items using several drop down filters, which are populated from the database. Intermittently, seemingly randomly (no pattern is emerging despite extensive testing) the code is failing to populate random drop down filters (one or more of the drop down filters show the default settings rather than those self populated from the database). I discovered this by either repeatedly visiting the page or by repeatedly refreshing the page. Often the code works, then every 3 or 4 times, one or more of the drop down filters shows its default settings rather than those self populated from the database (then the next time it goes wrong, it might be the same or a different one or set of filters which do not work)
This is the code. On this page, there are 3 drop down filters but I have several pages like this, each displaying and searching a different database, with up to 10 drop down filters on each page, and they all have this intermittent problem...
import wixData from "wix-data";
$w.onReady(function () {
$w('#iTitle')
$w('#iCounty')
$w('#iGeog')
$w('#dataset1')
$w('#text102')
});
let lastFilterTitle;
let lastFilterCounty;
let lastFilterGeog;
export function iTitle_change(event, $w) {
filter($w('#iTitle').value, lastFilterCounty, lastFilterGeog);
}
export function iCounty_change(event, $w) {
filter(lastFilterTitle, $w('#iCounty').value, lastFilterGeog);
}
export function iGeog_change(event, $w) {
filter(lastFilterTitle, lastFilterCounty, $w('#iGeog').value);
}
function filter(title, county, geog) {
if (lastFilterTitle !== title || lastFilterCounty !== county || lastFilterGeog !== geog) {
let newFilter = wixData.filter();
if (title)
newFilter = newFilter.eq('title', title);
if (county)
newFilter = newFilter.eq('county', county);
if (geog)
newFilter = newFilter.eq('geog', geog);
$w('#dataset1').setFilter(newFilter)
.then(() => {
if ($w('#dataset1').getTotalCount() ===0) {
$w('#text102').show();
}
else {
$w('#text102').hide();
}
})
.catch((err) => {
console.log(err);
});
lastFilterTitle = title;
lastFilterCounty = county;
lastFilterGeog = geog;
}
}
// Run a query that returns all the items in the collection
wixData.query("Psychologists")
// Get the max possible results from the query
.limit(1000)
.ascending("title")
.distinct("title")
.then(results => {
let distinctList = buildOptions(results.items);
// unshift() is like push(), but it prepends an item at the beginning of an array
distinctList.unshift({ "value": '', "label": 'All Psychologists'});
//Call the function that builds the options list from the unique titles
$w("#iTitle").options = distinctList
});
function buildOptions(items) {
return items.map(curr => {
//Use the map method to build the options list in the format {label:uniqueTitle, valueuniqueTitle}
return { label: curr, value: curr };
})
}
// Run a query that returns all the items in the collection
wixData.query("Psychologists")
// Get the max possible results from the query
.limit(1000)
.ascending("county")
.distinct("county")
.then(results => {
let distinctList = buildOptions(results.items);
// unshift() is like push(), but it prepends an item at the beginning of an array
distinctList.unshift({ "value": '', "label": 'All Counties'});
//Call the function that builds the options list from the unique titles
$w("#iCounty").options = distinctList
});
function buildOptions1(items) {
return items.map(curr => {
//Use the map method to build the options list in the format {label:uniqueTitle1, valueuniqueTitle1}
return { label: curr, value: curr };
})
}
// Run a query that returns all the items in the collection
wixData.query("Psychologists")
// Get the max possible results from the query
.limit(1000)
.ascending("geog")
.distinct("geog")
.then(results => {
let distinctList = buildOptions(results.items);
// unshift() is like push(), but it prepends an item at the beginning of an array
distinctList.unshift({ "value": '', "label": 'All Regions'});
//Call the function that builds the options list from the unique titles
$w("#iGeog").options = distinctList
});
function buildOptions2(items) {
return items.map(curr => {
//Use the map method to build the options list in the format {label:uniqueTitle2, valueuniqueTitle2}
return { label: curr, value: curr };
})
}
export function button45_click(event, $w) {
//Add your code for this event here:
filter($w('#iTitle').value='', $w('#iCounty').value='', $w('#iGeog').value='');
}
My experience and knowledge is very limited, so the answer may well be very simple. Any help would be much appreciated as I will have to abandon my project if I can't find a solution.Thank you
For context: I have a cron-job.org that fires an https function in my firebase project.
In this function, I have to go through all docs inside a collection and update a counter (each doc might have a different counter value). If the counter reaches a limit, I'll update another collection (independent from the first one), and delete the doc entry that reached the limit. If the counter is not beyond the limit, I simply update the doc entry with the updated counter value.
I tried adapting examples from the documentation, tried using transactions, batch, but I'm not sure how to proceed. According to transactions' description, that's the way to go, but examples only show how to edit a single doc.
This is what I have (tried adapting a realtime db sample):
function updateCounter() {
var ref = db.collection('my_collection_of_counters');
return ref.get().then(snapshot => {
const updates = {};
snapshot.forEach(child => {
var docData = child.data();
var newCounter = docData.counter+1;
if (newCounter == 10) {
// TO-DO: add to stock
updates[child.key] = null;
} else {
docData.counter = newCounter;
updates[child.key] = docData;
}
});
// execute all updates in one go and return the result to end the function
return ref.update(updates);
});
}
It doesn't work, collections don't have an update method. What is the best approach to updating each doc in a collection? One-by-one? Transaction? Is there an example?
PS: updateCounter is a function being called by the https trigger. Cron+trigger is working fine.
EDIT
When an item reaches the threshold, I want to update another collection, independent from the counter one. Is nested transactions a good solution?
Modified code:
function updateCounter() {
var ref = db.collection('my_collection_of_counters');
var transaction = db.runTransaction(t => {
return t.get(ref)
.then(snapshot => {
let docs = snapshot.docs;
for (let doc of docs) {
var item = doc.data();
var newCounter = item.counter + 1;
if (newCounter == 10) {
console.log("Update my_stock");
// ADD item.quantity to stock collection
}else{
t.update(doc.ref, {counter: newCounter});
}
}
});
})
.then(result => {
console.log('Transaction success');
})
.catch(err => {
console.log('Transaction failure:', err);
});
}
As you already noted yourself, you'll want to do this in a transaction to ensure that you can update the current counter value in a single operation. You can also create the new document, and delete the existing one, in that same transaction once your counter reaches its threshold. I don't see any benefit of doing this for all documents in a single transaction, since the operation on each doc seems unrelated to the others.
In a Firestore transaction, you perform the operations on a Transaction object as shown in the documentation. In your case you'd:
Get the current document with transaction.get().
Get the counter from the document.
Increment the counter.
If the new value is below your threshold:
Call transaction.update() to write the new counter value into the database
If the new value if above your threshold:
Call transaction.create on the new collection to create the document there.
Call transaction.delete on the existing document, to delete it.
For more, I recommend scanning the reference documentation for the Transaction class.
I have a mongoose model that looks like this:
module.exports = mongoose.model('Item', {
text : String,
position: Number
});
And I'm looking to have a Position field that increments on something like the .length of all the documents, for sorting the results of a .find of All:
// get All Items
app.get('/itemsList', function(req, res) {
// use mongoose to get all items in the database
Item.find({
sort : { position: 1 } // sort by Ascending Position
}. function(err, items) {
// if there is an error retrieving, send the error. nothing after res.send(err) will execute
if (err)
res.send(err)
res.json(items); // return all items in JSON format
});
});
Is there a way to auto-fill a number for the Position field with some javascript in node.js?
// create an item
app.post('/api/item', function(req, res) {
// create an item, information comes from AJAX request from Angular
Item.create({
text : req.body.text,
position:
// something using ++items.length
}, function(err, item) {
if (err)
res.send(err);
});
});
Mongoose lets you hook into the save, validate and remove methods and execute code before and after they're executed.
This code can be asynchronous. For example, in your case you could probably do this:
var schema = mongoose.Schema({
text : String,
position: Number
});
schema.pre("validate", function(next) {
var doc = this;
// If 'position' is not filled in, fill it in.
// Not using !position because 0 might be a valid value.
if(typeof position !== "number") {
// Count the number of Items *
mongoose.model("Item").count(function(err, num) {
// If there was an error, pass it to next().
if(err)
return next(err);
// Update the position, then call next();
doc.position = num;
return next();
});
} else {
// There is no need to count, so call next().
next();
}
});
module.exports = mongoose.model('Item', schema);
More here.
Before validation starts, the number of Items is counted. Afterwards, the position is set.
Validation and other pre-validator ** hooks will not commence until the above code is ready.
* I'm using mongoose.model here to fetch the model because the model is not compiled yet (that happens a bit below).
** The documentation shows you how you can make multiple pre-validator hooks execute in parallel. I've chosen not to do this in this example because the code is easier to read and because you might actually need the validators to run sequentially.
In the pre-validation hook, you could place some logic in the else-case. When inserting an Item with an existing position value, you'll want to move every record down. You can do this by doing the following:
Use this.isModified("position") to check if the value was changed since you last saved. You might also need doc.isNew().
Check if there is an existing document with the same position. Something like Item.where({_id: {$ne: this._id}, position: this.position}).count()
If there is, execute: Item.update({position: {$gte: this.position}}, {position: {$inc: 1}}, {multi: 1})
Then call next() to save your doc.
The above should work. It will leave gaps when you remove documents however.
Also, look into indexes. You'll want to add one on the position field. Perhaps even a unique index.
Per #RikkusRukkus's steps for moving records down, here's logic for the else-case (to be tested)
// load mongoose since we need it to define a schema and model
var mongoose = require('mongoose');
var ItemSchema = mongoose.Schema({
text : String,
position: Number
});
// before validation starts, the number of Items is counted..afterwards, the position is set
ItemSchema.pre("validate", function(next) {
var doc = this;
// if 'position' is not filled in, fill it in..not using !position because 0 might be a valid value
if(typeof position !== "number") {
// count the number of Items *
// use mongoose.model to fetch the model because the model is not compiled yet
mongoose.model("Item").count(function(err, num) {
// if there was an error, pass it to next()
if(err)
return next(err);
// set the position, then call next();
doc.position = num;
return next();
});
} else if(this.isModified("position") || this.isNew()) {
// check if there is an existing document with the same position
// use mongoose.model to fetch the model because the model is not compiled yet
mongoose.model("Item").where({_id: {$ne: this._id}, position: this.position}).count( function (err, count) {
// if there was an error, pass it to next()
if(err)
return next(err);
// if there is a doc with the same position, execute an update to move down all the $gte docs
if(count > 0) {
// use mongoose.model to fetch the model because the model is not compiled yet
mongoose.model("Item").update({position: {$gte: this.position}}, {position: {$inc: 1}}, {multi: 1}, function(err, numAffected) {
// Call next() (with or without an error)
next(err);
});
} else {
// there are no docs that need to move down, so call next()
next();
}
});
} else {
// there is no need to count or update positions, so call next()
next();
}
});
module.exports = mongoose.model('Item', ItemSchema);
I basically need to make about 3 calls to get the data for a json object.. It basically JSON array of JSON object which have some attributes, one of which is an array of other values selected using a second query, then that one also has an array inside which is selected with another db call.
I tried using asyn.concatSeries so that I can dig down into the bottom call and put together all the information I collected for one root json object but that's creating a lot of unexpected behaviour..
Example of JSON
[
{
"item" : "firstDbCall"
"children" : [ {
"name" : "itemDiscoveredWithSecondDBCall"
"children" : [ itemsDiscoveredwith3rdDBCall]
},
]
}
]
This is really difficult using node.js. I really need to figure out how to do this properly since I have to do many of these for different purposes.
EDIT
This is the code i have. There's some strange behaviour with async.concatSeries. The results get called multiple times after each one of the functions finish for each array. So i had to put a check in place. I know it's very messy code but i've been just putting band-aids all over it for the past 2 hours to make it work..
console.log("GET USERS HAREDQARE INFO _--__--_-_-_-_-_____");
var query = "select driveGroupId from tasks, driveInformation where agentId = '"
+ req.params.agentId + "' and driveInformation.taskId = tasks.id order by driveInformation.taskId desc;";
connection.query(query, function(err, rows) {
if (rows === undefined) {
res.json([]);
return;
}
if(rows.length<1) { res.send("[]"); return;}
var driveGroupId = rows[0].driveGroupId;
var physicalQuery = "select * from drives where driveGroupId = " + driveGroupId + ";";
connection.query(physicalQuery, function(err, rows) {
console.log("ROWSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSS");
console.log(rows);
async.concatSeries(rows, function(row, cb) {
console.log("-------------------------------SINGLE ROW-------------------------------------");
console.log(row);
if(row.hasLogicalDrives != 0) {
console.log("HAS LOGICAL DRIVES");
console.log(row.id);
var query = "select id, name from logicalDrives where driveId = " + row.id;
connection.query(query, function(error, drives) {
console.log("QUERY RETURNED");
console.log(drives);
parseDriveInfo(row.name, row.searchable, drives, cb);
});
}
else
var driveInfo = { "driveName" : row.name, "searchable" : row.searchable};
console.log("NO SUB ITEMS");
cb(null, driveInfo);
}, function(err, results) {
console.log("GEETTTTINGHERE");
console.log(results);
if(results.length == rows.length) {
console.log("RESULTS FOR THE DRIVE SEARCH");
console.log(results);
var response = {"id": req.params.agentId};
response.driveList = results;
console.log("RESPONSE");
console.log(response);
res.json(response);
}
});
});
});
};
parseDriveInfo = function(driveName, searchable, drives, cb) {
async.concatSeries(drives, function(drive,callback) {
console.log("SERIES 2");
console.log(drive);
console.log("END OF DRIVE INFO");
var query = "select name from supportedSearchTypes where logicalDriveId = " + drive.id;
connection.query(query, function(error, searchTypes) {
drive.searchTypes = searchTypes;
var driveInfo = { "driveName" :driveName,
"searchable" : searchable,
"logicalDrives" : drive
};
callback(null, driveInfo);
});
}, function (err, results) {
console.log("THIS IS ISISIS ISISISSISISISISISISISISISIS");
console.log(results);
if(results.length === drives.length) {
console.log("GOTHERE");
cb(null, results);
}
});
}
Getting good enough with async to use exactly the right combination of methods under the right circumstances takes a fair amount of experience. Most likely your case in particular can be handled with async.waterfall if its query1 then query2(dataFoundByQuery1) then query3(dataFoundByQuery2). But depending on the circumstances you need to mix and match async methods appropriately and sometimes have 2 levels - for example a "big picture" async.waterfall where some of the steps in the waterfall do async.parallel or async.series as needed. I've never used async.concat and given your needs I think you have chosen the wrong method. The workhorses are async.each, async.eachSeries, async.waterfall, and async.map, at least for the web app & DB query use cases I mostly encounter, so make sure you really have those understood before exploring the more specific convenience methods.
EDIT: This is a more in depth example based on use of the connection library you seem to be using. Please note, some of this is javascript psuedo code. Things like adding objects to the resultsArray are clearly not complete, the only thing I took time to make sure was correct is the "flow of logic" as it pertains to callbacks. Everything else is for you to implement. In order to support multiple calls to the same callback function and maintain state from call to call, the best way is to wrap the set of callbacks in a closure. This allows the callbacks to share some state with the main event loop. This allows you to pass arguments to the callbacks, without actually having to pass them as arguments, much like class variables in c++, or even globals in javascript, but we haven't poluted the global scope :)
function queryDataBase(query) {
//wrap the whole query in a function so the callbacks can share some
//variables with similar scope. This is called a closure
int rowCounter = 0;
var dataRowsFromStep2;
var resultsArray = {};
connection.query(query, dataBaseQueryStep2);
function dataBaseQueryStep2(err, rows) {
//do something with err and rows
dataRowsFromStep2 = rows;
var query = getQueryFromRow(dataRowsFromStep2[rowCounter++]);//Always zero the first time. Might need to double check rows isn't empty!
connection.query(query, dataBaseQueryStep3);
}
function dataBaseQueryStep3(err, rows) {
//do something with err and rows
if(rowCounter < dataRowsFromStep2.size) {
resultsArray.add(rows);//Probably needs to be more interesting, but you get the idea
//since this is within the same closure, rowCounter maintains it's state
var query = getQueryFromRow(dataRowsFromStep2[rowCounter++]);
//recursive call query using dataBaseQueryStep3 as it's callback repeatedly until
//we run out of rows to call it on.
connection.query(query, dataBaseQueryStep3)
} else {
//when the if statement fails we have no more rows to run queries on so return to main program flow
returnToMainProgramLogic(resultsArray);
}
}
}
function returnToMainProgramLogic(results) {
//continue running your program here
}
I personally like the above logic better than the syntax async produces... I believe the heart of your problem rests in your nested calls to async, and the fact that ASYN itself, runs the series of functions asynchronously, but in order(confusing I know). If you write your program like this, you won't have to worry about it!
I would strongly suggest using sequelize.js It provides a really powerful orm that allows you to chain queries together. It also allows you to directly load your data into js objects, write dynamic sql, and connect to many different databases. Picture ActiveRecord from the Ruby world for Node.