Uncaught MongoError: unrecognized field 'allowDiskUsage' - javascript

I installed 2.5.5 so that I can try the new "$out" operator to create new collections with aggregation results. My node adapter is mongodb#1.3.23. I don't have "allowDiskUsage" in my code, but I get this error:
Uncaught MongoError: unrecognized field 'allowDiskUsage'
What do I need to do to update my project to run 2.5.5?

From a simple test on the same driver version I do not see the same results:
var MongoClient = require('mongodb').MongoClient;
MongoClient.connect('mongodb://localhost/test', function(err, db) {
if (!err) {
db.collection('data', function(err, collection) {
if (!err) {
collection.aggregate([
{$out: "another" },
],function(err, result) {
if (err) {
console.log(err);
}
db.close();
});
}
});
}
});
There is an option for allowDiskUse that can be passed to the runCommand call to aggregate, but this does not directly have an impact on the $out pipeline operator, as it is intended for allowing the stages to use disk storage rather than memory alone. The usage of $out as you will be aware is to put the results in an output collection rather than return a cursor object.
If the same code used by itself is causing the same problem, you should check your installed driver version. As of 1.3.23 with a MongoDB 2.5.5 server, this code works as expected.
If this code passes, then there is likely some call or overriding module in your project that is implementing the option you specify in the error.

Related

How to run knex migrations with JavaScript instead of CLI?

I have created four postresql tables. I am using nodejs with knexjs as a query builder.
I can create and execute migrations with the command line without any problems. Now I want to run migrations via Javascript. How can I proceed with this?
Here is my code: -
module.exports.runDBMigrations = async () => {
knex.schema.hasTable(USERS_DB_NAME).then(function (exists) {
if (!exists) {
//Execute migrations to create users tables
}
})
.catch(e => {
console.log("Error creating tables: ", e);
})
knex.schema.hasTable(POSTS_DB_NAME).then(function (exists) {
if (!exists) {
//Execute migrations to create posts tables
}
})
.catch(e => {
console.log("Error creating tables: ", e);
})
knex.schema.hasTable(LIKES_DB_NAME).then(function (exists) {
if (!exists) {
//Execute migrations to likes users tables
}
})
.catch(e => {
console.log("Error creating tables: ", e);
})
knex.schema.hasTable(FOLLOWERS_DB_NAME).then(function (exists) {
if (!exists) {
//Execute migrations to create followers tables
}
})
.catch(e => {
console.log("Error creating tables: ", e);
})
}
Hi,
Couple days I was sorrounding about how to run migrations and seeds when Server is starting.
Doing some researchs adn reading the Knex documentation, this is the solution what I reach to. Maybe this can help you.
Stack
lib
version
nodejs
16.17.x
nodemon
^2.0.19
npm
8.19.x
knex
^2.3.0
express
^4.18.1
pg
^8.8.0
postgreSQL
14.5
typescript
^4.8.2
Solution
1. Create a void function runMigrations(). (I located in models/setup/) (you can name it as how you want)
This function will have the knex singleton calling for running the migrations and the seeds.
2. Runn all the pending migrations calling the knex migrate .latest() method
This method is a promise, so you can capture the success creation or the error in case something goes wrong.
In this part, you can run all the seeds calling the knex seed .run() method. You can this method in the first callback
Call the migrations method in your index or server file. In my case Is runMigrations() method.
Restart your server, so you can check if everything works fine.
If you go to the terminal you can check response status. I add some log response, in order to know if everything working fine.
Note
I provoque an error so you can check the resutl if something were bad
executing the migrations or seeds
Maybe you'll find or your found others alternative of solution, but if you reach to fix the issue and added a different way; Please share it as a comment, so we can exchange knowledge and see a different way how to solve this issue.
Regards

Storing MongoDB document inside a Node.js array

Apologies in advance for what is undoubtetly a silly question.
I'm trying to store the raw JSON documents from MongoDB inside a Node.js array. The following code gives me an atrocity of JSON inside an array, inside a string, inside an array.
let subscriptions = [];
MongoClient.connect(mongourl, function(err, db) {
if (err) throw err;
var dbo = db.db("sigdb");
dbo.collection("customers").find({}).project({ _id: 0 }).toArray(function(err, result) {
if (err) throw err;
subscriptions.push(JSON.stringify(result));
db.close();
});
});
I have tried to exclude toArray(), using the syntax of findOne() - no luck. Declaring subscriptions as a standard variable only returned undefined. Not putting result through JSON.stringify() made the second part of the document appear as [Object].
Any suggestions on how to untangle this and just have JSON stored in an array would be much appreciated.
Edit: turns out that instead of subscriptions.push(result) I could just use subscriptions = result.

arangodb Difficulty in Tutorial: Node.js (io.js) in 10 minutes

Dear arangodb community,
How mature is arangojs? When I tried "Tutorial: Node.js (io.js) in 10 minutes", the exercises 1 thru 4 work as expected. But the 5 thru 10 failed. From the following exercise, I am getting
Database created: undefined
instead of
Database created: "mydb"
Thus the remaining exercises cannot continue, since the crucial object-bearing variable (mydb) is null. But, observing that "mydb" data base is correctly created in arangodb, my question is just related to the maturity of aragogojs (arangodb's Javascript driver). Or how do I fix it?
db.createDatabase('mydb', function(err, newdb) {
if (err) {
console.log('Failed to create database: %j',
err.message);
} else {
console.log('Database created: %j', newdb.name);
mydb = newdb;
}
});
Thanks
The announced, updated node tutorial using the 4.x version of arangojs is available now.
Creating a new database changed to:
db.createDatabase('mydb').then(
() => console.log('Database created'),
err => console.error('Failed to create database:', err)
);
All asynchronous methods in the ArangoDB driver return promises but you can also pass a node-style callback instead:
db.createDatabase('mydb', function (err) {
if (!err) console.log('Database created');
else console.error('Failed to create database:', err);
});
The Node tutorial is based on version 3.x of the arangojs driver. The driver has recently been updated to version 4.x, which contains a number of breaking API changes.
The tutorial will soon be updated to reflect these changes. In the meantime you can follow the tutorial by installing version 3 explicitly:
npm install arangojs#3

Using mysql node.js driver to get an entire database as JSON

I'm working on creating a JavaScript file to get a JSON dump of an entire MySQL database, running on server side. I found and am using the MySQL driver for node.js (https://www.npmjs.com/package/mysql) for queries, it's been straight forward enough to start. My issue is that I need to call multiple queries and get the results from all of them to put into a single JSON file and I can't quite get that to work. I'm entirely new to JavaScript (basically never touched it before now) so it's probably a relatively simple solution that I'm just missing.
Currently I do a query of 'SHOW TABLES' to get a list of all the tables (this can change so I can't just assume a constant list). I then just want to basically loop through the list and call 'SELECT * from table_name' for each table, combining the results as I go to get one big JSON. Unfortunately I haven't figured out how to get the code to finish all the queries before trying to combine them, thus retuning 'undefined' for all the results. Here is what I currently have:
var mysql = require('mysql');
var fs = require('fs');
var connection = mysql.createConnection({
host: 'localhost',
user: 'root',
password: 'pass',
database: 'test_data'
});
connection.connect();
connection.query('SHOW TABLES;', function(err, results, fields)
{
if(err) throw err;
var name = fields[0].name;
var database_json = get_table(results[0][name]);
for (i = 1; i < results.length; i++)
{
var table_name = results[i][name];
var table_json = get_table(table_name);
database_json = database_table_json.concat(table_json);
}
fs.writeFile('test_data.json', JSON.stringify(database_json), function (err)
{
if (err) throw err;
});
connection.end();
});
function get_table(table_name)
{
connection.query('select * from ' + table_name + ';', function(err, results, fields) {
if(err) throw err;
return results;
});
}
This gets the table list and goes through all of it with no issue, and the information returned by the second query is correct if I just do a console.log(results) inside the query, but the for loop just keeps going before any query is completed and thus 'table_json' just ends up being 'undefined'. I really think this must be an easy solution (probably something with callbacks which I don't quite understand fully yet) but I keep stumbling.
Thanks for the help.
I'm guessing that this is for some sort of maintenance type function and not a piece that you need for your application. You're probably safe to do this asynchronously. This module is available here: https://github.com/caolan/async
You can also use Q promises, available here: https://github.com/kriskowal/q
This answer: describes both approaches pretty well: Simplest way to wait some asynchronous tasks complete, in Javascript?

Node.js event loop blocked from express routes

I have an express app and one of the functionalities is "moving" files. For example, you can drag a folder to another folder and have it's contents move. Once all the contents are moved, the server responds accordingly.
What I've found is that when doing this with folders that contain many files (possibly additional folders), this can cause the event loop to get blocked for up to 3+ seconds, which I've found using blocked.
Would anyone have any suggestions on how I could prevent this? One option I though is to use child_process.fork, I'm not sure how that will tie into an http route, but I'll test it. Would there be any other ways to improve something like this?
Code example: This is one part I'm testing now, that basically is building a "tree" from a materialized path pattern in Mongo:
var items = []
Items.find({parentId: null, user_id: '123', type: 'directory'})
.exec(function(err, docs) {
if (err) return res.send(err);
async.each(docs, function(doc, cb) {
doc.getArrayTree({
condition: {type: 'directory'}
}, function(err, childDocs) {
if (_.isEmpty(childDocs)) return cb();
items.push(childDocs[0]);
cb();
});
}, function(err) {
if (err) return res.send(err);
res.send(items);
});
});
This is using mongoose materialized, specifically I believe the issue/delay is with this. Which is being called for every item returned. There must be a more efficient way, perhaps with an aggregate. This is showing about a +-500ms delay, of course this degrades as there are more documents.

Categories