I loop over some directories and then the files in them. I process the files by directory and then try to add the processed results into MySQL.
I call conn.query('INSERT QUERY HERE') and it seems to continue on but the query never runs on the server. If I tell it to just process one directory and wait till the end it will run the queries but I can't have it continue to store all the queries in memory till the end of the script or node will fail out due to mem cap. I have tried everything I can think of to try and force the queued queries to run but no luck.
Here is an example of my code
dirs.forEach(function(dir){
var data = [];
var connection = mysql.createConnection(conConfig);
files.forEach(function(file){
//do some processing on files push into data array
//creating array of objects
});
data.forEach(function(record){
connection.query('INSERT INTO TABLE SET ?', record);
});
connection.end();
});
The code will just continue to loop over the directories without ever sending the query to mysql. I know it will work by limiting the code to just run on one directory and it will runt he queries once the one directory is processed but not if I let it run on all directories.
I have tried using mysql pooling as well with no luck. The
pool.on('enqueue' function... will fire but never send it over to the server.
edit:
So I tried calling the script with a for loop from bash to call every dir name individually and all records were loaded. I'm dumbfounded as to why a mysql connection is never established in my orig example.
Javascript calls the mysql query asynchronously. That means the connection will likely be be closed before all insert queries are finished.
What you can do is to use the callbacks that the query function provides:
var qCnt = array.length;
var connection = mysql.createConnection(conConfig);
connection.connect();
array.forEach(function(record){
connection.query('INSERT INTO TABLE SET ?', record, function(err){
if (err) throw err;
qCnt--;
if (qCnt == 0){
connection.end();
}
});
});
This solution is not ideal, since all the insert queries are fired regardless of your database connection limit etc. you may want to fire the next insert only after the former is done. This is also possible with some tricks.
It is in fact an async issue. There does not seem to be any way to force queued queries to execute without stopping the current running process. I had to use the async module in order to make my code work.
#luksch connection.end() will not close the connection till all queued queries are finished. I did use his iteration method to make the callback though.
Here is how I did it.
var async = require('async');
var connection = mysql.createConnection(conConfig);
var dirs = fs.readdirSync('./rootdirectory');
async.eachSeries(dirs,function(dir,callback){
var data = [];
files.forEach(function(file){
//do some processing on files push into data array
//creating array of objects
});
var qCount = data.length;
data.forEach(function(record){
connection.query('INSERT INTO TABLE SET ?', record, function(err){
if (err) throw err;
qCount--
if(qCount === 0) { callback(true); }
});
});
function(){ connection.end(); }
});
This will iterate over the directories and queue all the queries and then force the queries to be run till all directories have been processed then call the final function to close the connection.
Related
I've been creating a small node js app that iterates through an array of names and queries an API for the names. The issue I have is that the array is very large (400,000+ words) and my application runs out of memory before the forEach is complete.
I've been able to diagnose the issue by researching about how JS works with the call stack, web api, and callback queue. What I believe the issue to be is that the forEach loop is blocking the call stack and so the http requests continue to clog up the callback queue without getting resolved.
If anyone can provide a solution for unblocking the forEach loop or an alternative way of coding this app I would be very greatful.
Node JS App
const mongoose = require("mongoose");
const fs = require("fs");
const ajax = require("./modules/ajax.js");
// Bring in Models
let Dictionary = require("./models/dictionary.js");
//=============================
// MongoDB connection
//=============================
// Opens connection to database "test"
mongoose.connect("mongodb://localhost/bookCompanion");
let db = mongoose.connection;
// If database test encounters an error, output error to console.
db.on("error", (err)=>{
console.error("Database connection failed.");
});
db.on("open", ()=>{
console.info("Connected to MongoDB database...");
}).then(()=>{
fs.readFile("./words-2.json", "utf8", (err, data)=>{
if(err){
console.log(err);
} else {
data = JSON.parse(data);
data.forEach((word)=>{
let search = ajax.get(`API url Here?=${word}`);
search.then((response)=>{
let newWord = new Dictionary ({
word: response.word,
phonetic: response.phonetic,
meaning: response.meaning
}).save();
console.log("word saved");
}).catch((err)=>{
console.log("Word not found");
});
});
};
});
});
Check
Check whether api accepts multiple query params.
Try to use async Promises.
Resolve the promises and try to perform the save operation on the Promises by Promise#all
I have created a Redis cluster with 30 instances (15 masters/ 15 nodes). With python code i connected to these instances, i found the masters and then i wanted to add some keys to them.
def settomasters(port, host):
r = redis.Redis( host=host, port=port )
r.set("key"+port,"value")
Error:
redis.exceptions.ResponseError: MOVED 12539 127.0.0.1:30012
If i try to set key from redis-cli -c -p portofmyinstance sometimes i get a redirection message that tells where the keys stored.
I know that in case of get requests for example, a smart client is needed in order to redirect the requests to the correct node (the node that holds the key) otherwise a moved error occurs. Is it the same situation? I need to catch the redis.exceptions.ResponseError and try to set again?
while True:
try:
r.set("key","value")
break
except:
print "error"
pass
My first try was above code but without solution. The set operation never succeeds.
On the other hand below code in javascript does not throw an error and i cannot figure the reason:
var redis = require('redis-stream'),
client = new redis(30001, '127.0.0.1');
// Open stream
var stream = client.stream();
// Example of setting 200 records
for(var record = 0; record <200; record++) {
var command = ['set', 'qwerty' + record, 'QWERTYUIOP'];
stream.redis.write( redis.parse(command) );
}
stream.on('close', function () {
console.log('Completed!');
});
// Close the stream after batch insert
stream.end();
Any help will be appreciated, thanks.
with a redis cluster you can use the normal redis client only if you "find for the certain key the slot that belongs and then the slots that each master serves. With this information i can set keys to the correct node without moved redirection errors." as #Antonis said. Otherwise you need http://redis-py-cluster.readthedocs.io/en/master/
I'm working on creating a JavaScript file to get a JSON dump of an entire MySQL database, running on server side. I found and am using the MySQL driver for node.js (https://www.npmjs.com/package/mysql) for queries, it's been straight forward enough to start. My issue is that I need to call multiple queries and get the results from all of them to put into a single JSON file and I can't quite get that to work. I'm entirely new to JavaScript (basically never touched it before now) so it's probably a relatively simple solution that I'm just missing.
Currently I do a query of 'SHOW TABLES' to get a list of all the tables (this can change so I can't just assume a constant list). I then just want to basically loop through the list and call 'SELECT * from table_name' for each table, combining the results as I go to get one big JSON. Unfortunately I haven't figured out how to get the code to finish all the queries before trying to combine them, thus retuning 'undefined' for all the results. Here is what I currently have:
var mysql = require('mysql');
var fs = require('fs');
var connection = mysql.createConnection({
host: 'localhost',
user: 'root',
password: 'pass',
database: 'test_data'
});
connection.connect();
connection.query('SHOW TABLES;', function(err, results, fields)
{
if(err) throw err;
var name = fields[0].name;
var database_json = get_table(results[0][name]);
for (i = 1; i < results.length; i++)
{
var table_name = results[i][name];
var table_json = get_table(table_name);
database_json = database_table_json.concat(table_json);
}
fs.writeFile('test_data.json', JSON.stringify(database_json), function (err)
{
if (err) throw err;
});
connection.end();
});
function get_table(table_name)
{
connection.query('select * from ' + table_name + ';', function(err, results, fields) {
if(err) throw err;
return results;
});
}
This gets the table list and goes through all of it with no issue, and the information returned by the second query is correct if I just do a console.log(results) inside the query, but the for loop just keeps going before any query is completed and thus 'table_json' just ends up being 'undefined'. I really think this must be an easy solution (probably something with callbacks which I don't quite understand fully yet) but I keep stumbling.
Thanks for the help.
I'm guessing that this is for some sort of maintenance type function and not a piece that you need for your application. You're probably safe to do this asynchronously. This module is available here: https://github.com/caolan/async
You can also use Q promises, available here: https://github.com/kriskowal/q
This answer: describes both approaches pretty well: Simplest way to wait some asynchronous tasks complete, in Javascript?
I'm attempting to use MongoJS as a wrapper for the native Mongo driver in Node. I'm modeling the documents in my collection as JavaScript classes with methods like populate(), save(), etc.
In most languages like C# and Java, I'm used to explicitly connecting and then disconnecting for every query. Most examples only give an example of connecting, but never closing the connection when done. I'm uncertain if the driver is able to manage this on its own or if I need to manually do so myself. Documentation is sparse.
Here's the relevant code:
User.prototype.populate = function(callback) {
var that = this;
this.db = mongo.connect("DuxDB");
this.db.collection(dbName).findOne({email : that.email}, function(err, doc){
if(!err && doc) {
that.firstName = doc.firstName;
that.lastName = doc.lastName;
that.password = doc.password;
}
if (typeof(callback) === "function"){
callback.call(that);
}
that.db.close();
});
};
I'm finding that as soon as I call the close() method on the MongoJS object, I can no longer open a new connection on subsequent calls. However, if I do not call this method, the Node process never terminates once all async calls finish, as if it is waiting to disconnect from Mongo.
What is the proper way to manage connections to Mongo with MongoJS?
You will get better performance from your application if you leave the connection(s) open, rather than disconnecting. Making a TCP connection, and, in the case of MongoDB, discovering the replica set/sharding configuration where appropriate, is relatively expensive compared to the time spent actually processing queries and updates. It is better to "spend" this time once and keep the connection open rather than constantly re-doing this work.
Don't open + close a connection for every query. Open the connection once, and re-use it.
Do something more like this reusing your db connection for all calls
User = function(db) {
this.db = db;
}
User.prototype.populate = function(callback) {
var that = this;
this.db.collection(dbName).findOne({email : that.email}, function(err, doc){
if(!err && doc) {
that.firstName = doc.firstName;
that.lastName = doc.lastName;
that.password = doc.password;
}
if (typeof(callback) === "function"){
callback.call(that);
}
});
};
I believe it actually closes the connection after each request, but it sets {auto_reconnect:true} in the mongodb server config, so it will reopen a new connection whenever one is needed.
I am using node.js and with the native mongodb driver (node-mongodb-native);
My current project uses node.js + now.js + mongo-db.
The system basically sends data from the browser to node.js, which is processed with haskell and later fed back to the browser again.
Via a form and node.js the text is inserted in a mongo-db collection called "messages".
A haskell thread reads the entry and stores the result in the db collection "results". This works fine.
But now I need the javascript code that waits for the result to appear in the collection results.
Pseudo code:
wait until the collection result is non-empty.
findOne() from the collection results.
delete the collection results.
I currently connect to the mongodb like this:
var mongo = require('mongodb'),
Server = mongo.Server,
Db = mongo.Db;
var server = new Server('localhost', 27017, {
auto_reconnect: true
});
var db = new Db('test', server);
My haskell knowledge is quite good but not my javascript skills.
So I did extensive searches, but I didn't get far.
glad you solved it, i was going to write something similar:
setTimeout(function(){
db.collection('results',function(coll){
coll.findOne({}, function(err, one){
if( err ) return callback(err);
coll.drop(callback); //or destroy, not really sure <-- this will drop the whole collection
});
});
} ,1000);
The solution is to use the async library.
var async = require('async');
globalCount = -1;
async.whilst(
function () {
return globalCount<1;
},
function (callback) {
console.log("inner while loop");
setTimeout(db_count(callback), 1000);
},
function (err) {
console.log(" || whilst loop finished!!");
}
);