How can I perfom multiple updates with one query and condition, using javascript ?
For example, my docs:
[
{
"Author": "Auto",
"Number": 5,
"RandomText": "dddbd",
"Tag": "Srebro",
"id": "10fbd309-5a7a-4cd4-ac68-c71d7a336498"
},
{
"Author": "Auto",
"Number": 8,
"RandomText": "cccac",
"Tag": "Srebro",
"id": "37f9694d-cde8-46bd-8581-e85515f8262f"
},
{
"Author": "Auto",
"Number": 6,
"RandomText": "fffaf",
"Tag": "Srebro",
"id": "b7559a48-a01a-4f26-89cf-35373bdbb411"
}
]
This is my query:
UpdateIndex()
{
this.r.table(this.params.table).update((row) => {
let result;
console.log(this.r.expr([row]));
this.r.branch(
this.r.row(this.Index2).lt(10),
result = "Lucky",
result = "Good"
);
/*
if(this.r.row("Number").lt(3)) result = "Bad";
else if (this.r.row("Number").lt(5)) result = "Poor";
else if (this.r.row("Number").lt(10)) result = "Lucky";
else if (this.r.row("Number").lt(20)) result = "Good";
else if (this.r.row("Number").lt(50)) result = "Great";
else result = "Mystic";
*/
console.log(result);
return this.r.object(this.Index2, result);
}).run(this.conn, this.CheckResult.bind(this));
}
Why I want to do this ? I created second index (this.Index2 = 'Opinion'), and now I would like to populate this index with values which are describe by my condition.
But every documents are the same values (example: Bad). How can I update documents, but run condition for every document, and with one query ?
Assigning to a local variable like that (result in your case) doesn't work with the way RethinkDB's driver builds up the query object to send to the server. When you write code like above, you're storing a literal string in the local variable once on the client (rather than once per row on the server) and then sending that literal to the server in the query you return at the bottom of your function. You also can't use console.log in the way you're trying to; that runs on the client, but your query is executed on the server. You may find http://rethinkdb.com/blog/lambda-functions/ useful for understanding what the client does with the anonymous functions you pass to commands like update.
You should use do for variable binding instead:
r.table(params.table).update(function(row) {
return r.branch(r.row(Index2).lt(10), "Lucky", "Good").do(function(res) {
return r.object(Index2, res);
});
})
Related
Right, so I am trying to wrap my head around editing (appending data) to a JSON file.
The file (users.json) looks like this:
{
"users": {
"id": "0123456789",
"name": "GeirAndersen"
}
}
Now I want to add users to this file, and retain the formatting, which is where I can't seem to get going. I have spent numerous hours now trying, reading, trying again... But no matter what, I can't get the result I want.
In my .js file, I get the data from the json file like this:
const fs = require('fs').promises;
let data = await fs.readFile('./test.json', 'utf-8');
let users = JSON.parse(data);
console.log(JSON.stringify(users.users, null, 2));
This console log shows the contents like it should:
{
"id": "0123456789",
"name": "GeirAndersen"
}
Just to test, I have defined a new user directly in the code, like this:
let newUser = {
"id": '852852852',
"name": 'GeirTrippleAlt'
};
console.log(JSON.stringify(newUser, null, 2));
This console log also shows the data like this:
{
"id": "852852852",
"name": "GeirTrippleAlt"
}
All nice and good this far, BUT now I want to join this last one to users.users and I just can't figure out how to do this correctly. I have tried so many version and iterations, I can't remember them all.
Last tried:
users.users += newUser;
users.users = JSON.parse(JSON.stringify(users.users, null, 2));
console.log(JSON.parse(JSON.stringify(users.users, null, 2)));
console.log(users.users);
Both those console logs the same thing:
[object Object][object Object]
What I want to achieve is: I want to end up with:
{
"users": {
"id": "0123456789",
"name": "GeirAndersen"
},
{
"id": "852852852",
"name": "GeirTrippleAlt"
}
}
When I get this far, I am going to write back to the .json file, but that part isn't an issue.
That's not really a valid data structure, as you're trying to add another object to an object without giving that value a key.
I think what you're really looking for is for 'users' to be an array of users.
{
"users": [
{
"id": "0123456789",
"name": "GeirAndersen"
},
{
"id": "852852852",
"name": "GeirTrippleAlt"
}
]
}
You can easily create an array in JS and the push() new items into your array. You JSON.stringify() that with no issue.
const myValue = {
users: []
};
const newUser = {
'id': '0123456789',
'name': "GeirAndersen'
};
myValue.users.push(newUser);
const strigified = JSON.stringify(myValue);
Here's my situation, I have a JSON that looks somewhat like this:
{
"items": [{
"type": "condition",
"data": {
"type": "comparison",
"value1": {
"source": "MyType1",
"component": "Attribute1"
},
"value2": {
"source": "MyType2",
"component": "Attribute2"
},
"operator": "gt"
}
},
{
"type": "then",
"data": {
"result": "failed",
"message": "value1 is too high"
}
}
]
}
and would want it to translate to:
if (MyType1.Attribute1 > MyType2.Attribute2) {
result = "failed";
console.log("value1 is too high");
}
Now my problem is, I don't know how I would translate the entries of value1 and value2 to actual code, or rather, how I could access the Object MyType1(maybe through something like getAttribute("MyType1")).
Since I am going to have a whole bunch of sources which each have different components, I cant really write a huge dictionary. Or I would like to avoid it.
The goal is to allow creating if - then - statements via some interactive UI, and I figured it'd be best to save that code as .json files. (Think rule management system).
So, TL,DR, How would I access a Class Attribute this.MyType, if I only have a String MyType to go from? And how would I access the value this.MyType.MyValue, if I get another String MyValue?
Thanks in advance.
Edit:
I'd really like to avoid using eval, for obvious reasons. And if I have to - I guess I would need to create Dictionaries for possible JSON Values, to validate the input?
You need some kind of parser. At first we need some way to store variables and maybe flags:
const variables = {};
var in = false;
Then we go through the code and execute it:
for(const command of items){
switch( command.type ){
case "condition":
//...
case "then":
//...
}
}
To access a variable we can simply do
var current = variables[ identifier ];
To store its the other way round:
variables[ identifier ] = current;
Trying to access the value field of this JSON file using JSON.parse() in Meteor, but I cannot get it to return anything. I suspect there is an error in my syntax in selecting the data from the imported JS object.
{"status":"success","data":{"subjects":[{"value":"ABC","descr":"Descriptions"}]},"message":null,"meta":{"copyright":"Copyright","referenceDttm":"Date"}}
I'm trying to store it into an array, subjectArray. This is the code I'm using:
var subjectArray = new Array();
subjectFile = HTTP.get("https://classes.cornell.edu/api/2.0/config/subjects.json?roster=FA15");
subjectJSON = JSON.parse(subjectFile);
for (int i=0; i<subjectJSON.length; i++) {
subjectArray.push(subjectJSON[i].value)
}
Pretty printed this is:
{
"data": {
"subjects": [
{
"descr": "Descriptions",
"value": "ABC"
}
]
},
"message": null,
"meta": {
"copyright": "Copyright",
"referenceDttm": "Date"
},
"status": "success"
}
Responses from HTTP calls can take a while to come back so you should read your replies in an async way. You should move all of your code related to the "get" inside a callback function. If you want to find out more about HTTP and callbacks in Meteor make sure you check the docs
If you know what "value" is in your for loop ('cause I don't), then this is your answer:
HTTP.get("https://classes.cornell.edu/api/2.0/config/subjects.json?roster=FA15", function (err, res) {
if(!!err) return false;
subjectJSON = JSON.parse(res);
for (var i = 0; i < subjectJSON.length; i++) {
subjectArray.push(subjectJSON[i].value);
}
return true;
});
Also, there is no int in JavaScript.
On the site I am creating, users can enter different tags and separate them with commas. ExpressJS should then search through whether they exist or not. If they do not exist, then it should create an object for each of them. I have an array and am iterating through it with a for function, however, only one object is created thanks to the callback... Is there any possible way to create multiple objects at once depending on the array's length?
for (i=0;i<postTopics.length;i++) {
var postTopic = postTopics[i],
postTopicUrl = postTopic.toString().toLowerCase().replace(' ', '-');
Topic.findOne({ "title": postTopics[i] }, function (err, topic) {
if (err) throw err;
if (!topic) {
Topic.create({
title: postTopic,
url: postTopicUrl
}, function (err, topic) {
if (err) throw err;
res.redirect('/');
});
}
});
}
Try out async.parallel.
$ npm install async
// Get the async module so we can do our parallel asynchronous queries much easier.
var async = require('async');
// Create a hash to store your query functions on.
var topicQueries = {};
// Loop through your postTopics once to create a query function for each one.
postTopics.forEach(function (postTopic) {
// Use postTopic as the key for the query function so we can grab it later.
topicQueries[postTopic] = function (cb) {
// cb is the callback function passed in by async.parallel. It accepts err as the first argument and the result as the second.
Topic.findOne({ title: postTopic }, cb);
};
});
// Call async.parallel and pass in our topicQueries object.
// If any of the queries passed an error to cb then the rest of the queries will be aborted and this result function will be called with an err argument.
async.parallel(topicQueries, function (err, results) {
if (err) throw err;
// Create an array to store our Topic.create query functions. We don't need a hash because we don't need to tie the results back to anything else like we had to do with postTopics in order to check if a topic existed or not.
var createQueries = [];
// All our parallel queries have completed.
// Loop through postTopics again, using postTopic to retrieve the resulting document from the results object, which has postTopic as the key.
postTopics.forEach(function (postTopic) {
// If there is no document at results[postTopic] then none was returned from the DB.
if (results[postTopic]) return;
// I changed .replace to use a regular expression. Passing a string only replaces the first space in the string whereas my regex searches the whole string.
var postTopicUrl = postTopic.toString().toLowerCase().replace(\ \g, '-');
// Since this code is executing, we know there is no topic in the DB with the title you searched for, so create a new query to create a new topic and add it to the createQueries array.
createQueries.push(function (cb) {
Topic.create({
title: postTopic,
url: postTopicUrl
}, cb);
});
});
// Pass our createQueries array to async.parallel so it can run them all simultaneously (so to speak).
async.parallel(createQueries, function (err, results) {
// If any one of the parallel create queries passes an error to the callback, this function will be immediately invoked with that err argument.
if (err) throw err;
// If we made it this far, no errors were made during topic creation, so redirect.
res.redirect('/');
});
});
First we create an object called topicQueries and we attach a query function to it for each postTopic title in your postTopics array. Then we pass the completed topicQueries object to async.parallel which will run each query and gather the results in a results object.
The results object ends up being a simple object hash with each of your postTopic titles as the key, and the value being the result from the DB. The if (results[postTopic]) return; line returns if results has no document under that postTopic key. Meaning, the code below it only runs if there was no topic returned from the DB with that title. If there was no matching topic then we add a query function to our createQueries array.
We don't want your page to redirect after just one of those new topics finishes saving. We want to wait until all your create queries have finished, so we use async.parallel yet again, but this time we use an array instead of an object hash because we don't need to tie the results to anything. When you pass an array to async.parallel the results argument will also be an array containing the results of each query, though we don't really care about the results in this example, only that no errors were thrown. If the parallel function finishes and there is no err argument then all the topics finished creating successfully and we can finally redirect the user to the new page.
PS - If you ever run into a similar situation, except each subsequent query requires data from the query before it, then checkout async.waterfall :)
If you really want to see if things exist already and avoid getting errors on duplicates then the .create() method already accepts a list. You don't seem to care about getting the document created in response so just check for the documents that are there and send in the new ones.
So with "finding first", run the tasks in succession. async.waterfall just to pretty the indent creep:
// Just a placeholder for your input
var topics = ["A Topic","B Topic","C Topic","D Topic"];
async.waterfall(
[
function(callback) {
Topic.find(
{ "title": { "$in": topics } },
function(err,found) {
// assume ["Topic B", "Topic D"] are found
found = found.map(function(x) {
return x.title;
});
var newList = topics.filter(function(x) {
return found.indexOf(x) == -1;
});
callback(err,newList);
}
);
},
function(newList,callback) {
Topic.create(
newList.map(function(x) {
return {
"title": x,
"url": x.toString().toLowerCase().replace(' ','-')
};
}),
function(err) {
if (err) throw err;
console.log("done");
callback();
}
);
}
]
);
You could move the "url" generation to a "pre" save schema hook. But again if you really don't need the validation rules, go for "bulk API" operations provided your target MongoDB and mongoose version is new enough to support this, which really means getting a handle to the underlying driver:
// Just a placeholder for your input
var topics = ["A Topic","B Topic","C Topic","D Topic"];
async.waterfall(
[
function(callback) {
Topic.find(
{ "title": { "$in": topics } },
function(err,found) {
// assume ["Topic B", "Topic D"] are found
found = found.map(function(x) {
return x.title;
});
var newList = topics.filter(function(x) {
return found.indexOf(x) == -1;
});
callback(err,newList);
}
);
},
function(newList,callback) {
var bulk = Topic.collection.initializeOrderedBulkOp();
newList.forEach(function(x) {
bullk.insert({
"title": x,
"url": x.toString().toLowerCase().replace(' ','-')
});
});
bulk.execute(function(err,results) {
console.log("done");
callback();
});
}
]
);
That is a single write operation to the server, though of course all inserts are actually done in order and checked for errors.
Otherwise just hang the errors from duplicates and insert as an "unordered Op", check for "non duplicate" errors after if you want:
// Just a placeholder for your input
var topics = ["A Topic","B Topic","C Topic","D Topic"];
var bulk = Topic.collection.initializeUnorderedBulkOp();
topics.forEach(function(x) {
bullk.insert({
"title": x,
"url": x.toString().toLowerCase().replace(' ','-')
});
});
bulk.execute(function(err,results) {
if (err) throw err;
console.log(JSON.stringify(results,undefined,4));
});
Output in results looks something like the following indicating the "duplicate" errors, but does not "throw" the error as this is not set in this case:
{
"ok": 1,
"writeErrors": [
{
"code": 11000,
"index": 1,
"errmsg": "insertDocument :: caused by :: 11000 E11000 duplicate key error index: test.topic.$title_1 dup key: { : \"B Topic\" }",
"op": {
"title": "B Topic",
"url": "b-topic",
"_id": "53b396d70fd421057200e610"
}
},
{
"code": 11000,
"index": 3,
"errmsg": "insertDocument :: caused by :: 11000 E11000 duplicate key error index: test.topic.$title_1 dup key: { : \"D Topic\" }",
"op": {
"title": "D Topic",
"url": "d-topic",
"_id": "53b396d70fd421057200e612"
}
}
],
"writeConcernErrors": [],
"nInserted": 2,
"nUpserted": 0,
"nMatched": 0,
"nModified": 0,
"nRemoved": 0,
"upserted": []
}
Note that when using the native collection methods, you need to take care that a connection is already established. The mongoose methods will "queue" up until the connection is made, but these will not. More of a testing issue unless there is a chance this would be the first code executed.
Hopefully versions of those bulk operations will be exposed in the mongoose API soon, but the general back end functionality does depend on having MongoDB 2.6 or greater on the server. Generally it is going to be the best way to process.
Of course, in all but the last sample which does not need this, you can go absolutely "async nuts" by calling versions of "filter", "map" and "forEach" that exist under that library. Likely not to be a real issue unless you are providing really long lists for input though.
The .initializeOrderedBulkOP() and .initializeUnorderedBulkOP() methods are covered in the node native driver manual. Also see the main manual for general descriptions of Bulk operations.
I have a json url that returns data in the format
{
"photos" : [
{
"id": 1, "image":"https://path/to/my/image/1.jpg"
},
{
"id": 2, "image":"https://path/to/my/image/2.jpg"
}
]
}
I'm using the json in a javascript function, and need to manipulate it to remove the root key. i.e. I want something that looks like
[
{
"id": 1, "image":"https://path/to/my/image/1.jpg"
},
{
"id": 2, "image":"https://path/to/my/image/2.jpg"
}
]
I've been hacking around with various approaches, and have referred to several similar posts on SO, but nothing seems to work. The following seems like it should.
var url = 'http://path/to/my/json/feed.json';
var jsonSource = $.getJSON( url );
var jsonParsed = $.parseJSON(jsonSource);
var jsonFeed = jsonParsed.photos
What am I doing wrong?
A couple of issues there.
That's invalid JSON, in two different ways. A) The : after "photos" means that it's a property initializer, but it's inside an array ([...]) when it should be inside an object ({...}). B) There are extra " characters in front of the images keys. So the first thing is to fix that.
You're trying to use the return value of $.getJSON as though it were a string containing the JSON text. But $.getJSON returns a jqXHR object. You need to give it a success callback. That callback will be called with an object graph, the JSON is automatically parsed for you.
Assuming the JSON is fixed to look like this:
{
"photos": [
{
"id": 1,
"image": "https://path/to/my/image/1.jpg"
},
{
"id": 2,
"image": "https://path/to/my/image/2.jpg"
}
]
}
Then:
$.getJSON(url, function(data) {
var photos = data.photos;
// `photos` now refers to the array of photos
});