I need to format SQL query with default option for missing object fields. I can do it with an external call to pgp.as.format:
let formattedQuery = pgp.as.format('INSERT INTO some_table (a,b,c) VALUES ($(a), $(b), $(c))', object, {default: null});
db.none(formattedQuery);
Is it possible to pass default option directly without pre-formatting the query? Basically, i would like to do something like this:
db.none('INSERT INTO some_table (a,b,c) VALUES ($(a), $(b), $(c))', object, {default: null})
I'm the author of pg-promise.
All query methods in pg-promise rely on the default query formatting, for better reliability, i.e. when a query template refers to a property, the property must exist, or else an error is thrown. It is logical to keep it that way, because a query cannot execute correctly while having properties in it that haven't been replaced with values.
Internally, the query engine does support advanced query formatting options, via method as.format, such as partial and default. And there are several objects in the library that make use of those options.
One in particular that you should use for generating inserts is helpers.insert, which can generate both single-insert and multi-insert queries. That method, along with even more useful helpers.update make use of type ColumnSet, which is highly configurable, supporting default values for missing properties (among other things), via type Column.
Using ColumnSet, you can specify a default value either for selective columns or for all of them.
For example, let's assume that column c may be missing, in which case we want to set it to null:
var pgp = require('pg-promise')({
capSQL: true // to capitalize all generated SQL
});
// declaring a reusable ColumnSet object:
var csInsert = new pgp.helpers.ColumnSet(['a', 'b',
{
name: 'c',
def: null
}
], {table: 'some_table'});
var data = {a:1, b:'text'};
// generating our insert query:
var insert = pgp.helpers.insert(data, csInsert);
//=> INSERT INTO "some_table"("a","b","c") VALUES(1,'text',null)
This makes it possible to generate multi-insert queries automatically:
var data = [{a:1, b:'text'}, {a:2, b:'hello'}];
// generating a multi-insert query:
var insert = pgp.helpers.insert(data, csInsert);
//=> INSERT INTO "some_table"("a","b","c") VALUES(1,'text',null),(2,'hello',null)
The same approach works nicely for single-update and multi-update queries.
In all, to your original question:
Is it possible to pass default option directly without pre-formatting the query?
No, and neither it should. Instead, you should use the aforementioned methods within the helpers namespace to generate correct queries. They are way more powerful and flexible ;)
Related
I'm just importing some stuff from .csv file to Neo4j. I've always used MERGE to create a node, but now, when trying to import from .csv, some of data is null, e.g. column address. When I'm doing MERGE instead of CREATE it gives an error, but when I do CREATE it works fine. The only difference I know between MERGE and CREATE is that if the node already exists, MERGE doesn't make a new one.
My query:
LOAD CSV WITH HEADERS FROM '<path>' as line
CREATE (a: Address
{
address: line.address,
postalCode: toInteger(line.postalCode),
town: line.town,
municipalityNr: toInteger(line.municipalityNr),
municipality: line.municipality,
countryCode: line.countryCode,
country: line.country
})
RETURN a.address
When doing a MERGE, Neo4j expects a value that it can merge on. MERGE using a null will always result in an error.
In general, this is the approach:
Only MERGE on properties that are relevant to finding a unique node. So for instance, if you want to MERGE cars, the licence plate would be the property to use.
Make sure you have a CONSTRAINT for the property you MERGE on. This will help speed up the import.
To avoid nulls in the MERGE, you can use COALESCE().
After the MERGE, you can SET the other properties, which may have nulls.
MERGE {c:Car {licensePlate: COALESCE(line.licensePlate,'Unknown') })
SET c.color = line.color,
c.someproperty = line.someproperty
At the end of the run, you will find a single :Car node with licensePlate:'Unknown'
Agree with Graphileon.
There is a trap you should avoid. You also want to avoid properties that may vary within a data set. For instance the same person has a birth date in different formats (e.,g, 5/16/85, 05/16/1985), you would add duplicate nodes if birth date was part of the merge criteria. To avoid this, you can do the merge without the date and then, in a subsequent step, add the property ... of course, only the last such effort would endure.
You can check to see that there are no duplicate nodes. In the example given, if your sure the name is unique, or license number of the car:
match (p:Person) return p.name, count(*) as ct order by ct desc
The internet is full of resources for dealing with arrays, but often objects are a more natural fit for data and seemingly more efficient.
I want to store key-value objects under dynamic field names like this:
project['en-US'] = { 'nav-back': 'Go back', ... }
project['pt-BR'] = { 'nav-back': 'Volte', ... }
Doing this seems like it would be more efficient than keeping an array of all languages and having to filter it to get all language entries for a given language.
My question is: How can I insert a key-value pair into an object with a dynamic name using mongoose? And would the object need to exist or can I create it if it doesn't in one operation?
I tried this:
await Project.update(
{ _id: projectId },
{
$set: {
[`${language}.${key}`]: value,
},
});
But no luck regardless of if I have an empty object there to begin with or not: { ok: 0, n: 0, nModified: 0 }.
Bonus: Should I index these objects and how? (I will want to update single items)
Thanks!
In mongoose, the schema is everything. It describe the data you gonna read/store from the database. If you wanna add dynamically a new key in the schema it's gonna be hard.
In this particulary case I would recommend to use the mongodb-native-driver which is way more permissive about the data manipulation. So you could read the data in a specific format and dynamically add your field into it.
To resume my thought, how should your dynamic change happen :
Use mongodb-native-driver to insert the new key into the database data
Modify the mongoose schema you have in the code (push a new key into it)
Use mongoose to manipulate the data afterward
Do not forget to dynamically update your mongoose model or you won't read the new key at the next find.
I solved this using the original code snippet unchanged, but adding { strict: false } to the schema:
const projectSchema = new Schema({ ...schema... }, { strict: false });
From pg-promise's example, one can format a query like below, where ${this~} becomes all of the keys in the object that is the second parameter of "format()".
// automatically list object properties as sql names:
format('INSERT INTO table(${this~}) VALUES(${one}, ${two})', {
one: 1,
two: 2
});
//=> INSERT INTO table("one","two") VALUES(1, 2)
Is it possible to also get all of the values of the object, without explicitly typing all of them? I want to do it like below (should do the same thing as the snippet above, but without typing all of the values):
format('INSERT INTO table(${this~}) VALUES(${this#})', {
one: 1,
two: 2
});
Is it possible to also get all of the values of the object, without explicitly typing all of them?
No, it is not possible, because while column names require the same-type SQL Name escaping, values do not, they require templating that's possible only via explicitly defined variables.
I want to do it like below...
For that you should use the helpers methods of the library:
const cs = new pgp.helpers.ColumnSet(['one', 'two'], {table: 'my-table'});
const query = pgp.helpers.insert(values, cs);
im a little bit confusing about relations on sequelize.
Is the same to do:
var User = this.sequelize.define('user', {/* attributes */}),
Company = this.sequelize.define('company', {/* attributes */});
User.belongsTo(Company);
that do:
var User = this.sequelize.define('user', {
company_id : {
references: {
model: 'Company'
key: id
}
}
/* more attributes */
}),
Company = this.sequelize.define('company', {/* attributes */});
what are the difference? apparently two codes do the same tables as result, where in user table is added companyId foreign key to Company.
thanks!
They are just different methods to achieve the same result. Depending on the patterns you are following in your code or the specific options, one may end up being more clear than the other. Notice that in your example the former is much more compact than the latter, which might make it a better choice.
There is a bug in your example however. The first would indeed result in a foreign key of companyId, however in your second example you use company_id. If you want to use underscored table and column names instead of camel case, you will need to pass in the underscored: true option in Sequelize.define().
I have an existing "blackbox" web service. I need to append a session ID to the end of that output so that Javascript and similar clients can resume the stateful session.
Given the output below, what is the correct syntax to append or prepend an arbitrary GUID, so that it can be properly deserialized as valid JSON?
Note This data below is perfect. If I can somehow add a "removable" bit of information, using JSON.NET the string GUID, that would be ideal.
Output from REST call
"{\"sa\":[\"BHDQ9TLPeaeVuSSgXv9bsOIVFUWbOpivMKhGki7YPLzIXEyHuxRAZhDgts2sEcBQpLBuKJZCtcmSlzWZ9iK0AAA=\",\"BAhyo7T0Wq1WBLXnyN4vo1L94rWLhCCv4DqROi+p9XHO6UeS0Gw6xh1JAKOtXBU2fA432LkNqng8cUt1eAX0bqs=\",\"BGFmyTreWY5pICAcf3itoqbfhs5brOmIDLNF3V7p7slPYdCSVhwWUT5mHD6Lb5kNi\/Qy9tracNUtVgvo3f51FrI=\",\"BMV7RIwoz+LdFgD2fq7UZ7E88KFq\/03381NDYFIKYgUKxEzuXoj6hZfSB0slX5fdaL44Lf6i\/UjDzPQt2XUG8NE=\",\"BL8BnU5WvFn7vIlKi14dWsqykNf1\/nmE55YXFGwLx9Qu3VvDblULt\/U8CXPI1vD8+wMXCRnkunXqxlsFqgghf8w=\"],\"sb\":[\"BInTtgTAn\/zkmrkporhV5DvPZRq5YWm8e\/m02oq55UfY3RxIhOplJgwLjgKMHKYDthYEBcqNNNuVbbWnbtKVAqA=\",\"BJbh5y95wHGjmAPDFNqgewnBxtqVke0sloDD2S3IdrWZ95JfP77rtXZ4lTG8g9PuTLJbl4exZUnM16260WxJ9wU=\",\"BKevE9i2J8CicXHX3elCoQPEpTOmJyGOlBskIbFMFGQFhJ5TD7N1221rhhH9HY6DsfRojmefozsQYzo7Pokp+Hg=\",\"BJbVTRyh8WwCxfR7jRXnran4td7k5+vEfM+HWxeAibneSjdMRQ1Fg6QxKLu+Zu1aPdXqD8M29kABOTAiYopVuQE=\",\"BFv3alDqjo7ckdB2vuxJ15Gur1xsgATjLe9drt\/XU9AkbN+AELCv+mF1Xy8+83L2A1p8aGxF4b7dsrMed27u1j4=\"],\"sz\":\"BF1IiqMz0KmT4gZN6euJquWFt2UmVjyOEdaX0jH8uQMAPG8DBoyneT2PJ9NQTE2xBOP9TtAb1d2O+iCojFqzkvI=\"}"
The output above comes from Chrome. I'm not sure if Chrome adds additional quotes, etc but when I debug System.String on the server, I see the same thing being sent to the WCF service.
The end-usage for this will be a Chrome and Firefox plug in
Well if I am correctly understanding:
You get JSON from a blackbox service. It contains some properties and values. You want to add a new property with some GUID and send it to browser.
If this is correct, try following:
var json=<WHAT YOU GET FROM SERVICE>;
var converter = new ExpandoObjectConverter();
dynamic obj = JsonConvert.DeserializeObject<ExpandoObject>(json, converter);
obj.sid="this is the new session id"; //ADD NEW PROPERTY
var j=JsonConvert.SerializeObject(obj); //GET BACK JSON STRING WITH NEW PROPERTY
Of if you just want to add session id on client side (inside your plugin) the utilize JSON2 javascript library and use following code (as also suggested by Josh in comments):
var o = JSON.parse(<REST OUTPUT>);
o.sid = <YOUR SESSION ID>;
To convert back to JSON string.
var jsn = JSON.stringify(o);
There is no way to modify that particular response without breaking existing clients. If you can break existing clients, or if you are working with clients that you control, you could wrap the object in another object, setting two keys: GUID and data. For example:
var json = JsonConvert.SerializeObject(new {
data = foo,
GUID = bar,
});
Where bar is the GUID that you want to use, and foo is one of two things:
The JSON string from the response. This will result in the final object looking like so:
{
data: "{\"sa\":[\"BHDQ9TLPeaeVuSSgXv9bsOIVFUWbOpivMKhGki7YPLzIXEyHuxRAZhDgts2sEcBQpLBuKJZCtcmSlzWZ9iK0AAA=\",\"BAhyo7T0Wq1WBLXnyN4vo1L94rWLhCCv4DqROi+p9XHO6UeS0Gw6xh1JAKOtXBU2fA432LkNqng8cUt1eAX0bqs=\",\"BGFmyTreWY5pICAcf3itoqbfhs5brOmIDLNF3V7p7slPYdCSVhwWUT5mHD6Lb5kNi\/Qy9tracNUtVgvo3f51FrI=\",\"BMV7RIwoz+LdFgD2fq7UZ7E88KFq\/03381NDYFIKYgUKxEzuXoj6hZfSB0slX5fdaL44Lf6i\/UjDzPQt2XUG8NE=\",\"BL8BnU5WvFn7vIlKi14dWsqykNf1\/nmE55YXFGwLx9Qu3VvDblULt\/U8CXPI1vD8+wMXCRnkunXqxlsFqgghf8w=\"],\"sb\":[\"BInTtgTAn\/zkmrkporhV5DvPZRq5YWm8e\/m02oq55UfY3RxIhOplJgwLjgKMHKYDthYEBcqNNNuVbbWnbtKVAqA=\",\"BJbh5y95wHGjmAPDFNqgewnBxtqVke0sloDD2S3IdrWZ95JfP77rtXZ4lTG8g9PuTLJbl4exZUnM16260WxJ9wU=\",\"BKevE9i2J8CicXHX3elCoQPEpTOmJyGOlBskIbFMFGQFhJ5TD7N1221rhhH9HY6DsfRojmefozsQYzo7Pokp+Hg=\",\"BJbVTRyh8WwCxfR7jRXnran4td7k5+vEfM+HWxeAibneSjdMRQ1Fg6QxKLu+Zu1aPdXqD8M29kABOTAiYopVuQE=\",\"BFv3alDqjo7ckdB2vuxJ15Gur1xsgATjLe9drt\/XU9AkbN+AELCv+mF1Xy8+83L2A1p8aGxF4b7dsrMed27u1j4=\"],\"sz\":\"BF1IiqMz0KmT4gZN6euJquWFt2UmVjyOEdaX0jH8uQMAPG8DBoyneT2PJ9NQTE2xBOP9TtAb1d2O+iCojFqzkvI=\"}",
guid: "00000000-0000-0000-0000-000000000000"
}
And you would get at the data through two calls to JSON.parse (or the equivalent).
The deserialized object from the JSON response. This will result in the final object looking like so (most data removed for brevity sake):
{
data: {
sa: [],
sb: [],
sz: ""
},
guid: "00000000-0000-0000-0000-000000000000"
}
And you would access data through response.data.
Why any modification can break existing clients
Where the current response is an object, there are only a few ways to modify it:
Injecting a key into the object. This assumes that no client uses Object.keys() or in any way iterates the key set (e.g. for (k in obj)). While this may be true, this is an assumption.
Adding another object to the end: }, {. Doing so would require that the response be transformed into an array:
[{}, {}]
This would break any client that is assumes the response is an object.
Wrapping the current response in a surrounding object (as proposed above). This as well breaks any clients that assumes a certain structure for the response.
{data:{}, guid: ""}