I have been looking everywhere on AWS docs for any information on this and can find absolutely none. The only answer I keep getting everywhere I look is how to query or scan using a secondary index, on already-indexed data. But how do you add a value to the index-attribute of an item in the first place? I am using AWS SDK for JavaScript so JS-specific info would be most helpful, but any info on this would be so much better than what AWS has provided.
I tried to add an item with params like the following, where I simply used the names of indexes as attributes (date and timestamp):
const params = {
TableName: 'Posts_Table',
Item: {
'username' : user,
'image_id' : uuid(),
'date' : date,
'timestamp' : timestamp
}
}
But what ended up happening is date and timestamp were simply added as normal attributes that aren't able to be queried.
You've got some fundamental misunderstanding going on. You don't give enough code or examples for me to guess what you're really attempting. For example, I don't know what your table's keys are. So here's a primer:
You only write items to the base table (never directly to an index). Items can have a variety of attributes. Each item must have unique key attributes in the base table.
You can create a GSI against the table, including after the table has data. When constructing the GSI you select what its key attributes will be.
When you want to use the GSI you must specify it in the query as your Scan or Query target.
Are you trying to write to the index? You can't.
Are you trying to query the index by pointing at the base table? You can't.
Are you trying to write an item to the base table without specifying its primary keys? You can't.
How to create an item with an index in DynamoDB?
You can not create an item without an index in DynamoDB.
When you create a table, you specify the Primary Key which is your index.
When you add an item, you have to provide the Primary Key.
You can also make use of Global Secondary Indexes which technically create a new table with that index under the hood.
But what ended up happening is date and timestamp were simply added as normal attributes that aren't able to be queried.
If you want to be able to query an attribute, that attribute has to be a Primary Key (Partition or Composite) or a Global Secondary Index.
Related
I have a main table with primary key as 'main_id'.
This 'main_id' is being used a foreign key in two tables : 'sub_table1', 'sub_table2'
I have a requirement where I need to change the 'main_id' and it's references in 'sub_table1' and 'sub_table2'.
I am on Nodejs and using Bookshelf and Knex.
So far I tried updating the sub_tables first and then tried updating the main_table.
But it threw error 'Foreign_key constraint being violated'.
I'm new at using Bookshelf.
Please help
Foreign keys are absolutely constant and this might not be what you actually want to do. I mean, there is no real reason why you would like to change the id as it's an identification for you as a developer to work with, not something that should be able to be overwritten. If for some reason you need the user to have something similar to an identification that they can overwrite, you can simply create a new column called for example 'code' or 'customId' (or whatever, you got the point) and make it unique.
I have an attribute that is of string data type.I am using the algoliasearchHelper object to search through my Algolia database. What I want to do is create a facet filter that takes a specified prefix and returns all objects that have a value with the specified prefix inside a specified attribute.
For example, so far I am using this:
Helper.addDisjunctiveFacetRefinement("attributeName","Can");
This returns all objects that have the value of "Can" in the "attributeName" attribute, but it doesn't return any objects that have the value of "Canada" or "Canadians" in the "attributeName" attribute, even though those have a prefix of "Can".
How can I make it so that when it filters, it filters with a specified prefix.
A disjunctive facet refinement is used for an exact match of a filter. This is mainly used to select a certain filter out of a list of possible filters. What I assume you want, is to get a subset of all possible filters.
This is possible with our searchForFacetValues function. First this needs to be set up in your indexing settings as searchable(attributeName) so we can generate the additional data structures to make the facets easily searchable. You can also read more about that setup in the documentation.
So once the attribute is searchable, you can use the helper function to refine the list of filters like this:
Helper.searchForFacetValues("attributeName","Can");
Have a great day!
I am trying to get a specific menuitem and store it in a variable in JavaScript:
var Menus = new openerp.web.Model('ir.ui.menu');
Now, we can apply a filter to Menus to get the menuitem, for example, look for its name, but the thing is that there are a lot of menuitems with the same name. So I think that the only attribute which identifies my menuitem and differences it from the other is the XML ID.
But I do not know how to get it from JavaScript code. Is there any built function to obtain it? How can I manage my purpose?
Well, I have found a workaround. May be there is a better solution, in that case, please, post it.
In the database, there is a table named ir_model_data. This table stores the XML IDs, under the column name. The columns model and res_id indicate you the model where that XML ID record was stored and its ID. There is also a column named module, which can be used to put that before the XML ID extracted (column name), to get the module_name.xml_id notation.
For example:
I have a record from ir.ui.menu model with ID 303, and I want to get its XML ID from Javascript:
var Menus = new openerp.web.Model('ir.model.data');
Menus.query(['name']).filter(['&', ['model', '=', 'ir.ui.menu'], ['res_id', '=', 303]]).all().then(function(ir_model_datas) {
for (i in ir_model_datas) {
console.log(ir_model_datas[i].name);
}
});
How to add one to many relations in parse pointer, please check this screenshot:
It's a pointer and i can only add one pointer per row (one-to-one) i cant add one to many relationship to this using js it throws error.
{"code":111,"error":"invalid type for key members, expected *_User, but got array"}
And I don't wanna use parse 'Relation' type column due to querying 'Relation' is complicated.
If you want to have multiple-pointer column, just create array column and fill it with _User objects. Result will look like this and it will really be array of pointers:
[{"__type":"Pointer","className":"_User","objectId":"kIg9Kzzls9"},
{"__type":"Pointer","className":"_User","objectId":"TGCBZm52zW"},
{"__type":"Pointer","className":"_User","objectId":"YfGT9GvJs6"}]
Using include to get full objects in query also works, it is an array of pointers.
I have bump into this error before.
It is because the <Type> of your member column is already automatically set to *_User. If you try to set a many to many relation into the same column, it just don't work since the type are different.
You can solve the issue by manually deleting the column in your dashboard and set it in your code again.
I want to query object from Parse DB through javascript, that has only 1 of some specific relation object. How can this criteria be achieved?
So I tried something like this, the equalTo() acts as a "contains" and it's not what I'm looking for, my code so far, which doesn't work:
var query = new Parse.Query("Item");
query.equalTo("relatedItems", someItem);
query.lessThan("relatedItems", 2);
It seems Parse do not provide a easy way to do this.
Without any other fields, if you know all the items then you could do the following:
var innerQuery = new Parse.Query('Item');
innerQuery.containedIn('relatedItems', [all items except someItem]);
var query = new Parse.Query('Item');
query.equalTo('relatedItems', someItem);
query.doesNotMatchKeyInQuery('objectId', 'objectId', innerQuery);
...
Otherwise, you might need to get all records and do filtering.
Update
Because of the data type relation, there are no ways to include the relation content into the results, you need to do another query to get the relation content.
The workaround might add a itemCount column and keep it updated whenever the item relation is modified and do:
query.equalTo('relatedItems', someItem);
query.equalTo('itemCount', 1);
There are a couple of ways you could do this.
I'm working on a project now where I have cells composed of users.
I currently have an afterSave trigger that does this:
const count = await cell.relation("members").query().count();
cell.put("memberCount",count);
This works pretty well.
There are other ways that I've considered in theory, but I've not used
them yet.
The right way would be to hack the ability to use select with dot
notation to grab a virtual field called relatedItems.length in the
query, but that would probably only work for me because I use PostGres
... mongo seems to be extremely limited in its ability to do this sort
of thing, which is why I would never make a database out of blobs of
json in the first place.
You could do a similar thing with an afterFind trigger. I'm experimenting with that now. I'm not sure if it will confuse
parse to get an attribute back which does not exist in its schema, but
I'll find out, by the end of today. I have found that if I jam an artificial attribute into the objects in the trigger, they are returned
along with the other data. What I'm not sure about is whether Parse will decide that the object is dirty, or, worse, decide that I'm creating a new attribute and store it to the database ... which could be filtered out with a beforeSave trigger, but not until after the data had all been sent to the cloud.
There is also a place where i had to do several queries from several
tables, and would have ended up with a lot of redundant data. So I wrote a cloud function which did the queries, and then returned a couple of lists of objects, and a few lists of objectId strings which
served as indexes. This worked pretty well for me. And tracking the
last load time and sending it back when I needed up update my data allowed me to limit myself to objects which had changed since my last query.