I am new to dynamoDB and trying to perform some basic operations to learn the subject.
I have successfully created a table using AWS SDK (so no credentials issue) like this:
const newTable = async () => {
//* it's working!!!
try {
const params = {
AttributeDefinitions: [
{
AttributeName: 'email',
AttributeType: 'S',
},
{
AttributeName: 'password',
AttributeType: 'S',
},
],
KeySchema: [
{
AttributeName: 'email',
KeyType: 'HASH',
},
{
AttributeName: 'password',
KeyType: 'RANGE',
},
],
ProvisionedThroughput: {
ReadCapacityUnits: 5,
WriteCapacityUnits: 5,
},
TableName,
StreamSpecification: {
StreamEnabled: false,
},
};
const command = new CreateTableCommand(params);
const data = await client.send(command);
console.log(data);
} catch (err) {
console.log(err);
}
};
I inserted a new item into the table using the AWS console, and now I'm trying to access it using the SDK as follows:
const getItem = async () => {
try {
const params = {
TableName,
Key: {
email: { S: 'ofer#email.com' },
},
};
const command = new GetItemCommand(params);
const response = await client.send(command);
console.log(response);
} catch (err) {
console.error(err);
}
};
When I try to run the code, the following error is received: "ValidationException: The provided key element does not match the schema"
I couldn't figure out where my mistake was
Since you have a composite key, both HASH and RANGE keys need to be specified when getting an item. Both email and password in your case.
For the primary key, you must provide all of the attributes. For
example, with a simple primary key, you only need to provide a value
for the partition key. For a composite primary key, you must provide
values for both the partition key and the sort key.
https://docs.aws.amazon.com/amazondynamodb/latest/APIReference/API_GetItem.html#DDB-GetItem-request-Key
As a side note, you are unlikely to want to make a password a RANGE key.
Related
I'm using a npm package (tcmb-doviz-kuru) to get the currency data from api and I'm trying to insert my database. Hovewer i couldnt map the data to insert the values. Here is my code:
var tcmbDovizKuru = require('tcmb-doviz-kuru');
var pg = require('pg');
function cb(error, data) {
if (error) {
console.log('error', error)
}
insertToDatabase(data);
}
function insertToDatabase(data){
var pgClient = new pg.Client(connectionString);
pgClient.connect();
const text = 'INSERT INTO currency.kur("date", forexBuying, forexSelling, banknoteBuying, banknoteSelling, unit, isim, currencyCode)VALUES(CURRENT_TIMESTAMP,$1,$2,$3,$4,$5,$6,$7)'
const values = data['tarihDate']['currency'];
pgClient.query( text, values).then(res => {
console.log("inserted")
})
.catch(e => console.error("error"))
tcmbDovizKuru(cb);
}
and an example of the data from api (data['tarihDate']['currency']):
{
attributes: { crossOrder: '0', kod: 'USD', currencyCode: 'USD' },
unit: 1,
isim: 'ABD DOLARI',
currencyName: 'US DOLLAR',
forexBuying: 8.2981,
forexSelling: 8.313,
banknoteBuying: 8.2922,
banknoteSelling: 8.3255,
crossRateUSD: null,
crossRateOther: null
},
{
attributes: { crossOrder: '1', kod: 'AUD', currencyCode: 'AUD' },
unit: 1,
isim: 'AVUSTRALYA DOLARI',
currencyName: 'AUSTRALIAN DOLLAR',
forexBuying: 6.1339,
forexSelling: 6.1739,
banknoteBuying: 6.1057,
banknoteSelling: 6.211,
crossRateUSD: 1.3496,
crossRateOther: null
}
]
How can i insert the values each ?
The data you are inserting is an object, but the $1 $2.. you are using expects an array.
See What does the dollar sign ('$') mean when in the string to .query?
So you would need to map the object from each of the entries of "data['tarihDate']['currency']"
To insert for the US DATA from your example data
const usdInfo = data['tarihDate']['currency'][0];
const valuesArray = [usdInfo.forexBuying, usdInfo.forexSelling, banknoteBuying, banknoteSelling, unit, isim, currencyCode];
pgClient.query( text, valuesArray ).then(res => {
console.log("inserted", res)
})
.catch(e => console.error("error", e))
I have a DynamoDB table with the following attributes:
SubscriptionsTable:
Type: AWS::DynamoDB::Table
DeletionPolicy: Retain
Properties:
TableName: ${self:custom.subscriptionsTableName}
AttributeDefinitions:
- AttributeName: eventName
AttributeType: S
- AttributeName: hookUrl
AttributeType: S
KeySchema:
- AttributeName: eventName
KeyType: HASH
- AttributeName: hookUrl
KeyType: RANGE
I would like to delete an item according to its hookUrl and eventName. That's the code of my repository:
const deleteSubscriptionFromRepository = async ({
hookUrl,
eventName,
}: {
hookUrl: string;
eventName: SubscriptionEventName;
}) => {
const mapper = getDatabaseMapper();
try {
return await mapper.delete(Object.assign(new SubscriptionModel(), { hookUrl, eventName }));
} catch (e) {
throw new DatabaseAccessError(e.message, 'deleteSubscriptionFromRepository');
}
};
Instead of using directly the DynamoDB client, I'm using the DataMapper from '#aws/dynamodb-data-mapper';
But I'm receiving the following error:
DatabaseAccessError: The number of conditions on the keys is invalid
How is that possible to delete an item according to its partition and range key? What I'm doing wrong in the above code?
When I'm adding docs to elasticsearch with _id set I get:
Field [_id] is a metadata field and cannot be added inside a document. Use the index API request parameters.
Using client.bulk
const body = dataset.flatMap(doc => [{ index: { _index: 'myindex' } }, doc])
const { body: bulkResponse } = await client.bulk({ refresh: true, body })
I don't see a place to put the _id in the parameters.
https://www.elastic.co/guide/en/elasticsearch/client/javascript-api/current/api-reference.html
Am I supposed to use a different method?
Thanks.
It needs to be inside the command part, but you also need to remove it from the source document in doc:
here
|
v
const body = dataset.flatMap(doc => [{ index: { _index: 'myindex', _id: doc._id } }, doc])
const { body: bulkResponse } = await client.bulk({ refresh: true, body })
This is one of the fields of StringSet type that is returned from DynamoDb.
permissions:
Set {
wrapperName: 'Set',
values:
[ 'BannerConfigReadOnly',
'CampaignBannerCreate',
'CampaignPromoCreate',
'CampaignReadOnly',
'MasterplanReadOnly',
'SegmentCreate',
'SegmentDownload',
'SegmentUpload' ],
type: 'String' }
}
Now, I am using aws.DynamoDB.Converter.unmarshal function to get it in this format
permissions: ['BannerConfigReadOnly',
'CampaignBannerCreate',
'CampaignPromoCreate',
'CampaignReadOnly',
'MasterplanReadOnly',
'SegmentCreate',
'SegmentDownload',
'SegmentUpload']
But, this is what i get
{}
Any ideas what, I may be doing wrong.
This is my code
const aws = require('aws-sdk');
const documentClient = new aws.DynamoDB.DocumentClient();
documentClient.scan(params, (err, data) => {
if (err) {
reject(err);
} else {
let processedItems = [...data.Items];
var test = aws.DynamoDB.Converter.unmarshall(processedItems[0].permissions);
console.log(`test is ${JSON.stringify(test)}`);
}});
ProcessedItems[0] is this
{ email: 'abc#gmail.com',
tenant: 'Canada',
permissions:
Set {
wrapperName: 'Set',
values:
[ 'BannerConfigReadOnly',
'CampaignBannerCreate',
'CampaignPromoCreate',
'CampaignReadOnly',],
type: 'String' } }
That data is already unmarshalled since you are using the DocumentClient. Consider just using processedItems[0].permissions.values to get the values of the set.
I am very new to GraphQL. I'm trying to pass an object like this one as an argument:
{
filters: {
status: 'approved',
id: {
LESS_THAN: 200
}
}
}
Or this object can be like this either;
{
filters: {
status: ['approved', 'pending'],
id: 200
}
}
I know all properties that can be in this object, but all of these properties can be a string/int or an object.
I tried to define it like this but it obviously didn't work:
args: {
filters: { type: new GraphQLNonNull(new GraphQLNonNull(GraphQLString)) },
},
I'm trying to define the argument with a GraphQL type GraphQLInputObjectType.
const OffersFiltersType = new GraphQLInputObjectType({
name: 'Filters',
description: '...',
fields: () => ({})
id: {
type: new GraphQLNonNull({
name: 'Id',
description: '...',
fields: {
}
}),
resolve: (offer) => offer.id
},
}),
});
But how can i specify to this type that my id can be either a int or an object?
This is my Query definition:
const QueryType = new GraphQLObjectType({
name: 'Query',
description: '...',
fields: () => ({
offers: {
type: OffersType,
args: {
limit: { type: GraphQLInt },
page: { type: GraphQLInt },
sort: { type: GraphQLString },
filters: { [HERE] }
},
resolve: (root, args, context, info) => {
const gqlFields = graphqlFields(info);
const fields = Object.keys(gqlFields.offer);
const queryArgs = args;
queryArgs.fields = fields;
return getOffers(queryArgs);
}
},
}),
});
And this is my request with superagent
const getOffers = (args) => {
const queryArgs = args;
if (typeof queryArgs.limit !== 'undefined') {
queryArgs.limit = args.limit;
} else {
queryArgs.limit = Number.MAX_SAFE_INTEGER;
}
return new Promise((fulfill, reject) => {
request
.get(API_URL)
.query(qs.stringify(args))
.end((err, res) => {
if (err) {
reject(err);
}
fulfill(res);
});
});
};
I need this object to construct a query in my resolve function. Thank you all for your help! I only need simple advices!
This is not allowed, by design: https://github.com/graphql/graphql-js/issues/303
GraphQL does not support unknown property names, largely because it would make the schema meaningless. The example given is a simple typo:
If you have the query query ($foo: String) { field(arg: $foo) } and the variables { "fooo": "abc" }, we currently flag this as an error, but we could potentially miss this typo if we did not raise errors.
The schema is meant to ensure compatibility between servers and clients, even across versions, and allowing unknown properties would break that.
There is a merge request open for this in the GraphQL-JS repo, but it is still being debated and has the same problems with typos and general inconsistency.
The idea of returning a primitive or object runs into a similar problem. When accepting an object, you need to list the properties you're expecting and the query will validate those against the schema. The properties, and their types and null-ness, must be known ahead of time for you (and the parser) to build the query and definitely need to be known when you validate.
If you could accept a primitive or object, you would have to specify the fields on that object, but those could not possibly exist on the primitive. That's a problem.