PactumJS Data:Template #OVERRIDE# limited to top-level JSON - javascript

Having a real problem with overriding a field in a data template.
It works fine with top-level JSON fields, but second level or nested fields are out of scope.
I have a request body that looks like this:
{
"method": "validateUserEmail",
"parameters": {
"email": "email#addr.ess"
}
}
stash.addTemplate():
stash.addDataTemplate({
'Generic1ParamRequestBody': {
"method": "validateUserEmail",
"parameters": {
"email": ""
}
}
});
**call to OVERRIDE method field:**
.withJson({
'#DATA:TEMPLATE#': 'Generic1ParamRequestBody',
'#OVERRIDES#': {
'method': 'validateUserEmail' //WORKS
},
**call to OVERRIDE email field: **
.withJson({
'#DATA:TEMPLATE#': 'Generic1ParamRequestBody',
'#OVERRIDES#': {
'email': 'email#addr.ess' //DOESNT WORK
},
**All I get from the above is: **
"body": {
"method": "validateUserEmail",
"parameters": {
"email": ""
},
"email": "auto#api.test"
},
Its like its not smart enough to look for email field on level 2 of nesting.
I've tried jsonpath (parameters.email) and changing the entire parameters field with JSON.stringify(parameters: { email: email#addr.ess}); But no luck at all.
Can anyone spot anything I am missing or doing daftly (instead of deftly)

You are missing the parameters filed.
.withJson({
'#DATA:TEMPLATE#': 'Generic1ParamRequestBody',
'#OVERRIDES#': {
'parameters': {
"email": "abc"
}
}
})

Related

Elasticsearch, how to get a search result in all circumstances

I've been dealing with a project. My goal is to get result from search engine in all circumstances for example, although i enter a keyword which is not include the keys inside data or is a empty string, I still need to get some result.How can i reach my goal?
you can see the query below :
query: {
regexp: {
title: "something to not found .*",
},
Try use "prefix" or "query_string"
You also can use title.keyword for exact value
1 -
{
"query": {
"prefix": {
"title": {
"value": "<data>"
}
}
}
}
2 -
{
"query": {
"query_string": {
"default_field": "title",
"query": "<data>*^0"
}
}
}

google docs api documents.get multiple fields ? (nodejs)

How can I select multiple fields when using the documents.get ?
Right now I am getting the documenmt like this:
const doc = await docs.documents.get({
documentId: copiedFile.data.id,
fields: 'body/content'
});
which returns this:
"data": {
"body": {
"content": [...]
}
}
However, I need to also get the inlineObject and the only way so far I have been able to do so, is by removing the fields proeprty completely
const doc = await docs.documents.get({
documentId: copiedFile.data.id,
});
Then I get this:
"data": {
"title": "Some document title",
"body": {
"content": [...]
},
"headers": {
},
"footers": {
},
"documentStyle": {
},
"namedStyles": {
},
"lists": {
},
"revisionId": "some-long-id",
"suggestionsViewMode": "SUGGESTIONS_INLINE",
"inlineObjects": {
},
"documentId": "some-long-id"
}
But I am really only interested in data.body.content and data.inlineObjects
When selecting everything the response is many thousands of lines of json larger, which I don't want.
I have tried fields: ['body/content', 'inlineObjects'] but that only returns body.content and not the inlineObjects - also the documentation doesn't mention this syntax anywhere, it was just to experiment.
I think it doesn't return any inlineObjects when you don't have any inlineObjects in the document. To confirm if the actual format is working and the statement above is true, try using other fields where a value is confirmed to be returned such as revisionId or title.
Test:
const doc = await docs.documents.get({
documentId: copiedFile.data.id,
fields: 'body/content,inlineObjects'
});
Output:

What is the correct way to handle validation with json-schema in objection.js?

I want to understand if using json-schema as my main validation tool would be a good choice. The main thing I'm not sure about is whether I should use it to validate the query params in my API besides it being used to validate DB input.
I'll try to provide a better explanation with code examples below:
Let's start with a json-schema for the User model:
{
"$schema": "http://json-schema.org/draft-07/schema",
"type": "object",
"title": "User",
"description": "User schema",
"required": ["email", "full_name", "password"],
"additionalProperties": false,
"properties": {
"id": {
"$id": "#/properties/id",
"title": "User ID",
"type": "integer"
},
"email": {
"$id": "#/properties/email",
"title": "User email. Must be unique",
"type": "string",
"format": "email"
},
"password": {
"$id": "#/properties/password",
"title": "Hashed password for the user",
"type": "string",
"maxLength": 128,
"pattern": "^(?=.*[a-z])(?=.*[A-Z])(?=.*\\d)(?=.*[#$!%*?&])[A-Za-z\\d#$!%*?&]{8,}$"
},
"full_name": {
"$id": "#/properties/full_name",
"title": "User first name and last name",
"type": "string",
"maxLength": 128,
"pattern": "^[a-zA-Z]+(?:\\s[a-zA-Z]+)+$"
},
"image_url": {
"$id": "#/properties/image_url",
"title": "URL to user image",
"type": "string"
},
"created_at": {
"$id": "#/properties/created_at",
"title": "The creation date of the user",
"type": "string"
},
"updated_at": {
"$id": "#/properties/updated_at",
"title": "The date the user was last updated",
"type": "string"
}
}
}
As you can see, I'm using regex to validate the input for each field to ensure the format is correct. I can specify which fields are required which is very useful and I set additionalProperties to false which means that the schema/Objection will not accept properties that are not specified in the json schema.
Next let's take a look at an example of a registration API that I'm trying to use:
router.post("/registration", async (req, res, next) => {
try {
const { password, ...payload } = await User.query().insert(req.body);
const token = await jwt.sign(payload);
res.json({ user: payload, token });
} catch (error) {
next(error);
}
});
So there's no validation of the request body in the route itself, or really in any other place, I'm trying to delegate the validation entirely to json-schema. When the request comes in, the password is not hashed so it can pass the validation, but then I need a way of storing the hashed password.
Currently I'm using this solution but I'm not sure if it's smart/safe?
// User model
async $beforeInsert(queryContext) {
this.$setJson(
{ password: await bcrypt.hash(this.password, 12) },
{ skipValidation: true }
);
await super.$beforeInsert(queryContext);
}
This would enable the following:
validation to check for the correct params (full_name, email, password) and test whether the values are correct
after the validation passes, update the model with the hashed password and skip another validation as it's already been run
insert (safe?) data in the db
Now let's look at the login route:
router.post("/login", async (req, res, next) => {
try {
User.fromJson(req.body, { skipRequiredFields: ["full_name"] });
const { email, password } = req.body;
const user = await User.query().where({ email }).first();
if (!user) {
res.status(403);
throw new Error("Invalid credentials");
}
if (!(await bcrypt.compare(password, user.password))) {
res.status(403);
throw new Error("Invalid credentials");
}
const payload = { id: user.id, email, full_name: user.full_name };
const token = await jwt.sign(payload);
res.json({ user: payload, token });
} catch (error) {
next(error);
}
});
Because I want to leave the validation to the json-schema and I need to check the request payload, I have to create a model which is redundant for the route itself, it's only used to trigger the validation. Another reason to do this is because .where({ email }) doesn't trigger the validation and it throws an error when email is undefined.
Code to remove specific required fields from the schema when needed:
// User model
$beforeValidate(jsonSchema, _, { skipRequiredFields }) {
if (skipRequiredFields) {
return {
...jsonSchema,
required: jsonSchema.required.filter(
(fieldName) => !skipRequiredFields.includes(fieldName)
),
};
}
}
This allows me to remove required fields from the validation schema. In this case it works perfectly for the /login route as I don't want the request body to have a full_name field.
So far this has been working for me but it feel more like a workaround rather than an actual solution. I am also not necessarily sure about the fact that I have to tap in into Objection hooks and override the password like that. The last thing I don't like is the fact that I have to create a redundant model to trigger the validation but I also understand that triggering a validation (or having the option to) on .when() doesn't make much sense.
I'm new to Objection and this is not a production project but rather a side-project which I'm using to explore Objection along side other frameworks/libraries and I'd like to know if there are better ways of doing this or maybe my thinking is entirely wrong and I should have a separate validation for the request body and leave the json-schema as a db validation only?

How do I query an index properly with Dynamoose

I'm using Dynamoose to simplify my interactions with DynamoDB in a node.js application. I'm trying to write a query using Dynamoose's Model.query function that will search a table using an index, but it seems like Dynamoose is not including all of the info required to process the query and I'm not sure what I'm doing wrong.
Here's what the schema looks like:
const UserSchema = new dynamoose.Schema({
"user_id": {
"hashKey": true,
"type": String
},
"email": {
"type": String,
"index": {
"global": true,
"name": "email-index"
}
},
"first_name": {
"type": String,
"index": {
"global": true,
"name": "first_name-index"
}
},
"last_name": {
"type": String,
"index": {
"global": true,
"name": "last_name-index"
}
}
)
module.exports = dynamoose.model(config.usersTable, UserSchema)
I'd like to be able to search for users by their email address, so I'm writing a query that looks like this:
Users.query("email").contains(query.email)
.using("email-index")
.all()
.exec()
.then( results => {
res.status(200).json(results)
}).catch( err => {
res.status(500).send("Error searching for users: " + err)
})
I have a global secondary index defined for the email field:
When I try to execute this query, I'm getting the following error:
Error searching for users: ValidationException: Either the KeyConditions or KeyConditionExpression parameter must be specified in the request.
Using the Dynamoose debugging output, I can see that the query winds up looking like this:
aws:dynamodb:query:request - {
"FilterExpression": "contains (#a0, :v0)",
"ExpressionAttributeNames": {
"#a0": "email"
},
"ExpressionAttributeValues": {
":v0": {
"S": "mel"
}
},
"TableName": "user_qa",
"IndexName": "email-index"
}
I note that the actual query sent to DynamoDB does not contain KeyConditions or KeyConditionExpression, as the error message indicates. What am I doing wrong that prevents this query from being written correctly such that it executes the query against the global secondary index I've added for this table?
As it turns out, calls like .contains(text) are used as filters, not query parameters. DynamoDB can't figure out if the text in the index contains the text I'm searching for without looking at every single record, which is a scan, not a query. So it doesn't make sense to try to use .contains(text) in this context, even though it's possible to call it in a chain like the one I constructed. What I ultimately needed to do to make this work is turn my call into a table scan with the .contains(text) filter:
Users.scan({ email: { contains: query.email }}).all().exec().then( ... )
I am not familiar with Dynamoose too much but the following code below will do an update on a record using node.JS and DynamoDB. See the key parameter I have below; by the error message you got it seems you are missing this.
To my knowledge, you must specify a key for an UPDATE request. You can checks the AWS DynamoDB docs to confirm.
var params = {
TableName: table,
Key: {
"id": customerID,
},
UpdateExpression: "set customer_name= :s, customer_address= :p, customer_phone= :u, end_date = :u",
ExpressionAttributeValues: {
":s": customer_name,
":p": customer_address,
":u": customer_phone
},
ReturnValues: "UPDATED_NEW"
};
await docClient.update(params).promise();

Multi-field search in the elasticsearch-js library

I am using elasticsearch (managed instance from searchly) with the elasticsearch-js npm client library. I want to search multiple fields in my index from a term. There seems to be loads of documentation on this, e.g.
GET /_search
{
"query": {
"bool": {
"should": [
{ "match": { "title": "War and Peace" }},
{ "match": { "author": "Leo Tolstoy" }}
]
}
}
}
where I could just set the same value for author and title. However, this is a get request and is structured differently to the nodejs library, where I am doing this:
this._client.search({
index: 'sample',
body: {
query: {
match: {
name: 'toFind'
}
}
}
}).then(function (resp) {
var hits = resp.hits.hits;
}, function (err) {
console.trace(err.message);
});
I can't have multiple match: fields otherwise tsc complains about strict mode, and if I try something like:
query: {
match: {
name: 'toFind',
description: 'toFind'
}
}
then I get an error:
"type": "query_parsing_exception",
"reason": "[match] query parsed in simplified form, with direct field name, but included more options than just the field name, possibly use its \u0027options\u0027 form, with \u0027query\u0027 element?",
Since you want to match same string on multiple fields you need multi match query. Try something like this
this._client.search({
index: 'sample',
body: {
query: {
multi_match: {
query: 'toFind',
fields: ['name','description']
}
}
}
}).then(function (resp) {
var hits = resp.hits.hits;
}, function (err) {
console.trace(err.message);
});

Categories