I have a couple of JS syntax questions.
First, what's a good resource to get up to speed with JS syntax similar to those below?
In the code, are Q1 and Q2 labels? Also, what is the ... for?
const Q1: Query = {
'isChild': {
$ne: true,
},
};
const Q2: Query = {
...Q1,
'isL': true,
'stat': {
$in: ['1', '2', '3', '4'],
},
};
Below, is : Promise<Event> similar to a then statement?
async update(event: Event): Promise<Event> {
debug(`Updating event`, event);
const { id, ...fields } = event;
invariant(!!id, 'id is required');
const fieldsWithTimestamps = withTimestamps<EventFields>(fields);
debug(`Update ${id}`, fieldsWithTimestamps);
await collection.updateOne({ _id: id }, fieldsWithTimestamps);
return { id, ...fieldsWithTimestamps };
}
Thanks for your help!
For learning the syntax of a language quickly, I'm a fan of LearnXinYminutes. Here's their page for javascript.
Moreover, your snippets are written in TypeScript which is a superset of JavaScript that adds type annotations. Thats what the : Promise<Event> syntax is. It is a type annotation indicating the return type of the update function as a Promise<event>.
Finally, Q1 and Q2 are objects, and the ... syntax is called a spread operator.
All this is relatively well known javascript (barring the typescript annotations), and you should familiarize yourself with the language elsewhere rather than asking here. StackOverflow works better the more specific your question is.
Related
I am developing an application that has a quite sizeable amount of Queries and Mutation. Structures for data are often not complex, but there is plenty of them, so I have made myself a snippet, that generates the most common things repeating throughout them. This snippet also generates an input for mutations so it can be used for both simple and complex data structures. In quite a bit of instances, the input is just for adding a name. The API is supposed to be used mainly by my fronted, but after the app gets mature enough should be publicly available. Is doing this a problem in terms on conventions?
Sample of what I mean
/*=============================================
Types
=============================================*/
interface AddSampleSchemaInput {
input: AddSampleSchema
}
interface AddSampleSchema {
name: string
}
/*=============================================
Main
=============================================*/
export const SampleSchemaModule = {
typeDefs: gql`
type Mutation {
addSampleSchema(input: AddSampleSchemaInput): SampleSchema!
}
type SampleSchema {
_id: ID!
name: String!
}
input AddSampleSchemaInput {
name: String!
}
`
,
resolvers: {
Mutation: {
addSampleSchema: async (parents: any, args: AddSampleSchemaInput, context: GraphqlContext) => {
}
}
}
}
Sample of what I assume it should be.
/*=============================================
Main
=============================================*/
export const SampleSchemaModule = {
typeDefs: gql`
type Mutation {
addSampleSchema(name: String): SampleSchema!
}
type SampleSchema {
_id: ID!
name: String!
}
`
,
resolvers: {
Mutation: {
addSampleSchema: async (parents: any, args: { name: string }, context: GraphqlContext) => {
}
}
}
}
export default SampleSchemaModule
Would usage of the first code example be a problem. This means using input (input AddSampleSchemaInput), even if it were to contain just a single value (in this case name).
Or in other words is using input for every mutation a problem no matter the complexity.
Or the impact on frontent:
addDogBreed({
variables: {
input: {
name: "Retriever",
avergeHeight: 0.65
}
}
})
addDog({
variables: {
input: {
name: "Charlie"
}
}
})
// ======= VS =======
addDogBreed({
variables: {
input: {
name: "Retriever",
avergeHeight: 0.65
}
}
})
addDog({
variables: {
name: "Charlie"
}
})
In this case, is having the first one instead of the second one a problem?
Is having an input that only contains one key is something problematic?
No, on the contrary, it is something desirable in GraphQL. While nesting may sometimes seem superfluous, it is key in forward compatibility and extensibility of your schema. You should not have different conventions of how to design your mutation arguments depending on the number of inputs. If you always use an input object, you can easily deprecate existing fields or add new optional fields and stay compatible with all existing clients. If you were to completely change the shape of the mutation arguments just because you have an object with a single key, it would break compatibility.
I'm not seeing a problem that would drive you to
"only use GraphQL when dealing with Fetching / Get Data, and normal
REST API Request for mutating data (create, update, delete)."
Like #Bergi said. Plus you can provide your entity with multiple mutators some which can work like a PATCH or a PUT request.
I'm on JOI 14 and can't seem to find upgrade guides to move towards 17. I see people posting similar questions for JOI 16, but the last update was 3 months ago. It doesn't look like type was required back in 16 based on what I see in How to add custom validator function in Joi?.
I am looking at https://joi.dev/api/?v=17.3.0#extensions and the description of type is The type of schema. Can be a string, or a regular expression that matches multiple types..
I tried something like this:
const snakeAlpha = joi => {
return {
type: 'object',
name: 'snakeAlpha',
base: joi.string().regex(/^[a-z]+(_[a-z]+)*$/)
};
};
const customJoi = Joi.extend({
type: 'object',
rules: {
snakeAlpha
}
});
It gives me this error:
ValidationError: {
"type": "object",
"rules": {
"snakeAlpha" [1]: "[joi => {\n return {\n type: 'object',\n name: 'snakeAlpha',\n base: joi.string().regex(/^[a-z]+(_[a-z]+)*$/)\n };\n}]"
}
}
[1] "rules.snakeAlpha" must be of type object
I am confused since said object. I also tried string since that's what the base is, but it had same error message.
Update
I also realize the original example only covered one simple rule that isn't referencing joi (regex). I also have validators that reference other custom ones lke the below. Bonus points to solve this case too.
const arrayKebabAlpha = joi => {
return {
type: 'string',
name: 'arrayKebabAlpha',
base: joi.array().items(joi.kebabAlpha())
};
};
The documentation for Joi extensions is disappointingly lacklustre for such a useful feature. Fortunately a lot of Joi's core is written using extensions so a lot can be learned from looking at the source.
If I were to write your rule as an extension it'd be like this:
const customJoi = Joi.extend(joi => ({
type: 'string',
base: joi.string(),
messages: {
'string.snakeAlpha': '{{#label}} must be snake case'
},
rules: {
snakeAlpha: {
validate(value, helpers)
{
if (!/^[a-z]+(_[a-z]+)*$/.test(value))
{
return helpers.error('string.snakeAlpha', { value });
}
return value;
}
}
}
}));
Which can be used like:
customJoi.object().keys({
foo: customJoi.string().snakeAlpha()
});
UPDATE
Whether this is the correct way of working with dependant extensions, I'm not sure, but this is how I typically handle them...
I first define my extensions in an array ensuring dependant extensions are defined first. Then I'll iterate through the array re-using the previous customJoi instance so the next extension includes those defined before it. A simple working example will probably explain better than I can put into words!
(I've also simplified the extensions to be more inline with how you're used to using them)
const Joi = require('joi');
let customJoi = Joi;
const extensions = [
joi => ({
type: 'snakeAlpha',
base: joi.string().regex(/^[a-z]+(_[a-z]+)*$/)
}),
// this instance of 'joi' will include 'snakeAlpha'
joi => ({
type: 'kebabAlpha',
base: joi.string().regex(/^[a-z]+(-[a-z]+)*$/)
}),
// this instance of 'joi' will include 'snakeAlpha' and 'kebabAlpha'
joi => ({
type: 'arrayKebabAlpha',
base: joi.array().items(joi.kebabAlpha())
})
];
extensions.forEach(extension =>
customJoi = customJoi.extend(extension));
customJoi.assert([ 'hello-world' ], customJoi.arrayKebabAlpha());
Knex's documentation for transactions has code that looks like this:
knex.transaction(function(trx) {
var books = [
{title: 'Canterbury Tales'},
{title: 'Moby Dick'},
{title: 'Hamlet'}
];
return trx
.insert({name: 'Old Books'}, 'id')
.into('catalogues')
.then(function(ids) {
return Promise.map(books, function(book) {
book.catalogue_id = ids[0];
// Some validation could take place here.
return trx.insert(info).into('books');
});
});
})
Here on SO I've seen extensive use of a function transacting() with examples that look like this:
knex.transaction(function(trx) {
knex('foo')
.transacting(trx)
.insert({id:"bar", username:"bar"})
// etc
})
Knex describes transacting() with examples similar to above:
Used by knex.transaction, the transacting method may be chained to any query and passed the object you wish to join the query as part of the transaction for.
My question is:
What is the difference between trx.insert().into('foo') and knex('foo').transacting(trx).insert() and why would you use one instead of the other?
It is convenient to use .transacting(trx) when you want to perform multiple operations in the same transaction:
knex.transaction(function (trx) {
return Promise.all([
knex('foo').insert({ name: 'My Name' }).transacting(trx),
knex('bar').insert({ field: 'Value' }).transacting(trx)
])
// ---- or something like ----
return Promise.all(SOME_INPUT_VALUES.map(function (value) {
return knex('foo_bar').update('lul', value.lul).where('id', value.id).transacting(trx)
}))
})
Don't know really of a particular usage of the other method. It might be just a matter of style. You got two interfaces and you can pick one whatever you like most. As for me, I'm used to .transacing(trx)
I typically follow Airbnb's ESLint configuration and I noticed I was getting throwing errors for using For...In loops? Why is this the case?
I've read into it a bit Here, but I would like a more detailed explanation or example.
const data = {
color : 'blue',
movies : 'action',
hobby : 'football',
};
for (let prop in data) {
console.log(`prop: ${prop} and value is ${data[prop]}`);
}
//Throws Guarding for in, should be wrapped in an if statement to
//filter unwated properties from the prototype.
Object.keys(data).forEach((element) => {
console.log(`prop: ${element} and value is ${data[element]}`);
});
//This is Okay
I have Schema defined in Mongoose and I just realized one attribute is being saved as object (kind of hash), but it can contain prohibited characters in it's keys. By prohibited I mean those which are not very much liked by MongoDB, causing not okForStorage errors: dots, dollar signs, etc.
As I don't want to change all my application, I want to define something on my model which reformats the object to array before passing it to MongoDB and, of course, I need also something reformatting it back when loading such data from MongoDB.
I tried getters and setters and played a while with Middleware, but could not make it working. Is there a best practise on this? What would be the best approach? I really wish I could just stick two functions somewhere on the schema and it would be pure blackbox for the rest of my app.
UPDATE: What I want to achieve (example):
toMongo = function (mapping) {
// from {'k': 'v', ...} makes [{key: 'k', value: 'v'}, ...]
return ...
}
fromMongo = function (mapping) {
// from [{key: 'k', value: 'v'}, ...] makes {'k': 'v', ...}
return ...
}
schema = mongoose.Schema({
mapping: mongoose.Schema.Types.Mixed
});
var Foo = mongoose.model('Foo', schema);
var foo = new Foo({ mapping: {'tricky.key': 'yes', 'another$key': 'no'} });
foo.mapping // results in {'tricky.key': 'yes', 'another$key': 'no'}
foo.save(function(err, doc) {
// mapping is actually saved as
// [{key: 'tricky.key', value: 'yes'}, {key: 'another$key', value: 'no'}] in mongo!
doc.mapping // results in {'tricky.key': 'yes', 'another$key': 'no'}
});
Foo.find(function (err, foos) {
foos[0].mapping // results in {'tricky.key': 'yes', 'another$key': 'no'}
});
The question is: Where should I hook my two magic functions toMongo and fromMongo so the interface works exactly as I shown in the example?
(Disclaimer: At the time of this question is asked, I am Mongoose & Node.js noob, so even silly details could be helpful to me)
I think I found the answer myself. It can be solved with Middlewares, this way:
schema.post('init', function (doc) {
doc.mapping = fromMongo(doc.mapping);
}
schema.pre('save', function (next) {
this.mapping = toMongo(this.mapping);
next();
}
This way it's pretty isolated from the rest of the app and so far I didn't have any problems with this solution. I'll try to keep updating this answer in case any problems rise up.