I'm really new to Sequelize, but I've used another ORM in another language for a long time. I'm trying to figure out how the magic stuff behind getting model timestamps to be managed automatically, and I feel like I'm going around in circles.
What I'd like is for Sequelize to manage the job of knowing when to update "createdAt" and "modifiedAt" for me. However, I would strongly prefer that it would use statement_timestamp() (mostly out of long habit and the advice of a much smarter database person than me). I don't want important stuff like those to rely on the client clocks.
My previous ORM just maintained two prepared statements (per table), one for creation and one for update, and the queries just updated the right fields the right way because they were coded like that. Of course I could write raw SQL but that kind-of defeats the purpose of using something like Sequelize.
I've found some material online that seems to suggest that if I want to do what I'm talking about, I'm on my own, because would have to turn timestamps off on the models. That linked post is about MySQL, which has some trick to make that work.
I would love to post code, but it's a chicken and egg issue. Oh, also: I am creating my schema manually with scripts I run through psql. In the scripts I define the columns like I have been doing for years:
CREATE TABLE unit_type (
utyp_id serial NOT NULL PRIMARY KEY,
utyp_name varchar(32) NOT NULL,
utyp_description varchar(512) NOT NULL,
created timestamp without time zone NOT NULL DEFAULT statement_timestamp(),
modified timestamp without time zone NOT NULL DEFAULT statement_timestamp(),
version SMALLINT NOT NULL DEFAULT 0
);
(I don't care about the column names; "createdAt" would be fine.) My old ORM didn't pay attention to the column defaults; the timestamp stuff was an intrinsic feature of it. (It was home-made so it has no name. It worked though.)
My current code via Sequelize all works fine, but I'm interested in getting the real database stuff worked out, and I just don't understand how the timestamp feature is supposed to work.
You can overwrite the Sequelize's default timestamp function with hooks.
const sequelize = new Sequelize(
...
{
host: 'localhost',
port: 5432,
dialect: 'postgres',
hooks : {
beforeCreate : (record, options) => {
record.dataValues.createdAt = Sequelize.fn('statement_timestamp');
record.dataValues.updatedAt = Sequelize.fn('statement_timestamp');
},
beforeUpdate : (record, options) => {
record.dataValues.updatedAt = Sequelize.fn('statement_timestamp');
}
}
}
);
Related
I have a query which uses fetchMore and the relayPagination which works fine for lazy loading when using the variables page and perPage, the issue now is I'm trying to refresh the query whenever i update the other variables for filtering like date and type which values can be either debited or credited. It fetches the data, but then appends the incoming data to apollo cache instead of replacing the old data with the new one.
for example in this sample typePolicy
PaginatedBooks: {
fields: {
allBooks: {
merge: relayPagination()
}
}
}
AllProducts: {
// Singleton types that have no identifying field can use an empty
// array for their keyFields.
keyFields: [],
},
I only want to reset PaginatedBooks
I've tried use fetchPolicy of no-cache but this stops pagination and fetchMore from working as i can't merging existing and incoming data. I opted to use client.resetStore(): https://www.apollographql.com/docs/react/api/core/ApolloClient/#ApolloClient.resetStore
but this also refetches other active queries and causes the ui to flicker. So far looking through the documentation and github repo I can't seem to find anything or anyone who has tried to do something similar, so I'm hoping I can get some insight and probably be offered a better solution. Thanks in advance
I will try to be as clear as possible but please ask if I am missing some detail.
So I have an application that I've inherited and it's running slow. It was built with Angular and Typescript on the front end, Loopback handling our CRUD operations and a Mongo DB. We isolated one of the reasons for the slowness as excess REST calls (potentially tens of thousands at times) so I am looking at reducing this. The first one I have found is a relatively small one so I hoped it'd help me to learn but I've already hit a challenge. Descriptions, table names and finer detail has been removed for simplicity.
On the first page, we present to the user a list of links to areas that they have access to. This is stored in an "Areas" table. We pull back all of the areas in a function similar to this:
this.ourApi.areas.find({
"filter": {
"include": ["generalInfo"]
/*
This table is related to most tables and
holds things like version numbers - it
is not relevant to this question
*/
}
}).$promise.then((returnedAreas) => {
/* Stuff */
});
After we have all the areas, we then have to get some configuration values for these area links (icons, mostly). Our configuration table holds configuration data for a number of items, not just Areas. So to get the relevant config information, we loop through the returnedAreas and, for each area, get the config where the relatedId matches the areaId. A bit like this:
return this.ourApi.configs.find({
"filter": {
"where": {
"relatedId": areaId
},
"include": ["generalInfo"]
}
}).$promise.then((areaConfigurations) => {
/* Stuff */
});
Ideally, I'd like to compress this down to just one call (and then mimic this pattern throughout the application). The only thing we really care about (and the only objects we actually use) are the configuration objects so we really don't need the "Areas" objects at all. The challenge I am unable to solve is how I adjust that second call above (that gets the config) so that it first gets the Areas and then uses the areaId in the query that gets the data that I care about. In SQl, this would be quite easy but I can't figure it out from the docs.
Essentially, I am looking for the Loopback equivalent of:
SELECT * FROM configs WHERE relatedId IN (SELECT areaId FROM areas)
Is this possible?
I am using node.js with bookshelf as an ORM. I am a serious novice with this technology.
I have a situation where I have several columns in a database table. For the sake of this question, these columns shall be named 'sold_by_id', 'signed_off_by_id' and 'lead_developer_id', and are all columns that will reference a User table with an ID.
In other words, different User's in the system would at any point be associated with three different roles, not necessarily uniquely.
Going forward, I would need to be able to retrieve information in such ways as:
let soldByLastName = JobTicket.soldBy.get('last_name');
I've tried searching around and reading the documentation but I'm still very uncertain about how to achieve this. Obviously the below doesn't work and I'm aware that the second parameter is meant for the target table, but it illustrates the concept of what I'm trying to achieve.
// JobTicket.js
soldBy: function() {
return this.belongsTo(User, 'sold_by_id');
},
signedOffBy: function() {
return this.belongsTo(User, 'signed_off_by_id');
},
leadDeveloper: function() {
return this.belongsTo(User, 'lead_developer_id');
}
Obviously I would need a corresponding set of methods in User.js
I'm not sure where to start, can anyone point me in the right direction??
Or am I just a total idiot? ^_^
Your definitions look right. For using them it will be something like:
new JobTicket({ id: 33 })
.fetch({ withRelated: [ 'soldBy', 'signedOffBy' ] })
.then(jobTicket => {
console.log(jobTicket.related('soldBy').get('last_name');
});
Besides that I would recommend you to use the Registry plugin for referencing other models. That eases the pains of referencing models not yet loaded.
I've been searching a lot about Sails.js multi tenancy capabilities and I know that such a feature is not yet implemented. My initial idea was to build multi tenant app by creating one database per tenant.
Since I realized that I can't do such a thing in Sails.js yet, I tried a different aproach by creating only one database ( POSTGRES ) but with lots of schemas, each one representing a tenant. My problem is that I can't/I dunno ( don't even know if that is possible in Sails/Postgres adapter ) how to dynamically ( on runtime ) define what schema a given object should query aganist, based on the logged user.
Has anyone faced a problem like this? How can I proceed?
Sorry for English and thanks.
In my experience adding in the model does not work.
The only thing that worked for me was using the meta call to specify the schema.
await Users.create(newUser).meta({ schemaName: 'admin' });
A bit cumbersome, but it is working.
Hope this helps someone.
I thinks is an issue of the waterline sequel adapter, based in this answer.
The way to do it is add a property in the model
meta: {
schemaName: 'schema'
},
but is not working, you can't define multiple schemas, only takes the user as an schema, if the property schema is set in true ins the config/models.js, the definition of a schema for every table is not working.
The clue is inside the sails-postgresql adapter code - several of its helpers include this bit:
var schemaName = 'public';
if (inputs.meta && inputs.meta.schemaName) {
schemaName = inputs.meta.schemaName;
} else if (inputs.datastore.config && inputs.datastore.config.schemaName) {
schemaName = inputs.datastore.config.schemaName;
}
So indeed the driver is looking for a schema named public by default, unless a different value is provides via calls to meta() as described above, OR the schema name is configured application-wide.
To configure the schema name for all models, a schemaName property needs to be included in the configuration of the postgresql datastore, which occurs in datastore.js:
...
default: {
adapter: 'sails-postgresql',
url: 'postgresql://username:password#localhost:5432/your_database_name',
schemaName: 'your_schema_name_here'
}
Once this is in place, you don't have to append meta({ schemaName: 'blah'}) to any of the queries. I struggled with this for a couple of days and have finally solved it in this manner.
I'm building a node.js app and I'm evaluating Sequelize.js for persistent objects. One thing I need to do is publish new values when objects are modified. The most sensible place to do this would seem to be using the afterUpdate hook.
It almost works perfectly, but when I save an object the hook is passed ALL the values of the saved object. Normally this is desirable, but to keep the publish/subscribe chatter down, I would rather not republish fields that weren't saved.
So for instance, running the following
tasks[0].updateAttributes({assignee: 10}, ['assignee']);
Would automagically publish the new value for the assignee for that task on the appropriate channel, but not republish any of the other fields, which didn't change.
The closest I've come is with an afterUpdate hook:
Task.hook('afterUpdate', function(task, fn) {
Object.keys(task).forEach(function publishValue(key) {
pubSub.publish('Task:'+task.id+'#'+key, task[key]);
});
return fn();
});
which is pretty straightforward, but since the 'task' object has all the fields, I'm being unnecessarily noisy. (The pubSub system is ignorant of previous values and I'd like to keep it that way.)
I could override the setters in the task object (and all my other objects), but I would prefer not to publish until the object is saved. The object to be saved doesn't seem to have the old values (that I can find), so I can't base my publish on that.
So far the best answer I've come up with from a design standpoint is to tweak one line of dao.js to add the saved values to the returned object, and use that in the hook:
self.__factory.runHooks('after' + hook, _.extend({}, result.values, {savedVals: args[2]} ), function(err, newValues) {
Task.hook('afterUpdate', function(task, fn) {
Object.keys(task.savedVals).forEach(function publishValue(key) {
pubSub.publish('Task:'+task.id+'#'+key, task[key]);
});
return fn();
});
Obviously changing the Sequelize library is not ideal from a maintenance standpoint.
So my question is twofold: is there a better way to get the needed information to my hook without modifying dao.js, or is there a better way to attack my fundamental requirement?
Thanks in advance!
There is not currently. In the implementation for exactly what you describe we simply had to implement logic to compare old and new values, and if they differed, assume that they have changed.