I'm making a node api using prisma ORM and i'm trying to update a column which i had set with the type DateTime, here is the model, the column is the deleted_at one
model Employee {
id Int #id #default(autoincrement())
name String
created_at DateTime
deleted_at DateTime
}
how can i change it to the current time in my controller?
the controller looks like this
export const DeleteCompany = async (req:IEmployee, res:Response) => {
const data:ICompany = req
const deletedCompany = await prisma.employee.update({
where: {
id: Number(data.id)
},
data: {
deleted_at: //what should I put here?
}
})
return res.status(200).json(deletedCompany)
}
I've tried using
now()
but it didnt work.
Prisma supports plain javascript dates for setting date fields.
So new Date() should work fine:
deleted_at: new Date()
Related
I´m doing an Spotify clone and I´m trying to add a song to a playlist but my query doesn't work, until this point, everything was good following the documentation on prisma docs, but I cannot do this query, every time get an error, so if someone can tell me, how can I do this with an example, I'll be very grateful.
My question is, having this schema, how can I add a song to a playlist? there are two models affected by the query, song (where i am trying to add) and playlist.
My schema:
generator client {
provider = "prisma-client-js"
}
datasource db {
provider = "postgresql"
url = env("DATABASE_URL")
shadowDatabaseUrl = env("SHADOW_DATABASE_URL")
}
model User {
id Int #id #default(autoincrement())
createdAt DateTime #default(now())
updatedAt DateTime #updatedAt
email String #unique
firstName String
lastName String
password String
playlists Playlist[]
}
// here
model Song {
id Int #id #default(autoincrement())
createdAt DateTime #default(now())
updatedAt DateTime #updatedAt
name String
artist Artist #relation(fields: [artistId], references: [id])
artistId Int
playlists Playlist[]
duration Int
url String
}
model Artist {
id Int #id #default(autoincrement())
createdAt DateTime #default(now())
updatedAt DateTime #updatedAt
songs Song[]
name String #unique
}
// here
model Playlist {
id Int #id #default(autoincrement())
createdAt DateTime #default(now())
updatedAt DateTime #updatedAt
name String
songs Song[]
user User #relation(fields: [userId], references: [id])
userId Int
}
I am trying to add the song like this:
let songId = 1;
let playlistId = 1;
let lists;
// get the playlists the song is part of
lists = await prisma.song.findFirst({
select: {
playlists: true
},
where: {
id: +songId
}
})
// get the playlist data i need
const list = await prisma.playlist.findUnique({
where: {
id: playlistId
}
})
// create the array for update with the data
// plus the data I want to add
lists = [ ...lists.playlists, list ]
// trying to update the old array with the new data (lists)
// this is what i'm doing wrong, help please
await prisma.song.update({
where: { id: +songId },
data:{
playlists: lists
}
})
after many tries i finally got what i want if anyone knows a better way please tell me, i want to learn, for now this is my solution:
i need to send each value as id: playlistId
const song = await prisma.song.findUnique({
select: {
playlists: true
},
where: {
id: +songId
}
})
// get an array of objects, id: playlistId
const songPlaylistsIds = song.playlists.map( playlist => ({id: playlist.id}))
// I prepare the array with the content that already exists plus the new content that I want to add:
const playlists = [...songPlaylistsIds, { id: playlistId}]
await prisma.song.update({
where: { id: +songId },
data:{
playlists: {
// finally for each object in the array i get, id: playlistId and it works.
set: playlists.map( playlistSong => ({ ...playlistSong }))
}
}
})
Problems I had doing this: I was wrong in thinking that it should work as simple as playlist: lists I wanted to change the content to a new one but I couldn't, I needed to send the values one by one.
Another error when I get the content of the playlists. I had the full object but just needed to send the id.
And lastly, in the prisma documentation, there is a method like set, push, but this method doesn't work, at least I don't know how to make push work
I'm using TypeORM with an Entity that looks something like this:
#Entity('users')
export class UserEntity extends BaseEntity {
#PrimaryColumn()
id: string;
#CreateDateColumn({ type: 'timestamp' })
createdAt: Date;
#UpdateDateColumn({ type: 'timestamp' })
updatedAt: Date;
}
However, when I try to do any sort of timestamp equality related SQL query using the TypeORM Entity's repository it does not work properly. For example the query:
const id = 'dbe9e81d-aefa-459d-8460-707ade0fa156';
const userEntity = userRepository.findOne(id); // UserEntity(...)
const sameUserEntity = userRepository.findOne({ where: { createdAt: userEntity.createdAt } }); // undefined
Returns the correct entity for userEntity and undefined for sameUserEntity. I looked at the logs constructed by TypeORM for this query and it looks like this:
SELECT "UserEntity"."id" AS "UserEntity_id",
"UserEntity"."created_at" AS "UserEntity_created_at",
"UserEntity"."updated_at" AS "UserEntity_updated_at"
FROM "users" "UserEntity"
WHERE "UserEntity"."created_at" = $1 LIMIT 1 -- PARAMETERS: ["2022-02-19T22:10:13.564Z"]
It seems like TypeORM is not converting the JavaScript Date object to the correct PostgreSQL timestamp format. The timestamp in the database looks like 2022-02-19 22:10:13.564432, which is a completely different format and is a higher precision.
Is there a specific way I should be doing timestamp related searches when using TypeORM?
Note: I've tried too look for people having this same issue but I do not see any clear solution. I'm trying to implement cursor based pagination around the created at date, however the greater than and less than operators are not work properly as well.
I recently ran into the same problem and fixed it by adding precision: 3 to the column decorator. Please note that this is based on the assumption that you don't need that level of precision to begin with.
#Entity('users')
export class UserEntity extends BaseEntity {
#PrimaryColumn()
id: string;
#CreateDateColumn({
type: 'timestamp',
precision: 3
})
createdAt: Date;
#UpdateDateColumn({
type: 'timestamp',
precision: 3
})
updatedAt: Date;
}
Model:
const fooSchema = new mongoose.Schema({
fooCreationDate: {
type: Date
},
bar: [{
barCreationDate: {
type: Date
}
}]
});
const foo = mongoose.model(`foo`, fooSchema);
If we want to search for foo objects that were created between 2022-01-01 and 2022-01-02, we can use the following mongoose query:
foo.find({
fooCreationDate: {
$gte: "2022-01-01T00:00:00.000",
$lt: "2022-01-02T00:00:00.000"
}
});
Please note that I'm using strings instead of date objects. The reason is that the query is passed by the client through an AJAX call with dataType: "jsonp". Every date object that is passed like that to the backend is automatically converted to an ISO string. Despite that, the query works without any issues - the find function automatically parses dates represented as ISO strings.
We'd now like to extract every bar object that was created in the same time range, so we'll need to use an aggregation:
foo.aggregate([{
$unwind: `$bar`,
}, {
$match: {
"bar.barCreationDate": {
$gte: "2022-01-01T00:00:00.000",
$lt: "2022-01-02T00:00:00.000"
}
}
}]);
Unfortunately, nothing is found despite the fact that the database contains matching bar objects. This can be confirmed by passing Date objects instead of strings to the $match aggregation:
foo.aggregate([{
$unwind: `$bar`,
}, {
$match: {
"bar.barCreationDate": {
$gte: new Date("2022-01-01T00:00:00.000"),
$lt: new Date("2022-01-02T00:00:00.000")
}
}
}]);
This query returns some results, so the conclusion is that mongoose accepts ISO date strings in the find function, but can't handle them in the aggregate function.
Is there any known workaround? I could, for example, deep-scan every query object passed from the client and search for ISO date strings, then convert them to Date objects, but that's a bit dirty in my opinion. I'm using mongoose v5.6.4 and mongodb v4.2.2.
I'm using SequlizeJS ORM in my ExpressJS application to communicate with MariaDB. I'm working with existing database scheme so I can't change data types of the fields. Existing database uses unix timestamps in createdAt, updatedAt & deletedAt fields. Now I need to follow that for the new ExpressJS app too.
In each modal I use the following Sequelize hooks to convert createdAt & updatedAt fields to unix timestamps.
hooks: {
beforeCreate: (instance, options) => {
instance.dataValues.createdAt = Math.floor(Date.now() / 1000);
instance.dataValues.updatedAt = Math.floor(Date.now() / 1000);
},
beforeUpdate: (instance, options) => {
instance.dataValues.updatedAt = Math.floor(Date.now() / 1000);
}
}
but the thing is that I can't set deletedAt as a timestamp on beforeBulkDestroy hook. Can anyone please help me to resolve this ?
Thanks !
One thing to try - the individualHooks option calls the beforeDestroy() hook for each instance. This can be applied at the query level:
db.myFunkyModel.destroy({
where: {
'field' : { [Op.like]: '%someValue%' }
},
individualHooks : true
});
Or at a broader level:
const sequelizeDb = new Sequelize(
...
{
host: '127.0.0.1',
....
define: {
....
individualHooks : true
}
....
});
Take a look at the Model Hooks section of the manual for potential performance hits from individualHooks.
In my collection I'd like to have automatically generated createdAt and updatedAt fields that would contain the date of when the object was inserted / updated for the last time - kind of like it's happening in Ruby on Rails. Currently I'm doing this with an observer similar to this one:
MyCollection.find({}).observeChanges({
changed: function(id, changes) {
MyCollection.update(id, ...);
},
});
Is there a better / more efficient / more straightforward way?
I use Collection2. It supports autoValue in the schema, a function that computes the forced value of a field. As these two fields are used in all collections, you can save them to a variable:
#SchemaHelpers =
createdAt:
type: Date
autoValue: ->
if #isInsert
return new Date
if #isUpsert
return $setOnInsert: new Date
if #isUpdate
#unset()
return
updatedAt:
type: Date
autoValue: ->
return new Date
And then in the collection:
Schema = {}
Posts = new Meteor.Collection("posts")
Schema.Posts = new SimpleSchema
createdAt: SchemaHelpers.createdAt
updatedAt: SchemaHelpers.updatedAt
title:
type: String
max: 30
body:
type: String
max: 3000
Posts.attachSchema(Schema.Posts)
This solution makes updatedAt always present and its value will be very close to createdAt when it is just inserted (not necessarily the same). If you need updatedAt not to be set when inserting, you can use something like the example in the Collection2 readme:
updatedAt: {
type: Date,
autoValue: function() {
if (this.isUpdate) {
return new Date();
}
},
denyInsert: true,
optional: true
},
but this does not handle upserts. I don't know any good solution that handles upserts correctly and leaves the field empty at inserts.
I like https://github.com/matb33/meteor-collection-hooks
collection.before.insert (userId, doc) ->
doc.createdAt = new Date().valueOf #toISOString()