I have two tables. users is a parent and leaves is a child table.
Every user has more than one leave requests.
users.id is primary key and leaves.userID is foreign key.
I want to get related user's record with the every leave record.
Here is users model
import bookshelf from '../config/bookshelf';
const TABLE_NAME = 'users';
/**
* User model.
*/
class User extends bookshelf.Model {
/**
* Get table name.
*/
get tableName() {
return TABLE_NAME;
}
/**
* Table has timestamps.
*/
get hasTimestamps() {
return true;
}
verifyPassword(password) {
return this.get('password') === password;
}
}
export default User;
Here is leaves model
import bookshelf from '../config/bookshelf';
const TABLE_NAME = 'leaves';
/**
* Client model.
*/
class leaves extends bookshelf.Model {
/**
* Get table name.
*/
get tableName() {
return TABLE_NAME;
}
/**
* Table has timestamps.
*/
get hasTimestamps() {
return true;
}
verifyPassword(password) {
return this.get('password') === password;
}
}
export default leaves;
It is my code to fetch the leaves records.
leaves.forge()
.fetchAll()
.then(leaves => res.json({
error: false,
data: leaves.toJSON()
})
)
.catch(err => res.status(HttpStatus.INTERNAL_SERVER_ERROR).json({
error: err
})
);
Unless absolutely necessary, I don't recommend performing such a function in your application code. What you are asking is very rudimentary in the data world, and your DBMS will be far more capable of processing such a request. No need to slow down your app with a request that should be off-loaded to the DBMS. It's what the DBMS is made for. Don't reinvent the wheel.
If your DBMS supports Views, then create a View to perform this function. You could also do this in a Stored Procedure. It is a very simple JOIN query that can be done in as few as 3 lines of SQL code. After you have the View, your app can read data from it just like any other table.
Related
Whenever I try to archieve creating diffrent stores in the same database at the same time only one of them is created. Is there a way to resolve this syncronism issue.
I've already been able to solve this issue but I'll clarify what I ment in case it might help someone else. I have a GridView component which is mapped multiple times. This component saves the columns, how they are aranged and their specific behaviors to be stored inside indexedDB. The issue I had was that I used a function that created a Db named after the page and the stores (one for each GridView inside the same DB). In order to create all the stores at the same time (in this case I had to create 9 of them) I had to trigger a version change for each of the new stores in order to be able to persist the information. Inside the function I searched for the Db actual version and added 1 to trigger the version change event. The problem has that because they where searching the version synchronously inside this function all of the itterations were getting the same version and the result would be that only the first store was beeing created because only the first iteration of the map would trigger a version change. In order to resolve this issue I used index prop iside the map function and passed it as an order prop to my GridView component. Then instead of triggering the version change (version+1) inside the function I triggered by using version+order, this way all the stores where being created because it assures that all the versions were going to be higher than the previous ones.
I'll give some code to maybe help the explanation.
This is the map:
{status.map((status, index) => {
return (
<GridView
pageName={"Page"} // DB Name
gridName={status} // Store Name
order={index + 1} // index + 1 so that the order is never 0
//all the other props...
/>
);
})}
Inside the GridView component I have a function that triggers on first render to search for the store inside the db and if there is no information inside creates and then fills the store with the information needed. This the function:
/**
*
#param {String} DB_NAME
#param {String} STORE_NAME
#param {String} keyPath
#param {Int} order
#param {Object/Array} info
#returns {Object}: resolve: {success,message,storeInfo), reject:{error, message}
*/
const createAndPopulateStore = (
DB_NAME,
STORE_NAME,
keyPath,
order = 1,
info
) => {
return new Promise((resolve, reject) => {
const request = indexedDB.open(DB_NAME);
request.onsuccess = function (e) {
let database = e.target.result;
let version = parseInt(database.version);
database.close();
//This was the critical part in order to create multiple stores at the same time.
let secondRequest = indexedDB.open(DB_NAME, version + order);
secondRequest.onupgradeneeded = (e) => {
let database = e.target.result;
//Early return if the store already exist.
if (database.objectStoreNames.contains(STORE_NAME)) {
reject({
success: false,
message: `There is already a store named: ${STORE_NAME} created in the database: ${DB_NAME}`,
});
return;
}
let objectStore = database.createObjectStore(STORE_NAME, { keyPath });
if (info) {
// Populates the store within the db named after the gridName prop with the indexedColumns array.
if (Array.isArray(info)) {
info.map((item) => objectStore.put(item));
} else {
Object.entries(info).map((item) => objectStore.put(item));
}
}
};
secondRequest.onsuccess = function (e) {
resolve({
success: true,
message: `Store: ${STORE_NAME}, created successfully.`,
storeInfo: info ?? {},
});
let database = e.target.result;
database.close();
};
};
});
};
I hope this will help! Fell free to ask any questions regarding this issue and I'll try to answer them as soon as I can.
When I try to call a transaction on an asset that is inherited from a abstract base class asset, the call fails with Error: Invalid or missing identifier for Type <type> in namespace <name.spa.ce>
user.cto
namespace com.aczire.alm.base.user
import com.aczire.alm.base.*
abstract participant User {
o String uid
o String title optional
o String firstName optional
o String lastName optional
o UserTransactionLogEntry[] logEntries
}
concept UserTransactionLogEntry {
//--> User modified_by
o String comment optional
o DateTime timestamp
}
abstract transaction UserTransaction {
o String comment
}
abstract event UserTransactionEvent {
o String comment
}
admin.cto
namespace com.aczire.alm.base.user.admin
import com.aczire.alm.base.*
import com.aczire.alm.base.user.*
participant Admin identified by uname extends User {
o String uname
}
abstract transaction AdminUserTransaction extends UserTransaction {
o Admin user
--> Admin modified_by
}
abstract event AdminUserTransactionEvent extends UserTransactionEvent {
--> Admin user
--> Admin modified_by
}
transaction CreateUser extends AdminUserTransaction {
}
admin.js
/**
* Create a User
* #param {com.aczire.alm.base.user.admin.CreateUser} createUser - the CreateUser transaction
* #transaction
*/
function createUser(newuser) {
console.log('createUser');
var factory = getFactory();
var NS_AU = 'com.aczire.alm.base.user.admin';
var user = factory.newResource(NS_AU, 'Admin', newuser.uname);
user.uid = newuser.uid;
// save the order
return getAssetRegistry(NS_AU)
.then(function (registry) {
return registry.add(user);
})
.then(function(){
var userCreatedEvent = factory.newEvent(NS_AU, 'UserCreatedEvent');
userCreatedEvent.user = user;
userCreatedEvent.comment = 'Created new admin user - ' + newuser.uname + '!';
emit(userCreatedEvent);
});
}
I tried making the parameters to the TP as User, Admin; moving the transaction arguments to base class as User type; moving the transaction to the base class etc.. But nothing seems to work.
Does the inheritance works differently here?
Error shown in composer playground.
Your admin.js has some issues. The error is because Admin is a participant so you have to getParticipantRegistry() not Assetregistry and then add the user. Also in your model file admin.cto there is no event named UserCreatedEvent. So you first need to add it to the model and then emit the event. To add the user try changing to this.
/**
* Create a User
* #param {com.aczire.alm.base.user.admin.CreateUser} createUser - the CreateUser transaction
* #transaction
*/
function createUser(newuser) {
console.log('createUser');
var NS_AU = 'com.aczire.alm.base.user.admin';
var factory = getFactory();
var testuser = factory.newResource(NS_AU, 'Admin', newuser.user.uname);
testuser.uid=newuser.user.uid
testuser.logEntries=newuser.user.logEntries
// save the order
return getParticipantRegistry('com.aczire.alm.base.user.admin.Admin')
.then(function (registry) {
return registry.add(testuser);
});
}
I have a MongoDB database with 2 collections: Customer and Order. These models' relationships in Loopback are as follows: Customer -hasMany-> Order & Order -belongsTo-> Customer (so an order has a foreignKey customerId).
I needed to query the orders of all customers that have age 20 for example. But I was surprised to notice that Loopback doesn't support this kind of queries. I searched a lot and even the "include" option doesn't support the format I want to get as a final result and cannot be combined to the filter by age functionality (Loopback-doc-about-it). Then, I wrote a remote method that makes 2 queries: the first one finds all customers who have a certain age (Basic where filter) and the other iterates over every customer in that list to find his orders ( basically for each index, it searches in the Orders collection which ones has customerId equal to the index's customer's id)
Here's customer.js file:
'use strict';
module.exports = function(Customer) {
var app = require('../../server/server');
/**
*
* #param {number} age
* #param {Function(Error, array)} callback
*/
Customer.getOrdersByAge= function(age, callback) {
var customers;
var filter= { where: { 'age': age } };
var Order=app.models.Order;
var orders;
var elementary_orders;
Customer.find(filter, function(err, items) {
if (err !==null){
console.log("error1");
return callback(err);
}
console.log("items: "+ items);
customers=items;
for (let i of customers){
console.log(i+": "+i.id+" -lenght: "+customers.length);
var filter_order= { where: { 'customerId': i.id+'' } };
Order.find(filter_order, function(err2, items_orders) {
if (err2 !==null){
console.log(i+": "+"error2");
return callback(err2);
}
elementary_orders= items_orders;
orders=elementary_orders;
console.log("elementary_orders: "+ elementary_orders);
console.log("orders now: "+ elementary_orders);
});
}
console.log("-> orderssss: "+ orders);
callback(null,orders);
});
}
};
But then I found another problem, orders is always undefined. It appears that since "find" queries are asynchronous, orders stays undefined. So, I looked for a way to make it synchronous (but it was impossible since it's Loopback's spirit to be always asynchronous) and a way to control the flow through npm's (async package but even after trying the eachOf utility, orders is still undefined.
I am wondering not only how can I make this simple query work but also why is it so impossible to implement? Am I violating any conceptual or architectural patterns related to Loopback's models or something? Querying multiple collections is a usual thing to do, though.
Thank you :)
You can use async.map on the customers to patch the models and get the orders added to them.
async.map(customers, function(customer, mapCallback) {
Order.find(params, function (orders, error) {
if(ordersError) {
mapCallback(null, ordersError);
}
customer.orders = orders;
mapCallback(customer);
});
}, callback); //This callback is the main from the remote method.
i have 2 tables user and tn_user, table user is a table containing information to log in, i made it by tutorial from https://laravel.com/ so basically it was automatically created, while tn_user is a table that i make by myself
USER TABLE
in case u can't see the atribut are id, name, email, password that the important things, email and password in this table is used to logging in
TN_USER TABLE
the atribut are cn_id, cv_name, cv_email, cn_phone, cv_position, cv_address, cv_country, cv_username, cv_password, cv_privileges, those are the important thing
based on the form below i want to insert username and password into table user and the rest into table tn_user and how do i do that? im pretty new to laravel so not really quite understand how, usually i use CI
UserController.php
this is where the code i use to insert data
i use json response to parse the data and not quite sure how to insert data into 2 tables little help here
public function createOrEdit(){
//get current user
$currentUserId = Auth::user()->id;
$isUpdate = false;
$id = Input::get('id');
$user = new UserCompany;
if($id != ""){
$user = UserCompany::where('cn_id', '=', $id)->firstOrFail();
$user->cv_updated_by = $currentUserId;
$user->cv_updated_at = Carbon::now();
$isUpdate = true;
}else{
$user->cv_created_by = $currentUserId;
$user->cv_created_at = Carbon::now();
}
$user->cv_name = Input::get('name');
$user->cv_position = Input::get('position');
$user->cv_email = Input::get('email');
$user->cn_phone = Input::get('phone');
$user->cv_address = Input::get('address');
$user->cv_username = Input::get('username');
$user->cv_password = Input::get('password');
$user->cv_country = Input::get('country');
if($isUpdate){
UserCompany::where('cn_id','=',$id)->update(['cv_updated_by' => $user->cv_updated_by,
'cv_updated_at' => $user->cv_updated_at,
'cv_name' => $user->cv_name,
'cv_position' => $user->cv_position,
'cv_email' => $user->cv_email,
'cn_phone' => $user->cn_phone,
'cv_country' => $user->cv_country,
'cv_username' => $user->cv_username,
'cv_password' => $user->cv_password,
'cv_address' => $user->cv_address]);
}else{
$user->save();
}
$returnedData = UserCompany::all();
$response = array(
'content' => $returnedData,
'status' => 'success',
);
return Response::json($response);
}
UserCompany.php is my model but since im new im not really understand how to use relationship yet
<?php namespace Activity;
use Illuminate\Database\Eloquent\Model;
class UserCompany extends Model {
protected $table = 'tn_user';
public $timestamps = false;
protected $fillable = [
];
/*public function usercompany(){
return $this->belongsTo('Activity\user');
}*/
}
You should know that in the UserCompany class, by setting the fillable, It means you are setting table column which you want to alter, in this case tn_user table. So this means, by setting
protected $fillable = [];
It means, that you are making no table columns should undergo modification when you are using commands like;
$user_details->cv_name = Input::get('cv_name');
Okay, so the first thing that you should know is that when creating two tables i.e users and tn_users you should have a column which carries a value which relate the two tables, I suggest that you are to user id from the users table:
I have noticed that you have used cn_id to be a linker, but it is best if every table has its own incrementing id column and also in this case, its own link_id column
Let's say you are starting over:
Open the command prompt or Terminal and go to you laravel project folder directory and type: -$ php artisan make:model User -m and again -$ php artisan make:model UserDetail -m
What this will do is, create User and UserDetail, and adding the -m means its creating the migrations for the models associated which is create_users_table and create_user_details_table
From the create_users_table simply create the desired table columns as shown below:
use Illuminate\Database\Schema\Blueprint;
use Illuminate\Database\Migrations\Migration;
class CreateUsersTable extends Migration
{
/**
* Run the migrations.
*
* #return void
*/
public function up()
{
Schema::create('users', function (Blueprint $table){
$table->increments('id');
$table->integer('auth');
$table->string('username')->unique();
$table->string('email');
$table->string('password');
$table->boolean('online');
$table->string('lang', 2);
$table->rememberToken();
$table->timestamps();
});
}
/**
* Reverse the migrations.
*
* #return void
*/
public function down()
{
//
Schema::drop('users');
}
}
Now for the create_tn_users_table its kinda important, you should set which links with the users account so that suppose you delete the users, his credentials are also removed, but you can make it do otherwise if you want.
use Illuminate\Database\Schema\Blueprint;
use Illuminate\Database\Migrations\Migration;
class CreateTnUsersTable extends Migration
{
/**
* Run the migrations.
*
* #return void
*/
public function up()
{
Schema::create('tn_users', function (Blueprint $table) {
$table->increments('id');
$table->string('full_name');
$table->string('username')->unique();
$table->integer('link_user_id')
->references('id')->on('users'); // Relationship btn table tn_users and users
$table->string('phone');
});
}
/**
* Reverse the migrations.
*
* #return void
*/
public function down()
{
Schema::drop('tn_users');
}
}
Now go the command prompt or terminal and type -$ php artisan migrate to have the tables created.
Again on the command prompt or terminal type -$ php artisan make:controller UserController --resource and have the controller made together with its resources.
On the create() function inside the UserController, add the Request in as a parameter.
The functions is to be reached upon the submission of the form that you have created
namespace App\Http\Controllers;
use App\User;
use App\TnUser;
use ...
class UserController extends Controller{
public function create(Request $request){
$tn_user = new TnUser();
$user = new User();
$user->username = $request['username'];
$user->password = bcrypt($request['username']);
...
$user->save();
$tn_user->full_name = ucword(strtolower($request['full_name'));
$tn_user->link_user_id = $user->id; // uses the previously save id
$tn_user->phone = trim($request['phone']);
$th_user->save();
}
}
I hope I have answered you questions. Here are some helpful links to learn.
https://laravel.com/docs/5.1/migrations#creating-columns
https://laravel.com/docs/5.1/requests
You Create 2 objects
$user = new User()
$user->username = INPUT::get('username');
$user->password = $password // Hashed
$user->save();
$user_detail = new UserCompany() // Your detail table modal.
$user_detail->cv_name = Input::get('cv_name');
//etc
$user_detail->save()
I am using Knex.JS migration tools. However, when creating a table, I'd like to have a column named updated_at that is automatically updated when a record is updated in the database.
For example, here is a table:
knex.schema.createTable('table_name', function(table) {
table.increments();
table.string('name');
table.timestamp("created_at").defaultTo(knex.fn.now());
table.timestamp("updated_at").defaultTo(knex.fn.now());
table.timestamp("deleted_at");
})
The created_at and updated_at column defaults to the time the record is created, which is fine. But, when that record is updated, I'd like the updated_at column to show the new time that it was updated at automatically.
I'd prefer not to write in raw postgres.
Thanks!
With Postgres, you'll need a trigger. Here's a method I've used successfully.
Add a function
If you have multiple migration files in a set order, you might need to artificially change the datestamp in the filename to get this to run first (or just add it to your first migration file). If you can't roll back, you might need to do this step manually via psql. However, for new projects:
const ON_UPDATE_TIMESTAMP_FUNCTION = `
CREATE OR REPLACE FUNCTION on_update_timestamp()
RETURNS trigger AS $$
BEGIN
NEW.updated_at = now();
RETURN NEW;
END;
$$ language 'plpgsql';
`
const DROP_ON_UPDATE_TIMESTAMP_FUNCTION = `DROP FUNCTION on_update_timestamp`
exports.up = knex => knex.raw(ON_UPDATE_TIMESTAMP_FUNCTION)
exports.down = knex => knex.raw(DROP_ON_UPDATE_TIMESTAMP_FUNCTION)
Now the function should be available to all subsequent migrations.
Define a knex.raw trigger helper
I find it more expressive not to repeat large chunks of SQL in migration files if I can avoid it. I've used knexfile.js here but if you don't like to complicate that, you could define it wherever.
module.exports = {
development: {
// ...
},
production: {
// ...
},
onUpdateTrigger: table => `
CREATE TRIGGER ${table}_updated_at
BEFORE UPDATE ON ${table}
FOR EACH ROW
EXECUTE PROCEDURE on_update_timestamp();
`
}
Use the helper
Finally, we can fairly conveniently define auto-updating triggers:
const { onUpdateTrigger } = require('../knexfile')
exports.up = knex =>
knex.schema.createTable('posts', t => {
t.increments()
t.string('title')
t.string('body')
t.timestamps(true, true)
})
.then(() => knex.raw(onUpdateTrigger('posts')))
exports.down = knex => knex.schema.dropTable('posts')
Note that dropping the table is enough to get rid of the trigger: we don't need an explicit DROP TRIGGER.
This all might seem like a lot of work, but it's pretty "set-and-forget" once you've done it and handy if you want to avoid using an ORM.
You can create a knex migration using timestamps:
exports.up = (knex, Promise) => {
return Promise.all([
knex.schema.createTable('table_name', (table) => {
table.increments();
table.string('name');
table.timestamps(false, true);
table.timestamp('deleted_at').defaultTo(knex.fn.now());
})
]);
};
exports.down = (knex, Promise) => {
return Promise.all([
knex.schema.dropTableIfExists('table_name')
]);
};
With timestamps a database schema will be created which adds a created_at and updated_at column, each containing an initial timestamp.
To keep the updated_at column current, you'll need knex.raw:
table.timestamp('updated_at').defaultTo(knex.raw('CURRENT_TIMESTAMP ON UPDATE CURRENT_TIMESTAMP'));
To skip the knex.raw solution, I suggest using a high level ORM like Objection.js. With Objection.js you could implement your own BaseModel which then updates the updated_at column:
Something.js
const BaseModel = require('./BaseModel');
class Something extends BaseModel {
constructor() {
super();
}
static get tableName() {
return 'table_name';
}
}
module.exports = Something;
BaseModel
const knexfile = require('../../knexfile');
const knex = require('knex')(knexfile.development);
const Model = require('objection').Model;
class BaseModel extends Model {
$beforeUpdate() {
this.updated_at = knex.fn.now();
}
}
module.exports = BaseModel;
Source: http://vincit.github.io/objection.js/#timestamps
This is my way of doing that in Mysql 5.6+
The reason I didn't use table.timestamps is because I use DATETIME instead of timestamp.
table.dateTime('created_on')
.notNullable()
.defaultTo(knex.raw('CURRENT_TIMESTAMP'))
table.dateTime('updated_on')
.notNullable()
.defaultTo(knex.raw('CURRENT_TIMESTAMP ON UPDATE CURRENT_TIMESTAMP'))
This is not a feature of Knex. Knex only creates the columns, but does not keep them up to date for you.
If you use, the Bookshelf ORM, however, you can specify that a table has timestamps, and it will set & update the columns as expected:
Bookshelf docs
Github issue
exports.up = (knex) => {
return knex.raw(create or replace function table_name_update() RETURNS trigger AS $$ begin new.updated_at = now(); RETURN NEW; end; $$ language 'plpgsql'; create or replace trigger tg_table_name_update on table_name before update for each row execute table_name_update();)
};
exports.down = (knex) => {
return knex.raw(drop table if exists table_name; drop function if exists table_name_update;)
};
You can directly use this function
table.timestamps()
This will create the 'created_at' and 'updated_at' columns by default and update them accordingly
https://knexjs.org/#Schema-timestamps