I have list of more than 5k users in my another DB, and I want to migrate all the users in loopback DB, there are many duplicate users exists so that I am checking if user exists then I am updating a user otherwise I am creating new one. I have written a script for that. Its working fine but after updates first user, loopback starts throwing an error 401 Unauthorized if I execute GET user detail API. Even I have allowed Unauthenticated user to access update, create property with ACL, but its not working with this as well.
I have extended the User model.
Can anyone please throw some light? Help will be appeciated!
Thanks
LoopBack's default ACLs are more specific than the ones you are defining, so that yours don't have effect at the end. The #authenticated and #unauthenticated ALLOW rules doesn't have precedence over DENY all rules. But custom roles do, and using a custom ADMINISTRATOR role is the right way in the framework.
You need to create a Role for an specific user.
Map that role to the user using the RoleMapping model.
Steps 1 and 2 can be done with this boot script (ex: App/server/boot/create-admin-user.js):
module.exports = function(app) {
var User = app.models.ExtendedUser;
var Role = app.models.Role;
var RoleMapping = app.models.RoleMapping;
User.findOrCreate({ where: { username: 'admin', email: 'admin#admin.com' } },
{
username: 'admin',
email: 'admin#admin.com',
password: 'admin123'
},
function(err, user) {
if (err) return console.log(err);
// Create the admin role
Role.findOrCreate({where: { name: 'ADMINISTRATOR' }},
{ name: 'ADMINISTRATOR' },
function(err, role) {
if (err) return debug(err);
console.log("Role Created: " + role.name);
// Assign admin role
RoleMapping.findOrCreate({where: { roleId: role.id, principalId: user.id }},
{ roleId: role.id, principalId: user.id, principalType: RoleMapping.USER },
function(err, roleMapping) {
if (err) return console.log(err);
console.log("ADMINISTRATOR Role assigned to " + user.username);
});
});
});
};
Create an ACL entry in your ExtendedUser model to ALLOW the ROLE ADMINISTRATOR to WRITE:
```
{
"name": "ExtendedUser",
"base": "User",
/* ... */
"acls": [
{
"accessType": "READ",
"principalType": "ROLE",
"principalId": "$authenticated",
"permission": "ALLOW"
},
{
"accessType": "WRITE",
"principalType": "ROLE",
"principalId": "ADMINISTRATOR",
"permission": "ALLOW"
}
]
}
Related
I want to understand if using json-schema as my main validation tool would be a good choice. The main thing I'm not sure about is whether I should use it to validate the query params in my API besides it being used to validate DB input.
I'll try to provide a better explanation with code examples below:
Let's start with a json-schema for the User model:
{
"$schema": "http://json-schema.org/draft-07/schema",
"type": "object",
"title": "User",
"description": "User schema",
"required": ["email", "full_name", "password"],
"additionalProperties": false,
"properties": {
"id": {
"$id": "#/properties/id",
"title": "User ID",
"type": "integer"
},
"email": {
"$id": "#/properties/email",
"title": "User email. Must be unique",
"type": "string",
"format": "email"
},
"password": {
"$id": "#/properties/password",
"title": "Hashed password for the user",
"type": "string",
"maxLength": 128,
"pattern": "^(?=.*[a-z])(?=.*[A-Z])(?=.*\\d)(?=.*[#$!%*?&])[A-Za-z\\d#$!%*?&]{8,}$"
},
"full_name": {
"$id": "#/properties/full_name",
"title": "User first name and last name",
"type": "string",
"maxLength": 128,
"pattern": "^[a-zA-Z]+(?:\\s[a-zA-Z]+)+$"
},
"image_url": {
"$id": "#/properties/image_url",
"title": "URL to user image",
"type": "string"
},
"created_at": {
"$id": "#/properties/created_at",
"title": "The creation date of the user",
"type": "string"
},
"updated_at": {
"$id": "#/properties/updated_at",
"title": "The date the user was last updated",
"type": "string"
}
}
}
As you can see, I'm using regex to validate the input for each field to ensure the format is correct. I can specify which fields are required which is very useful and I set additionalProperties to false which means that the schema/Objection will not accept properties that are not specified in the json schema.
Next let's take a look at an example of a registration API that I'm trying to use:
router.post("/registration", async (req, res, next) => {
try {
const { password, ...payload } = await User.query().insert(req.body);
const token = await jwt.sign(payload);
res.json({ user: payload, token });
} catch (error) {
next(error);
}
});
So there's no validation of the request body in the route itself, or really in any other place, I'm trying to delegate the validation entirely to json-schema. When the request comes in, the password is not hashed so it can pass the validation, but then I need a way of storing the hashed password.
Currently I'm using this solution but I'm not sure if it's smart/safe?
// User model
async $beforeInsert(queryContext) {
this.$setJson(
{ password: await bcrypt.hash(this.password, 12) },
{ skipValidation: true }
);
await super.$beforeInsert(queryContext);
}
This would enable the following:
validation to check for the correct params (full_name, email, password) and test whether the values are correct
after the validation passes, update the model with the hashed password and skip another validation as it's already been run
insert (safe?) data in the db
Now let's look at the login route:
router.post("/login", async (req, res, next) => {
try {
User.fromJson(req.body, { skipRequiredFields: ["full_name"] });
const { email, password } = req.body;
const user = await User.query().where({ email }).first();
if (!user) {
res.status(403);
throw new Error("Invalid credentials");
}
if (!(await bcrypt.compare(password, user.password))) {
res.status(403);
throw new Error("Invalid credentials");
}
const payload = { id: user.id, email, full_name: user.full_name };
const token = await jwt.sign(payload);
res.json({ user: payload, token });
} catch (error) {
next(error);
}
});
Because I want to leave the validation to the json-schema and I need to check the request payload, I have to create a model which is redundant for the route itself, it's only used to trigger the validation. Another reason to do this is because .where({ email }) doesn't trigger the validation and it throws an error when email is undefined.
Code to remove specific required fields from the schema when needed:
// User model
$beforeValidate(jsonSchema, _, { skipRequiredFields }) {
if (skipRequiredFields) {
return {
...jsonSchema,
required: jsonSchema.required.filter(
(fieldName) => !skipRequiredFields.includes(fieldName)
),
};
}
}
This allows me to remove required fields from the validation schema. In this case it works perfectly for the /login route as I don't want the request body to have a full_name field.
So far this has been working for me but it feel more like a workaround rather than an actual solution. I am also not necessarily sure about the fact that I have to tap in into Objection hooks and override the password like that. The last thing I don't like is the fact that I have to create a redundant model to trigger the validation but I also understand that triggering a validation (or having the option to) on .when() doesn't make much sense.
I'm new to Objection and this is not a production project but rather a side-project which I'm using to explore Objection along side other frameworks/libraries and I'd like to know if there are better ways of doing this or maybe my thinking is entirely wrong and I should have a separate validation for the request body and leave the json-schema as a db validation only?
I built a PUT method for updating/editing inserted/existing records in sqlite db and when firing the put method the following error message is pulled up in the network tab for the PUT request. The issue seems to be related to SequelizeUniqueConstraintError and it auto-creating an id on update. Any thoughts or known workarounds?:
{
"error":"An error has occured trying to create the user.",
"err":{
"name":"SequelizeUniqueConstraintError",
"errors":[
{
"message":"id must be unique",
"type":"unique violation",
"path":"id",
"value":1,
"origin":"DB",
"instance":{
"id":1,
"email":"el#gmail.com",
"password":"$2a$08$HnRjfTzqsvwzobMQYM8bT.IkPbTSe6aOxy50l8/cUuuVhOgbl513.",
"firstName":"el",
"lastName":"leo",
"jobDescription":"test",
"gains":"100",
"costs":"50",
"balance":50,
"isAdmin":null,
"youAgree":null,
"createdAt":"2019-03-03T09:39:31.295Z",
"updatedAt":"2019-03-26T19:03:21.665Z"
},
"validatorKey":"not_unique",
"validatorName":null,
"validatorArgs":[
]
}
],
"fields":[
"id"
],
"parent":{
"errno":19,
"code":"SQLITE_CONSTRAINT",
"sql":"INSERT INTO `Users` (`id`,`email`,`password`,`firstName`,`lastName`,`jobDescription`,`gains`,`costs`,`balance`,`isAdmin`,`youAgree`,`createdAt`,`updatedAt`) VALUES (1,'el#gmail.com','$2a$08$HnRjfTzqsvwzobMQYM8bT.IkPbTSe6aOxy50l8/cUuuVhOgbl513.','el','leo','test','100','50',50,NULL,NULL,'2019-03-03 09:39:31.295 +00:00','2019-03-26 19:03:21.665 +00:00');"
},
"original":{
"errno":19,
"code":"SQLITE_CONSTRAINT",
"sql":"INSERT INTO `Users` (`id`,`email`,`password`,`firstName`,`lastName`,`jobDescription`,`gains`,`costs`,`balance`,`isAdmin`,`youAgree`,`createdAt`,`updatedAt`) VALUES (1,'el#gmail.com','$2a$08$HnRjfTzqsvwzobMQYM8bT.IkPbTSe6aOxy50l8/cUuuVhOgbl513.','el','leo','test','100','50',50,NULL,NULL,'2019-03-03 09:39:31.295 +00:00','2019-03-26 19:03:21.665 +00:00');"
},
"sql":"INSERT INTO `Users` (`id`,`email`,`password`,`firstName`,`lastName`,`jobDescription`,`gains`,`costs`,`balance`,`isAdmin`,`youAgree`,`createdAt`,`updatedAt`) VALUES (1,'el#gmail.com','$2a$08$HnRjfTzqsvwzobMQYM8bT.IkPbTSe6aOxy50l8/cUuuVhOgbl513.','el','leo','test','100','50',50,NULL,NULL,'2019-03-03 09:39:31.295 +00:00','2019-03-26 19:03:21.665 +00:00');"
}
}
This is the sample PUT method for reference:
put (user) { console.log('put user: ', JSON.stringify(user)) var userId = user.id console.log('put userId: ', userId) return Api().put(users/${user.id}, user) // return Api().put('users', userId, user) }
I'm creating a Loopback application and have created a custom user model, based on built-in User model.
{
"name": "user",
"base": "User",
"idInjection": true,
"properties": {
"test": {
"type": "string",
"required": false
}
},
"validations": [],
"acls": [],
"methods": []
}
Then in boot script I'm creating (if not exists) new user, new role and a roleMapping.
User.create(
{ username: 'admin', email: 'admin#mail.com', password: 'pass' }
, function (err, users) {
if (err) throw err;
console.log('Created user:', users);
//create the admin role
Role.create({
name: 'admin'
}, function (err, role) {
if (err) throw err;
//make user an admin
role.principals.create({
principalType: RoleMapping.USER,
principalId: users.id
}, function (err, principal) {
if (err) throw err;
console.log(principal);
});
});
});
Then in custom remote method I'm trying to get all roles for User, using user's id. Loopbacks' documentation on this topic says that
Once you define a “hasMany” relation, LoopBack adds a method with the relation name to the declaring model class’s prototype automatically. For example: Customer.prototype.orders(...).
And gives this example:
customer.orders([filter],
function(err, orders) {
...
});
But when I am trying to use User.roles() method, (const User = app.models.user;) I get the next error:
TypeError: User.roles is not a function
But when I'm making a remote request http://localhost:9000/api/users/5aab95a03e96b62718940bc4/roles, I get the desired roleMappings array.
So, i would appreciate if someone could help get this data using js. I know I can probably just query the RoleMappings model, but I've wanted to do it the documentation-way.
Loopback documentation suggests to extend the built-in user model
to add more properties and functionalities.
A good practice is creating a model Member that extends the built-in model User. In the new model declare the following relationship:
"relations": {
"roles": {
"type": "hasMany",
"model": "RoleMapping",
"foreignKey": "principalId"
}
}
Now, you can get all the user roles:
user.roles(function (err, roles) {
// roles is an array of RoleMapping objects
})
where user is an instance of Member.
This is an old question, but I faced the same issue and was able to solve it by having the relation Antonio Trapani suggested and accessing the roles like this:
const userInstance = await User.findById(userId);
const roles = await userInstance.roles.find();
Roles is not a function, it is an object. By the way this is using loopback 3.
I'm writing an analytics application that collects events and associates it with visitors.
My Visitor mongoose model as follows:
var visitorSchema = new Schema({
created_at: { type: Date, default: Date.now },
identifier: Number,
client_id: Number,
account_id: Number,
funnels: [String],
goals: [Goal],
events: [Event]
});
The api accept a mixed of visitor info and the event
{
"identifier": 11999762224,
"client_id": 1,
"account_id": 1,
"event": {
"context": "Home",
"action": "Click red button",
"value": ""
}
}
When restify receives a request it checks if the visitor exists, and if it exists the ap just push the event as follows:
server.post('/event', function (req, res, next) {
Visitor.findOne({
identifier: req.params.identifier,
client_id: req.params.client_id,
account_id: req.params.client_id
}, function(err, visitor) {
if(err) {
console.log(err);
res.send(500, visitor);
}
if(visitor) {
visitor.events.push(req.params.event);
visitor.save();
} else {
visitor = new Visitor({
identifier: req.params.cpf,
client_id: req.params.client_id,
account_id: req.params.client_id,
funnels: req.params.funnels,
events: req.params.event
});
visitor.save();
}
res.send(200, visitor);
});
});
By using this method, and I trigger several concurrent requests I get duplicated visitors instead of one visitor with multiple events.
How can I solve this issue? Whats the best approach?
Add a unique index on identifier in mongoose model. This way the second request will break with unique index violation. Just make sure you handle that error.
If you are using single page framework on the client side (angular, backbone etc) make sure you disable the button when you make an api call, and enable it on server response.
I have a MongoDb schema like this
var User = new Schema({
"UserName": { type: String, required: true },
"Email": { type: String, required: true, unique: true },
"UserType": { type: String },
"Password": { type: String }
});
I am trying to create a new user
This is done in NodeJs using mongoose ODM
And this is the code for creating:
controller.createUser = function (req, res) {
var user = new models.User({
"UserName": req.body.UserName.toLowerCase(),
"Email": req.body.Email.toLowerCase(),
"UserType": req.body.UserType.toLowerCase()
});
models.User.findOne({ 'Email': user.Email }, function (err, olduser) {
if (!err) {
if (olduser) {
res.send({ 'statusCode': 409, 'statusText': 'Email Already Exists' });
}
else if (!olduser) {
user.setPassword(req.body.Password);
user.save(function (err, done) {
if (!err) {
console.log(user);
res.send({ 'statusCode': 201, 'statusText': 'CREATED' });
}
else {
res.send({ 'Status code': 500, 'statusText': 'Internal Server Error' });
}
});
}
}
else {
res.send({ 'statusCode': 500, 'statusText': 'ERROR' });
}
});
};
The for creating new user,I am giving attributes and values as follows:
{
"UserName": "ann",
"Email": "ann#ann.com",
"UserType": "normaluser",
"Password":"123456"
}
And I am getting error like this:
{"Status code":500,"statusText":"Internal Server Error","Error":{"name":"MongoError","err":"E11000 duplicate key error index: medinfo.users.$UserName_1 dup key: { : \"ann\" }","code":11000,"n":0,"connectionId":54,"ok":1}}
I understand that this error is because UserName is duplicated ,but I haven't set UserName with unique constraint.Whenever I add a new row,I need only email to be unique,UserName can be repeated.How to achieve this??
#ManseUK Is probably right, that looks like UserName is a 'key' - in this case an index. The _id attribute is the "primary" index that is created by default, but mongodb allows you to have multiple of these.
Start a mongo console and run medinfo.users.getIndexes()? Something must have added an index on 'UserName'.
required: true wouldn't do that, but you might have played with other settings previously and the index hasn't been removed?
There should be an index that is blocking.
You can try the db.collection.dropIndex() method
medinfo.users.dropIndexes()
I got the similar issue on my project. I tried to clear out all the documents and the dup issue still keep popping up. Until I dropped this collection and re-start my node service, it just worked.
What I had realized is that my data-structures were changing -- this is where versioning comes in handy.
You may need to get a mongoose-version module, do a thing.remove({}, ...) or even drop the collection: drop database with mongoose
I use RoboMongo for an admin tool (and I highly recommend it!) so I just went in and right-clicked/dropped collection from the console.
If anyone knows how to easily version and/or drop a collection from within the code, feel free to post a comment below as it surely helps this thread ( and I :) ).