I have created four postresql tables. I am using nodejs with knexjs as a query builder.
I can create and execute migrations with the command line without any problems. Now I want to run migrations via Javascript. How can I proceed with this?
Here is my code: -
module.exports.runDBMigrations = async () => {
knex.schema.hasTable(USERS_DB_NAME).then(function (exists) {
if (!exists) {
//Execute migrations to create users tables
}
})
.catch(e => {
console.log("Error creating tables: ", e);
})
knex.schema.hasTable(POSTS_DB_NAME).then(function (exists) {
if (!exists) {
//Execute migrations to create posts tables
}
})
.catch(e => {
console.log("Error creating tables: ", e);
})
knex.schema.hasTable(LIKES_DB_NAME).then(function (exists) {
if (!exists) {
//Execute migrations to likes users tables
}
})
.catch(e => {
console.log("Error creating tables: ", e);
})
knex.schema.hasTable(FOLLOWERS_DB_NAME).then(function (exists) {
if (!exists) {
//Execute migrations to create followers tables
}
})
.catch(e => {
console.log("Error creating tables: ", e);
})
}
Hi,
Couple days I was sorrounding about how to run migrations and seeds when Server is starting.
Doing some researchs adn reading the Knex documentation, this is the solution what I reach to. Maybe this can help you.
Stack
lib
version
nodejs
16.17.x
nodemon
^2.0.19
npm
8.19.x
knex
^2.3.0
express
^4.18.1
pg
^8.8.0
postgreSQL
14.5
typescript
^4.8.2
Solution
1. Create a void function runMigrations(). (I located in models/setup/) (you can name it as how you want)
This function will have the knex singleton calling for running the migrations and the seeds.
2. Runn all the pending migrations calling the knex migrate .latest() method
This method is a promise, so you can capture the success creation or the error in case something goes wrong.
In this part, you can run all the seeds calling the knex seed .run() method. You can this method in the first callback
Call the migrations method in your index or server file. In my case Is runMigrations() method.
Restart your server, so you can check if everything works fine.
If you go to the terminal you can check response status. I add some log response, in order to know if everything working fine.
Note
I provoque an error so you can check the resutl if something were bad
executing the migrations or seeds
Maybe you'll find or your found others alternative of solution, but if you reach to fix the issue and added a different way; Please share it as a comment, so we can exchange knowledge and see a different way how to solve this issue.
Regards
Related
I am writing a discord bot using javascript (discord.js).
I use json files to store my data and of course always need the latest data.
I do the following steps:
I start the bot
I run a function that requires the config.json file every time a message is sent
I increase the xp a user gets from the message he sent
I update the users xp in the config.json
I log the data
So now after logging the first time (aka sending the first message) I get the data that was in the json file before I started the bot (makes sense). But after sending the second message, I expect the xp value to be higher than before, because the data should have been updated, the file new loaded and the data logged again.
(Yes I do update the file every time. When I look in the file by myself, the data is always up to date)
So is there any reason the file is not updated after requiring it the second time? Does require not reload the file?
Here is my code:
function loadJson() {
var jsonData = require("./config.json")
//here I navigate through my json file and end up getting to the ... That won't be needed I guess :)
return jsonData
}
//edits the xp of a user
function changeUserXP(receivedMessage) {
let xpPerMessage = getJsonData(receivedMessage)["levelSystemInfo"].xpPerMessage
jsonReader('./config.json', (err, data) => {
if (err) {
console.log('Error reading file:',err)
return
}
//increase the users xp
data.guilds[receivedMessage.guild.id].members[receivedMessage.author.id].xp += Number(xpPerMessage)
data.guilds[receivedMessage.guild.id].members[receivedMessage.author.id].stats.messagesSent += 1
fs.writeFile('./test_config.json', JSON.stringify(data, null, 4), (err) => {
if (err) console.log('Error writing file:', err)
})
})
}
client.on("message", (receivedMessage) => {
changeUserXP(receivedMessage)
console.log(loadJson(receivedMessage))
});
I hope the code helps :)
If my question was not precise enough or if you have further questions, feel free to comment
Thank you for your help <3
This is because require() reads the file only once and caches it. In order to read the same file again, you should first delete its key (the key is the path to the file) from require.cache
I'm refactoring a monolith into microservices, and switching the DB of one of them to MySQL. I'm using knex.js to sanitize the queries. I need to build 3 tables. One of the tables only has three columns: its own id, and two foreign key ids: one from each of the other two tables.
When trying to build the knex.js query to build the tables, I get the error in the title.
I've attempted to re-arrange my queries using the various modifications that can be made to knex.js queries, as well as attempted a raw mysql foreign key query. The error persists. Here is my js code:
const knex = require('knex') (CONFIG);
// Build images table
const imagesSchema = () => {
knex.schema.createTable('images', (table) => {
table.integer('id').primary();
table.string('name');
table.string('src');
table.string('alt');
table.string ('category');
table.string('subCategory');
})
.then(console.log('test'));
};
// Build users table
const usersSchema = () => {
knex.schema.createTable('users', table => {
table.increments('id').primary();
table.string('session', 64).nullable();
})
.then(console.log('users schema built into DB!'))
.catch( error => {console.log('cant build the users schema!\n', error)})
}
// Build userhistory table
const userHistorySchema = () => {
knex.schema.createTable('userhistory', table => {
table.increments('id').primary();
table.integer('userid').nullable();
table.integer('imageid').nullable();
// add foreign keys:
table.foreign('userid').references('users.id');
table.foreign('imageid').references('images.id');
})
.then(console.log('userhistory schema built into DB!'))
.catch( error => {console.log('cant build the userhistory schema!\n', error)})
}
I expect for the table to be created with the userhistory.userid column to point to the users.id column, and for the userhistory.imageid column to point to the images.id column. Instead, I receive this error:
Error: ER_CANNOT_ADD_FOREIGN: Cannot add foreign key constraint
code: 'ER_CANNOT_ADD_FOREIGN',
errno: 1215,
sqlMessage: 'Cannot add foreign key constraint',
sqlState: 'HY000',
index: 0,
sql: 'alter table `userhistory` add constraint `userhistory_userid_foreign` foreign key (`userid`) references `users` (`id`)'
The tables are created without foreign keys where I would like for them to be.
For MySQL the foreign keys need to be defined as unsigned().
So your userhistory schema needs to be set up like this:
knex.schema.createTable('userhistory', table => {
table.increments('id').primary();
table.integer('userid').unsigned().nullable();
table.integer('imageid').unsigned().nullable();
// add foreign keys:
table.foreign('userid').references('users.id');
table.foreign('imageid').references('images.id');
})
.then(console.log('userhistory schema built into DB!'))
.catch( error => {console.log('cant build the userhistory schema!\n', error)})
}
If you're creating tables with Knex, you should probably be using Knex's migration engine rather than rolling your own. However, your issue is that the two tables with primary keys haven't been created when you're trying to create the linking table. This is because you do not have the createTable method's callbacks nested.
I would rewrite this into something like this instead:
exports.up = async function (knex, Promise) => {
await knex.schema.createTable('images', (table) => {
table.integer('id').primary();
table.string('name');
table.string('src');
table.string('alt');
table.string ('category');
table.string('subCategory');
});
await knex.schema.createTable('users', table => {
table.increments('id').primary();
table.string('session', 64).nullable();
});
await knex.schema.createTable('user_history', table => {
table.increments('id').primary();
table.integer('userid').nullable();
table.integer('imageid').nullable();
table.foreign('userid').references('users.id');
table.foreign('imageid').references('images.id');
});
};
exports.down = async function (Knex, Promise) => {
await knex.dropTable('user_history');
await knex.dropTable('images');
await knex.dropTable('users');
};
Note that instead of using callbacks, I have switched to async/await syntax which makes this appear more like synchronous code. This should be easier to understand since the callbacks appeared to be tripping you up. In this example, when Node tries to run await knex.schema.createTable, it will pause and wait for the table to get created instead of jumping to the next statement. This will make sure that your images and users tables exist.
In order to do it this way, you'll have to use Knex's migration engine. If you're unfamiliar with migrations, check out this documentation.
Run these commands inside your project directory (where knexfile.js is):
npm install -g knex
knex migrate:make <your_migration_name>
knex migrate:latest
You should get a green console output saying something about how it successfully ran the migrations.
Firstly to be mentioned, I'm absolutely new to Node.Js and MongoDB.
I'm coding a back end API with Node.Js and MongoDB which will deal with GET, POST, DELETE requests from the front end, quite simple stuff.
I'm stuck while working with DELETE functionality.
Here is my posts.service.ts file contains this deletePost() function which sends the postId to the back end app.js file.
`
deletePost(postId: string) {
this.http.delete('http://localhost:3000/api/posts/' + postId)
.subscribe(() => {
console.log(postId);
console.log('Deleted');
});
}
`
I have added this console.log(postId) to check whether it is actually containing the actual postId and found that it does. I have also matched it with the real Id in MongoDB by checking through mongo shell.
Here is the delete() function in the back end app.js file that should do the actual task.
`
app.delete("/api/posts/:id", (req, res, next) => {
Post.deleteOne({ _id: req.params.id }).then(result => {
console.log(result);
res.status(200).json({message: "Post deleted"});
});
});
`
The console.log(result) line should print some result in the terminal, but it does not, so does not it delete the collection in the DB.
I`m running this on an Ubuntu 16.04 LTS pc.
Some clue would mean great help. Thank you very much for your kind effort.
deleteOne doesn't return the deleted document. It always deletes the first matching document and return the number of documents deleted with the boolean value.
From the mongodb docs deleteOne:
Returns:
A document containing: A boolean acknowledged as true if the operation ran with write concern or false if write concern was
disabled
deletedCount containing the number of deleted documents
From the mongoose docs
Deletes the first document that matches conditions from the
collection. Behaves like remove(), but deletes at most one document
regardless of the single option.
I was facing exactly the same, i solved returning the deleteOne promise object and then using the .then property of the promise.
Something like this:
Model.js
...
bicicleSchema.statics.deleteById= function(id, cb){
return this.deleteOne({code: id}, cb);
};
...
module.exports = mongoose.model('Bicicle', bicicleSchema);
Service.js
var Bicicle = require('../../../model/bicicle');
...
const id = req.params.id;
Bicicle.deleteById(id).then(()=>{
//your code
});
...
Dear arangodb community,
How mature is arangojs? When I tried "Tutorial: Node.js (io.js) in 10 minutes", the exercises 1 thru 4 work as expected. But the 5 thru 10 failed. From the following exercise, I am getting
Database created: undefined
instead of
Database created: "mydb"
Thus the remaining exercises cannot continue, since the crucial object-bearing variable (mydb) is null. But, observing that "mydb" data base is correctly created in arangodb, my question is just related to the maturity of aragogojs (arangodb's Javascript driver). Or how do I fix it?
db.createDatabase('mydb', function(err, newdb) {
if (err) {
console.log('Failed to create database: %j',
err.message);
} else {
console.log('Database created: %j', newdb.name);
mydb = newdb;
}
});
Thanks
The announced, updated node tutorial using the 4.x version of arangojs is available now.
Creating a new database changed to:
db.createDatabase('mydb').then(
() => console.log('Database created'),
err => console.error('Failed to create database:', err)
);
All asynchronous methods in the ArangoDB driver return promises but you can also pass a node-style callback instead:
db.createDatabase('mydb', function (err) {
if (!err) console.log('Database created');
else console.error('Failed to create database:', err);
});
The Node tutorial is based on version 3.x of the arangojs driver. The driver has recently been updated to version 4.x, which contains a number of breaking API changes.
The tutorial will soon be updated to reflect these changes. In the meantime you can follow the tutorial by installing version 3 explicitly:
npm install arangojs#3
I am currently building an api application that checks the status and gets information of various types of dbs(i.e. Mongo, MySQL) using Sailsjs such as users, depending on a user input. Here is a snippet of the code I am working on. The local host is just the test database I am connecting to, but in the future it will be supplied by the user.
var mp = require('mongodb-promise');
var MongoClient = require('mongodb');
mp.MongoClient.connect("mongodb://#localhost:27017/test")
.then(function(db){
db.getUsers().then(function(users){
res.ok(users);
})
})
.fail(function(err) {
console.log(err);
})
I am attempting to use promises for the async issue. The problem I am having is that it doesn't work. It tells me that that Object[object object] has no method 'getUsers'. I have searched and can't seem to find a solution that works.
If I change the function to the below, I get the some data back.
mp.MongoClient.connect("mongodb://#localhost:27017/IMS")
.then(function(db){
db.stats().then(function(stats){
return res.ok(stats);
})
})
.fail(function(err) {
console.log(err);
dbObject.vipUp = false;
})
I am not sure what the issue is or how to solve it.
What you are doing here is using the node native driver methods to connect and inspect the database. There is in fact "no such method" as .getUsers() here in this API or in fact in any other API.
The .getUsers() function is just a "shell helper" that is basically implemented like this:
function (args) {
var cmdObj = {usersInfo: 1};
Object.extend(cmdObj, args);
var res = this.runCommand(cmdObj);
if (!res.ok) {
var authSchemaIncompatibleCode = 69;
if (res.code == authSchemaIncompatibleCode ||
(res.code == null && res.errmsg == "no such cmd: usersInfo")) {
// Working with 2.4 schema user data
return this.system.users.find({}).toArray();
}
throw Error(res.errmsg);
}
return res.users;
}
So what you should be able to see here is that this normally wraps a "command" form, or otherwise falls back for compatibility with MongoDB 2.4 to querying the system.users collection on the current database.
Therefore, instead of calling a method that does not exist, you then need to use the .command() method instead:
mp.MongoClient.connect("mongodb://#localhost:27017/test")
.then(function(db){
db.command({ "usersInfo": 1}).then(function(users){
res.ok(users);
})
})
.fail(function(err) {
console.log(err);
})
Or in the case of connecting to a MongoDB 2.4 instance, then fetch from the .collection():
mp.MongoClient.connect("mongodb://#localhost:27017/test")
.then(function(db){
db.collection('system.users').find().toArray().then(function(users){
res.ok(users);
})
})
.fail(function(err) {
console.log(err);
})
At any rate, you really should be establishing the database connection elsewhere in your application ( or re-using the underlying driver connection from another store ), and then calling methods on the connection already establihed. This is always preferable to creating a connection on the request of the information you want to retrieve.
Also, recent versions of the node native driver support promises right out of the box. So there may be no need to configure in anything else, depending on how you intend to use it.