I have a node.js app using the sails.js framework and I'm trying to deploy this app on the bluemix cloud service.
I am trying a MongoDB instance in compose.io and I have a rather standard connection configuration in my local.js file:
connectMongo: {
adapter: 'sails-mongo',
host: 'sl-eu-lon-2-portal.1.dblayer.com',
port: 10438,
database: 'some-db'
}
It is not working. It's not deploying.
The error it gives is:
ERR error: A hook (`orm`) failed to load!
ERR error: Error: Failed to connect to MongoDB.
This means, of course, that the database is
But strangely it also gives this
ERR { [MongoError: connect ECONNREFUSED 127.0.0.1:27017]
Which doesn't make any sense, as I am not using port 27017, as noted above I am using 10438.
The app is running locally, so I get that I am missing something on connecting to the database via the bluemix configurations, but I can't understand how come the 27017 pops up there.
So your setup seems correct. Compare it with my setup:
env/production.js
connections: {
prodMongoDb: {
adapter: 'sails-mongo',
host: process.env.MONGO_PORT_27017_TCP_ADDR,
port: 27017,
database: 'my_database'
}
},
models: {
connection: 'prodMongoDb',
migrate: 'safe'
}
env/development.js
connections: {
devMongoDb: {
adapter: 'sails-mongo',
host: 'localhost',
port: 27017,
database: 'my_database'
}
},
models: {
connection: 'devMongoDb',
migrate: 'safe'
}
The fact that you are specifying a port 10438 but get an error regarding 27017 means that sails is not picking up your connection definition. How do you start your app?
Starting it like this:
npm start NODE_ENV="production"
will make sails pick up the production config.
Related
I am trying to connect From Node.js on Localhost to MySQL instance running on docker using docker-compose.
Node.js gives me this error: ENOTFOUND db, Full error message bellow.
[nodemon] restarting due to changes...
[nodemon] starting `node app.js`
Application Name: RESTFull API - Development
Environment: development
Server is listening on port 3000
getaddrinfo ENOTFOUND db # <------------ Error here
[nodemon] app crashed - waiting for file changes before starting...
Here is docker-compose.yml that contains MySQL and adminer services.
## docker-compose.yml
version: '3.8'
services:
db:
image: mysql
restart: always
environment:
MYSQL_ROOT_PASSWORD: root
MYSQL_DATABASE: 'nodejs-restfull-api-development'
expose:
- 3306
volumes:
- db-config:/etc/mysql
- db-data:/var/lib/mysql
adminer:
image: adminer:latest
depends_on:
- db
environment:
ADMINER_DEFAULT_DB_DRIVER: mysql
ADMINER_DEFAULT_DB_HOST: db
ADMINER_DESIGN: nette
ADMINER_PLUGINS: tables-filter tinymce
ports:
- "8080:8080"
volumes:
db-config:
db-data:
Here is my node.js database connection config.
const database = mysql.createConnection({
host: 'db',
user: config.get('db.user'),
password: config.get('db.password'),
database: config.get('db.database'),
port: config.get('db.port'),
connectTimeout: config.get('db.connectTimeout')
});
database.connect(err => {
if (err) {
console.log(err.message);
process.exit(1);
} else {
console.log('Connected to database');
}
});
You don't tell us, but I assume your Node app is running on the host and not in a container? If that's the case, then you need to expose the MySQL port to the host, so it's reachable. You also need to use localhost as the hostname in your configuration.
Expose the port by changing the database part of your docker-compose file to
db:
image: mysql
restart: always
environment:
MYSQL_ROOT_PASSWORD: root
MYSQL_DATABASE: 'nodejs-restfull-api-development'
ports:
- 3306:3306
volumes:
- db-config:/etc/mysql
- db-data:/var/lib/mysql
And change your Node configuration to
const database = mysql.createConnection({
host: 'localhost',
user: config.get('db.user'),
password: config.get('db.password'),
database: config.get('db.database'),
port: config.get('db.port'),
connectTimeout: config.get('db.connectTimeout')
});
I keep getting this error:
Error: self signed certificate
When running this command in the terminal:
knex migrate:latest --env production
My knexfile.js
require('dotenv').config();
module.exports = {
development: {
client: "pg",
connection: {
host: "localhost",
database: "my-movies"
}
},
production: {
client: "pg",
connection: process.env.DATABASE_URL
}
};
My .env file:
DATABASE_URL=<my_database_url>?ssl=true
Heroku app info:
Addons: heroku-postgresql:hobby-dev
Auto Cert Mgmt: false
Dynos:
Git URL: https://git.heroku.com/path-name.git
Owner: xxxxxxxxx#xxxx.com
Region: us
Repo Size: 0 B
Slug Size: 0 B
Stack: heroku-18
Web URL: https://my-appname.herokuapp.com/
I've tried putting a key value pair in the production in the knexfile of ssl: true and I get the same error. I've done it this way in the past many, many times and have never had this issue. Wondering if Heroku has changed anything but while searching their docs I couldn't find anything.
The following config at knexfile.js worked for me.
...
production: {
client: 'postgresql',
connection: {
connectionString: process.env.DATABASE_URL,
ssl: { rejectUnauthorized: false }
}
}
...
where the DATABASE_URL is what you get by running heroku config --yourAppName
This is due to a breaking change in pg#^8 (2020/02/25) cf. this heroku help forum.
You can get the full pg#^8 announcement but here is the relevant passage:
Now we will use the default ssl options to tls.connect which includes rejectUnauthorized being enabled. This means your connection attempt may fail if you are using a self-signed cert.
And it seems heroku is using self-signed certificates somewhere.
possible solutions:
downgrade to pg#^7
instruct pg#^8 to ignore problematic certificates ssl: { rejectUnauthorized: false } (see announcement linked above)
find a way to download and trust the certificate instructions
The ssl: { rejectUnauthorized: false } pg config isn't working for me at the moment either.. but I found a temporary (maybe permanent) solution via the heroku docs
Set the following config var:
heroku config:set PGSSLMODE=no-verify
If you are using a config like:
...
production: {
client: 'postgresql',
connection: {
connectionString: process.env.DATABASE_URL,
ssl: { rejectUnauthorized: false }
}
}
...
...and it still isn't working for you, make sure you don't have a ?ssl=true or sslmode set in DB your connection string.
If ssl is set in your connection string it will override the ssl part of your config, meaning behavior is equivalent to:
...
production: {
client: 'postgresql',
connection: {
connectionString: process.env.DATABASE_URL,
ssl: true
}
}
...
Removing the ssl entry from your connection string will fix the problem.
What worked for me was not using just a connection string but also adding the CA from my database as an option to the connection object in knex.
production: {
client: 'postgresql',
connection: {
connectionString: process.env.DATABASE_URL,
ssl: {
rejectUnauthorized: false,
ca: process.env.POSTGRES_CA,
}
}
}
I can't connect to the MySQL-database. I've googled some solutions, but none seems to work, or perhaps I haven't understood them. This is the setup:
let express = require("express");
let mysql = require("mysql");
let http = require("http");
let app = express();
http.createServer(app).listen(8000, function() {
console.log("Listening on http://localhost:" + port)
});
var connection = mysql.createConnection({
host: "127.0.0.1",
user: "root",
password: process.env.DB_PASSWORD,
database: "users",
port: 8000
});
connection.connect();
connection.query("SELECT * FROM user", function(err, rows, fields)
{
if (err) {
console.error("error connecting: " + err.stack);
return;
}
console.log(rows[0]);
});
connection.end();
Tried to setup the connection with "socketPath" and with another port, but they both returned error:
Error: connect ECONNREFUSED 127.0.0.1:3306
at Object.exports._errnoException (util.js:1034:11)
at exports._exceptionWithHostPort (util.js:1057:20)
at TCPConnectWrap.afterConnect [as oncomplete] (net.js:1099:14)
--------------------
at Protocol._enqueue (/vagrant/node_modules/mysql/lib/protocol/Protocol.js:141:48)
at Protocol.handshake (/vagrant/node_modules/mysql/lib/protocol/Protocol.js:52:41)
at Connection.connect (/vagrant/node_modules/mysql/lib/Connection.js:130:18)
at Object.<anonymous> (/vagrant/app.js:12:12)
at Module._compile (module.js:571:32)
at Object.Module._extensions..js (module.js:580:10)
at Module.load (module.js:488:32)
at tryModuleLoad (module.js:447:12)
at Function.Module._load (module.js:439:3)
at Module.runMain (module.js:605:10)
When the connection and listen listens on same port, it doesn't return any errors, but the connection doesn't seem to initialize.
console.log(connection.connect()); // -> undefined
I'm very new to using MySQL to node so I'm probably doing this all wrong, but can't figure out what the problem is
For MAC users using MAMP
var con = mysql.createConnection({
host: "localhost",
user: "root",
password: "root",
database: "databasename",
socketPath: '/Applications/MAMP/tmp/mysql/mysql.sock'
});
socketPath points to MAMP "mysql.sock" to permit the NodeJS/Mysql connection.
NOTE: the above connection also solves "Error: connect ECONNREFUSED 127.0.0.1:3306".
I hope this helps someone.
Change your connection.connect() to
connection.connect(function(err) {
if (err) {
console.error('error connecting: ' + err.stack);
return;
}
console.log('connected as id ' + connection.threadId);
});
so you can see errors.
EDIT:
Check bind-address and port in mysql config file (/etc/mysql/my.cnf). If your mysql server is running on host environment and node is running in guest (any virtualization like docker), set mysql server to listen on your local ip (eth0 like 192.168.0.x) and use same address in node config
As it turns out, I was a bit stupid. I was running the application on a Vagrant-server(virtual server), and I was trying to connect to my local-server. As soon as I tried to connect to the vagrant-server, everything worked properly. I also didn't have mysql-server properly installed to my vagrant machine.
if you don't set the port or set it to mysql default port i.e 3306 it works fine.
var connection = mysql.createConnection({
host: "127.0.0.1",
user: "mysqluser",
password: 'password',
database: "yourdatabase"
});
I don't know the exact cause(although i tried with different ports), why it's not running on different ports but here is a link that shows how to use different port number for connecting mysql.
FYI: You are setting your app to run on 8000 and again the port mysql
using in your app is 8000.You must change it to something else.
You have to set socketPath and before you run your script make sure your MAMP is running or not.
var con = mysql.createConnection({
host: "localhost",
user: "root",
password: "root",
socketPath: '/Applications/MAMP/tmp/mysql/mysql.sock'
});
Remove your port key value(port: 8000) and socketPath if you added,
from sql configuration, and try.
var connection = mysql.createConnection({
host: "127.0.0.1",
user: "root",
password: "root",
database: "users"
});
Actually you occupied 8000 port by node server so you can't assign same port to different process.
Second mysql is running on port 3306. so you could not connect mysql through different port.
You can use different port for mysql but by some proxy pass mechanism or by running mysql itself on specific port.
Please follow simple steps to start (Worked on windows Machine):
Download and install MySQL from following link click here.
Configure downloaded file (I had kept default config parameters). Please check this video link for reference
Make sure MySQL server status is on.
Now you can check connection status by using below node code.
const mysql = require('mysql');
const connection = mysql.createConnection({
host: 'localhost',
user: 'root',
password: 'password',
port: 3306
});
connection.connect((err) => {
if (err) throw err;
console.log('Connected to MySQL Server');
})
Start Your Mysql Server from your Xampp Control panel
I did set up two mosquitto broker with websocket support and am able to connect to them with mqtt.js
Now I tried to implement a fault-prove version with an array of possible mqtt brokers, which should be tried to connect to in order until a successful connection. If a connection fails, the next broker should be tried... so far so good, but if i try to connect to an offline broker, somehow mqtt.js tries to reconnect endlessly. I am not able to close the connection attempt and connect to the next one.
var client = mqtt.connect("ws://firstbrokerip:9001");
client.on('connect', function() {
//consoleLog("[BROWSER] MQTT js-Client:"," Connected","green");
client.subscribe("testchannel");
});
client.on('offline', function() {
//consoleLog("[BROWSER] MQTT js-Client:", ' Offline',"red");
client.end();
client = mqtt.connect("ws://secondbrokerip:9001");
});
Any ideas of how can I close the connection and connect to the next ?
(Plz don't care about the custom ConsoleLog function)
You don't need to implement fail over, it's baked into the module:
From the mqtt.js doc (https://github.com/mqttjs/MQTT.js#connect)
You can also specify a servers options with content: [{ host:
'localhost', port: 1883 }, ... ], in that case that array is iterated
at every connect.
So you pass the connect method options object with a key called servers which is an array of brokers to connect to.
client = mqtt.connect({
servers: [
{
host: 'firstbroker.ip',
port: 9001,
protocol: 'ws'
},
{
host: 'secondbroker.ip',
port: 9001,
protocol: 'ws'
}
]
});
I am attempting to connect my app, which is currently on localhost, to a rethinkdb server on AWS. I used the rethinkdb AMI to get the server configured and up and running. However, I keep getting a failed to connect message. I am using rethinkdbdash to attempt the connection (https://github.com/neumino/rethinkdbdash). The following is my connection code and no passsword on the db for right now. Anyone know how to connect to it:
let r = rethinkdbdash({
db: 'test',
user: 'rethinkdb',
servers: [{host: 'my.aws.ip.addr', port: '28015'}]
});
Made a mistake in the connection syntax... the actual syntax should have been:
let r = rethinkdbdash(
{
host: 'ip.addr.here',
db: 'test',
});