ER_ACCESS_DENIED_ERROR CloudSQL - javascript

I am receiving an error which looks like this (in my functions log)
Access denied for user \'varun_admin\'#\'cloudsqlproxy~84.117.112.32\' (using password: YES)',
sqlMessage:
`\'varun_admin\'#\'cloudsqlproxy~84.117.112.32\' (using password: YES)',`
sqlState: '28000',
fatal: true }
(84.117.112.32) intentionally modified.
I have double checked my username and password, In fact, I made a request from workbench and it went fine.
This is how I am creating/initialising my sql
const mysql = require('mysql')
const config = require('./../../config.js')
const connectionName = config.DB_CONNECTION_NAME
console.log(`Connection name: ${config.DB_CONNECTION_NAME}`)
const configSQL = {
host: config.DB_HOST,
user: config.DB_USER,
password: config.DB_PASSWORD,
database: config.DB_DATABASE
}
// Connection to cloud sql in production
if (!process.env.dev) {
configSQL.socketPath = `/cloudsql/${connectionName}`;
}
//SQL Config
const pool = mysql.createPool(configSQL)
// Checking if it was connected sucessfully or not on server startup
pool.getConnection((err, connection) => {
if (err) {
console.error('error connecting: ' + err)
return
}
console.log('connected as id ' + connection.threadId)
connection.release()
return
})
And the following function would typically make a call to get data
const getEverythingFromTable = tableName => {
return new Promise((resolve, reject) => {
pool.getConnection((error, connection) => {
if (error) return reject(error)
const query = `SELECT * FROM ${tableName}`
connection.query(query, (err, response) => {
connection.destroy()
if (err) return reject(err)
return resolve(response)
})
})
})
}
Any idea what I could be doing wrong?
SQL Logs
Update: 1
These are the environment values I am passing to the Cloud SQL Config
(Please refer to the code snippet above)
Where my cloudSQL config in the UI looks like this
How I am invoking functions/ calling them, the NodeJS code for it is above.

The error you are getting can be caused by an issue with your password or with the SSL encryption that is being used, as mentioned in the Verify how you connect section of the documentation.
I actually tried to see if I could reproduce the issue by I changing my instance configurations to Allow only SSL connections, as suggested by the Enforcing SSL/TLS section of the documentation. However, it didn’t cause the issue for me
This would not usually be a problem since, as mentioned in this post, the connections from Cloud Functions are encrypted by default when you use the cloudsqlproxy, but I had to test it in case something changed.
I also tried changing the configuration in order to restrict the access to my instance even more. However the only thing that failed my connection was disabling the connection through the Public IP and only allowing it through the Private one, and this made so the connection did not even reach the instance.
Since you mentioned you are able to connect with the Workbench, I believe there are 2 possible causes for your issue:
There could be a problem with the encoding of some characters in your
password, that only get mess up when trying to access it from the
env variables. I suggest you try with a very basic password to see
if you get the same result.
There could be an issue with the encryption of the connection from
the Cloud Function. If that is the case, this would be very specific
to your project and the best way to address this issue would be to
open an issue on Google’s Issue Tracker, or to open a support
case, in case you have a support plan.
I hope this helps you.

Make sure that Cloud SQL user varun_admin has the permission to connect from host cloudsqlproxy~84.117.112.32. This could also be %, but I'd rather recommend to permit only what is required to connect (which is a single host only). Also make sure to flush privileges on mySQL, so that the account changes will be applied instantly. Also see Configuring SSL/TLS.

Related

TypeError: Cannot read property 'rows' of undefined

When following Getting started with Postgres in your React app, at the point where you process and export the getMerchants, createMerchant, and deleteMerchant functions, I am getting this error -- TypeError: Cannot read property 'rows' of undefined, what is going on here?? It is probably something very small I am missing. There error occurs at getMerchants and resolve(results.rows)
const Pool = require('pg').Pool
const pool = new Pool({
user: 'my_user',
host: 'localhost',
database: 'my_database',
password: 'root',
port: 5432,
});
const getMerchants = () => {
return new Promise(function(resolve, reject) {
pool.query('SELECT * FROM userIds ORDER BY id ASC', (error, results) => {
if (error) {
reject(error)
}
resolve(results.rows);
})
})
}
const createMerchant = (body) => {
return new Promise(function(resolve, reject) {
const { name, email } = body
pool.query('INSERT INTO userIds (name, email) VALUES ($1, $2) RETURNING *', [name, email], (error, results) => {
if (error) {
reject(error)
}
resolve(`A new merchant has been added added: ${results.rows[0]}`)
})
})
}
const deleteMerchant = () => {
return new Promise(function(resolve, reject) {
const id = parseInt(Request.params.id)
pool.query('DELETE FROM userIds WHERE id = $1', [id], (error, results) => {
if (error) {
reject(error)
}
resolve(`Merchant deleted with ID: ${id}`)
})
})
}
module.exports = {
getMerchants,
createMerchant,
deleteMerchant,
}
As per my comment:
Second parameter for query is the query parameters, and not a callback function. Pass in [], if you do not have parameters. Other than that, your code looks largely redundant. You should follow proper async pattern that library offers, and not re-create all those useless promises...
const getMerchants = () => pool.query('SELECT * FROM userIds ORDER BY id ASC');
And then use it like this:
const {rows} = await getMerchants();
Right code, wrong password
The code itself is right! The error is thrown:
if you have not set a password at all, or
if you enter a wrong password. For example, I accidentally copied 'root' as a password from the guide, although I had set the password of my_user to 'postgres'. When everything else was already running and only the password was wrong, you get the same error.
The error in detail:
myuser#mypc:~/project/node-postgres$ node index.js
App running on port 3001.
/home/myuser/myproject/node-postgres/merchant_model.js:17
resolve(results.rows);
^
TypeError: Cannot read property 'rows' of undefined
at pool.query (/home/myuser/myproject/node-postgres/merchant_model.js:17:23)
at PendingItem.connect [as callback] (/home/myuser/myproject/node-postgres/node_modules/pg-pool/index.js:363:16)
at Client.client.connect [as _connectionCallback] (/home/myuser/myproject/node-postgres/node_modules/pg-pool/index.js:246:23)
at Client._handleErrorWhileConnecting (/home/myuser/myproject/node-postgres/node_modules/pg/lib/client.js:305:19)
at Client._handleErrorMessage (/home/myuser/myproject/node-postgres/node_modules/pg/lib/client.js:325:19)
at Connection.emit (events.js:198:13)
at parse (/home/myuser/myproject/node-postgres/node_modules/pg/lib/connection.js:114:12)
at Parser.parse (/home/myuser/myproject/node-postgres/node_modules/pg-protocol/dist/parser.js:40:17)
at Socket.stream.on (/home/myuser/myproject/node-postgres/node_modules/pg-protocol/dist/index.js:11:42)
at Socket.emit (events.js:198:13)
I could find out the solution for this by following the guide at Getting started with Postgres in your React app: an end to end example where you also find more about the code that was used here. For digging deeper into this, How to quickly build an API using Node.js & PostgreSQL might help as well. To test this, you need to install PostgreSQL (or use a Webserver, but then, you will need SSL, see How to create a pool object in javascript that can connect to a database backend using SSL?) and create a "Merchant" table in the database as it is done in the guide.
As the files of the tutorial do not have any error, the problem of the question happens most probably (as in my case) because you have not fully followed the guide about how to set up the database role and password. You can check whether your settings are right by running, depending on whether you have chosen the guide's my_user or the standard postgres user to connect (if the latter, change the pool settings in merchant_model.js to postgres user):
psql -d my_database -U my_user
or:
psql -d my_database -U postgres
It should ask for a password:
Password for user postgres/my_user:
If your password tries are not accepted (standard test password is often postgres or the guide uses root), you probably do not have any password set, so easy. This was in my case, since the fresh install of PostgreSQL does not give the superuser postgres a password, neither has any other role that you create! It just did not catch my attention because I always used
sudo su postgres
to connect, which just asks you for the password of your Linux user, not that of the postgres user.
Thus: you must use a postgres (or my_user) user that has a password. You need to follow Getting error: Peer authentication failed for user "postgres", when trying to get pgsql working with rails in order to give postgres a password, and if you want to use another user like my_user, give it a password as well.
Only then, you get the connection done. And the error will disappear.
Outlook
When you have added the two files from the repository:
merchant_model.js (the one from the guide or the compact version above)
index.js (from the guide)
to the "node-postgres" directory, being in the project directory, you can run it with node index.js.
If that runs, you should see
App running on port 3001.
in the command prompt, with the command still running. Then you open a browser and enter localhost:3001 to see the backend:
When you then follow the guide and add the files from the repository:
App.js
index.js
to the "react-postgres\src" directory, you can run:
npm start
to see the frontend:
Try the Add button:
PS: Bad coding? Probably not!
The first answer is wrong in saying that the alleged "bad coding" is the cause of the error. You can use the full copies of the guide's github repository and everything will work, since the error comes from somthing else in the end. Yet, the code above may be bad coding, at least, count null entries in database column in a RESTfull way also uses the suggested await. I doubt that is that bad since it is working well, and when I later ran the code without the Promises on react-postgres, I vaguely remember having had a sort of message that you should use error handlers. Yet, for anyone interested, here is the advice of the first answer put into practice, it works, the "rows" error disappears, but that is just because the "rows" are not in the code anymore :))).
merchant_model.js:
const Pool = require('pg').Pool
const pool = new Pool({
user: 'my_user',
host: 'localhost',
database: 'my_database',
password: 'postgres',
port: 5432,
});
const getMerchants = () => pool.query('SELECT * FROM Merchants ORDER BY id ASC')
const createMerchant = (body) => {
const { name, email } = body
pool.query('INSERT INTO Merchants (name, email) VALUES ($1, $2) RETURNING *', [name, email])
}
const deleteMerchant = () => {
const id = parseInt(request.params.id)
pool.query('DELETE FROM Merchants WHERE ID = $1', [id])
}
module.exports = {
getMerchants,
createMerchant,
deleteMerchant,
}
Whether you use the code with or without Promises is up to you, perhaps one should rather stick to the guide and use it with Promises.

Node.js SQL How to Run Query that Spans Over Two Databases

I am building an application that has a backend that uses SQL queries to get data from a SQL Server database. However, I need to write a query that truncates and repopulates a table in that database using data from a second database. Here is what my code looks like:
// establishes a connection to serverName and uses DB1 as the database. But how can you access two?
global.config = {
user: 'username',
password: 'password',
server: 'serverName',
database: 'DB1'
};
// run this query. It's already been tested in SQL server and works fine there
let query = "TRUNCATE TABLE [DB1].[dbo].[Shop]; INSERT INTO [DB1].[dbo].[Shop] (Shop, shopDescription, Address, City)" +
" SELECT Shop, Description, Address, City FROM [DB2].[dbo].[ShopTable]"
new sql.ConnectionPool(config).connect().then(pool => {
return pool.request().query(query) }).then(
result => {
console.log(result.recordset)
//result returns as "undefined"
res.setHeader('Access-Control-Allow-Origin', '*')
res.status(200);
sql.close();
}).catch(err => { //error is not thrown
res.status(500).send({ message: err})
sql.close();
});
I get an "undefined" result, and find that no update to the table was made. The issue here isn't exactly clear whether it can't reach the table in DB2, or if perhaps the command doesn't work with the Node.js mssql package?
It appears this was a front end to back end connection issue. I was able to get the query to work without any changes to the database information, so it appears I can access the second database without any issues.
component.ts
this.httpService.patch('update', {} ).subscribe()
index.js
app.patch('/api/update', controllers.manualQueries.update);

authentication conflict between sessions in Express server

I am playing around with this library and I am experiencing an annoying scenario which I believe comes from some sort of conflict in cookies or headers authentication.
When I login to one account everything works great. But then when trying to login to another account, it simply ignore the new data provided and move through the authentication with the old data and connecting to the old account. No matter if the email or the password even exist. (Tried also with fake data).
The library doesn't have proper logout method which make sense, you dont really need one because when you run it simply using node on your machine without any server involved and there is no cookies or any kind of data in memory, everything work great. I can login to as many account as I want.
The problem is when running it on an Express server.
CODE:
// api.js
const OKCupid = require("./okc_lib");
const Promise = require("bluebird");
const okc = Promise.promisifyAll(new OKCupid());
async function start(req, res, next) {
const {
body: {
username,
password
}
} = req;
try {
await okc.loginAsync(username, password);
okc.search(
searchOpt,
(err, resp, body) => {
if (err) return console.log({ err });
const results = body.data;
// do dsomething with results
return res.status(200).json({ message: "OK" });
});
}
);
} catch (error) {
res.status(400).json({ message: "Something went wrong", error });
}
}
module.exports = { start };
// routes.js
const express = require("express");
const router = express.Router();
const { start, login } = require("../actions/okc");
router.post("/login", login);
router.post("/start", start);
module.exports = router;
So when trying first to post to url/login it works fine. But when you try to do it again with different username and password it simply go through and ignore the new data and connect to the old one.
As part of my investigation I looked at the source code of the library and found a method clearOAuthToken which clear the token from the header. However it didn't really do anything. So I tried to remove the jar initialisation from the requester helper and it was the only thing that helped me to move on and login to another account. BUT it was only for experimenting and cant be a solution as you do need those cookies for other parts of the library. It was only a proof the problem isn't in the headers but in the cookies.
Any idea how can I "reset" state of server between each call?
"when trying to login to another account, it simply ignore the new data provided and move through the authentication with the old data and connecting to the old account."
As OP mentioned in the comment, this is not an authorization header issue, but a cookie issue.
To implement the logout interface, you can manually clear the cookies:
OKCupid.prototype.logout = function(){
request = request.defaults({jar: request.jar()}) // reset the cookie jar
headers.clearOAuthToken(); // just in case
}

mongoose connection can´t handle errors

I am building my first test API rest with Mongo and Node
I am opening a connection to the database, and it works right... but I can´t handle the error case. Despite i write a wrong URI, it makes a successful connection. Tried with promises, callbacks, and events, but nothing works:
For example:
const mongoose=require('mongoose');
mongoose.Promise = global.Promise;
const express=require('express');
const bodyParser=require('body-parser');
const portApp=1300;
const app=express();
app.listen(portApp,'localhost',()=>{
console.log(`server works fine at ${portApp}`);
mongoose.connect('mongodb://localhost:27017/RIGHTdbname')
.then((res)=>
{
console.log(`successful connection to BBDD`);
//console.log(res);
})
.catch((error)=>{
console.log("error"+error.message);
});
});
That´s ok, it throws "successful connection to BBDD"... the problem is, when I write a wrong database name, it throws the same!
I tried to with callback too. like suggested here:
mongoose.connect('mongodb://localhost:27017/WRONGdbname',function(err){
if(err)
{
throw err;
}
});
And tried to use these events (taken from here, and which I actually don´t understand, only used the .on() jquery method in the past, for event delegation tasks), but it does´t work either, because always the "connected" event fires, even if database name is wrong, again.
// When successfully connected
mongoose.connection.on('connected', function () {
console.log('Mongoose default connection opened);
});
// If the connection throws an error
mongoose.connection.on('error',function (err) {
console.log('Mongoose default connection error: ' + err);
});
Can someone explain me what I´´m doing wrong? Thanks
The "database" in the Mongo connection string is used for authentication, and is only relevant if you pass the username and password in the URL using the mongodb://user:pass#host:port/database syntax.
From the reference
/database Optional. The name of the database to authenticate if the connection string includes authentication credentials in the form of username:password#. If /database is not specified and the connection string includes credentials, the driver will authenticate to the admin database.

How to build local node.js mail server

I have spent couple of days implementing my own mail server using node.js. I used modules like "smtp-server" for creating smtp server and also "smtp-connection" to connect and send mail to it. But I'm getting confused because I don't know how to send mails from my smtp server to providers smtp servers like google or yahoo.
Can anyone help me?
Here is my code for more information:
My index.js file:
var SMTPServer = require('smtp-server').SMTPServer;
var port = 9025;
var serverOptions = {
name: "smtp-interceptor",
onConnect: onConnect,
onAuth: onAuth,
onData: onData
};
var server = new SMTPServer(serverOptions);
server.listen(port, 'localhost', function () {
console.log('SMTP server is listening on port ' + port);
});
function onConnect(session, callback) {
console.log('Connected');
return callback(); // Accept the connection
}
function onData(stream, session, callback) {
stream.pipe(process.stdout); // print message to console
console.log('Session \n', session.envelope);
stream.on('end', callback);
}
function onAuth(auth, session, callback){
if(auth.username !== 'Mahan' || auth.password !== 'Tafreshi') {
return callback(new Error('Invalid username or password'));
}
callback(null, {user: 123}); // where 123 is the user id or similar property
}
And my connection.js file:
var SMTPConnection = require('smtp-connection');
process.env.NODE_TLS_REJECT_UNAUTHORIZED = "0";
var connection = new SMTPConnection({
host: 'localhost',
port: 9025,
secure: false
});
connection.connect(function (){
console.log('Connceted to SMTP server');
var auth = {
user: 'Mahan',
pass: 'Tafreshi'
};
connection.login(auth, function (err) {
if(err)
return console.log('Login Failed \n', err);
console.log('Login Successful');
var envelope = {
from: "testapp#testapplocal.com",
to: "mahantp19#gmail.com"
};
var message = 'Test message1';
connection.send(envelope, message, function (err, info) {
if(err)
return console.log('Error : ' + err);
console.log('Message sent');
console.log('Accepted : ' + info.accepted);
console.log('Rejected : ' + info.rejected);
console.log(info.response);
connection.quit();
console.log('Quit connection');
connection.close();
});
});
});
There are many checks an email must pass before it's accepted by most mail providers. These checks attempt to validate the server sending the message is authorized to send on behalf of the sender.
IE: My server can send an email saying it's from "someone-special#somewhere-important.com"... That doesn't mean I'm "anywhere important" by any means.
While you may have had success sending mail from an SMTP server in the past using another technology such as PHP or an Exchange Server, the rules have changed significantly. Gmail has just began full enforcement this year.
I would assume your current issue has nothing to do with node as much as recent changes by the big providers.
Some of the checks that are needed include:
DKIM Keys (DNS Record)
SPF Record (DNS Record)
DMARK has been setup.
Dedicated IP Address for the server is required.
Your servers IP not being blacklisted.
The content of your email passes their filters.
Attempt to have an email sent from your server appear to be from a visitor or customer.
Among many others.
Any domain you want to "Act as Sender" must have these in place for most of the major providers to accept you message.
Google has a great set of tools and walkthroughs on getting an IP/Domain setup.
Visit the Google MX Record Checker and enter in the domain/subdomain you want to use as sender and it will tell you everything that is failing as well as passing.
Alternative Solutions
I use sendgrid.com. They have A node library that makes sending mail very easy. They also provide me the ability to proxy messages via SMTP. This means you can utilize the standard methods to deliver messages. You will just change out "localhost" with an hostname they provide. However, if this is for a new setup, go for the API.
Whomever is hosting your email should offer the ability for you send messages via SMTP or an API
An endless supply of other providers are out their, most of which allow low volume senders to send for FREE.
Word of warning
I tried for a few years keeping up with all the changes and inevitably, I continued to hit barriers of blocked messages with no ability to know until someone did not get an email. If your sending low volume, you should be able to use third parties without paying paying for it. If you are sending high volume, the cost of their service is cheap compared to the endless issues you will encounter even once you get it initially rolling.
PS. I have no affiliation with any email provider or sender. I pay them too.

Categories