I'm creating a SailsJS application, and I want users to log in only with Steam authentication. I used sails-generate-auth to create some boilerplate code with sails routes, but I'm having trouble plugging passport-steam into it.
https://github.com/kasperisager/sails-generate-auth
https://github.com/liamcurry/passport-steam
The reported error is:
C:\Users\Joe\testProject\node_modules\passport-steam\lib\passport-steam\strategy.js:67
id: result.response.players[0].steamid,
^
TypeError: Cannot read property 'steamid' of undefined
at steamapi.getPlayerSummaries.callback (C:\Users\Joe\testProject\node_modules\passport-steam\lib\passport-steam\strategy.js:67:43)
at IncomingMessage.<anonymous> (C:\Users\Joe\testProject\node_modules\passport-steam\node_modules\steam-web\lib\steam.js:218:7)
at IncomingMessage.emit (events.js:117:20)
at _stream_readable.js:944:16
at process._tickDomainCallback (node.js:486:13)
I have a feeling that this is caused by SteamWebAPI returning an empty response: {"response":{"players":[]}}, which is caused by a bogus SteamID in the request. The offending line is here in passport-steam: https://github.com/liamcurry/passport-steam/blob/master/lib/passport-steam/strategy.js#L53
Looking at identifier parameter to getUserProfile, it appears to be the entire Sails request scope. If I hardcode a good steam id into that array, I get this error:
C:\Users\Joe\testProject\api\services\passport.js:98
return next(new Error('Neither a username nor email was available'));
^
TypeError: undefined is not a function
at Authenticator.passport.connect (C:\Users\Joe\testProject\api\services\passport.js:98:12)
at module.exports (C:\Users\Joe\testProject\api\services\protocols\openid.js:24:12)
at steamapi.getPlayerSummaries.callback (C:\Users\Joe\testProject\node_modules\passport-steam\lib\passport-steam\strategy.js:72:11)
at IncomingMessage.<anonymous> (C:\Users\Joe\testProject\node_modules\passport-steam\node_modules\steam-web\lib\steam.js:218:7)
at IncomingMessage.emit (events.js:117:20)
at _stream_readable.js:944:16
at process._tickDomainCallback (node.js:486:13)
I think that makes sense since the steam response doesn't have a username nor email, but this is the profile: {"emails":[{}],"name":{}}
This is my passport configuration:
steam: {
name: 'Steam',
protocol: 'openid',
strategy: require('passport-steam').Strategy,
options: {
returnURL: 'http://localhost:1337/auth/steam/callback',
realm: 'http://localhost:1337/',
apiKey:'STEAM-API-KEY-REMOVED'
}
}
}
Not sure if there is something simple I'm missing, or I need to write a ton of custom handling. Is my configuration correct?
This is caused by out-of-date source code in npm. Even with the latest 0.1.4 version, the code is not correct. Replacing strategy.js in passport-steam with the latest version will fix this error.
Also, in api\services\passport.js, a little custom handling in passport.connect() needs to be added. The profile does not have a username, but has an id (user steamid) and displayName. These can be used to set the user model properties, e.g.
//steam auth
if (profile.hasOwnProperty('displayName')) {
user.username = profile.displayName;
}
Here is the ticket where the problem was solved: https://github.com/liamcurry/passport-steam/issues/10
Related
Issue:
Local development in VS Code runs without problems or crashes.
However, when I upload the application to a server (node.js), it runs until a Socket.io connection is established or the connected web page is refreshed.
I don't know if this relates to the admin UI specifically as in the filepath below or if this is a Socket.io issue in general.
Perhaps this also has to do with the middleware which holds up a request until an action has completed so that non-existent data cannot be accessed.
I already tried to update the npm package, but nothing has changed.
// middleware is executed on every request
var Map = require('./file')
module.exports = (io, socket) => {
socket.use(async (packet, next) => {
var key = socket.handshake.auth.key
var data = Map.get(key)
while(!data.action) { // true for finished or false for loading
await new Promise(resolve => setTimeout(resolve,10))
}
socket.data = data
next()
})
}
Serverconsole:
/usr/src/app/node_modules/#socket.io/admin-ui/dist/index.js:233
socket.data._admin.transport = transport.name;
^
TypeError: Cannot set properties of undefined (setting 'transport')
at Socket.<anonymous> (/usr/src/app/node_modules/#socket.io/admin-ui/dist/index.js:233:42)
at Socket.emit (node:events:390:28)
at WebSocket.onPacket (/usr/src/app/node_modules/engine.io/build/socket.js:214:22)
at WebSocket.emit (node:events:390:28)
at WebSocket.onPacket (/usr/src/app/node_modules/engine.io/build/transport.js:92:14)
at WebSocket.onData (/usr/src/app/node_modules/engine.io/build/transport.js:101:14)
at WebSocket.<anonymous> (/usr/src/app/node_modules/engine.io/build/transports/websocket.js:20:19)
at WebSocket.emit (node:events:390:28)
at Receiver.receiverOnMessage (/usr/src/app/node_modules/ws/lib/websocket.js:1022:20)
at Receiver.emit (node:events:390:28)
Node.js v17.3.1
I also asked the developers the question on GitHub directly and I received an answer.
I was told that data is overwritten by using socket.data = value.
If I change it to socket.socketDate for example, then it works fine.
See the full answer on GitHub with the refered code of the socket.io admin ui:
github socket.io-admin-ui issue-55
I am currently debugging some tests written with jest over typescript and I'm having a bit of a headache.
If a test, or tested class, runs Postgres SQL and there is an error in the query, I get the wrong stack trace, for example, this:
error: invalid input syntax for type integer: ""0""
at Parser.parseErrorMessage (/Users/sklivvz/src/xxx/node_modules/pg-protocol/src/parser.ts:369:69)
at Parser.handlePacket (/Users/sklivvz/src/xxx/node_modules/pg-protocol/src/parser.ts:188:21)
at Parser.parse (/Users/sklivvz/src/xxx/node_modules/pg-protocol/src/parser.ts:103:30)
at Socket.<anonymous> (/Users/sklivvz/src/xxx/node_modules/pg-protocol/src/index.ts:7:48)
at Socket.emit (node:events:365:28)
at addChunk (node:internal/streams/readable:314:12)
at readableAddChunk (node:internal/streams/readable:289:9)
at Socket.Readable.push (node:internal/streams/readable:228:10)
at TCP.onStreamRead (node:internal/stream_base_commons:190:23)
The "error" line is very useful, however, the stack trace only tells me that the error was thrown by the pg-protocol driver. I would like to know which line within my code generated the error.
I am exactly 82.7% sure that this is due to PG's query being async.
It is incredibly time-consuming having to step debug or (gasp) console.log my way to each error when it would only be a matter of showing the correct call stack in order to make it better.
Has anyone found a way of making this developer-friendly?
Check if this is related to brianc/node-postgres issue 2484
is (there) a preferred package, extension, or method for providing more detail when you get a syntax error back from the parser?
(for instance, one that listed line number, column of the error)
for instance, right now:
error: syntax error at or near "as"
at Parser.parseErrorMessage (/home/collspec/projects/staff-portal/sprint-server/node_modules/pg-protocol/dist/parser.js:278:15)
desired behavior:
error: syntax error at or near "as", line 5, column 7
at Parser.parseErrorMessage (/home/collspec/projects/staff-portal/sprint-server/node_modules/pg-protocol/dist/parser.js:278:15)
Possible workaround from that issue:
There are a bunch of additional fields on Error objects populated by the driver.
If you log the error object you can see them. They correspond to the error fields returned by the server:
For example with the command:
SELECT foo
FROM bar
You can get an error like this:
{
length: 102,
severity: 'ERROR',
code: '42P01',
detail: undefined,
hint: undefined,
position: '17',
internalPosition: undefined,
internalQuery: undefined,
where: undefined,
schema: undefined,
table: undefined,
column: undefined,
dataType: undefined,
constraint: undefined,
file: 'parse_relation.c',
line: '1180',
routine: 'parserOpenTable'
}
The one you want is position. It gives you the character offset in the SQL of the error.
In this example the position value of "17" refers to the start of the bar token in the SQL.
It's not always populated though as it depends on what caused the error (generally just parse errors).
I ran into a similar issue with aws-sdk for DynamoDb. This is a stacktrace I usually get from aws-sdk.
ResourceNotFoundException: Requested resource not found
at Request.extractError (D:\workspaces\typescript-starters\console-app\node_modules\aws-sdk\lib\protocol\json.js:52:27)
at Request.callListeners (D:\workspaces\typescript-starters\console-app\node_modules\aws-sdk\lib\sequential_executor.js:106:20)
at Request.emit (D:\workspaces\typescript-starters\console-app\node_modules\aws-sdk\lib\sequential_executor.js:78:10)
at Request.emit (D:\workspaces\typescript-starters\console-app\node_modules\aws-sdk\lib\request.js:688:14)
at Request.transition (D:\workspaces\typescript-starters\console-app\node_modules\aws-sdk\lib\request.js:22:10)
at AcceptorStateMachine.runTo (D:\workspaces\typescript-starters\console-app\node_modules\aws-sdk\lib\state_machine.js:14:12)
at D:\workspaces\typescript-starters\console-app\node_modules\aws-sdk\lib\state_machine.js:26:10
at Request.<anonymous> (D:\workspaces\typescript-starters\console-app\node_modules\aws-sdk\lib\request.js:38:9)
at Request.<anonymous> (D:\workspaces\typescript-starters\console-app\node_modules\aws-sdk\lib\request.js:690:12)
at Request.callListeners (D:\workspaces\typescript-starters\console-app\node_modules\aws-sdk\lib\sequential_executor.js:116:18)
My workaround is simply to catch async errors, and overwrite their stack traces. On the other hand, you may append Postgres stacktrace, or error message to your own errors.
async function getPersonFromDb (personId: string): Promise<DocumentClient.AttributeMap | undefined> {
const result = await documentClient.get({ // Similar to postgres.query()
TableName: 'wrong-name',
Key: { pk: personId, sk: personId }
}).promise().catch(error => {
Error.captureStackTrace(error)
throw error
})
return result.Item
}
test('Get a person from DynamoDB', async () => {
const person = await getPersonFromDb('hello')
expect(person).not.toBeUndefined()
})
// ========= new stacktrace ========
Error: Requested resource not found
at D:\workspaces\typescript-starters\console-app\test\abc.test.ts:12:13
at processTicksAndRejections (internal/process/task_queues.js:93:5)
at getPersonFromDb (D:\workspaces\typescript-starters\console-app\test\abc.test.ts:8:20)
at Object.<anonymous> (D:\workspaces\typescript-starters\console-app\test\abc.test.ts:18:20) // my code, and where my error is thrown
I am currently learning Javascript/HTML/CSS in order to build some data dashboard.
I have found this tutorial https://d3-dashboard.cube.dev/setting-up-a-database-and-cube-js
Currently I am getting stuck at this part:
The next step is to create a Cube.js data schema.
When opening the Cube.js playground at: http://localhost:4000, I get the following output in my terminal:
🦅 Dev environment available at http://localhost:4000, I get the following error:
🚀 Cube.js server (0.21.1) is listening on 4000
Error: getaddrinfo ENOTFOUND <YOUR_DB_HOST_HERE>
at GetAddrInfoReqWrap.onlookup [as oncomplete] (dns.js:66:26)
And in the Cube.js playground webpage view:
Error while loading DB schema
Error: getaddrinfo ENOTFOUND <YOUR_DB_HOST_HERE> at
GetAddrInfoReqWrap.onlookup [as oncomplete](dns.js66:26)
I have edited the following file:
d3-dashboard/node_modules/#cubejs-backend/server-core/core/index.js
, with:
const checkEnvForPlaceholders = () => {
const placeholderSubstr = '<YOUR_DB_';
const credentials = [
'CUBEJS_API_SECRET=SECRET',
'CUBEJS_DB_TYPE=postgres',
'CUBEJS_DB_NAME=ecom',
'CUBEJS_WEB_SOCKETS=true'
/*'CUBEJS_DB_HOST',*/
/*'CUBEJS_DB_NAME',*/
/*'CUBEJS_DB_USER',*/
/*'CUBEJS_DB_PASS'*/
];
Any input on what I am doing wrong here?
I am totally new to apps and front-ends, so it might be something "stupid" that I am sking, but I really would like to learn from my mistakes :)
Thank you for your time and potential inputs/help!
Have a great day :)
You definitely should not edit any files in the node_modules directory. You should store the env variables in the .env file.
-your-cubejs-server-root
--schema
--.env
--//..
And it may look like
CUBEJS_DB_HOST=localhost
CUBEJS_DB_NAME=cubejs
CUBEJS_DB_USER=root
CUBEJS_DB_PASS=
CUBEJS_DB_TYPE=mysql
CUBEJS_API_SECRET=secret
The error you're getting is saying that the connection to the DB cannot be established. Because you're missing the proper CUBEJS_DB_HOST= variable.
The minimum required set of variables differs for each database and can be found here https://cube.dev/docs/connecting-to-the-database
I'm following the tutorial step by step, when I get to the part of run npx mikro-orm migration:create, I get this error
TypeError [ERR_INVALID_ARG_TYPE]: The "key" argument must be of type string or an instance of Buffer, TypedArray, DataView, or KeyObject. Received null
at prepareSecretKey (internal/crypto/keys.js:322:11)
at new Hmac (internal/crypto/hash.js:113:9)
at Object.createHmac (crypto.js:147:10)
at createHMAC (C:\lireddit-server\node_modules\pg\lib\sasl.js:133:17)
at Hi (C:\lireddit-server\node_modules\pg\lib\sasl.js:137:13)
at Object.continueSession (C:\lireddit-server\node_modules\pg\lib\sasl.js:32:24)
at Client._handleAuthSASLContinue (C:\lireddit-server\node_modules\pg\lib\client.js:248:10)
at Connection.emit (events.js:314:20)
at Connection.EventEmitter.emit (domain.js:483:12)
at C:\lireddit-server\node_modules\pg\lib\connection.js:109:12
at Parser.parse (C:\lireddit-server\node_modules\pg-protocol\src\parser.ts:102:9)
at Socket.<anonymous> (C:\lireddit-server\node_modules\pg-protocol\src\index.ts:7:48)
at Socket.emit (events.js:314:20)
at Socket.EventEmitter.emit (domain.js:483:12)
at addChunk (_stream_readable.js:298:12)
at readableAddChunk (_stream_readable.js:273:9)```
I can't find any solution on google, and the tut doesn't point out how to login to postgresql on the app
You are missing some configuration, most probably user or password fields. Here is related issue:
https://github.com/mikro-orm/mikro-orm/issues/866
If you do not provide them, MikroORM will pick the defaults for given driver, which is postgres user and empty password - your postgres installation apparently do not have empty password for this user.
If you are using docker to create the postgres server, this is how you can make it accept empty pws:
postgre:
image: postgres:12.4
ports:
- 5432:5432
environment:
POSTGRES_HOST_AUTH_METHOD: trust <-- here
It seems like you have not specified the password or user property of an object which you are passing inside of MikroORM.init.
This should work
export default {
entities: [Entity],
dbName: "yourDatabaseName",
type: "postgresql",
user: "yourUserName",
password: "yourPassword"
} as Parameters<typeof MikroORM.init>[0];
I am using the adal node js library 1.22, and trying to authenticate a user with username and password. I am getting a "unable to get local issuer certificate" error. The user is federated and the error happens on realm discovery.
var context = new AuthenticationContext(authorityUrl);
context.acquireTokenWithUsernamePassword(resource, sampleParameters.username, sampleParameters.password, sampleParameters.clientId, function(err, tokenResponse) {
if (err) {
console.log('well that didn\'t work: ' + err.stack);
} else {
console.log(tokenResponse);
}
});
The error stack:
Stack:
Error: unable to get local issuer certificate
at Error (native)
at TLSSocket.<anonymous> (_tls_wrap.js:1092:38)
at emitNone (events.js:86:13)
at TLSSocket.emit (events.js:185:7)
at TLSSocket._finishInit (_tls_wrap.js:610:8)
at TLSWrap.ssl.onhandshakedone (_tls_wrap.js:440:38)
{ Error: unable to get local issuer certificate
at Error (native)
at TLSSocket.<anonymous> (_tls_wrap.js:1092:38)
at emitNone (events.js:86:13)
at TLSSocket.emit (events.js:185:7)
at TLSSocket._finishInit (_tls_wrap.js:610:8)
at TLSWrap.ssl.onhandshakedone (_tls_wrap.js:440:38) code: 'UNABLE_TO_GET_ISSUER_CERT_LOCALLY' }
Wed, 14 Jun 2017 08:44:17 GMT:079c7b70-6ae1-461c-b433-cc3fe0c22783 - TokenRequest: VERBOSE: getTokenFunc returned with err
well that didn't work: Error: unable to get local issuer certificate
at Error (native)
at TLSSocket.<anonymous> (_tls_wrap.js:1092:38)
at emitNone (events.js:86:13)
at TLSSocket.emit (events.js:185:7)
at TLSSocket._finishInit (_tls_wrap.js:610:8)
at TLSWrap.ssl.onhandshakedone (_tls_wrap.js:440:38)
Can you please advise on what certificate i am missing and where to find it.
EDIT
after digging through the code i found that commenting out the global agent.ca part of the code resolved this issue and the library was able to perform a few steps after that, but it had a problem returning the token response from ADFS.
The log:
Wed, 14 Jun 2017 10:39:39 GMT:425e3117-a495-4f8e-8a12-e7e64dd0e37b - OAuth2Client: INFO: Get TokenServer returned this correlationId: 425e3117-a495-4f8e-8a12-e7e64dd0e37b
Wed, 14 Jun 2017 10:39:39 GMT:425e3117-a495-4f8e-8a12-e7e64dd0e37b - OAuth2Client: ERROR: Get Token request returned http error: 401 and server response: {"error":"invalid_client","error_description":"AADSTS70002: The request body must contain the following parameter: 'client_secret or client_assertion'.\r\nTrace ID: aadf1560-18ec-46f9-83b6-5932c2131200\r\nCorrelation ID: 425e3117-a495-4f8e-8a12-e7e64dd0e37b\r\nTimestamp: 2017-06-14 10:39:41Z","error_codes":[70002],"timestamp":"2017-06-14 10:39:41Z","trace_id":"aadf1560-18ec-46f9-83b6-5932c2131200","correlation_id":"425e3117-a495-4f8e-8a12-e7e64dd0e37b"}
Is there any configuration that i forgot,
if (!parametersFile) {
sampleParameters = {
tenant : 'tenant.onmicrosoft.com',
authorityHostUrl : 'https://login.microsoftonline.com',
clientId : 'aa461028-1fgf-46e5-ab9b-5adca324febc',
username : 'user#domain.net',
password : 'lamepassword'
};
}
var authorityUrl = sampleParameters.authorityHostUrl + '/' + sampleParameters.tenant;
var resource = '00000002-0000-0000-c000-000000000000';
The resource owner flow is strongly discouraged, and in some cases like federated users or users that require MFA, will just not work. This flow is the one in which your application handles the user's username and password directly and sends those in the request to the identity provider. This approach won't work if there are any extra interactions required as part of authentication such as requiring a second factor or dealing with federation. For these reasons and simple security principles (removing the need for the application to deal with the username and password) it's better to avoid this flow.
Since you are dealing with federated users, the resource owner won't work for you leaving you with the two preferred alternatives:
Use the authorization code flow if you want to authenticate as a user
Use the client credentials flow if you want to authenticate as an application.
See the "Web Application to Web API" scenario in the "Azure AD Authentication Scenarios" documentation for more information about choosing between these two options.