i followed the sample of authorized-https-endpoint and only added console.log to print the req.cookies, the problem is the cookies are always empty {} I set the cookies using client JS calls and they do save but from some reason, I can't get them on the server side.
here is the full code of index.js, it's exactly the same as the sample:
'use strict';
const functions = require('firebase-functions');
const admin = require('firebase-admin');
admin.initializeApp(functions.config().firebase);
const express = require('express');
const cookieParser = require('cookie-parser')();
const cors = require('cors')({origin: true});
const app = express();
const validateFirebaseIdToken = (req, res, next) => {
console.log(req.cookies); //// <----- issue this is empty {} why??
next();
};
app.use(cors);
app.use(cookieParser);
app.use(validateFirebaseIdToken);
app.get('/hello', (req, res) => {
res.send(`Hello!!`);
});
exports.app = functions.https.onRequest(app);
store cookie:
curl http://FUNCTION_URL/hello --cookie "__session=bar" // req.cookies =
{__session: bar}
doesn't store:
curl http://FUNCTION_URL/hello --cookie "foo=bar" // req.cookies =
{}
If you are using Firebase Hosting + Cloud Functions, __session is the only cookie you can store, by design. This is necessary for us to be able to efficiently cache content on the CDN -- we strip all cookies from the request other than __session. This should be documented but doesn't appear to be (oops!). We'll update documentation to reflect this limitation.
Also, you need to set Cache-Control Header as private
res.setHeader('Cache-Control', 'private');
Wow this cost me 2 days of debugging. It is documented (under Hosting > Serve dynamic content and host microservices > Manage cache behavior, but not in a place that I found to be useful -- it is at the very bottom "Using Cookies"). The sample code on Manage Session Cookies they provide uses the cookie name session instead of __session which, in my case, is what caused this problem for me.
Not sure if this is specific to Express.js served via cloud functions only, but that was my use case. The most frustrating part was that when testing locally using firebase serve caching doesn't factor in so it worked just fine.
Instead of trying req.cookies, use req.headers.cookie. You will have to handle the cookie string manually, but at least you don't need to implement express cookie parser, if that's a problem to you.
Is the above answer and naming convention still valid? I can't seem to pass any cookie, to include a session cookie named "__session", to a cloud function.
I setup a simple test function, with the proper firebase rewrite rules:
export const test = functions.https.onRequest((request, response) => {
if (request.cookies) {
response.status(200).send(`cookies: ${request.cookies}`);
} else {
response.status(200).send('no cookies');
}
});
The function gets called every time I access https://www.xxxcustomdomainxxx.com/test, but request.cookies is always undefined and thus 'no cookies' is returned.
For example, the following always returns 'no cookies':
curl https://www.xxxcustomdomainxxx.com/test --cookie "__session=testing"
I get the same behavior using the browser, even after verifying a session cookie named __session was properly set via my authentication endpoint. Further, the link cited above (https://firebase.google.com/docs/hosting/functions#using_cookies) no longer specifies anything about cookies or naming conventions.
Related
I'm making a web app. My back end uses Node.js, Express.js, and specifically, it uses the module express-session to create a session object for session-based authentication in order to persist a user login. I understand that when I use express-session to create a session, a cookie with the session ID is created on the back end and sent to the browser on the front end. I have verified that this cookie sends seamlessly when I use my browser and visit the page the Express app is hosted on (in my case, localhost:3001).
// My Express app's index.js file
// This code seamlessly sends a session ID cookie to the browser when I visit http://localhost:3001
const express = require('express');
const session = require('express-session');
const mongoDbStore = require('connect-mongodb-session')(session);
const app = express();
const store = new mongoDbStore({
uri: 'mongodb://localhost:27017/GameDB',
collection: 'UserSessions'
});
store.on('error', (err) => {
console.log('Something exploded:' + err);
});
app.use(session({
secret: 'I am stuck, help me please!',
store: store,
saveUninitialized: true,
resave: false,
} ));
app.listen(3001, () => {
console.log('server started on 3001');
})
And I get my cookie just fine. Here's a screenshot in my Chrome developer tools:
However, I want my front end to be a separate app (I'm using React), and I want to use API requests from my front end to access everything on my back end. As the front end and back end are separate apps, they will need to run on different ports. My front end will run on port 3000 and my back end will continue to run on port 3001. And with that in mind, I'll be running localhost:3000 in my browser instead of localhost:3001.
The only problem is, with these changes, the cookie made by express-session no longer gets sent to my browser, even when I do an HTTP POST (via JavaScript fetch()) from my front end to my back end and get a valid response back.
In a nutshell, my question is: how can I have my express-session session ID cookie saved to my browser when I'm using a separate app for the front end?
Here's my front end API request:
fetch('http://localhost:3001/gimmecookie', {
method: 'POST',
headers: {
'Content-Type': 'application/json',
'Accept': 'application/json',
},
body: JSON.stringify({ data: "data" })
})
.then(response => response.json() )
.then(response => {
console.log(response)
})
.catch((error) => {
console.error(error);
});
And here's the slightly-modified-from-before index.js file for my Express app:
// My Express app's index.js file that *should* create and
// send a session-id cookie to my React project
const express = require('express');
const cors = require('cors');
const session = require('express-session');
const mongoDbStore = require('connect-mongodb-session')(session);
const app = express();
const corsOptions = {
origin: 'http://localhost:3000',
}
const store = new mongoDbStore({
uri: 'mongodb://localhost:27017/GameDB',
collection: 'UserSessions'
});
store.on('error', (err) => {
console.log('Something exploded:' + err);
});
app.use(cors(corsOptions));
app.use(session({
secret: 'I am stuck, help me please!',
store: store,
saveUninitialized: true,
resave: false,
}));
app.post('/gimmecookie', (req, res) => {
res.json({ sendsome: "data" })
});
app.listen(3001, () => {
console.log('server started on 3001');
});
Right now, I'm using a hack to send the cookie (creating a cookie in JS and sending it manually), but the hack is getting more and more tiresome as my project gets bigger. What do I need to add to this code to have the cookie send like when I was using just one app? Is express-session just not designed for this? (It seems like it would be, I know it is extremely common to have a separate app for both front end and back end.)
Should I expect the cookie to send automatically if the front end and back end have any sort of handshake? Do I need to mess with the cookie.domain or cookie.sameSite attributes in the express-session initialization object? Do I need to mess with CORS or the fetch() Accept header? Is res.json() the wrong method to use? Is it easier to deal with a real IP and not localhost? I've tried a bunch of things, but I no matter what I do, I can't get that blasted express-session generated session ID cookie on my browser.
It turns out that the problem was with the front end, not the back end, and it was a CORS issue. The express-session code was making the cookie just fine, but the front end couldn't accept it because having the back end hosted on port 3001 made it a cross-origin request (you'll recall the front end was on port 3000), even though both the front and back ends were on the same machine, localhost.
All I had to do was use the proxy field in my package.json file in my React project, like so:
...
},
"proxy": "http://localhost:3001"
}
With the proxy change, I also had to change the fetch() in my React code, from this:
fetch('http://localhost:3001/gimmecookie', {
...
to this:
fetch('/gimmecookie', {
...
as my proxy field was tricking my React project into thinking my back end on a different port was actually on the same origin.
Side note: once I realized this was a front end CORS issue, I toyed around with some other solutions (such as using credentials: include in the fetch() init object), but it quickly became apparent that these solutions had significant drawbacks, until I found the proxy solution. Flippn' CORS!
I am trying to fetch food by its key. In postman api is working fine but is the forntend it has no response.
backend code
app.get('/foods/:key', (req, res) => {
foodsCollection.find({ key: req.params.key }).toArray((err, documents) => {
res.send(documents[0])
})
})
frontend code
const { key } = useParams()
const [foodById, setFoodById] = useState({})
useEffect(() => {
fetch(`http://localhost:5000/foods/${key}`)
.then((res) => res.json())
.then((data) => {
setFoodById(data)
})
}, [key])
Although you've added some images above, the most important is missing, namely, what are the Browser's Developer Tools stating the problem is. You should see some message in the Console tab, as well as in the Network tab for that particular request, if it is indeed being made. Until anyone sees this, it will be very difficult to help in fixing your problem.
If your not already, I suggest scaffolding any react app with create-react-app (CRA). This will give you a working app to start from. You can ignore CORS related issues in development, if using CRA, by adding "proxy": "http://localhost:5000", to your package.json file, see here for more on this method, but remember, this is only works for local development. You can also start Chrome to ignore Web Security by running it with the --disable-web-security flag e.g. chromium --disable-web-security, but that isn't a great idea really, more a way to quickly determine if you are having CORS problems, as Chrome masks some problems as CORS related, when in fact they aren't.
I'd also suggest changing your fetch code to use await, so instead you'd have:
const response = await fetch(`http://localhost:5000/foods/${key}`);
if (!response.ok) {
console.error(`Error message: ${response.statusText} ${response.status}`);
}
const result = response.json();
console.log(result);
This isn't necessary, but I've always found it way easier to read than the then/catch/finally method.
Reason for error
You need to stringify an object before sending it to the client with the JSON.stringify() method. When we exchange data to/from a web server, it must be a string.
Solution:
Proper way to send response to the client would to wrap the entire API in a try-catch block and explicitly specify the HTTP Status Code along with the stringified data in every response.
Note: Although 500 status code is used for error handling, you should choose one as per the use case.
app.get('/foods/:key', (req, res) => {
try {
/*
rest of the code
*/
foodsCollection.find({ key: req.params.key }).toArray((err, documents) => {
if (err) {
// 500 stands for internal server error
return res.status(500).send(JSON.stringify('Here goes a meaningful error message!'));
}
// 200 stands for success
res.status(200).send(JSON.stringify(documents[0]));
});
/*
rest of the code
*/
} catch (error) {
// 500 stands for internal server error
res.status(500).send(JSON.stringify('Here goes another meaningful error message!'));
}
})
The problem is that you haven't set the CORS headers of response in your backend code. and you are using different ports in your backend and frontend (5000 & 3000) so the Same Origin Policy disallows reading the remote resource, indicating that the request was blocked due to violating the CORS security rules.
you've to set the CORS headers.
you can install the CORS npm package and follow it's instructions to resolve the issue like this:
var express = require('express')
var cors = require('cors')
var app = express()
app.use(cors())
.
.
.
And one other issue that I'm seeing is that you've put the react-router default route before your specified path. so move the <route path="*"> after <route path="/foods/:key">
I am currently using Chakram API Testing framework to test some REST API endpoints.
The first API gets a CSRF token which is used in the rest of the endpoints in the headers.
The CSRF API returns a JSON object - something like this
{
csrf_token : Aajkndaknsda99/adjaj29adja
}
This is how I'm doing it right now
describe('Hits the CSRF API to get the token',()=>{
let csrf_tok;
before(()=>{
return chakram.wait(response = chakram.get(API_ENDPOINT,headers));
});
it('gets the csrf token from the response',()=>{
return response.then(function(resp){
csrf_tok = response.body.csrf_token;
console.log(csrf_tok) //works fine and I can see the value correctly
exports.csrf = csrf_tok;
});
});
});
In my other file, where I need to use the CSRF token, I'm doing something like this
var token = require('../test/csrf_token');
var options ={
headers :{
//other headers
CSRF-TOKEN : token.csrf;
}
}
However, this is not working and the rest of the API endpoint tests are failing due to the token being passed as undefined. I hard coded the value of token and then the tests starts working. However, I do not want to do this every time (I plan on deploying this as a part of pipelines).
This issue seems to be that the variable cannot be accessed outside of Mocha's describe context. Is that right? If so, how can I overcome it?
You can declare the variable outside describe and then export it from outside 'describe'.
Other thing I have noticed regarding line:
csrf_tok = response.body.csrf_token;
It should be :
csrf_tok = resp.response.body.csrf_token;
This doesnt answer your specific question, but I needed something similar - where I needed to get an auth token that could then be passed to other tests.
I did this with a before hook in a shared.js file
before ( function getToken (done) {
chai.request(host)
.post(tokenURL)
.send({my params})
.end(function(err, response){
... getToken expectations
this.myToken = response.token;
done();
});
});
Then in test.js file you can just use 'myToken', as long as your shared.js file is in the root test dir
See https://gist.github.com/icirellik/b9968abcecbb9e88dfb2
I want to implement authentication and authorization in the Flatiron stack (using Flatiron, Resourceful and Restful). I want to require that a user has the necessary permissions, when trying to change a resource. In the Restful Readme file, there's a note about authorization:
There are several ways to provide security and authorization for
accessing resource methods exposed with restful. The recommended
pattern for authorization is to use resourceful's ability for before
and after hooks. In these hooks, you can add additional business logic
to restrict access to the resource's methods.
It is not recommended to place authorization logic in the routing
layer, as in an ideal world the router will be a reflected interface
of the resource. In theory, the security of the router itself should
be somewhat irrelevant since the resource could have multiple
reflected interfaces that all required the same business logic.
TL;DR; For security and authorization, you should use resourceful's
before and after hooks.
So authorization can be handled by Resourceful's hooking system.
My actual problem is the authentication process at the beginning of every HTTP request.
Let's say I have a resource Post, a User and a resource Session. The REST API is being defined by using Restful. My main concern for this question is to ensure that a user has a session when creating a post. Other methods like save, update or for other resources like creating a user should work analogous.
File app.js:
var flatiron = require('flatiron');
var app = flatiron.app;
app.resources = require('./resources.js');
app.use(flatiron.plugins.http);
app.use(restful);
app.start(8080, function(){
console.log('http server started on port 8080');
});
File resources.js:
var resourceful = require('resourceful');
var resources = exports;
resources.User = resourceful.define('user', function() {
this.restful = true;
this.string('name');
this.string('password');
});
resources.Session = resourceful.define('session', function() {
// note: this is not restful
this.use('memory');
this.string('session_id');
});
resources.Post = resourceful.define('post', function() {
this.restful = true;
this.use('memory');
this.string('title');
this.string('content');
});
resources.Post.before('create', function authorization(post, callback) {
// What should happen here?
// How do I ensure, a user has a session id?
callback();
});
There's also a runnable version of the code (thanks #generalhenry).
So assume a user trying to create a post, already has been given a session id, that is sent with every request he makes by a cookie header. How can I access that cookie in the before hook (i.e. the authorization callback)?
The example can be started with node app.js and HTTP requests can be made using curl.
Keep in mind that these guidelines are for authorization process. If you need to use sessionId you can access it either way: req.sessionID, req.cookies["connect.sid"].
By checking requests this way you will be sure every users have valid session id.
app.use(flatiron.plugins.http, {
before: [
connect.favicon(),
connect.cookieParser('catpsy speeds'),
function(req, res) {
if (req.originalUrl === undefined) {
req.originalUrl = req.url;
}
res.emit('next');
},
connect.session({secret : 'secter'}),
function(req, res) {
console.log('Authenticating...');
console.dir(req.session);
//any other validation logic
if (req.url !== '/login' && typeof req.session.user == 'undefined') {
res.redirect('/login');
} else {
res.emit('next');
}
}
]
});
Here is project example using this approach.
NodeJS is a fantastic tool and blazing fast.
I'm wondering if HTTPClient supports cookies and if can be used in order to simulate very basic browser behaviour!
Help would be very much appreciated! =)
EDIT:
Found this: node-httpclient (seems useful!) not working!
Short answer: no. And it's not so great.
I implemented this as part of npm so that I could download tarballs from github. Here's the code that does that: https://github.com/isaacs/npm/blob/master/lib/utils/fetch.js#L96-100
var cookie = get(response.headers, "Set-Cookie")
if (cookie) {
cookie = (cookie + "").split(";").shift()
set(opts.headers, "Cookie", cookie)
}
The file's got a lot of npm-specific stuff (log, set, etc.) but it should show you the general idea. Basically, I'm collecting the cookies so that I can send them back on the next request when I get redirected.
I've talked with Mikeal Rogers about adding this kind of functionality to his "request" util, complete with supporting a filesystem-backed cookiejar, but it's really pretty tricky. You have to keep track of which domains to send the cookies to, and so on.
This will likely never be included in node directly, for that reason. But watch for developments in userspace.
EDIT: This is now supported by default in Request.
If you are looking to do cookies client side you can use https://github.com/mikeal/request
M.
Zombie.js is another choice if you want browser-like behaviour. It "maintains state across requests: history, cookies, HTML5 local and session stroage, etc.". More info on zombie's cookie api: http://zombie.labnotes.org/API
There is also PhantomJS and PhantomJS-based frameworks, like CasperJS.
A feature-complete solution for cookies
The self-made solutions proposed in the other answers here don't cover a lot of special cases, can easily break and lack a lot of standard features, such as persistence.
As mentioned by isaacs, the request module now has true cookie support. They provide examples with cookies on their Github page. The examples explain how to enable cookie support by adding a "tough-cookie" cookie jar to your request.
NOTE: A cookie jar contains and helps you manage your cookies.
To quote their Readme (as of April 2015):
Cookies are disabled by default (else, they would be used in
subsequent requests). To enable cookies, set jar to true (either in
defaults or options) and install tough-cookie.
The cookie management is provided through the tough-cookie module. It is a stable, rather feature-complete cookie management tool that implements the "HTTP State Management Mechanism" - RFC 6265. It even offers a variety of options to persist (store) cookies, using a cookie store.
The code below demonstrates using cookie from server side, here's a demo API server that parse cookies from a http client and check the cookie hash:
var express = require("express"),
app = express(),
hbs = require('hbs'),
mongoose = require('mongoose'),
port = parseInt(process.env.PORT, 10) || 4568;
app.configure(function () {
app.use(express.bodyParser());
app.use(express.methodOverride());
app.use(express.static(__dirname + '/public_api'));
app.use(express.errorHandler({ dumpExceptions: true, showStack: true }));
});
app.get('/api', function (req, res) {
var cookies = {};
req.headers.cookie && req.headers.cookie.split(';').forEach(function( cookie ) {
var parts = cookie.split('=');
cookies[ parts[ 0 ].trim() ] = ( parts[ 1 ] || '' ).trim();
});
if (!cookies['testcookie']) {
console.log('First request');
res.cookie('testcookie','testvaluecookie',{ maxAge: 900000, httpOnly: true });
res.end('FirstRequest');
} else {
console.log(cookies['testcookie']);
res.end(cookies['testcookie']);
}
});
app.listen(port);
On the client side, just make a normal request to the server api above, i'm using request module, it by default transfers cookie for each request.
request(options, function(err, response, body) {
console.log(util.inspect(response.headers));
res.render("index.html", {layout: false,user: {
username: req.session.user + body
}});
});
Just get cookies from Set-Cookie param in response headers and send them back with future requests. Should not be hard.