Express Proxy Server, to Proxy Fiddler - javascript

I have a node.js express proxy server, which I would like to debug, So I want to capture the traffic through fiddler:
const express = require('express');
const expressApp = express();
const proxyTarget = 'https://my-proxy.azurewebsites.net/proxy';
//attempting to configure fiddler proxy
process.env.https_proxy= 'http://127.0.0.1:8888'
process.env.http_proxy= 'http://127.0.0.1:8888'
process.env.NODE_TLS_REJECT_UNAUTHORIZED=0;
expressApp.use('/api', proxy({
target: proxyTarget,
secure: false, // don't verify https certificates
logLevel: 'debug'
}));
I've tried to capture traffic using fiddler but fiddler doesn't capture the request to Azure, only the wrapped proxy request on localhost.
How can I configure my proxy settings so that first I pass through fiddler?

Unfortunately, Node.js doesn't listen to the http_proxy/https_proxy environment variables (see https://github.com/nodejs/node/issues/8381 for the full debate).
That means this is difficult with Fiddler - you need to change every place where HTTP requests are made to use the proxy settings, or to replace Node's default agent to do so (and then ensure no requests set a custom agent). There are some node modules that can help here, like https://www.npmjs.com/package/global-agent, but you'll also need to handle HTTPS trust separately (or you'll just see unreadable CONNECT requests, as mentioned by the commenter above).
I have hit this same issue myself though, so I've been building an open-source alternative to Fiddler to fix it! HTTP Toolkit can do what you're looking for: you can open a terminal from there (or enable it in an existing terminal) and any node processes started there will automatically have all HTTP & HTTPS requests intercepted, so you can examine and/or rewrite them, just like Fiddler. It handles all the cert trust stuff too.
Under the hood, this is just wrapping node on the command line to do the same thing you'd do manually to do that, reconfiguring settings & defaults to ensure everything plays nicely. If you do want to do that manually, the full source to do so is here: https://github.com/httptoolkit/httptoolkit-server/blob/master/overrides/js/prepend-node.js

Related

CORS policy error while calling remote URL in Angular

Try to call remote API Url but, getting Access-Control-Allow-Origin error. I tried many things like following but, nothing works.
proxy.conf.js
const PROXY_CONFIG = [
{
context: [
"/api/planets"
],
target: "https://swapi.co",
secure: false,
changeOrigin: true,
logLevel: "debug",
bypass: function (req, res, proxyOptions) {
req.headers["Access-Control-Allow-Origin"] = "*";
req.headers["X-Forwarded-Host"] = "localhost:8090";
req.headers["X-Forwarded-For"] = "localhost";
req.headers["X-Forwarded-Port"] = "8090";
req.headers["X-Forwarded-Proto"] = "http";
}
}
];
module.exports = PROXY_CONFIG;
Running with ng serve --port 8090 --proxy-config proxy.conf.js
Can't make any changes in server side because I am using third party API.
Try adding plugin like https://chrome.google.com/webstore/detail/allow-cors-access-control/lhobafahddgcelffkeicbaginigeejlf?hl=en in your chrome browser.
Since you cant change the server side config, so this plugin can do the trick. The browser by default blocks CORS
Since You cannot make any changes on remote server. So it can be escaped using Reverse proxy server. I also faced the same issue while calling linkedin services.
a. There is an https://cors-anywhere.herokuapp.com/ you can append this before your url
and it will temporarily resolve CORS issues.
Since in enterprise scenario you can not use herokuapp.com before your application specific names so better to set below proxy server.
b. Second approach is using rever-proxy approach and set up your own server (local or remote ) for reverse proxying.
https://stackoverflow.com/q/29670703/7562674
You can also implement reverse-proxy like implementation using Spring and Jersey.
https://github.com/hhimanshusharma70/cors-escape
As the error says, a header named Access-Control-Allow-Origin must be present in a CORS response.
Since swapi.co responses include the Access-Control-Allow-Origin header for correct CORS requests (can be tested with a simple fetch('https://swapi.co/api/planets/') from your browser's dev console), the issue may be because of your proxy settings.
Try modifying the response in the proxy bypass method:
bypass: function (req, res, proxyOptions) {
...
// Note that this is "res", not "req".
res.headers["Access-Control-Allow-Origin"] = "*";
...
}
You can't! End of story. If the owner of the api has decided not to allow cross origin requests then you can't. If your are not going to host your app on the https://swapi.co domain then you will not be able to use the api directly from Angular and you will need some kind of pass through api call on the server from .NET, Node, PHP, Java etc.

Enabling http compression with express

I'm messing around with the Darksky API and under one of the query parameters it states:
extend=hourly optional
When present, return hour-by-hour data for the next 168 hours, instead
of the next 48. When using this option, we strongly recommend enabling
HTTP compression.
I'm using Express as a node proxy which hits the Darksky api (i.e. localhost:3000/api/forecast/LATITUDE, LONGITUDE).
What does "HTTP compression" mean and how would I go about enabling it?
Here compression means the gzip compression on the express server. You can use the compression middleware to add easy gzip compression to your server.
Read more about how you can install that middleware on here.
https://github.com/expressjs/compression
An example implementation should be look like this.
var compression = require('compression')
var express = require('express')
var app = express()
// compress all responses
app.use(compression())
// add all routes
To quote from https://darksky.net/dev/docs
The Forecast Data API supports HTTP compression. We heartily recommend using it, as it will make responses much smaller over the wire. To enable it, simply add an Accept-Encoding: gzip header to your request. (Most HTTP client libraries wrap this functionality for you, please consult your library’s documentation for details.)
I'm not familiar with the Dark Sky API but I would imagine it returns a large amount of highly redundant data, which is ideal for compression. HTTP requests have a compression mechanism built in via Accept-Encoding, as mentioned above.
In your case that data will be travelling across the wire twice, once from Dark Sky to your server and then again from your server to your end user. You could compress just one of these two transmissions or both, it's up to you but it's likely you'd want both unless the end user is on the same local network as your server.
There are various SO questions about making compressed requests, such as:
node.js - easy http requests with gzip/deflate compression
The key decision for you is whether you want to decompress and recompress the data in your proxy or just stream it through. If you don't need a decompressed copy of the data in the server then it would be more efficient to skip the extra steps. You'd need to be careful to ensure all the headers are set correctly but if you just pass on the relevant headers that you receive (in both directions) it should be relatively simple to pipe through the response from Dark Sky.

Is there a way to set the source port for a node js https request?

Is there a way to set the source port for a node js https request? I am not asking about the destination, rather the source, ie the port used to send the request.
The context is I am trying to send https requests from a specific port, rather than random ports, thus allowing for locking down iptables. Node is not running as root, thus the port is not 443.
Update :
It appears there is a bug in Node. The options localAddress and localPort do not work, at least with a TLS socket.
Update :
Found a similar question from last year. The answers were "don't do that", which seems dumb given that node is suppose to be a generic tool. Nodejs TCP connection client port assignment
The feature appears to be undocumented, but you can achieve this by setting BOTH the localAddress and localPort parameters for the options argument in https.request.
For more information, see the code here:
https://github.com/nodejs/node/blob/b85a50b6da5bbd7e9c8902a13dfbe1a142fd786a/lib/net.js#L916
A basic example follows:
var https = require('https');
var options = {
hostname: 'example.com',
port: 8443,
localAddress : '192.168.0.1',
localPort: 8444
};
var req = https.request(options, function(res) {
console.log(res);
});
req.end();
Unfortunately it looks like Node does not support binding a client port. Apparently this isn't a feature that is used much, but it is possible. This link explains port binding fairly well. https://blog.cloudflare.com/cloudflare-now-supports-websockets/ Not sure how to get the nodejs people to consider this change.
You can use the localPort option in http.Request options to configure a source port. Seems like it's available since v14.x LTS. I tested out with Node v18.12.0 and it works. :)
https://nodejs.org/docs/latest-v14.x/api/http.html#http_http_request_url_options_callback
options.localPort <number> Local port to connect from.

How do I serve socket.io over HTTPS with Node?

The problem:
I just successfully got my Node.js server all working properly and setup with SSL, great. But then I see this:
[blocked] The page at 'https://www.mywebsite.com/' was loaded over
HTTPS, but ran insecure content from
'http://54.xxx.xxx.77:4546/socket.io/socket.io.js': this content
should also be loaded over HTTPS.
I tried just changing the socket.io URL to https, but of course that doesn't work because socket.io insists on serving it's own generated file, a process that I don't think I control. I can change the port that socket.io listens on, that's it.
The question:
So how do I securely serve socket.io (1.0)?
The codez:
var port = 4546;
var io = require('/node_modules/socket.io')(port);
As a side note, I think socket.io (its back and forth communication) should run over HTTPS properly without any extra work; I read that somewhere. Perhaps someone can confirm. It is important that the web socket's communications be securely transferred.
This is one of those occasions where the questions aren't quite duplicates, but the selected answer to a question answers this one as well.
It's simple: Just pass your https server object as socket.io's port parameter.
// ... require stuff
var app = express();
// ... set up your express middleware, etc
var server = https.createServer(sslOptions, app);
// attach your socket.io server to the express server
var io = require("socket.io").listen(server);
server.listen(port);
code by aembke

Socket.io and Webscocket listen on the same server

I need to share same http server between socket.io and websocket (from 'ws' package) handlers.
Unfortunatelly, despite that they are listening to diffrent prefixes, the first is listening to /socket.io and the second to /websocket urls, for some reasons if they are running on the same server the websocket is not working properly.
I did some debugging, but it seems that the requests are properly handled by both libraries but in the end only socket.io works properly.
Any idea how to solve that?
The way sockets work in node.js is quite a bit different from the way normal requests work. There is no routing, so rather than listening to a url, you have to listen to all sockets. The default behavior of socket.io is to close any socket connections that it doesn't recognize. To fix this, you'll need to add the flag 'destroy upgrade': false to the options (server is an express server):
require('socket.io').listen(server, {'destroy upgrade': false, ...})
You'll also need to check the url when a client connects (in the code handling /websocket) and ignore it if it looks like it belongs to socket.io. You can find the url from the client object (passed in to the on connection handler) as client.upgradeReq.url.
Ok solution is simple (unfortunately half day of debugging and now it's simple :)).
There is an option 'destroy upgrade' for upgrade requests coming from non-socketio clients. Since Websocket (module 'ws') is using the same requests some of them might be for 'ws' not for 'socket.io'. So this option should be disabled.
io = require('socket.io').listen(server, { log: false, 'resource':'/socket.io' });
io.disable('destroy upgrade')
Update for 2016:
io.disable('destroy upgrade');
seems not to be available anymore.
But I succeeded by assigning the websocket module ws a path (using Express):
var wss = new WebSocketServer({ server: server, path: '/ws' }); //do not interfere with socket.io
Of course the client has the same path ws://theserver.com/ws
I did not have to alter the socket.io side at all.

Categories