Calling nodeJS HTTP server from Javascript - javascript

I am trying to setup a very simple nodeJS HTTP server. When I call it from the browser, like this http://localhost:8081, it works fine, but when I call is using a JS fetch() method, I get a 404 error:
GET http://localhost/:8081?q=hi
JS:
fetch(":8081/?q=hi")
NODE JS:
const requestListener = function (req, res) {
res.writeHead(200);
res.end('Hello, World!');
}
const server = http.createServer(requestListener);
server.listen(8081);

Every thing is fine, you just need to enable cors that's it, use the below code
const http = require('http')
const requestListener = function (req, res) {
const headers = {
'Access-Control-Allow-Origin': '*', /* #dev First, read about security */
'Access-Control-Allow-Methods': 'OPTIONS, POST, GET',
'Access-Control-Max-Age': 2592000, // 30 days
/** add other headers as per requirement */
};
res.writeHead(200, headers);
res.end(JSON.stringify({"key":"value"}));
}
const server = http.createServer(requestListener);
server.listen(8081);
If you are running both frontend and backend code on the same server then you don’t have to use complete url while if you are running fronted and backed on different server you need to enable cors and use complete url.

When you're calling your local server through JS fetch, you don't need to add the port number you can call it like below:
fetch('/?q=hi')

the URL handed to fetch function looks wronge, it would work if you adjust it to:
fetch('http://localhost:8081/?q=hi');
// or
fetch('/?q=hi');
it should work just fine,
and ensure that you enable the cors if you need to works from any domain

Related

Clone/relay exact request to another URL (nodejs)

I'd like to clone/relay the exact request to another URL in native NodeJs. For example, if you send a POST request to my site "example.com", it will send the exact same request you sent to another URL "example2.com" (data, headers etc). How could I achieve that?
You can use proxy middleware to duplicate the request. For example http-proxy-middleware will allow you to proxy the request to another server, but from what I can tell, you can only modify the response. Which isn't optimal if you don't want to wait on the proxy. You might just grab the main proxy library itself, something like:
const httpProxy = require('http-proxy');
const proxyMiddleware = httpProxy.createProxyMiddleware({
target: 'http://www.example2.com',
selfHandleResponse: true
});
const customProxyMiddleware = (req, res, next) => {
proxy.web(req, res);
next();
};
// This passes all incoming requests to the proxy but does not handle
// any of them. It simply passes it along.
app.use('/', customProxyMiddleware);
This code may not work exactly as intended but it should be a good starting point for what you are attempting.

http get vue.js + node.js returns empty data

I'm doing a project with vue + nativescript
the function app.get is not triggerd when I'm calling it from the vue project
this call :
const urlChannels = 'http://localhost:3001/sources';
axios.get(urlChannels)
.then(response => {
store.commit('setTasks', {
channels: response.data,
});
})
}
returns :"data":"","status":null,"statusText":"" as if the server is off,(the call itself is valid it works with other apis)
but simple test with angularjs on the browser returns the valid needed data
this is my nodejs :
app.get('/sources', function (req, res) {
res.set({
'Access-Control-Allow-Origin': '*',
'Access-Control-Allow-Methods': 'GET'
});
res.writeHead(200,{'Content-Type':'application/json'})
let data = getNews.getSources()
res.send(JSON.stringify(data));
//res.json(data); also tried this same resualt
})
res.end() is intended for ending the response without data (see https://expressjs.com/en/api.html).
If you want to return json, the easiest way is to use res.json():
app.get('/sources', function (req, res) {
res.set({
'Access-Control-Allow-Origin': '*',
'Access-Control-Allow-Methods': 'GET'
});
let data = getNews.getSources()
res.json(data);
});
I found the problem, it's a security thing with ios, they don't allow http calls, only https (I run the project on an ios emulator)
The resource could not be loaded because the App Transport Security policy requires the use of a secure connection

Use Express server to control a nodejs application

I have a node.js application which I use to interact with a REST API provided by another server. I would like to expose a web interface (html + css + javascript) using express.js in order to control the first application.
How can I let the browser talk to the server and let it make node.js actions like using http from that machine or writing to its filesystem? I tried using XMLHttpRequest, but HTTP requests are sent by my local PC instead of from my server.
The only solution I found is using XMLHttpRequest in the javascript of my web interface to invoke some middleware functions on my server, but I had some problems: when I make POST requests, I cannot read data from server. I used FormData and its append method to make the "body" of the POST request, then used body-parser in express to read that body, but it turns out to be always empty. Also tried changing the 'Content-Type' of the header.
Any suggestions? Any better solution than mine (I think it is not efficient)?
As pointed by Jonas, using node server as proxy would be the right approach.
I'm providing sample code for both frontend as well as backend app. Hope this helps you.
Frontend App Code
<html>
<head>
<script type="text/javascript">
function sendData(data) {
if (!data) {
// lets define some dummy data for testing
data = { somekey: "somevalue", anotherkey: "anothervalues" };
}
var XHR = new XMLHttpRequest();
var FD = new FormData();
// Push our data into our FormData object
for (name in data) {
FD.append(name, data[name]);
}
// Define what happens on successful data submission
XHR.addEventListener("load", function(event) {
alert("Yeah! Data sent and response loaded.");
alert(event.target.responseText);
});
// Define what happens in case of error
XHR.addEventListener("error", function(event) {
alert("Oops! Something went wrong.");
});
// Set up our request
XHR.open("POST", "http://path/to/your/nodejs/server/app");
// Send our FormData object; HTTP headers are set automatically
XHR.send(FD);
}
</script>
</head>
<body>
<button onclick="sendData()">Send Test Request to the Server</button>
</body>
</html>
Backend App code
const http = require('http');
const express = require('express');
const bodyParser = require('body-parser');
const app = express();
app.use(bodyParser.urlencoded({ extended: false }));
app.get('/', (req, res) => res.send('Yeah! Server is UP! Post some data'));
app.post('/', (req, res) => {
// You'll see the posted data in req.body, simply for testing purpose return it back to the calling user
res.json(req.body || {});
});
const server = http.createServer(app);
server.listen(3000);
server.on('error', console.error);
server.on('listening', () => console.log('Listening on 3000'));
process.on('exit', (code) => console.warn('Server terminated with code=' + code));
Please note that for this backend app to run, you must have installed following npm packages: express, body-parser

express-http-proxy still gets blocked by CORS policy

I have an express server statically serving my Polymer project. I have a REST API query that I need to make, but if I make it from the client it will be blocked by CORS. So I used express-http-proxy to try to get around that; I send my request to it, and it redirects to the server that has the REST API endpoint on it. This is the entirety of my server code that's running with node server.js:
var express = require('express');
var proxy = require('express-http-proxy');
var server = express();
server.use('/', express.static(__dirname + '/'));
server.listen(8080);
server.use('/rest/api/2/search', proxy('restserver:8877'));
console.log("Server listening on localhost:8080");
When I access restserver:8877/rest/api/2/search in a browser it returns a bunch of json as a 'default' search.
On the client side, I have iron-ajax making this request:
<iron-ajax
id="getBugs"
url="/rest/api/2/search"
params=''
on-response="handleResponse"
debounce-duration="300">
</iron-ajax>
And in the script section, I'm using this.$.getBugs.generateRequest() in the ready function to send the request. So I load this up, expecting the request to not be blocked by CORS, since... it's being proxied by the server. Instead, Chrome devtools gives me this:
XMLHttpRequest cannot load http://restserver:8877/secure/MyJiraHome.jspa. Redirect from 'http://restserver:8877/secure/MyJiraHome.jspa' to 'http://restserver:8877/secure/Dashboard.jspa' has been blocked by CORS policy: No 'Access-Control-Allow-Origin' header is present on the requested resource. Origin 'http://localhost:8080' is therefore not allowed access.
I don't understand why it's giving me those URLs, since I never reference them, and why it's blocking due to CORS, since it's going from the server, not the client, that being the whole point of the proxy.
It may be that express-http-proxy simply forwards the Origin header coming from your client which is http://localhost:8080, causing the end server to reject it.
Try modifying it with proxyReqOptDecorator:
server.use('/rest/api/2/search', proxy('restserver:8877', {
proxyReqOptDecorator(proxyReqOpts) {
proxyReqOpts.headers['Origin'] = 'http://accepted.origin.com';
return proxyReqOpts;
}
}));
Never used express-http-proxy and did not test it tho, so tell me if it's not a solution. Also I think using cors as other people suggested may simplify things a lot. But I don't know your development constraints, so I could be wrong.
Server is probably returning a 302 redirect which is not handled correctly in the used middleware.
Read more how the redirect works: https://en.wikipedia.org/wiki/HTTP_location
You can modify the Location response header to overcome the CORS issue or you can try:
var proxy = require('http-proxy-middleware');
var restServerProxy = proxy({target: 'http://restserver:8877', autoRewrite: true});
server.use('/rest/api/2/search', restServerProxy);
The above example should handle redirects automatically.
You don't need any proxy. Since you are calling endpoint on your server, you can just whitelist your client side for calling your server. You can do that with cors package.
https://www.npmjs.com/package/cors
First, define your CORS policy logic in one file (let's name it cors-policy-logic.js), and then export it so you can use it in other files.
const cors = require('cors');
const whitelist = ['http://localhost:8080', 'http://localhost:your_client_url'];
var corsOptionsDelegate = (req, callback) => {
var corsOptions;
if (whitelist.indexOf(req.header('Origin')) !== -1) {
corsOptions = { origin: true };
} else {
corsOptions = { origin: false };
}
callback(null, corsOptions);
};
exports.cors = cors();
exports.corsWithOptions = cors(corsOptionsDelegate);
Now, import it and use it anywhere were you define some endpoint:
var express = require('express');
const cors = require('./cors-policy-logic.js');
var server = express();
server.use('/', express.static(__dirname + '/'));
server.listen(8080);
server.route('/rest/api/2/search')
.options(cors.corsWithOptions, (req, res) => { res.sendStatus(200); })
.get(cors.cors, (req, res, next) => {
//Your business logic
});
console.log("Server listening on localhost:8080");
Alternative solution would be to use http-proxy-middleware as mentioned by #chimurai.
If you want to proxy to an https server to avoid CORS:
const proxy = require('http-proxy-middleware');
app.use('/proxy', proxy({
pathRewrite: {
'^/proxy/': '/'
},
target: 'https://server.com',
secure: false
}));
Here secure: false needs to be set to avoid UNABLE_TO_VERIFY_LEAF_SIGNATURE error.

Using custom headers when piping external image

I am trying to pipe images from an Amazon S3 server through my node server while adding a custom header to the response.
Right now, however, the server will respond with a plain "Document" that will download to my computer with no file extension declared. The "Document" still contains the desired image data, but how can I make it clear that this is a PNG that can be viewed in my browser?
Here's my current code:
app.get('/skin', function (req, res) {
res.writeHead(200, {'Content-Type': 'image/png', 'access-control-allow-origin': '*'});
http.get("http://s3.amazonaws.com/MinecraftSkins/clone1018.png").pipe(res);
});
You might want to use http.request in order to make nice proxying and resource loading with duplicating headers.
Here is example in express that will listen on port 8080, and will make request to specific server with actually url that you request from /skin/* route:
var http = require('http'),
express = require('express'),
app = express();
app.get('/skin/*', function(req, res, next) {
var request = http.request({
hostname: 's3.amazonaws.com',
port: 80,
path: '/' + req.params[0],
method: req.method
}, function(response) {
if (response.statusCode == 200) {
res.writeHead(response.statusCode, response.headers);
response.pipe(res);
} else {
res.writeHead(response.statusCode);
res.end();
}
});
request.on('error', function(e) {
console.log('something went wrong');
console.log(e);
})
request.end();
});
app.listen(8080);
In order to test it out, run it on your machine, and then go to: http://localhost:8080/skin/nyc1940/qn01_GEO.png
It will load that image proxying from Amazon, and returning its headers as well. You might customize headers as well, in order to prevent XML being sent from S3 (when file does not exist).
You dont need to set any headers as they are proxied from s3.amazon and it does reliably set right headers for you.
Nor access-control-allow-origin as you will need it only in case with AJAX request to resource from another domain name. But anyway feel free to modify response.headers before sending out. It is simple object (console.log it for tests).

Categories