Hiding api url in react/redux application (proxy) - javascript

I am concerned about security of my react/redux application as my api url is exposed to the public inside bundled app.js file. I've been researching this and some developers proxy it somehow i.e. instead of using my api url I can use api/ whenever I perform calls with libraries like axios or superagent and it gets proxied to my api url, but this way users can only see api/ on their side.
I'm trying to figure this out, I assume this is set up within express config?

You have a valid concern.
Typically you would have your clientside code make calls to, say, /api, and in express (or whatever server you use) create a route for "/api" that proxies that request to the actual api url.
This way you can obscure any sensitive information from the client. For example authentication tokens, api keys, etc.
In express you could do something like this:
app.use('/api', (req, res) => {
const method = req.method.toLowerCase();
const headers = req.headers;
const url = 'your_actual_api_url';
// Proxy request
const proxyRequest = req.pipe(
request({
url
headers,
method,
})
);
const data = [];
proxyRequest.on('data', (chunk) => {
data.push(chunk);
});
proxyRequest.on('end', () => {
const { response } = proxyRequest;
const buf = Buffer.concat(data).toString();
res.status(response.statusCode).send(buf);
});
});
This example is a bit more elaborate that is has to be, but it will probably work for you.

Related

Clone/relay exact request to another URL (nodejs)

I'd like to clone/relay the exact request to another URL in native NodeJs. For example, if you send a POST request to my site "example.com", it will send the exact same request you sent to another URL "example2.com" (data, headers etc). How could I achieve that?
You can use proxy middleware to duplicate the request. For example http-proxy-middleware will allow you to proxy the request to another server, but from what I can tell, you can only modify the response. Which isn't optimal if you don't want to wait on the proxy. You might just grab the main proxy library itself, something like:
const httpProxy = require('http-proxy');
const proxyMiddleware = httpProxy.createProxyMiddleware({
target: 'http://www.example2.com',
selfHandleResponse: true
});
const customProxyMiddleware = (req, res, next) => {
proxy.web(req, res);
next();
};
// This passes all incoming requests to the proxy but does not handle
// any of them. It simply passes it along.
app.use('/', customProxyMiddleware);
This code may not work exactly as intended but it should be a good starting point for what you are attempting.

Route that is executed within another route in Node.js is not being executed

Good Evening,
I have a function that contains a route that is a call to the Auth0 API and contains the updated data that was sent from the client. The function runs, but the app.patch() does not seem to run and I am not sure what I am missing.
function updateUser(val) {
app.patch(`https://${process.env.AUTH0_BASE_URL}/api/v2/users/${val.id}`,(res) => {
console.log(val);
res.header('Authorization: Bearer <insert token>)
res.json(val);
})
app.post('/updateuser', (req, ) => {
const val = req.body;
updateUser(val);
})
app.patch() does NOT send an outgoing request to another server. Instead, it registers a listener for incoming PATCH requests. It does not appear from your comments that that is what you want to do.
To send a PATCH request to another server, you need to use a library that is designed for sending http requests. There's a low level library built into the nodejs http module which you could use an http.request() to construct a PATCH request with, but it's generally a lot easier to use a higher level library such as any of them listed here.
My favorite in that list is the got() library, but many in that list are popular and used widely.
Using the got() library, you would send a PATCH request like this:
const got = require('got');
const options = {
headers: {Authorization: `Bearer ${someToken}`},
body: someData
};
const url = `https://${process.env.AUTH0_BASE_URL}/api/v2/users/${val.id}`;
got.patch(url, options).then(result => {
console.log(result);
}).catch(err => {
console.log(err);
});
Note: The PATCH request needs body data (the same that a POST needs body data)

https api request is not working with https.get(url)

I need to pull the weather API data from
https://api.weatherapi.com/v1/current.json?key=f5f45b956fc64a2482370828211902&q=London
It gives a response when pasted in the web browser as well as in postman. But, once incorporated in javascript https.get it fails by continuously loading the browsers and hanging the terminal with unwanted informations.
app.js :
//jshint esversion:6
const express = require("express");
const https = require("https");
const app = express();
app.get("/",function(req, res){
const url = "https://api.weatherapi.com/v1/current.json?key=f5f45b956fc64a2482370828211902&q=London";
https.get(url,function(response){
//console.log(response);
response.on("data",function(data){
const weatherData = JSON.parse(data);
const temp = weatherData.current.temp_c;
const weatherDescription = weatherData.current.condition.text;
});
});
res.send("Server is up and running");
});
app.listen(3000, function(){
console.log("Started on port 3000");
});
I tried the specific request that you posted using https.get() and it worked for me. So it will be difficult to figure out what is the exact problem without more information, for example about how exactly it is failing for you. What messages are you seeing on the console? How are you accessing the result of the request, so what makes you think that the request didn't work?
But apart from that, that is not how you usually make requests in Node. The "data" event may be emitted multiple times if the response arrives in multiple chunks, so the way you do it will only work if you are lucky and the response arrives in a single chunk. The proper way to do this by hand would be to listen to all "data" events, concatenate the result, and then when the "end" event is emitted, parse the concatenated data. However, it is rather uncommon to do this by hand.
A more common way to do this would be to use a library such as node-fetch to make the request for you. For example like this: fetch(url).then((res) => res.json()).then((weatherData) => { const weatherDescription = weatherData.current.condition.text; }).

Yelp API authentication with Express

I'm attempting to issue a GET request to Yelp's API in order to perform a simple search using both Express and Nodejs, however I'm having trouble understanding how to set the request header with the provided API key. Using request I attempted to pass basic authentication following the documentation yet I'm receiving errors. Using setHeader I also received errors.
Using Postman I am able to enter the API keys and receive responses with no issue. I understand packages are out there for this but I think it might be good to learn without the use of additional dependencies if possible.
var express = require("express");
var app = express();
var request = require("request");
request.get('https://api.yelp.com/v3/businesses/search', function (error,
response, body) {
'auth': {
'bearer': 'api_key_here'
}
});
app.listen(3000)
Edit: I was able to fix the callback function(it was passing the header), and am now able to run, however I am now getting {"error": {"code": "TOKEN_MISSING", "description": "An access token must be supplied
in order to use this endpoint."}} . Still confused what I'm doing wrong here as the API key is working properly in POSTMAN.
var express = require("express");
var app = express();
var request = require("request");
request.get('https://api.yelp.com/v3/businesses/search', {
'Authorization': {
Bearer: 'api key'
}}
function(error, response, body){
console.log(body);
});
I tried using express.js to implement Yelp API, but I found using yelp-fushion much easier.
I implemented it like below:
require('dotenv').config();
const yelp = require('yelp-fusion');
const apiKey = process.env.YELP_API_KEY;
const searchRequest = {
term: 'restaurants',
location: 'Los Angeles',
};
const client = yelp.client(apiKey);
client.search(searchRequest)
.then((response) => {
console.log(response.jsonBody);
})
.catch((error) => {
console.log(error);
});
First line is to use .env file.
I saved my Yelp API key in .env file like this: YELP_API_KEY=6fNc0sj5Oyt_jsU2gdeDrlo_1NLm5c-df3f.
Then I assigned it to const apiKey and used it as an argument of yelp.client()
yelp-fushion's docs gives you more details on parameters for businesses search.
Lastly, client.search() method will return response.jsonBody containing all the data that you need.

how to use a nodeJS out in javascript file

I would like to use the output of my nodeJS. This is my code
var fs = require('fs'); //File System
var rutaImagen = 'C:/Users/smontesc/Desktop/imagenes1/'; //Location of images
fs.readdir(rutaImagen, function(err, files) {
if (err) { throw err; }
var imageFile = getNewestFile(files, rutaImagen);
//process imageFile here or pass it to a function...
console.log(imageFile);
});
function getNewestFile(files, path) {
var out = [];
files.forEach(function(file) {
var stats = fs.statSync(path + "/" +file);
if(stats.isFile()) {
out.push({"file":file, "mtime": stats.mtime.getTime()});
}
});
out.sort(function(a,b) {
return b.mtime - a.mtime;
})
return (out.length>0) ? out[0].file : "";
}
And the result is console.log(imageFile), I want to call the result of this in my javascript project, like
<script>
document.write(imageFile)
</script>
All this is to get the newest file created in a directory because I can't do it directly on JS.
Thanks a lot
First, there are several fundamental things about how the client/server relationship of the browser and a web server work that we need to establish. That will then offer a framework for discussing solving your problem.
Images are displayed in a browser, not with document.write(), but by inserting an image tag in your document that points to the URL of a specific image.
For a web page to get some result from the server, it has to either have that result embedded in the web page when the web page was originally fetched from the server or the Javascript in the web page has to request information from the server with an Ajax request. An ajax request is an http request where the Javascript in your web page, forms an http request that is sent to your server, your server receives that request and sends back a response which the Javascript in your web page receives and can then do something with.
To implement something where your web page requests some data from your back-end, you will have to have a web server in your back-end that can response to Ajax requests sent from the web page. You cannot just run a script on your server and magically modify a web page displayed in a browser. Without the type of structure described in the previous points, your web page has no connection at all to the displayed server. The web page can't directly reach your server file system and the server can't directly touch the displayed web page.
There are a number of possible schemes for implementing this type of connection. What I would think would work best would be to define an image URL that, when requested by any browser, it returns an image for the newest image in your particular directory on your server. Then, you would just embed that particular URL in your web page and anytime that image was refreshed or displayed, your server would send it the newest version of that image. Your server probably also needs to make sure that the browser does not cache that URL by setting appropriate cache headers so that it won't mistakenly just display the previously cached version of that image.
The web page could look like this:
<img src='http://mycustomdomain.com/dimages/newest'>
Then, you'd set up a web server at mycustomdomain.com that is publicly accessible (from the open internet - you choose your own domain obviously) that has access to the desired images and you'd create a route on that web server that answers to the /dimages/newest request.
Using Express as your web server framework, this could look like this:
const app = require('express')();
const fs = require('fs');
const util = require('util');
const readdir = util.promisify(fs.readdir);
const stat = util.promisify(fs.stat);
// middleware to use in some routes that you don't want any caching on
function nocache(req, res, next) {
res.header('Cache-Control', 'private, no-cache, no-store, must-revalidate, proxy-revalidate');
res.header('Expires', '-1');
res.header('Pragma', 'no-cache');
next();
}
const rutaImagen = 'C:/Users/smontesc/Desktop/imagenes1/'; //Location of images
// function to find newest image
// returns promise that resolves with the full path of the image
// or rejects with an error
async function getNewestImage(root) {
let files = await readdir(root);
let results = [];
for (f of files) {
const fullPath = root + "/" + f;
const stats = await stat(fullPath);
if (stats.isFile()) {
results.push({file: fullPath, mtime: stats.mtime.getTime()});
}
}
results.sort(function(a,b) {
return b.mtime - a.mtime;
});
return (results.length > 0) ? results[0].file : "";
}
// route for fetching that image
app.get(nocache, '/dimages/newest', function(req, res) {
getNewestImage(rutaImagen).then(img => {
res.sendFile(img, {cacheControl: false});
}).catch(err => {
console.log('getNewestImage() error', err);
res.sendStatus(500);
});
});
// start your web server
app.listen(80);
To be able to use that result in your Javascipt project, we definitely have to create an API which has a particular route that responses the imageFile. Then, in your Javascript project, you can use XMLHttpRequest (XHR) objects or the Fetch API to interact with servers to get the result.
The core idea is we definitely need both server-side and client-side programming to perform that functionality.

Categories