Alternative to sending a lot of requests to server side - javascript

I got cart products at my eCommerce website that I built with Angular,NodeJS,MongoDB.
When my client wants to add quantity or decrease quantity of product for example from 4 to 2, it will send 2 patch requests to update the quantity (first one from 4 to 3 and second from 3 to 2), I want it to make a way better algorithm that will do in 1 request (for example at the end of the session, when the user is leaving the website, etc..)
I tried using Navigator.sendBeacon and it sometimes works and sometimes not (which I can't use, I need something that works all the time..)
I don't want to show the user any message before he leaves which I know that make the fix for that issue and it will work with navigator.sendBeacon
Here is what I made with navigator.sendBeacon :
Client Side:
#HostListener('window:beforeunload', ['$event'])
onBeforeUnload(): void {
const data = new FormData();
data.append('cartProducts', JSON.stringify(this.cartProducts));
navigator.sendBeacon("http://localhost:3000/api/shopping-online/get-beacon", data)
}
public cartProducts: CartItem[] = [];
Cart products gets the array of objects from the database.
Server Side:
router.post("/get-beacon", async (request, response) => {
try {
console.log("hi");
}
catch (err) {
console.log("bye");
}
})
Sometimes I get the "hi" message on NodeJS terminal and sometimes it comes in delay of few seconds and sometimes it doesn't come at all.
I will glad for any idea with navigator.sendBeacon or any other idea you got to stop that bad algorithm that every click on change quantity it sends to server side (can be even 10 clicks in a row which is very bad).

Have you tried listening for visibilitychange instead of using onBeforeUnload?
document.addEventListener('visibilitychange', () => {
if (document.visibilityState === 'hidden') {
const data = new FormData();
data.append('cartProducts', JSON.stringify(this.cartProducts));
navigator.sendBeacon("http://localhost:3000/api/shopping-online/get-beacon", data);
}
});
MDN docs

With a great advice of Alan got to exactly what I wanted, here is the solution.
Btw there is no mention how to get exactly the sendBeacon on NodeJS so I will add all the way here.
Client Side (Angular):
#HostListener('document:visibilitychange', ['$event'])
visibilitychange() {
if (this.cartProducts && this.isCartProductsChanged) {
const blob = new Blob([JSON.stringify(this.cartProducts)], { type: 'application/json' });
navigator.sendBeacon("http://localhost:3000/api/shopping-online/get-beacon", blob)
this.isCartProductsChanged = false;
}
}
public cartProducts: CartItem[] = [];
public isCartProductsChanged: boolean = false;
It will send the beacon only if there are products in cart, only if any changes made to the cart (added quantity or product)
Server side (NodeJS):
On controller:
router.post("/get-beacon", async (request, response) => {
try {
console.log(request.body);
}
catch (err) {
console.log("bye");
}
})
On app.js
const shoppingController = require("./controllers/shopping-controller");
const express = require("express");
const server = express();
const cors = require("cors");
server.use(function (req, res, next) {
res.header("Access-Control-Allow-Credentials", "true");
res.header("Access-Control-Allow-Headers", "Origin, X-Requested-With, Content-
Type, Accept");
next();
});
server.use(express.text());
let port = process.env.PORT || 3000;
server.use("/api/shopping-online", shoppingController);
server.listen(port, () => console.log(`Listening on
http://localhost:${port}`));
The usage instead of body parser since express version 4.16.0 to make it work is
server.use(express.text());
Before express version 4.16.0 you have to install body-parser
npm i body-parser
And add at app.js:
const bodyParser = require("body-parser");
server.use(bodyParser.text());

Related

Is it possible to implement socket.io connection in express route?

I implement a payment service which depend on one of my express route as a callback route, so whenever a user want to make a payment, they will be redirected to this payment service link which entirely different my backend/frontend domain. After a successful payment, user will then be redirected to my express GET route (callback route), in this route is where I give users their asset and then redirect them to the frontend.
EXPECTATION
My expectation is, whenever a user make a purchase, I want a real time update on the frontend for others to see some details about the purchase without refreshing their browser.
WHAT I'VE TRIED
I had think socket.io would solve this, like adding a socket connection in the route to then push the data to the frontend. But after making lot of research, no solution seems to work for me.
HERE IS A SIMPLE CODE OF WHAT I'VE TRIED
=============================== server.js ========================
const express = require("express")
const app = express()
const http = require("http")
const cors = require("cors")
const session = require("express-session")
const runSocket = require("./runSocket")
const { Server } = require("socket.io")
app.use(cors())
app.use(express.json())
const server = http.createServer(app)
server.listen(3004, () => {
console.log("SERVER IS RUNNING")
})
const io = new Server(server, {
cors: {
origin: "http://localhost:3000",
methods: ["GET", "POST"],
},
})
const postRoute = require("./routes/postData")(io)
app.use("/post-data", postRoute)
==================================== postData Route ======================================
module.exports = function (io) {
router.post("/", async (req, res) => {
const data = req?.body?.data.message
const room = req?.body?.data?.room
io.on("connection", (socket) => {
console.log("Socket Running...")
socket.to(room).emit("the_message", data)
})
console.log("Under socket...")
return res.status(200).json({ data: req.body.data })
})
return router
}
This log: in postData route is not printing console.log("Socket Running...")
EXPECTATION
My expectation is, whenever a user make a purchase, I would like to make a real time update on the frontend for others to see some details about the purchase.
UPDATE: The Payment Gateway config looks somthing like this:
const { body } = await got.post("https://payment-provider-link", {
headers: { Authorization: "Bearer token for payment" },
json: {
email: "email#gmail.com",
amount: amount * 100,
initiate_type: "inline",
callback_url: `${BackendBaseUrl}/payment-callback`, // <<<============
},
})
Okay so you don't need the io.on("connection") in ur route. Remove that piece of code and simply change it to io.to(room).emit("the_message", data). Also make sure to have the other sockets joined the room ur trying to emit to otherwise they won't receive the data.

Why downloading a file from node.js server multiple times results in empty files

I am glad to get some help.
Here is my problem:
I have built a web server with node.js that should send a csv file to the client when requested through a certain route. The csv file is created from json using the fast-csv package. The json data comes from a mongoDB and is processed with mongoose.
When I request this route once, it works fine. However, when it is requested a second time, an empty file is sent to the client. By the way, the headers reach the client correctly.
I have to restart the server to download my file again.
What did I try:
Basically, I have now lost track of everything I have tried. This behavior occurs both when using postman and when querying via the browser.
I've tried implementing promises in my handler function.
I've tried to unsubscribe
res somehow (but yes, that was a stupid approach).
I`ve tried to write the file into the fs and to send it on a second request. ...
Maybe one of you can tell what's going wrong here at first glance:
const { format } = require("#fast-csv/format");
const csvStream = format({ delimiter: ";", headers: true });
const router = express.Router();
router.route("/:collection/csv").get(requireModel, createCsv);
const csvFromDatabase = (data, res) => {
csvStream.pipe(res);
const processData = (data) => {
data.forEach((row) => {
const { _id, __v, ...newRow } = row._doc;
csvStream.write({ ...newRow });
});
csvStream.end();
};
processData(data);
};
const createCsv = async (req, res) => {
const { model } = req;
const items = await model.find({});
res.setHeader("Content-disposition", "attachment; filename=file.csv");
res.setHeader("Content-type", "text/html; charset=UTF-8");
csvFromDatabase(items, res);
};
Thank you very much for your help. I hope I didn't bore you with too stupid questions.
You need to recreate csvStream for each new request:
const csvFromDatabase = (data, res) => {
const csvStream = format({ delimiter: ";", headers: true });
csvStream.pipe(res);
…
};

How to combine request coming from two different application take data from one request and send it as a response to the other request?

I am working on a project for my class. the application facilitate users to upgrade their PCs. The application has three parts a react front-end, node.js server and a C# application(scanner) that the user downloads from the from-end and runs it locally in user's PC. After the scanner is ran, it extracts the existing hardware specs from the user's PC and send it as a json object to the server through a HTTP request. the server API takes this information and generates a compatible upgraded parts list from our database. I am struggling to find a way to return this API generated data to the react front-end opened on the user's end.
The API is a REST API
I am new to using node js and dealing with HTTP requests. It will be helpful if some one can provide any solution, documentation or article which I could be helpful to implement this functionality. also feel free to give any suggestions on how to achieve this connection between the three parts.
Here is the code where react front end that is sending the request for scanner.exe
handleDownload = () =>{
const scn_URL = "http://localhost:5000/fileApi?name=User_scanner.exe";
Axios({
method: 'GET',
url: scn_URL,
responseType: 'blob'
})
.then(response =>{
console.log(response.data);
const url = window.URL.createObjectURL(new Blob([response.data]));
const link = document.createElement('a');
link.href = url;
link.setAttribute('download', 'scanner.exe');
document.body.appendChild(link);
link.click();
})
}
here is the server side API code that is serving the scanner.exe file
var express = require("express");
var router = express.Router();
router.get("/", function (req, res, next) {
const fileName = req.query.name;
const filepath = "C:\\Users\\Sakib_PC\\my files\\senior_Design\\Senior_DesignS19_group_5\\back-end-testing\\serverAPI\\public\\User_scanner.exe"
console.log(`Got file name: ${fileName}`);
res.set({'Content-Type':'application/octet-stream'})
res.sendFile(filepath, function (err) {
if (err) {
next(err)
} else {
console.log('Sent:', fileName)
}
});
});
module.exports = router;
The scanner is sending the information as json to the server and named "data" in the code below
private async void SendJson_Click(object sender, EventArgs e)
{
Console.WriteLine("send button was clicked");
const string url = "http://localhost:5000/testApi";
var data = new StringContent(this.jsonData, Encoding.UTF8, "application/json");
string response;
HttpClient client = new HttpClient();
HttpResponseMessage res = await client.PostAsync(url, data);
response = res.Content.ReadAsStringAsync().Result;
Console.WriteLine(response);
if (response == "message recieved")
{
Console.WriteLine(response);
this.finishButton.Enabled = true;
}
else
{
Console.WriteLine("nothing ");
}
}
the code for API that catches this data is given below:
var express = require("express");
var router = express.Router();
router.post("/", function (req, res, next) {
const msg = req.body;
console.log(msg);
const modMsg = "message recieved";
res.send( modMsg);
})
module.exports = router;
the data that is being received by the server from scanner is given in the link which is what is intended.
image of the json data being send by the scanner
is there any way I can send the scanner data the api caught to the front end that made the download request in the first hand. Keeping in mind that the download GET request made from the website and POST request made from the scanner are two different request. My goal is to take the data from the scanner POST request and serve it to the client from where It made the Download request
thank you!

Express post request timing out in chrome

I am new to API development and am trying to create a post request and send data to an API but it keeps timing out in chrome. The error I am getting is net::ERR_EMPTY_RESPONSE.
This is my js where I am trying to send the info. It is called in another method called addToCart() where I am passing in the cart as a parameter.
function sendToAPI(cart) {
var req = new XMLHttpRequest();
req.open('POST', '/add');
req.setRequestHeader('Content-Type', 'application/json');
req.send(JSON.stringify({cart : cart}));
req.addEventListener('load', () => {
console.log(req.resonseText);
})
req.addEventListener('error', () => {
console.log('There was an error');
console.log(error);
});
}
This is where I am creating the API:
const express = require("express");
const bodyParser = require("body-parser");
const api = express();
api.use(express.static(__dirname + '/public'));
api.use(bodyParser);
api.listen(3000, function() {
console.log("Server is running on port 3000");
});
api.post('/add', function(req, res) {
console.log(req.body);
res.send("It works");
});
I see a couple problems. First, this is not correct:
api.use(bodyParser);
For a JSON response, you would do this:
api.use(bodyParser.json());
And, body-parser is built into Express so you don't need to manually load the body-parser module. You can just do this:
api.use(express.text());
Then, in your client-side code, this:
console.log(req.resonseText);
is misspelled and should be this:
console.log(req.responseText);
And, in your client-side code, you should also be checking the status code returned by the response.
FYI, the new fetch() interface in the browser is soooo much nicer to use than XMLHttpRequest.

Hapi.js - adding mechanism to check every route

I am trying to implement a mechanism that will be run before any route is hit. In that mechanism I want to take a value from the header and check for authentication.
I have come up with this:
server.js:
// Create a server with a host and port
'use strict';
var Hapi = require('hapi');
var mongojs = require('mongojs');
var plugins = [
require('./routes/entities')
];
var server = new Hapi.Server();
server.connection({
port: 3000
});
//Connect to db
server.app.db = mongojs('hapi-rest-mongo', ['entities']);
server.app.checkHeader = function (request) {
var header = request.headers['x-authorization'];
if(header === "letmein"){
return true
}
return false
};
//Load plugins and start server
server.register(plugins, function (err) {
if (err) {
throw err;
}
// Start the server
server.start(function (err) {
console.log('Server running at:', server.info.uri);
});
});
and in routes.entities:
'use strict';
var Boom = require('boom');
var uuid = require('node-uuid');
var Joi = require('joi');
exports.register = function (server, options, next) {
var db = server.app.db;
server.route({
method: 'GET',
path: '/entities',
handler: function handler(request, reply) {
if(!server.app.checkHeader(request))
{
return reply(Boom.unauthorized());
};
//request.server.myFunc();
db.entities.find(function (err, docs) {
if (err) {
return reply(Boom.wrap(err, 'Internal MongoDB error'));
}
reply(docs);
});
}
});
So in short while starting the server I have registered my function server.app.checkHeader
And in the routes I am calling it and sending a request object to it. Request object contains information about the headers.
While this works, I am having a feeling I am not following the best practices with the Hapi.
How could I do it more elegantly?
There are a few options.
You can, of course, tap into the request lifecycle - note the events that occur in the pipeline prior to the route handler.
Although, I'd urge you to consider implementing an auth strategy that can be set as the default for all routes or selectively on appropriate routes.
The best way to require authentication for all or selected route is to use hapi’s integrated functionality.
You should set a default authentication strategy that is applied to each route handler. The sample below uses basic auth. You’d want to create a custom authentication strategy for hapi to check your x-authentication header.
const Hapi = require('hapi')
const BasicAuth = require('hapi-auth-basic')
const server = new Hapi.Server()
server.register(BasicAuth, function (err) {
if (err) {
console.log('error', 'failed to install plugins')
throw err
}
// TODO: add authentication strategy & set as default
server.auth.strategy('simple', 'basic', true, { validateFunc: basicValidationFn })
// or set strategy separately as default auth strategy
server.auth.strategy('simple', 'basic', { validateFunc: basicValidationFn })
server.auth.default('simple')
// TODO: add routes
server.start(function (err) {
})
})
You can also inject hapi’s request lifecycle and extend it at given points. Extending the request lifecycle should be done by using plugins:
register: function (server, options, next) {
// do some processing before 'onPreAuth'
// or pick another extension point
server.ext('onPreAuth', (request, reply) => {
// your functionality
})
}
Hope that helps!

Categories