Webscokets vs Server Side Events - NodeJS backend and VueJs client - javascript

I have a front end client, which is written in VueJs and a Backend API which is written in Node Js. The Node API communicates with other third party APIs and in turn sent responses back to the client. Now for some of the APIs, it is taking a long time, more than a minute to complete the request and send the response back to the client. As the Node App is proxied over Akamai, it sends a 503 error after a certain time and thus and error will be thrown to the enduser. But the actual process that the third party API do is still in progress and it will send a success response back to the Node App once it is completed. As the client already received the error, it will not receive the success message.
I have this issue with the account creation flow. The client form data is posted to NodeJS backend, which eventually post to another third party API. While waiting for the call to finish, the Akamai proxy will send 503 HTTPS status with Zero Size object response. Client receives this error message and a custom error will be shown. But the account is being created in the backend and eventually it will send success response to the node app, but this never reaches the client and so the user. There is a chance that user will create another account.
The front end call is as follows:
return new Promise((resolve, reject) => {
const config = {}
config.method = 'POST'
config.url = APIaddress
config.data = data
config.params = params
config.withCredentials = true
config.httpsAgent = new https.Agent({ keepAlive: true })
console.log('Config: ', config)
axios(config).then(response => {
console.log('RESPONSE: ', response)
resolve(response)
}).catch(error => {
console.log('ERROR: ', error.response)
reject(error.response.data)
})
})
Here I added the KeepAlive option, but it has no effect and I still get the error.
Now, in the backend also, I use agentkeepalive, and the call is as follows:
const HttpsAgent = agentkeepalive.HttpsAgent
const keepaliveAgent = new HttpsAgent({
timeout:120000,
freeSocketTimeout:60000
});
const options = {
method: 'POST',
url: config.endpoint.url,
headers:
{
'Content-Type': 'application/json;charset=utf-8',
'Accept': 'application/json',
authorization: 'Bearer ' + token
},
data: data,
json: true,
httpsAgent:keepaliveAgent
};
axios(options)
.then(response => response.data)
.then(response => {
resolve(response)
})
.catch(function (error) {
logger.error({
message: `Error while creating account: ${error}`
});
reject(error);
});
Now in order to account for the delays, I am planning to use Server Side Events or WebSockets. I am very new to this and not sure which one to use. I think by using one of these, I can send response back to the client once the account is created. Currently the client is waiting for the account to be created, but I want to make it in such a way that client will send the initial requests and then the server will send notification to the client, once the account is created. This will avoid the unnecessary timeouts and other related issues.
I not sure which solution has to be used here. It will be helpful if someone can shed some light. Thanks for reading.

I switched from SSE and RestAPI to WebSocket on my Node and React app. My setup is as follows:
Create WebSocket server in Node
Create connection from client to server
Then I use "publish-subscribe" pattern.
When client needs something from server, it sends WebSocket message to server with specific sign (In my case it is called "route".) Server filters the message and sends it to proper router (not the Express one, these are routes in my server handling the WebSocket requests.)
As it is processed, server sends WebSocket message back to client, which filters it and processes.
This allows me to have always opened connection to server, what is very swift, and - that's what you are looking for - wait for some message from server without blocking the connection or risking timeout.
Very simple code example:
server:
ws.on('message', m => {
if (m.route === DO_SOMETHING) {
...do something...
ws.send(JSON.stringify({route: DO_SOMETHING_RESPONSE}, message: 'Something was
done'})
}
)
client:
// I want something to be done from server:
ws.send(JSON.stringify({route: DO_SOMETHING, message: 'something should be done'}))
// this is send and you can wait a year for a response, which is catched with:
ws.on('message', m => {
if (m.route === DO_SOMETHING_RESPONSE) {
console.log('Yupeee, something was done!')
}
)
This way you can handle unlimited number of requests. You can make them independent as in this example. Or you can force client to wait for the answger from server.

Related

Twilio function writing to aws rds postgres error

I wrote the below twilio function to store incoming messages into a postgres database in RDS on AWS. I'm getting a 504 timeout error. Details of how I am running:
I'm running this by deploying the function to twilio & adding it as a widget to a twilio studio flow.
My postgres database is in RDS. It is publicly accessible and I'm able to access it from my local machine (I added a security rule for My IP. I'm not sure if I need to add a rule for Twilio, I could find a specific IP they would be on.)
This is my personal computer, so it shouldn't have any extra firewalls. Any thoughts would be appreciated!
I'm not sure if:
There is something wrong with the below script or
AWS/RDS database is refusing connection. I don't know how to properly update my security rules to allow traffic from twilio.
const{Client} = require("pg");
exports.handler = async function(context, event, callback) {
console.log("HELLO")
// Not sure what the below does
// context.callbackWaitsForEmptyEventLoop = false;
// Add database config
const client = new Client({
host: context.host,
port: context.port,
user: context.user,
password: context.password,
database: context.database
});
// Try actually connecting to the database
try {
await client.connect();
console.log("connected successfully");
const user = ["12222222222", "2022-07-10T00:22:10Z", '{"question1": {"question": "How are you?", "answer": "great"}}'];
// Try to actually store the data
await client.query("INSERT INTO text_responses (phone_number, datetime, response) VALUES ($1, $2, $3::jsonb)", user);
await client.end();
console.log("Executed!");
callback(null, "12222222222");
} catch (e) {
console.log(`error: ${e}`);
callback(e);
}
};
Twilio error
Outbound HTTP Request Failed: Request URL: https://example-2039-dev.twil.io/log-sms-db
Request Method: POST
Response Status Code: 502
Response Content Type:
Error - 81016
Outgoing HTTP request failed
The outgoing HTTP request from a Studio widget failed.
Possible Causes
The URL you are requesting is incorrect
The response is badly formed
The URL returned a 4xx or 5xx error code
Possible Solutions
Make sure the request results in a response code 2xx or 3xx

Why is request.on data firing with a delay on NodeJS?

There is a simple web server that accepts data. Sample code below.
The idea is to track in real time how much data has entered the server and immediately inform the client about this. If you send a small amount of data, then everything works well, but if you send more than X data in size, then the on.data event on the server is triggered with a huge delay. I can see that data is transfering for 5 seconds already but on.data event is not trigerred.
on.data event seems to be triggered only when data is uploaded completely to the server, so that's why it works fine with small data (~2..20Mb), but with big data (50..200Mb) it doesnt work well.
Or maybe it is due to some kind of buffering..?
Do you have any suggestions why on.data triggered with delay and how to fix it?
const app = express();
const port = 3000;
// PUBLIC API
// upload file
app.post('/upload', function (request, response) {
request.on('data', chunk => {
// message appears with delay
console.log('upload on data', chunk.length);
// send message to the client about chunk.length
});
response.send({
message: `Got a POST request ${request.headers['content-length']}`
});
});
app.listen(port, () => {
console.log(`Example app listening at http://localhost:${port}`);
});
TLDR:
The delay that you are experiencing probably is the Queueing from Resource scheduling from the browser.
The Test
I did some tests with express, and then I found that it uses http to handle requests/response, so I used a raw http server listener to test this scenario, which has the same situation.
Backend code
This code, based on sample of Node transaction samples, will create a http server and give log of time on 3 situations:
When a request was received
When the first data event fires
When the end event fires
const http = require('http');
var firstByte = null;
var server = http.createServer((request, response) => {
const { headers, method, url } = request;
let body = [];
request.on('error', (err) => {
}).on('data', (chunk) => {
if (!firstByte) {
firstByte = Date.now();
console.log('received first byte at: ' + Date.now());
}
}).on('end', () => {
console.log('end receive data at: ' + Date.now());
// body = Buffer.concat(body).toString();
// At this point, we have the headers, method, url and body, and can now
// do whatever we need to in order to respond to this request.
if (url === '/') {
response.statusCode = 200;
response.setHeader('Content-Type', 'text/html');
response.write('<h1>Hello World</h1>');
}
firstByte = null;
response.end();
});
console.log('received a request at: ' + Date.now());
});
server.listen(8083);
Frontend code (snnipet from devtools)
This code will fire a upload to /upload which some array data, I filled the array before with random bytes, but then I removed and see that it did not have any affect on my timing log, so yes.. the upload content for now is just an array of 0's.
console.log('building data');
var view = new Uint32Array(new Array(5 * 1024 * 1024));
console.log('start sending at: ' + Date.now());
fetch("/upload", {
body: view,
method: "post"
}).then(async response => {
const text = await response.text();
console.log('got response: ' + text);
});
Now running the backend code and then running the frontend code I get some log.
Log capture (screenshots)
The Backend log and frontend log:
The time differences between backend and frontend:
Results
looking at the screenshoots and I get two differences between the logs:
The first, and most important, is the difference between frontend fetch start and backend request recevied, I got 1613ms which is "close" (1430ms) to Resource Scheduling in network timing tab, I think there are more things happening between the frontend fetch call and the node backend event, so I can't direct compare the times:
log.backendReceivedRequest - log.frontEndStart
1613
The second is the difference between receving data on backend, which I got
578ms, close to Request sent (585ms) in network timing tab:
log.backendReceivedAllData - log.backendReceivedFirstData
578
I also changed the frontend code to send different sizes of data and the network timing tab still matches the log
The thing that remains unknown for me is... Why does Google Chrome is queueing my fetch since I'm not running any more requests and not using the bandwidth of the server/host? I readed the conditions for Queueing but not found the reason, maybe is allocating the resources on disk, but not sure: https://developer.chrome.com/docs/devtools/network/reference/#timing-explanation
References:
https://nodejs.org/es/docs/guides/anatomy-of-an-http-transaction/
https://developer.chrome.com/docs/devtools/network/reference/#timing-explanation
I found a problem. It was in nginx config. Nginx was setup like a reverse proxy. By default proxy request buffering is enabled, so nginx grabs first whole request body and only then forwards it to nodejs, so that's why I saw delay.
https://nginx.org/en/docs/http/ngx_http_proxy_module.html#proxy_request_buffering

Process multiple unique Express JS requests

I've got a small Express JS api that I'm building to handle and process multiple incoming requests from the browser and am having some trouble figuring out the best approach to handle them.
The use case is that there's a form, with potentially up-to 30 or so people submitting form data to the Express JS api at any given time, the API then POSTS this data of to some place using axios, and each one needs to return a response back to the browser of the person that submitted the data, my endpoint so far is:
app.post('/api/process', (req, res) => {
if (!req.body) {
res.status(400).send({ code: 400, success: false, message: "No data was submitted" })
return
}
const application = req.body.Application
axios.post('https://example.com/api/endpoint', application)
.then(response => {
res.status(200).send({ code: 200, success: true, message: response })
})
.catch(error => {
res.status(200).send({ code: 200, success: false, message: error })
});
})
If John and James submit form data from different browsers to my Express JS api, which is forwarded to another api, I need the respective responses to go back to the respective browsers...
Let's make clear for you, A response of a request will only send to the requester, But if you need to send a process request and send a response like, hey i received your request and you can use another get route to get the result sometimes later, then you need to determine which job you mean. So You can generate a UUID when server receives a process request and send it back to the sender as response, Hey i received your process request, you can check the result of process sometimes later and this UUID is your reference code. Then you need to pass the UUID code as GETparam or query param and server send you the correct result.
This is the usual way when you are usinf WebSockettoo. send a process req to server and server sends back a reference UUID code, sometime later server sends the process result to websocket of requester and says Hey this is the result of that process with that UUID reference code.
I hope i said clear enough.

nodejs - how to write a client that connects to a nodeJS Server and receives broadcast message

I have the following nodeJS Server that seems to work fine. I would like to write a client that receives message from the server and invokes some JS based on the message.
The steps involved are:
User accesses the url http://server.xyz.com:8080/pa
nodeJS Server receives that call and broadcasts to the connected clients that pa is the api call received.
nodeJS Clients that are connected to the server invoke some JS related to the pa action.
My questions are:
1. How do I make sure the server broadcasts that message like Step 2?
2. How do I write a client that performs Step 3 above.
For the client, I am seeing a lot of references to socket.io, but I am not sure what's the best framework in this case.
server.js
var http = require('http');
http.createServer(function(request, response) {
request.on('error', function(err) {
console.error(err);
response.statusCode = 400;
response.end();
});
response.on('error', function(err) {
console.error(err);
});
response.writeHead(200, {'Content-Type': 'application/json'});
var body=[];
if (request.method === 'GET' && request.url === '/pa') {
response.end(JSON.stringify({"action": "pa"}));
}
else if (request.method === 'GET' && request.url === '/pi') {
response.end(JSON.stringify({"action": "pi"}));
}
else {
response.statusCode = 404;
response.end();
}
}).listen(8080);
If the clients are also in Node.js then you should be able to set up a webhook/push service. Webhooks are used in every major API today, they are extremely prevalent in Slack's and Microsoft's. The following is inspired by Microsoft's Office 365 push and streaming services APIs.
Add a route that clients can POST to, let's call this /subscribe. In the request body to the /subscribe route, they include a url, we'll call it the paReceiverUrl.
If a client wishes to be notified when someone else messages the /pa endpoint, a client can send a request to the /subscribe endpoint and include the location to notify them at - the paReceiverUrl (along with any other info, probably some auth data). You should store the subscribed clients' information in non-volatile storage to be safe, have a database do this for you or simply write it to a file until it gets more complex.
Now your '/pa' route becomes something more like this (your Step 2):
if (request.method === 'GET' && request.url === '/pa') {
// Send response to the request sender
response.end(JSON.stringify({"action": "pa"}));
// Get subscribed clients' information
var subscribedClients = <Read from file or db>
// Broadcast
for(var i = 0; i < subscribedClientsUrls.length; i++) {
// Send them a request at subscribedClients[i].paReceiverUrl
}
}
Now for your Step 3, the clients simply just need to have their server accepting requests at their paReceiverUrl and there they can invoke whatever JS they want.
If you need it to be more real-time then I would go with the web sockets protocol to set a persistent HTTP connection to stream data over.

HTML5 Event Source API and node.js | How to send the stream to the correct client?

For some reason, which I think there is no point mentioning, i can't use socket.io and i decided to use HTML5 Event Source API (Server-sent events) to sent a message to the client. The message tells the client that his payments has been received via a third-party callback.
I've got an ID that identifies each client and it is also received in the callback. I've got two questions so far:
I suppose every message sent is broadcasted to all the clients. Is there a way to select a specific client by his ID?
Currently i am implementing this functionality in the client using an if sentence, but if i could sent the message directly form server to the client to improve performance would be great
When i close the connection on the client i guess i am not closing all the connections stablished, isn't it?
My code:
Node.js
app.get('/payments', function(req, res) {
res.header("Access-Control-Allow-Origin", "*");
res.header("Access-Control-Allow-Headers", "X-Requested-With");
res.writeHead(200, {
'Content-Type': 'text/event-stream',
'Connection': 'keep-alive',
'Cache-Control': 'no-cache',
});
ee.on("payment", function (data) {
res.write("data: "
+ JSON.stringify({'wallet': data.address, 'refund_address': data.refund_address,'payment_status': 'paid'})
+ "\n\n"
);
});
});
Client
var source = new EventSource("/payments");
source.addEventListener('message', function(e) {
if (e.origin == 'http://localhost:3000') {
var data = JSON.parse(e.data);
if (btcwalletdir == data.wallet) { //each client filter
// do whatever here
source.close();
}
return;
}
}, false);
Is it a valid solution for a production environment?
Regards,
Regarding EventSource each client has its own connection. Each time someone access that endpoint it will have a request/response assigned to it, so when you do res.write you are only writing to one client.
But of course if you broadcast the event "payment" to every connection, each one will send the json message to the respective client. So you need to find a way to only send the data to the correct connection (ex adding a user/connection id to the data you emit and storing the res objects in an array/object)

Categories