Waiting for/receiving an external web service request in cucumberjs - javascript

I think I just need a pointer in the right direction.
I am testing an application server (through its REST API). I'm using cucumberjs
When I invoke a specific method, some time later the application will send a notification to a URL. The URL is configurable, but only in the app settings - i.e. I can't give a callback URL for each invocation.
In my test step, I need to wait for (and receive) that notification, and extract some data from the request body to use in later steps.
How do I go about this? I guess I could set up a web server for each scenario, pass it a reference to my World object, and have it update something there with the details of the notification (it's OK to configure the app settings to point at my testing app).
But how do I wait (with a timeout) for the notification to be received?
(Ideas, and pointers to doc I should have found, suggestions for node.js packages etc. all welcomed)

I am performing similar types of testing where I need to trigger something in a could-based system and then wait for some behavior to occur. In order to achieve this, I'm using the promise-retry NPM package to perform polling (when needed) and then configuring the default timeouts in CucumberJS to be higher than normal (usually around 60 seconds). I also use config to make all these timing configurations easier to manage.
As far as how you're interacting with your system, it depends on what type of system it is. If you're using Azure, AWS, or Firebase there are API clients that they provide.
If you need to poll, I would recommend a promise-based HTTP client like flashheart, axios, or superagent. CucumberJS itself does not provide these capabilities, but it's easy to bring in other modules for CucumberJS to integrate with.
If you want to receive notifications directly, you could use some sort of cloud-based event hub like Azure Service Bus. Alternatively, if your tests are running on the same network as the system you could easily run an Express HTTP server within CucumberJS. As you receive messages, push them onto an array and then have a step definition to assert that the array contains the expected message.

Related

Making web server run scripts

I'm sure a 100 people asked this before but I couldn't find the right key words to google it.
Simply put: How do I make a web client send information to the web server the right way?
I have a simple website set up with apache 2 on a raspberry with ubuntu. Something very basic. I'd like to have a button on my website that makes my server run a script (I wrote that script in c++ but I don't care if I have to translate it). One solution is using JS client-side to send a message to my server on a specific port (say 50000), and having the server listen on that port with a custom listener.
That works fine, but I'm sure there is a right way to do this. How should I do this so that people won't be pissed by my architecture too much? (+using other ports than 80 and 443 on browser may not work if the client blocks other ports)
What you want to do is create a RESTful web API. The server can listen on a specific port and handle different HTTP requests in different ways. You will want to use Controllers, Services, and potentially a Data Access Object layer. You could either have the script you want to run in code, or you could have your code simply make a call to execute a bash or shell script once you get a valid request of the correct type (GET, POST, PATCH, PUT, etc) with the correct parameters.
https://www.tutorialspoint.com/nodejs/nodejs_restful_api.htm
You're correct, it would be a bad decision to try to have a single endpoint on one port do one thing, and another endpoint on another port do something else. First, a single application can only listen on a single port (or at least should only listen on a single port), so you'd need to spin up a new application for everything you want your back-end server to do. Second, you can't be semantic. Your users would have to look at a dictionary of port - action mappings, instead of being able to (for example) send a request to yourService.com/run/script/1234 to run a script with ID of 1234.
Here is a bit of information on HTTP requests to get you started: https://developer.mozilla.org/en-US/docs/Web/HTTP/Methods

Client access vs broadcast data from web server

I'm looking for technique or skils to fix the ways for new web site.
This site show the read time data which located on server as file or data on memory.
I'll use Node.js for server-side. But I can't fix how to get the data and show that to web site user.
Because this data have to update per 1 second at least.
I think it is similar to the stock price page.
I know there are a lot of ways to access data like AJAX, Angular.js, Socket.io..
Also each has pros and cons.
Which platform or framework is good in this situation?
This ultimately depends on how much control you have over the server side. For data that needs to be refreshed every second, doing the polling on client side would place quite the load on the browser.
For instance, you could do it by simply using one of the many available frameworks to make http requests inside some form of interval. The downsides to this approach include:
the interval needs to be run in the background all the time while the user is on the page
the http request needs to be made for every interval to check if the data has changed
comparison of data also needs to be performed by the browser, which can be quite heavy at 1 sec intervals
If you have some server control, it would be advisable to poll the data source on the server, i.e. using a proxying microservice, and use the server to perform change checking and only send data to clients when it has changed.
You could use Websockets to communicate those changes via a "push" style message instead of making the client browser do the heavy lifting. The flow would go something like:
server starts polling when a new client starts listening on its socket
server makes http requests for each polling interval, runs comparison for each result
when result has changed, server broadcasts a socket message to all connected clients with new data
The main advantage to this is that all the client needs to do is "connect and listen". This even works with data sources you don't control – the server you provide can perform any data manipulation needed before it sends a message to the client, the source just needs to provide data when requested.
EDIT: just published a small library that accomplishes this goal: Mighty Polling ⚡️ Socket Server. Still young, examine for your use if using.

"Direct Response with Node.js" - Sending HTTP response with different Node.js process (different than main process)

Using Node.js servers, I am wondering if it is both possible and recommended to send an HTTP response from a delegated worker process, instead of the main process. These worker processes could be Node.js servers themselves, or simply Node.js child processes that communicate via IPC.
I don't think the cluster core module https://nodejs.org/api/cluster.html can do what I want to do, because in that model, all the workers are listening on the same port, and they process all requests on behalf of the master process. What I am looking for is one main Node.js process that responds to all HTTP requests, perhaps does the authentication and processes some requests, but is also capable of delegating data-intensive or CPU intensive requests to a worker pool.
Imagine that we have a GET request for a large amount of data, say 2-3MBs.
We have at least 3 possible scenarios:
The main process receives the request, asks the database for the large amount of data and then sends the data back to the requestor.
Main process receives the request, sends some data to a worker process using IPC, the worker gets the data from the DB does some heavy operations, and then the worker uses IPC to send all of the 3MBs of data back to the main process, which then sends back the response.
The main process receives the request, sends as small amount of info as possible about the request stream to the worker, the worker does all the work and the worker sends back the HTTP response.
I am particularly curious about making #3 possible.
A simple depiction of scenario 3 is below:
(Just to be clear, I don't want 3 responses for one request, I am just trying to show that a worker could possibly send the response on behalf of the main process).
Anyone know how this might work with Node.js? How it might work in other languages? Normally I have no problems with the Node.js concurrency model, but with some types of data, using the Cluster module is probably not the best way to achieve the highest levels of concurrency.
I believe one term for this model is "direct response", meaning the worker responds directly to the request. And perhaps it's possible to simply use the cluster core module https://nodejs.org/api/cluster.html for this.
I am wondering if it is both possible and recommended to send an HTTP response from a delegated worker process
Yes, this is possible and probably the easiest most common way to scale out your application servers. Unlike IPC, it can work across hosts over a network. (It will also work locally if you want it to... but do make sure you actually are CPU-bound in your application. Despite JavaScript itself being single-threaded, most of the libraries for IO and some NPM modules use thread pools.)
There's no reason to use Node.js as the server load balancing between the backend servers. Node.js is better for your application server. For something just proxying HTTP requests, I'd use Nginx or similar. Nginx can efficiently handle all the dealing with the client, and can be easily configured to load balance.
If you're trying to exploit multiple processors in your machine(s) (executing Node by itself only uses a single process), just use PM2:
https://www.npmjs.com/package/pm2
PM2 starts various instances of your application on the processors you dictate to PM2. If you application is stateless (as it ideally is using Node), an instance of your app will run on each processor and PM2 will do the routing.
If I can verbally redraw the diagram you posted for scenario 3, PM2 would take the place of "MAIN" and "W" would be replaced with your app and no need to worry about workers and forking.
We use PM2 in production and it performs well for us.

Implementing a client-side WebHook handler?

I am a bit of a newbie in Webhooks, so excuse me if this is a simple question.
I am clear about how Webhook providers work, i.e. whenever this information needing to be pushed, it sends the payload to the URL specified as callback.
Now my question is: how do I write a client-side Webhook handler, that can detect/process the callback and update my client-side accordingly. For example, if my client-side is a simple web-page with bullet-points, I would like to append new data to the list, whenever it comes through.
Preferably, I would be after a complete JavaScript solution...
Is there perhaps a JS WebHook Client/Handler that already exists? It seems that this should be so common, that it should exist, although I haven't been able to find anything.
Take a look at WebSockets. Depending on your needs, this could be exactly what you need to avoid polling and keep things in sync - particularly if you have a lots of clients who need to see the same updates from your server.
I highly recommend Socket.IO
To consume a webhook API endpoint, or in other words, to "listen for changes", you'd poll for changes, long-poll for changes, or anything else clever you'd like to do.
Or you can use any javascript Publisher Subscriber module to easily do this. try googling around for PubSub stuff. here's an example of one such tool: http://www.pubnub.com/tutorial/javascript-push-api
web hooks are not made for this. Event notification in web hooks is done through POST requests meaning your client app cannot be notified about new events unless it listens for incoming HTTP requests (usually the client is behind a firewall so this will not be feasible in most cases).
If you'd like to avoid polling the server for status updates, use WebSockets as matthewhudson pointed out.

Forward get request from one node.js server to another

I currently have two instances of nodejs servers running, both listening on localhost, with instance 1 on port 15000 and instance 2 on port 16000. The first is going to work as a master with the second as part of a group of slaves, where any request coming into the first gets forwarded to the second.
I'm having trouble sending any messages from the first to the second.
var jquery = require('jquery');
jquery.get('http://localhost:16000');
called from the first does not get received by the second (jquery is loaded correctly). I'm about to try Mootools, but would like some advice on the best way to forward an incoming nodejs request directly to another instance of a nodejs server.
You want cluster.
You simply call cluster with an instance of a http.Server or net.Server and it does the load balancing for you.
If you want to roll out something yourself then call your clients with http.request which is a sensible way to send a HTTP request to a particular server in node.js.
Using jQuery or MooTools to do this for you is horrible (They don't use native C goodness, like standard node.js modules do!). Don't do this. The only reason why you would want jQuery / MooTools in node is to manipulate jsdom

Categories