I am in a project where I need to connect to a client's server and retrieve data for some graphs.
The client wants this to be something in real time (they have a script that is listening for connections and feeds data constantly).
My big problem is that I don't think this can be done in IE as easy as it would be done in Firefox or Chrome. Firefox and Chrome both support web sockets so I think I could make something that would do exactly what the client wants.
In IE the only way I could do this is by using a setTimeout with an ajax call.
Is there any other alternative for IE??
For now we're using jquery .ajax() and setting a callback for the onReadystageChanged event that handles the server response when the readyState == 3. But this doesn't work as expected because the connection closes after the first response.
Also, I am using IIS with Application Request Routing module as a proxy and a URL rewriting rule because we're using cross domain services. But this module uses a buffer with at least 1KB that corrupts the server response. The services were created by another company that won't implement CORS specification in their server.
How can I work around this and receive the response without buffering?
Thank you
Instead of WebSockets you can use long polling and a server side promise.
On the server you can suspend the http request so it works async.
Example jQuery:
poll();
function poll(){
$.ajax({
url : "http://my-own-domain.com/data/",
success : function(data) {
//do what you want with data
poll();
},
error : function(data) {
poll();
}
});
}
Server side example:
WSRequest req = WS.url("http://the-other-domain.org/data/");
Promise<HttpResponse> respAsync = req.getAsync();
HttpResponse resp = await(respAsync);
renderJSON(resp.getString());
For server side take a look at Play! Framework Async to get an idea how it can be done.
Related
I have a webpage consisting of HTML/CSS/JS/JQUERY on server 1.
On server 2, I have a PHP file to avoid cors.
So, I make requests from server 1 to server 2. The PHP code then uses cURL to interact with API I am working with.
This API sends back cookies, which I want to forward back to server 1 ( the website).
I've been searching for a while, but I have no idea how to go about doing this?
I'm guessing it has to do with something to catch the response headers?
Any help would be nice! I'm new to PHP.
Heres more info:
// Here is how I am calling the server 2 file.
$.ajax({
URL:"HTTP://localhost......server.php",
type:"post",
data: $(this).serialize(),
success: .....,
});
Is there any way I can just get the headers in this file regularly?
Eg. Make it automatically set the cookies here?
If I understand this question correctly. You are hosting a website, and the website makes a request to a PHP script that in turn calls an API that returns some set-cookie headers.
For the cookies to be passed on to the client viewing the website, you'd need to extract the cookie values and call PHPs setcookie function before returning the response to the client from PHP.
Most frameworks will provide a mechanism to specify the header for the response to the client and provide the ability to set a cookie.
Based on your update:
// Here is how I am calling the server 2 file.
$.ajax({
URL:"HTTP://localhost......server.php",
type:"post",
data: $(this).serialize(),
success: .....,
});
So you have some JS that is making a request to the "server 2" file. This "server 2" file should be able to extract the values from the response to the curl call it is making and return set-cookie headers to the client. The client will then store those cookies and could provide them on subsequent requests to http://localhost, for how long, for what paths etc are configurable as part of the process for creating the cookie.
Might be worth posting more code, or making it clearer what you aim is with this as there may be a better solution outside of extracting cookie data from the curl call and passing back to the client.
I'm trying to build an Outlook add-in for encrypting/decrypting emails. The Add-in itself uses a web app, which in turn uses Javascript.
I basically want to pass the unencrypted/encrypted email to a local server running python using GET/POST requests, perform some encryption/decryption on it in python, and then pass it back to Outlook Add-in.
The problem is that the Add-in just won't pass any data to the local server nor get something in return. I have tried to work with both Django and Flask with the same results. I also configured the local servers to use locally-generated certificates and keys with no improvement. I even tried routing the HTTP connection using ngrok, but none came of it.
Server response to GET request on HTTP
Server response to GET request on HTTPS using Werkzeug
The code given below never returns "Here5" on item-status. Clearly the success snippet isn't being run.
$.ajax({
url: "https://127.0.0.1:8000/getPost/hello/",
type: 'GET',
dataType: 'json', // added data type
success: function (res) {
$('#item-status').text("Here5");
$('#item-message').text(res);
}
});
I've also added the required URLs in the App Domain part of the manifest file:
<AppDomain>https://127.0.0.1:8000/getPost/hello/</AppDomain>
The Add-in performs fine when running GET requests from other sites on the internet, like:
https://api.github.com/users
However, I just can't seem to run it on my local server. How do I solve this?
Edit: After some debugging, I've learned that XMLHttpRequest status is 0 and responseText is empty. I think it might have something to do with CORS. Am I correct in that assumption? If yes, then how do I disable Cross Origin for localhost in this particular case?
Thank You
I have a React application for my business that has multiple clients. My architecture is set-up like this:
Server A - Main Application Server, only serves React app (url = https://example.net)
Server B - Business Client #1, serves their slightly customized React app (url = https://example2.net)
Server C - Business Client #2, serves their slightly customized React app (url = https://example3.net)
Microservice Server - Running a Java microservice (url = https://api.example.net - notice it's a subdomain of Server A)
Server A, B, and C are all talking to the Microservice Server for API calls. The API calls are using the native browser fetch() call with these options
var options = {
method: "{method}",
mode: "cors",
credentials: "include",
headers: { "Content-Type": "application/json" }
}
The server response has this header:
"Access-Control-Allow-Credentials", "true"
and uses the microservice's built-in
enableCorsForAllOrigins()
Now, this all works fine on Firefox, Chrome, IE, and Edge. However, it doesn't work on Safari (MacOS or iPhone) for Server B or Server C. It works fine on Safari on Server A.
The API calls are in this sequence...
POST https://api.example.net/login
GET https://api.example.net/users
On Safari, the first call is successful, but the second call fails with a 401 error. The 401 error is thrown from the microservice trying to get the Session attribute for the "currentUserId", which is null. In other words, Safari is not sending the cookie to the server. I've figured out that Safari doesn't like that the API calls are going to a different URL base than the client's server. (In other words, Server B making API calls to Server A).
To further verify this, if a user starts on Server B, then navigates to Server A and makes an API call (a bad login for example), and then goes back to Server B, then it works for them on Safari. It seems like Safari just needs to exchange a cookie on the Server A URL to get it working on every other server.
Now, my issue...how do I fix this? Users on Mac Safari and iPhone Safari on Server B and Server C are not able to use the application because their session cookies are not sent to the server. Has anyone seen this before? Does anyone have solutions? My first attempt was to create an iframe on Server B that loaded Server A's page, but that didn't work.
Many thanks in advance!
So after reading the links provided by #str above, and reading the links from those links, I came to the conclusion that trying to get it working using cookies and sessions wouldn't work with Safari. So I scrapped the entire setup and went with JWT instead.
So, my solution now is:
POST https://api.example.net/login -> Response includes server-generated JWT token with userId in it.
JavaScript code now saves this JWT into SessionStorage (not a cookie, or you'll run into the same issue as when you used Sessions).
window.sessionStorage.setItem("jwtToken", data.jwtToken);
Set up the API calls to always pass the JWT token as a header.
"Authorization": "Bearer " + window.sessionStorage.getItem("jwtToken")
Then, using a server-side library, I can extract the JWT token and get the userId from it.
It works on Safari and is the preferred solution nowadays for web authorization, so I'd say go with this solution. The downsides is that storing the JWT in sessionStorage opens the user to XSS attacks.
The npm-request library allows me to construct HTTP requests using a nice JSON-style syntax, like this.
request.post(
{
url: 'https://my.own.service/api/route',
formData: {
firstName: 'John',
lastName: 'Smith'
}
},
(err, response, body) => {
console.log(body)
}
);
But for troubleshooting, I really need to see the HTTP message body of the request as it would appear on the wire. Ideally I'm looking for a raw bytes representation with a Node.js Buffer object. It seems easy to get this for the response, but not the request. I'm particularly interested in multipart/form-data.
I've looked through the documentation and GitHub issues and can't figure it out.
Simplest way to do this is to start a netcat server on any port:
$ nc -l -p 8080
and change the URL to localhost in your code:
https://localhost:8080/v1beta1/text:synthesize?key=API_KEY
Now, any requests made will print the entire, raw HTTP message sent to the localhost server.
Obviously, you won't be able to see the response, but the entire raw request data will be available for you to inspect in the terminal you have netcat running
I figured out how to dump the HTTP message body with Request. In both cases, I'm just copying the same approach that request uses internally.
Multipart Form Uploads
req._form.pipe(process.stdout);
URL-encoded Forms
console.log(req.body);
You could try #jfriend00 suggestion an use a network sniffer like wireshark but as you're fetching an https URL this might not be the easiest route as it requires some setup to intercept TLS connections.
So maybe it would be enough turning on debug mode for the request module itself, you can do that by simply setting require('request').debug = true. As a third option you could go with the dedicated debug module for request here which allows you to view request and response headers and bodies.
I can think of a number of ways to see the bytes of the request:
Turn on debugging in the request module. There are multiple ways to do that documented here including setting NODE_DEBUG=request or require('request').debug = true or using the request-debug module.
Use a network sniffer to see what's actually being sent over the socket, independent of your node.js code.
Create your own dummy http server that does nothing but log the exact incoming request and send your same request to that dummy server so it can log it for you.
Create or use a proxy (like nginx) that can dump the exact incoming request before forwarding it on to its final destination and send the request to the proxy.
Step through the sending of the request in the debugger to see exactly what it is writing to the socket (this may be time consuming, particularly with async callbacks, but will eventually work).
you could use a nodejs server capable of logging the raw request/response string , then direct your request to that server
i gave an example using both http and https server - no dependencies
nodejs getting raw https request and response
My web application uses javascript to communicate with a server by HTTP requests. However the server software is changed so that instead of using HTTP requests, it uses WebSocket communication.
Rewriting the the entire web application to use open communication (i.e. WebSockets) instead of a request-respond mechanism entirely will be an incredibly tedious job, and I don't really have anything to gain on using WebSockets for the web application.
What I want to be able to achieve is to add a javascript module that communicates by WebSockets, but still in a request-respond manner. I guess this means that I have to "pause" the javascript until the server responds to my WebSocket messages? Because with HTTP requests my web application waited until the request was responded to.
So I guess it boils down to: How can I make javascript wait until the server sends a WebSocket message?
Cheers
EDIT: To specify, my Javascript code uses HTTP requests throughout the source code via a function I named "get(URL)". I want to modify this function so that it sends a WebSocket message, waits for the response and returns the response. Any ideas?
If I understand you correctly from the comments, you do in fact not desire to "pause" your Javascript, but merely to replace async ajax calls with websocket communication.
You would emit a message to the server, with a pre-defined key as it is in the server. You could then have some sort of payload in the message if you desire, and then have a callback function with the data you receive. Following is a sample where I did something similar for a simple chat application I created.
My client code:
socket.emit('join', options, function(res){
_printToChat(res);
});
Server code:
socket.on('join', function(roomname, fn){
socket.join(roomname);
fn('You joined ' + roomname);
});
This sends a messge with the key "join", and the callback is called with the response from server. In this case res in the client would be "You joined " + roomname.
You would likely do something like:
Client code:
socket.emit('getSomeResource', options, function(res){
// do something with response in res variable.
});
Server code:
socket.on('getSomeResource', function(yourOptions, fn){
fn(resourceToReturn);
});