My question seems pretty basic, but I came across a lot of documentation and question on this forum without getting any proper way to get the work done.
I have a secured webapp, in which I handle redirections programatically to send authentification headers with each request. Thus, instead of href links, I have buttons, which trigger the following function :
route_accessor.mjs
const access = (path = '') => {
const token = localStorage.getItem('anov_auth_token');
const dest = `http://localhost:8080/${path}`;
const headers = new Headers();
if (token) headers.append('authorization', token);
fetch(
dest,
{
method: 'GET',
headers,
mode: 'cors',
redirect: 'follow'
}
)
.then(response => {
if (response.url.includes('error/403')) {
localStorage.removeItem('anov_auth_token');
}
// Here I need to redirect to the response page
})
.catch(error => {
console.error(error);
});
};
export default access;
Then, I have NodeJs backend, which determines where I should go, either :
My requested page
A 403 page (if I sent a wrong token)
Login page (if I havent sent any token)
Backend works perfectly so far. The problem is, once I made my request, I can't display the result as I'd like. Here is what I tried. The following code takes place where I put a comment inside route_accessor.mjs.
Using window.location
window.location.href = response.url;
I tried every variant, such as window.location.replace(), but always went into the same issue : those methods launch a second request to the requested url, without sending the headers. So I end up in an infinite 403 redirection loop when token is acceptable by my server.
I tried methods listed in the following post : redirect after fetch response
Using document.write()
A second acceptable answer I found was manually updating page content and location. The following almost achieve what I want, with a few flaws :
response.text().then(htmlResponse => {
document.open();
document.write(htmlResponse);
document.close();
// This doesn't do what I want, event without the second argument
window.document.dispatchEvent(new Event("DOMContentLoaded", {
bubbles: true,
cancelable: true
}));
});
With this, I get my view updated. However, URL remains the same (I want it to change). And though every script is loaded, I have few DOMContentLoaded event to make my page fully functionnal, but they aren't triggered, at all. I can't manage to dispatch a new DOMContentLoaded properly after my document is closed.
Some other minor problems come, such as console not being cleared.
Conclusion
I am stuck with this issue for quite a time right now, and all my researches havent lead me to what I am looking for so far. Maybe I missed an important point here, but anyway...
This only concerns get requests.
Is their a proper way to make them behave like a link tag, with a single href, but with additional headers ? Can I do this only with javascript or is their a limitation to it ?
Thanks in advance for any helpful answer !
Related
I am trying to intercept a fetch request and assert that the payload / query params are correct. I have searched the cypress documentation but everything mentioned is setting query params. I for example, need to assert that a fetch request such as https://helloWorld.com/proxy/service has query parameters /payload of ?service=request&layers=demo. Is there a way to do this?
I've tried almost everything but something similar to this is what I'm shooting for. Any ideas?
cy.location("https://helloWorld/proxy/service").should((loc) => {
expect(loc.search).to.eq('?service=request&layers=demo')
})
Setting up an intercept can check the shape of the request
cy.intercept('https://helloworld.com/proxy/service').as('requestWithQuery')
// trigger the request
cy.wait('#requestWithQuery').then(interception => {
expect(interception.req.body.query.includes('service=request').to.eq(true)
})
I'm not sure if the assertion above is exactly what you need, but put a console.log(interception.req) to check out what you need to assert.
The intercept can also specify the query
cy.intercept({
pathname: 'https://helloworld.com/proxy/service',
query: {
service: 'request',
layers: 'demo'
},
}).as('requestWithQuery')
// trigger the request
cy.wait('#requestWithQuery')
By the way your use of cy.location() is incorrect, you would use
cy.location('search').should('eq', '?service=request&layers=demo')
// or
cy.location().should((loc) => {
expect(loc.href).to.eq(
'https://helloworld/proxy/service?service=request&layers=demo'
)
})
But the app would already have to have navigated to https://helloWorld/proxy/service and it's not clear from your question if that is happening.
Catching "helloWorld/proxy/service"
When your app uses fetch, the URL is converted to lower case.
So sending fetch('https://helloWorld/proxy/service') can be intercepted with
cy.intercept('https://helloworld/proxy/service') // all lower-case url
There's a clue in the Cypress log, the logged fetch is show as being all lower-case characters
(fetch) GET https://helloworld/proxy/service
BaseUrl and intercept
When baseUrl is a different domain from the intercept hostname, you can specify it with an additional option, although in practice I found that it also works with the full URL as shown above.
cy.intercept('/proxy/service*', { hostname: 'https://helloworld' })
I’ve now spent countless hours trying to get the cache API to cache a simple request. I had it working once in between but forgot to add something to the cache key, and now its not working anymore. Needless to say, cache.put() not having a return value that specifies if the request was actually cached or not does not exactly help and I am left with trial and error. Can someone maybe give me a hint on what I’m doing wrong and what is actually required? I’ve read all the documentation more than 3 times now and I’m at a loss…
Noteworthy maybe is that this REST endpoint sets pragma: no-cache and everything else cache-related to no-cache, but i want to forcibly cache the response anyway which is why I tried to completely re-write the headers before caching, but it still isn’t working (not matching or not storing, no one knows…)
async function apiTest(token, url) {
let apiCache = await caches.open("apiResponses");
let request = new Request(
new URL("https://api.mysite.com/api/"+url),
{
headers: {
"Authorization": "Bearer "+token,
}
}
)
// Check if the response is already in the cloudflare cache
let response = await apiCache.match(request);
if (response) {
console.log("Serving from cache");
}
if (!response) {
// if not, ask the origin if the permission is granted
response = await fetch(request);
// cache response in cloudflare cache
response = new Response(response.body, {
status: response.status,
statusText: response.statusText,
headers: {
"Cache-Control": "max-age=900",
"Content-Type": response.headers.get("Content-Type"),
}
});
await apiCache.put(request, response.clone());
}
return response;
}
Thanks in advance for any help, I've asked the same question on the Cloudflare community first and not received an answer in 2 weeks
This might be related to your use of caches.default, instead of opening a private cache with caches.open("whatever"). When you use caches.default, you are sharing the same cache that fetch() itself uses. So when your worker runs, your worker checks the cache, then fetch() checks the cache, then fetch() later writes the cache, and then your worker also writes the same cache entry. Since the write operations in particular happen asynchronously (as the response streams through), it's quite possible that they are overlapping and the cache is getting confused and tossing them all out.
To avoid this, you should open a private cache namespace. So, replace this line:
let cache = caches.default;
with:
let cache = await caches.open("whatever");
(This await always completes immediately; it's only needed because the Cache API standard insists that this method is asynchronous.)
This way, you are reading and writing a completely separate cache entry from the one that fetch() itself reads/writes.
The use case for caches.default is when you intentionally want to operate on exactly the cache entry that fetch() would also use, but I don't think you need to do that here.
EDIT: Based on conversation below, I now suspect that the presence of the Authorization header was causing the cache to refuse to store the response. But, using a custom cache namespace (as described above) means that you can safely cache the value using a Request that doesn't have that header, because you know the cached response can only be accessed by the Worker via the cache API. It sounds like this approach worked in your case.
so I want to add some functionality to an already existing site, this is to make my life easier. One of the things I need that I can't seem to figure out is: how to capture the body payload data a specific outgoing "POST" request. I found the code to do it before but didn't save it and I been searching for that code for 2 days to no avail.
So here is an example of the request the site is making to server.
fetch("https://my.site/api/req", {"credentials":"include","headers":{"accept":"*/*","content-type":"application/json"},"referrerPolicy":"no-referrer-when-downgrade","body":"{\"symbol\":\"mySYM\",\"results\":[{\"data\":{\"id\":\"dataID\"},\"result\":\"signature\"}]}","method":"POST","mode":"cors"});
and the part I need to catch is the "body" portion and then unescape it so it looks like this.
{"symbol":"mySYM","results":[{"data":{"id":"dataID"},"result":"signature"}]}
Also, if possible I would like to have it only catch data when the method = POST and requests going to a specific URL, so it will catch /api/req/ and not pay attention to other URL's and/or when the method is = GET, HEAD.
Currently, I manually get the data from the request using dev tools and clicking on the correct request then scrolling down to find the POST data.
In case you need to know the reason for this. The server signs the data through the websocket connection and I am essentially trying to capture that signature to be able to replay it. I am not trying to catch the websocket data as its incomplete for my needs I need to catch the whole outgoing request body data.
Thanks in advance.
Chosen Solution:
Thanks #thirtydot for your responses. Note that my specific situation involved only fetch requests so that is the reason I went with this route. With your response, a bit of more of my own research, and the help of this post I came up with this solution. Since I don't really care to see the responses (I have other functions taking care of the responses which are important to me.).
const constantMock = window.fetch;
window.fetch = function() {
if (arguments[0] === '/api/req' && arguments[1].method === 'post'){
bodyResults(arguments[1].body)
}
return constantMock.apply(this, arguments)
}
function bodyResults(reqBody){
console.log(reqBody)
}
which put the following in console (Exactly as I wanted).
{"symbol":"NEON","results":[{"data":{"expires_at":"1561273300","id":"2469c8dd"},"signature":"6d712b9fbb22469c8dd240be13a2c261c7af0dfbe3328469eeadbf6cda00475c"}]}
except now I can return this data through that function and continue to run the rest of my script fully automated.
Extra Solution:
In case there are others struggling with similar issues and care to catch the responses of those fetch requests I could have alternatively used:
const constMock = window.fetch;
window.fetch = function() {
if (arguments[0] === '/api/req' && arguments[1].method === 'post'){
bodyResults(arguments[1].body)
}
return new Promise((resolve, reject) => {
constantMock.apply(this, arguments)
.then((response) => {
if(response.url.indexOf("/me") > -1 && response.type != "cors"){
console.log(response);
// do something for specificconditions
}
resolve(response);
})
.catch((error) => {
reject(response);
})
});
}
function bodyResults(reqBody){
console.log(reqBody)
}
Possible XHR Solution
NOTE: this one is untested! An alternative Solution for XHR requests could be done similarly using something along the lines of:
(function(open) {
XMLHttpRequest.prototype.open = function(method, url, async, user, pass) {
alert('Intercept');
open.call(this, method, url+".ua", async, user, pass);
};
})(XMLHttpRequest.prototype.open);
Hope this helps!
So I came across this today and I'm not quite sure if its exactly what you've been looking for over 2 years ago, but solved my problem and I thought I should share it if others needed.
I'm currently using a marketing automation tool which is quite limiting when it comes to landing pages, but I wanted the client to be able to update the content whenever needed and still have access to custom functionality, so I needed the payload which was being sent by the form submission.
Here is what I used to get the form submission payload:
(function() {
var origOpen = XMLHttpRequest.prototype.send;
XMLHttpRequest.prototype.send = function() {
console.log('request started!');
console.log(arguments[0]);
this.addEventListener('load', function() {
console.log('request completed!');
console.log(this.status);
});
origOpen.apply(this, arguments);
};
})();
The arguments[0] piece is actually the JSON sent as the payload, and the status code is the response (200), stating the request was successfull.
I partially used code from this other response here: https://stackoverflow.com/a/27363569/1576797
I am getting error with status 302
But while trying to log error in catch I am getting 200
post(url, data, successCallBack, errCallback) {
return this.http.post(apiDomain + url, JSON.stringify(data), {
headers: this.headers
}).catch(this.handleError).subscribe(
(res) => {
successCallBack(res.json());
},
(err) => {
errCallback(err);
}
);
}
private handleError(error: any) {
let errMsg = (error.message) ? error.message :
error.status;
console.log(error.status); // log is 200
console.log(error)
console.error(errMsg);
return Observable.throw(errMsg);
}
Requirement I want to send another post call on redirect URL redirects.
How to get Redirect URL.
Need help.
Late answer I know, but for anyone stumbling across this.
The short answer is you can't as the browser handles 302's itself and won't tell angular anything about that. What you can do is set-up an interceptor style class that monitors what is going on.
Google for angular2 http interceptor or similar, it's a little beefier than your example above and can monitor every XHR connection. An example is here:
https://www.illucit.com/blog/2016/03/angular2-http-authentication-interceptor/
What this now allows is that any connection will come through your interceptor. As we won't be able to monitor 302s, we have to think about what might happen. For example in my example the request suddenly changes the url to something with my auth in it.
Great so my 1st bit of pseudo code would be:
if (response.url.contains('my-auth string')) {
redirect....
}
I can also see on the headers provided that instead of application/json I've suddenly gone to text/html. Hmm, that's another change I can check for:
if (response.url.contains('my-auth string') && response.headers['content-type'] == 'text/html') {
redirect....
}
You may have other parameters you can check, however these were good enough to detect a redirect for me. Admittedly this is with respect to being redirected to login and not another example, hopefully you get enough distinct changes check for you to decide whether you have got a 302.
I have an error reporting beacon I created using Google Apps script and it is published to run as myself and to be accessible to "anyone, even anonymous," which should mean that X-domain requests to GAS are allowed.
However, my browsers are now indicating there is no Access-Control-Allow-Origin header on the response after the code posts to the beacon.
Am I missing something here? This used to work as recently as two months ago. So long as the GAS was published for public access, then it was setting the Access-Control-Allow-Origin header.
In Google Apps Script:
Code.gs
function doPost(data){
if(data){
//Do Something
}
return ContentService.createTextOutput("{status:'okay'}", ContentService.MimeType.JSON);
}
Client Side:
script.js
$.post(beacon_url, data, null, "json");
When making calls to a contentservice script I always have sent a callback for JSONP. Since GAS does not support CORS this is the only reliable way to ensure your app doesn't break when x-domain issues arrive.
Making a call in jQuery just add "&callback=?". It will figure everything else out.
var url = "https://script.google.com/macros/s/{YourProjectId}/exec?offset="+offset+"&baseDate="+baseDate+"&callback=?";
$.getJSON( url,function( returnValue ){...});
On the server side
function doGet(e){
var callback = e.parameter.callback;
//do stuff ...
return ContentService.createTextOutput(callback+'('+ JSON.stringify(returnValue)+')').setMimeType(ContentService.MimeType.JAVASCRIPT);
}
I've lost a couple of hours with the same issue. The solution was trivial.
When you deploy the script as webapp, you get two URLs: the /dev one and the /exec one. You should use /exec one to make cross domain POST requests. The /dev one is always private: it requires to be authorized and doesn't set *Allow-Origin header.
PS.: The /exec one seems to be frozen — it doesn't reflect any changes of code until you manually deploy it with a new version string (dropdown list in deploy dialog). To debug the most recent version of the script with the /dev URL just install an alternative browser and disable it's web-security features (--disable-web-security in GoogleChrome).
Just to make it simpler for those who are only interested in a POST request like me:
function doPost(e){
//do stuff ...
var MyResponse = "It Works!";
return ContentService.createTextOutput(MyResponse).setMimeType(ContentService.MimeType.JAVASCRIPT);
}
I stumbled upon the same issue:
calling /exec-urls from the browser went fine when running a webpage on localhost
throws crossorigin-error when called from a https-domain
I was trying to avoid refactoring my POST JSON-clientcode into JSONP (I was skeptical, since things always worked before).
Possible Fix #1
Luckily, after I did one non-CORS request (fetch() in the browser from a https-domain, using mode: no-cors), the usual CORS-requests worked fine again.
last thoughts
A last explanation might be: every new appscript-deployment needs a bit of time/usage before its configuration actually settled down at server-level.
Following solution works for me
In Google Apps Script
function doPost(e) {
return ContentService.createTextOutput(JSON.stringify({status: "success", "data": "my-data"})).setMimeType(ContentService.MimeType.JSON);
}
In JavaScript
fetch(URL, {
redirect: "follow",
method: "POST",
body: JSON.stringify(DATA),
headers: {
"Content-Type": "text/plain;charset=utf-8",
},
})
Notice the attribute redirect: "follow" which is very very important. Without that, it doesn't work for me.
I faced a similar issue of CORS policy error when I tried to integrate the app script application with another Vue application.
Please be careful with the following configurations:
Project version should be NEW for every deployment.
Execute the app as me in case you want to give access to all.
Who has access to the app to anyone, anonymous.
Hope this works for you.
in your calling application, just set the content-type to text/plain, and you will be able to parse the returned JSON from GAS as a valid json object.
Here is my JSON object in my google script doPost function
var result = {
status: 200,
error: 'None',
rowID: rowID
};
ws.appendRow(rowContents);
return ContentService.createTextOutput(JSON.stringify(result))
.setMimeType(ContentService.MimeType.JSON);
and here I am calling my app script API from node js
const requestOptions = {
method: 'POST',
headers: {'Content-Type': 'text/plain'},
body: JSON.stringify({param1: value, param2:value})
};
const response = await fetch(server_URL, requestOptions);
const data = await response.json();
console.log(data);
console.log(data.status);
My case is different, I'm facing the CORS error in a very weird way.
My code works normally and no CORS errors, only until I added a constant:
const MY_CONST = "...";
It seems that Google Apps Script (GAS) won't allow 'const' keyword, GAS is based on ES3 or before ES5 or that kind of thing. The error on 'const' redirect to an error page URL with no CORS.
Reference:
https://stackoverflow.com/a/54413892/5581893
In case this helps all any of those people like me:
I have a .js file which contains all my utility functions, including ones which call a GAS. I keep forgetting to clear my cache when I go to test updates, so I'll often get this kind of error because the cached code is using the /dev link instead of the /exec one.