How to get response header from useQuery of Apollo Client - javascript

I haven't been able to find a way to do this at all. Does anyone know if this is supported? Thanks.

ApolloClient's methods for making requests, and the React Hooks that use them, serve as an abstraction over how the data is actually fetched. It could come from a remote server over HTTP, from the cache, from directly executing the request against a schema, etc. As a result, they don't expose any information regarding how the data was fetched in the first place, including transport-specific information like HTTP headers.
If you need to access this information, the appropriate place to do so would be inside a Link that you'd prepend to your HttpLink -- either an existing one like a ContextLink or ErrorLink, or some custom Link you roll yourself. If you're doing this in an error-handling context, then ErrorLink would be your best bet, as suggested in the comments.
HttpLink injects the raw response from the server into the context object used by all Links (see here). Assuming you're using the default fetch API as the fetcher, this response will be a Response object.
So you can do something like this:
const link = onError(({ graphQLErrors, networkError, operation }) => {
const { response } = operation.getContext();
const { headers, status } = response;
// do something with the headers
});

Related

Route that is executed within another route in Node.js is not being executed

Good Evening,
I have a function that contains a route that is a call to the Auth0 API and contains the updated data that was sent from the client. The function runs, but the app.patch() does not seem to run and I am not sure what I am missing.
function updateUser(val) {
app.patch(`https://${process.env.AUTH0_BASE_URL}/api/v2/users/${val.id}`,(res) => {
console.log(val);
res.header('Authorization: Bearer <insert token>)
res.json(val);
})
app.post('/updateuser', (req, ) => {
const val = req.body;
updateUser(val);
})
app.patch() does NOT send an outgoing request to another server. Instead, it registers a listener for incoming PATCH requests. It does not appear from your comments that that is what you want to do.
To send a PATCH request to another server, you need to use a library that is designed for sending http requests. There's a low level library built into the nodejs http module which you could use an http.request() to construct a PATCH request with, but it's generally a lot easier to use a higher level library such as any of them listed here.
My favorite in that list is the got() library, but many in that list are popular and used widely.
Using the got() library, you would send a PATCH request like this:
const got = require('got');
const options = {
headers: {Authorization: `Bearer ${someToken}`},
body: someData
};
const url = `https://${process.env.AUTH0_BASE_URL}/api/v2/users/${val.id}`;
got.patch(url, options).then(result => {
console.log(result);
}).catch(err => {
console.log(err);
});
Note: The PATCH request needs body data (the same that a POST needs body data)

Trouble with Cloudflares Worker Cache API

I’ve now spent countless hours trying to get the cache API to cache a simple request. I had it working once in between but forgot to add something to the cache key, and now its not working anymore. Needless to say, cache.put() not having a return value that specifies if the request was actually cached or not does not exactly help and I am left with trial and error. Can someone maybe give me a hint on what I’m doing wrong and what is actually required? I’ve read all the documentation more than 3 times now and I’m at a loss…
Noteworthy maybe is that this REST endpoint sets pragma: no-cache and everything else cache-related to no-cache, but i want to forcibly cache the response anyway which is why I tried to completely re-write the headers before caching, but it still isn’t working (not matching or not storing, no one knows…)
async function apiTest(token, url) {
let apiCache = await caches.open("apiResponses");
let request = new Request(
new URL("https://api.mysite.com/api/"+url),
{
headers: {
"Authorization": "Bearer "+token,
}
}
)
// Check if the response is already in the cloudflare cache
let response = await apiCache.match(request);
if (response) {
console.log("Serving from cache");
}
if (!response) {
// if not, ask the origin if the permission is granted
response = await fetch(request);
// cache response in cloudflare cache
response = new Response(response.body, {
status: response.status,
statusText: response.statusText,
headers: {
"Cache-Control": "max-age=900",
"Content-Type": response.headers.get("Content-Type"),
}
});
await apiCache.put(request, response.clone());
}
return response;
}
Thanks in advance for any help, I've asked the same question on the Cloudflare community first and not received an answer in 2 weeks
This might be related to your use of caches.default, instead of opening a private cache with caches.open("whatever"). When you use caches.default, you are sharing the same cache that fetch() itself uses. So when your worker runs, your worker checks the cache, then fetch() checks the cache, then fetch() later writes the cache, and then your worker also writes the same cache entry. Since the write operations in particular happen asynchronously (as the response streams through), it's quite possible that they are overlapping and the cache is getting confused and tossing them all out.
To avoid this, you should open a private cache namespace. So, replace this line:
let cache = caches.default;
with:
let cache = await caches.open("whatever");
(This await always completes immediately; it's only needed because the Cache API standard insists that this method is asynchronous.)
This way, you are reading and writing a completely separate cache entry from the one that fetch() itself reads/writes.
The use case for caches.default is when you intentionally want to operate on exactly the cache entry that fetch() would also use, but I don't think you need to do that here.
EDIT: Based on conversation below, I now suspect that the presence of the Authorization header was causing the cache to refuse to store the response. But, using a custom cache namespace (as described above) means that you can safely cache the value using a Request that doesn't have that header, because you know the cached response can only be accessed by the Worker via the cache API. It sounds like this approach worked in your case.

Sapper.js - Preload with cookies/headers?

I'm using Sapper.js to power my application but only using the static content created by running sapper export. So there is no server rendering the pages.
I'm using AWS CloudFront with Lambda#Edge to perform authentication on the user's HttpOnly cookies whenever they request a page. If the user is authenticated, Lambda will then fetch user data such as the user's profile picture, username, etc and set these values in custom headers/cookies (non HttpOnly) on the pages returned by CloudFront.
These values can be set in either headers or cookies, there are no requirements for either.
But I need to have this dynamic content available to the client before the page is rendered in order to avoid an ugly flash of empty content. So it should be retrieved inside of sapper's preload function instead of onMount in order to stall any other html from being rendered until the data is returned.
I know how to fetch inside of the preload function like so:
<script context="module">
export async function preload(page, session) {
const res = await this.fetch("SOME_ENDPOINT");
const data = await res.json();
return {data};
}
</script>
but I'm not sure on how to get access to headers or cookies from within this function.
EDIT: NEW APPROACH?
So I've been thinking and it seems like the best way to go at this point is to try and transform Sapper's sapper.middleware function so that it accepts a custom req object and returns the res object instead of trying to serve it to the client.
Then we can run npm run build and use the entire build directory inside of Lambda. We're free to pass any user data into the middleware session obbject afterwards as it explains in the docs:
sapper.middleware({session: (CUSTOM_REQ, CUSTOM_RES) => ({user: CUSTOME_REQ.user})})
No need to fetch any data as it should now be available in the store.
Any thoughts?
You can pass { credentials: true } as the second option to this.fetch (same as regular fetch):
export async function preload(page, session) {
const res = await this.fetch("SOME_ENDPOINT", {
credentials: true
});
const data = await res.json();
return {data};
}
This will cause cookies to be sent with the request. By definition though, this won't work with exported apps — the response must be constructed per-user.

Axios POST params show empty on server - using MERN stack

I want to update a document in Mongo, but when I send an Axios POST request to the server with params for the updates I receive nothing but a blank object on the server side - I'm using Node.js with an Express server (MERN stack).
I have tried the qs library module and Node's querystring module. I tried including headers with
'Content-Type': 'application/x-www-form-urlencoded' and 'application/json'.
My Axios POST request:
const A = 1;
const B = 2;
const data = { A, B };
console.log(qs.stringify(data)); // A=1&B=2
axios.post(url('upVote'), qs.stringify(data));
The server route:
app.post('/upVote', async (req, res) => {
console.log(req.params); // {}
await DB.updateVote(ID, collection, voteCount);
res.end();
});
The headers as shown by Chrome's DevTools.
... Also, all my axios.get() requests work fine and grab data from Mongo and send it back to my app properly, and the url/endpoints match.
There are a couple of ways to send data to the server with axios.
I see the confusion with the documentation in axios, I have not seen this usage before and it does seem to be broken upon looking at the request logs and object.
1) axios.post receives body of the request as a second parameter. So if you want to pass parameters to axios, you should do something like this:
const B = 2;
const data = { A: 1, B: 1 };
axios.post(url('upVote'), {}, { params: data });
Note that axios will handle stringification on it's own and that the third parameter is a config object.
On the server the params will be available at request.query
2) If you want to stringify the parameters yourself, then you should append them into your URL like so
axios.post(`url('upVote')?${qs.stringify(data)}`);
Same here, data on the server will be under request.query
3) It's generally better to use the body of the post request to transfer large data payloads for convenience. You should also consider what your caching strategies are and if they rely on request url without the consideration of request body it may be a concern.
axios.post(url('upVote'), data);
In this case data on the server will be under request.body
UPD: Originally forgot to mention that you will need a body-parser middleware to access request.body.
4) You can use axios without method shorthands which may be useful for some people
axios({
method: 'POST',
url: url('upVote'),
params: data
})
This is identical to the example in 1.
And all of them return a Promise which you can .then().catch() or await.
I think you want .body instead of .params.As you are sending data in body by post using axios. You are printing params which will print nothing for this url/api .
Try
console.log(req.body) // instead of req.params
If this did not work then please show us your react code.
Moreover
In react you have to add .then() after axios else it will say unhanded promise
To get params on server side you have to make some changes
In axios (react)
axios.post(url('upVote/param'), qs.stringify(data));
In server
app.post('/upVote/:params', async (req, res) => {
console.log(req.params)
.....
})
I think you are calling res.end(). I think it should be res.send(...)
This answer should help: https://stackoverflow.com/a/29555444/1971378

webRequest API: How to get the requestId of a new request?

The chrome.webRequest API has the concept of a request ID (source: Chrome webRequest documention):
Request IDs
Each request is identified by a request ID. This ID is unique within a browser session and the context of an extension. It remains constant during the the life cycle of a request and can be used to match events for the same request. Note that several HTTP requests are mapped to one web request in case of HTTP redirection or HTTP authentication.
You can use it to correlate the requests even across redirects. But how do you initially get hold off the id when start a new request with fetch or XMLHttpRequest?
So far, I have not found anything better than to use the URL of the request as a way to make the initial link between the new request and the requestId. However, if there are overlapping requests to the same resource, this is not reliable.
Questions:
If you make a new request (either with fetch or XMLHttpRequest), how do you reliably get access to the requestId?
Does the fetch API or XMLHttpRequest API allow access to the requestId?
What I want to do is to use the functionality provided by the webRequest API to modify a single request, but I want to make sure that I do not accidentally modify other pending requests.
To the best of my knowledge, there is no direct support in the fetch or XHMLHttpRequest API. Also I'm not aware of completely reliable way to get hold of the requestId.
What I ended up doing was installing a onBeforeRequest listener, storing the requestId, and then immediately removing the listener again. For instance, it could look like this:
function makeSomeRequest(url) {
let listener;
const removeListener = () => {
if (listener) {
chrome.webRequest.onBeforeRequest.removeListener(listener);
listener = null;
}
};
let requestId;
listener = (details) => {
if (!requestId && urlMatches(details.url, url)) {
requestId = details.requestId;
removeListener();
}
};
chrome.webRequest.onBeforeRequest.addListener(listener, { urls: ['<all_urls>'] });
// install other listeners, which can then use the stored "requestId"
// ...
// finally, start the actual request, for instance
const promise = fetch(url).then(doSomething);
// and make sure to always clean up the listener
promise.then(removeListener, removeLister);
}
It is not perfect, and matching the URL is a detail that I left open. You could simply compare whether the details.url is identical to url:
function urlMatches(url1, url2) {
return url1 === url2;
}
Note that it is not guaranteed that you see the identical URL, for instance, if make a request against http://some.domain.test, you will see http://some.domain.test/ in your listener (see my other question about the details). Or http:// could have been replaced by https:// (here I'm not sure, but it could be because of other extensions like HTTPS Everywhere).
That is why the code above should only be seen as a sketch of the idea. It seems to work good enough in practice, as long as you do not start multiple requests to the identical URL. Still, I would be interested in learning about a better way to approach the problem.

Categories