I am working on a website that uses Nextjs and React. We are using a mix of static generation as well as SSR. One of the components we're using is shared by every page on the site. It is a header. The issue is that his header requires some very large queries in order to get all the information it needs to properly render. Ideally, we want to "statically render" this component so that it can be used by all pages at build time, even for pages that are not statically rendered. As it stands, rendering this header every time a user visits our website significantly increases load times as well as puts an unnecessary load on our DB.
Is there any way to go about doing something like "statically rendering" a single component that's shared by every page? Part of the code for the header looks something like
const { loading, error, data } = useQuery(QUERY);
const { loading: productLoading, data: productData } = useQuery(QUERY_PRODUCT);
const { data: otherData } = useQuery(OTHER_DATA);
const { data: otherOtherData } = useQuery(OTHER_OTHER_DATA);
Ideally we don't want to run these queries every time a user visits the website.
If you are going to use the same data for every page, you may consider using a redux-store or storing them in a context provider.
You may use the codes sample similar to https://stackoverflow.com/a/64924281/2822041 where you create a context and provider which is accessible in all your pages.
The other method, ideally, would be to "cache" using Redis so that you do not need to query the data over and over again.
Related
I'm using a Headless CMS based in another country that is not near my own, and each request to this CMS takes at least 1.5s to return a payload (which can be a ridiculously long time to wait when you're making multiple requests).
Now, I can use getStaticProps for each page requiring CMS content. But, since I have global components that need information from the CMS too (such as the navigation and footer), this destroys Next's ability to render each page statically, and it forces me to use getInitialProps in _app.js.
Now, before you say:
Just create a request to the CMS on every page using getStaticProps.
That will remove the ability for my website to be a proper single-page application and cause a re-render for elements in the layout that don't need to be re-mounted while navigating. This ability to navigate without re-mounting is vital since the navigation has functionality that relies on it.
Here is an example of what my _App.getInitialProps looks like:
const appProps = await App.getInitialProps(context);
try{
// Retrieve data from CMS.
// Yes. I do have to create two requests to the CMS to retrieve a full payload,
// as that is how their API is setup, unfortunately.
const dataOne = await fetch(...);
const dataTwo = await fetch(...);
// Yes, I'm aware this isn't actually how it would work.
return { ...dataOne, ...dataTwo, ...appProps };
} catch(error){
console.log("Error in _App component:", error);
}
return appProps;
The ideal workaround for this would be to use getStaticProps in each component that requires it - but this isn't a feature that Next offers, unfortunately.
So this is a fairly new topic, React Server Components has recently been released, comparing to SSR/Next.js, how does it affect SEO?
Since the component is rendered in the server dynamically when it is requested, it is not really as static as SSR like Next.js, will search engine fail to index those component if I use it?
A demo can found here
We can see that in api.server.js,
async function renderReactTree(res, props) {
await waitForWebpack();
const manifest = readFileSync(
path.resolve(__dirname, '../build/react-client-manifest.json'),
'utf8'
);
const moduleMap = JSON.parse(manifest);
pipeToNodeWritable(React.createElement(ReactApp, props), res, moduleMap);
}
function sendResponse(req, res, redirectToId) {
const location = JSON.parse(req.query.location);
if (redirectToId) {
location.selectedId = redirectToId;
}
res.set('X-Location', JSON.stringify(location));
renderReactTree(res, {
selectedId: location.selectedId,
isEditing: location.isEditing,
searchText: location.searchText,
});
}
I understand this can help to reduce the workload for client's device, since the component are rendered on the server and sent to the client, and that the component can be rendered with the secret stored in server as we can just pass it in as props rather we sending the secret to client.
But if SEO matters, is SSR preferred over React Server Component?
Server Components are complementary to rendering into HTML, not an alternative. The plan is to have both.
Server Components were not released. What was released is an early tech preview in the spirit of sharing our research. This preview doesn’t include an HTML renderer. The api.server.js file from the demo you mentioned contains a comment about this:
const html = readFileSync(
path.resolve(__dirname, '../build/index.html'),
'utf8'
);
// Note: this is sending an empty HTML shell, like a client-side-only app.
// However, the intended solution (which isn't built out yet) is to read
// from the Server endpoint and turn its response into an HTML stream.
res.send(html);
By the time Server Components are officially released, there will be a streaming HTML renderer for the first render.
It’s not built yet.
It should be same from SEO point of view as SPA.
What happens with classic React SPA is, it loads React components (which are essentially JS functions) as part of the JS bundle, and then it starts to request data from the backend in JSON format. After JSON is fetched, it is rendered via the React component functions and inserted into the DOM. Modern crawlers use V8 engine (or maybe smth else if it's Bing :D), they wait until page is fully loaded and all JSON data is loaded and all components are actually rendered - and then it crawls the resulting DOM.
GoogleBot is crawling SPAs that way for at least 3 years now, probably more - so if you were thinking that SSR is essential for SEO, no, it is not. There were plenty of investigations into this, random example: https://medium.com/#l.mugnaini/spa-and-seo-is-googlebot-able-to-render-a-single-page-application-1f74e706ab11
So essentially for crawler it doesn't really matter, how exactly React component is rendered. In case of React Server Components, component function resides on server and is never transferred to the client as part of the JS bundle. So instead of requesting JSON data, the application requests rendered component in some intermediate format (not HTML btw). Result of that is transferred to the client and is getting rendered to the DOM. So the end result is still the same - it's some DOM elements that the bot can crawl.
I am working on a nuxt project, and I have a folder structure like this.
_category
_subCategory
_pageAlias
_post.vue
_pageAlias.vue
anotherPage.vue
samplePage.vue
_subCategory.vue
_pageAlias.vue, anotherPage.vue, samplePage.vue are at the same level and working fine. The problem comes when I have to add another page which is similar to samplePage but have different URL. All the above pages have different page structure and functionalities. But now I want to merge the functionality and looking for a way to define a custom route in a way that based I can check the URL parameter make an api call to find out the type of page and load the respective component.
Eg. A middleware where I can do something like
if(router.name === 'category-subcategory-pagealias') {
let type = axio.get('api'+router.param._pageAlias)
if(type === 'sample'){
next({name: 'category-subcategory-samplepage'})
}
}
In short: I want to load a page component like a dynamic component based on routes parameters so that I don't lose any of the page methods like asyncData.
I have tried using extend routes in nuxt config, but that doesn't work for this case because it will be a dynamic change in the router config.
I cannot use dynamic components inside a page; then I lose all the page methods.
Do you know if it's possible to re-execute Gatsby page queries (normal queries) manually?
Note, This should happen in dev mode while gatsby develop runs.
Background info: I'm trying to set up a draft environment with Gatsby and a Headless CMS (Craft CMS in my case). I want gatsby develop to run on, say, heroku. The CMS requests a Gatsby page, passing a specific draft-token as an URL param, and then the page queries should be re-executed, using the token to re-fetch the draft content from the CMS rather than the published content.
I'm hooking into the token-request via a middleware defined in gatsby-config.js. This is all based on https://gist.github.com/monachilada/af7e92a86e0d27ba47a8597ac4e4b105
I tried
createSchemaCustomization({ refresh: true }).then(() => {
sourceNodes()
})
but this completely re-creates all pages. I really only want the page queries to be extracted/executed.
Probably you are looking for this. Basically, you need to set an environment variable (ENABLE_GATSBY_REFRESH_ENDPOINT) which opens and exposes a /__refresh webhook that is able to receive POST requests to refresh the sourced content. This exposed webhook can be triggered whenever remote data changes, which means you can update your data without re-launching the development server.
You can also trigger it manually using: curl -X POST http://localhost:8000/__refresh
If you need a detailed explanation of how to set .env variables in Gatsby just tell me and I will provide a detailed explanation. But you just need to create a .env file with your variables (ENABLE_GATSBY_REFRESH_ENDPOINT=true) and place this snippet in your gatsby-config.js:
require("dotenv").config({
path: `.env.${activeEnv}`,
})
Of course, it will only work under the development environment but in this case, it fits your requirements.
Rebuild for all is needed f.e. when you have indexing pages.
It looks like you need some logic to conditionally call createPage (with all data refetched) or even conditionally fetch data for selected pages only.
If amount (of pages) is relatively not so big I would fetch for all data to get page update times. Then in loop conditionally (time within a few minutes - no needs to pass parameter) call createPage.
If develop doesn't call 'createPage' on /__refresh ... dive deeper into gatsby code and find logic and way to modify redux touched nodes.
... or search for other optimization techniques you can use for this scenario (queried data cached into json files?).
I am using React with react-router and Reflux as my datastore, but I am unsure on how to best deal with persistence to allow page refresh.
My components connect to the store with Reflux.connect, but since the store fetches the data from a backend, it is not available yet when the Components first initialize and render.
When the user enters my app from the start, then all this data is loaded in order and available when it needs to be, but if further down a route you trigger a page refresh, react tries to render components that rely on data that is not there yet.
I solved this by constantly keeping a copy of data in LocalStorage and serving that from the Reflux store getInitialState(), so that all components get the data before they render.
I wonder if this is the proper way to do it. When for some reason the local storage data gets cleared or corrupted, the interface goes blank, because the components cannot access the correct data. Substructures and properties don't exist and trigger javascript errors. It seems like a messy and unreliable solution.
I am curious to know what patterns are used to solve this.
------ edit -----
To answer to the comment of WiredPrairie:
1) Why are you initializing components with data in getInitialState?
When my components use Reflux.connect, they don't have the data in their state yet on the first render as the store still needs to fetch its data. My views currently don't work gracefully with undefined data. By returning the locally stored cache from the Reflux store in getInitialState(), all connected components will get that data before their first render call.
2) What's causing a page refresh and why can't the data be loaded in the same manner as it was the first time?
It's mainly a workaround I had to build around livereload refreshing the page when I make edits (will look into using react-hotloader later but is not an options yet), but users can also just hit refresh when they are somewhere in my nested views and that would have the same effect. When they manually refresh, they are not entering the app at the start.
3) When components are wired to the change events of a store, why don't they update then?
They do update, but like I said they don't deal with empty data right now and on first render they will miss it waiting for the store to fetch things. I can make all my views work gracefully with empty data, but that would add a lot of boilerplate code.
From the replies so far, I get the feeling that what I'm doing with localStorage is the common way to do it. Cache stuff locally in localStorage or sessionStorage or something similar, and serve that data immediately after a refresh.
I should make my views a bit more robust by gracefully handing empty data on the odd occasion that localStorage doesn't work properly.
I've been caching each Store in sessionStorage when its emitChange() fires, and initializing the store from sessionStorage if cached data exists, or null values otherwise. This seems to work provided that the views can handle null values, which is probably a good idea anyway (it sounds like this is your main problem).
I'd suggest making your views handle the case of no data gracefully, initialize everything with null values if the cache isn't available, and then call the backend to update the null values with an Action whenever the data returns.
I haven't tried Reflux, but in regular Flux it would look like this (maybe you can apply the same principle):
var _data;
if (sessionStorage.PostStore)
_data = JSON.parse(sessionStorage.PostStore);
else {
_data = {
posts: null
};
BackendAPI.getPosts(function(err, posts) {
if (posts) {
PostActions.setPosts(posts);
}
});
}
...
AppDispatcher.register(function(payload) {
var action = payload.action;
switch (action.actionType) {
...
case Constants.SET_POSTS:
_data.posts= action.data.posts;
break;
default:
return true
}
// Update cache because state changed
sessionStorage.PostStore = JSON.stringify(_data);
PostStore.emitChange();
return true;
});