I am using Next.js's Static HTML Export for my site which have 10 million static pages but I am running into ram issues when building the app.
Is it even possible to export it in parts like 100k pages on first build then 100k on second build and so on?
I do not want to use Incremental Static Regeneration or getServerSideProps to cut costs.
This site is using MongoDB only have two pages home page and posts page:
index.js
[postPage].js
In home page I used this code:
export async function getStaticProps() {
const { db } = await connectToDatabase();
const postsFeed = await db
.collection("myCollection")
.aggregate([{ $sample: { size: 100 } }])
.toArray();
return {
props: {
postsFeed: JSON.parse(JSON.stringify(postsFeed)),
},
};
}
In posts page I used this code:
export async function getStaticPaths() {
const { db } = await connectToDatabase();
const posts = await db
.collection("myCollection")
.find({})
.toArray();
const paths = posts.map((data) => {
return {
params: {
postPage: data.slug.toString(),
}
}
})
return {
paths,
fallback: 'blocking'
}
}
export async function getStaticProps(context) {
const postSlug = context.params.postPage;
const { db } = await connectToDatabase();
const posts = await db
.collection("myCollection")
.find({ slug: { $eq: postsSlug } })
.toArray();
const postsFeed = await db
.collection("myCollection")
.aggregate([{ $sample: { size: 100 } }])
.toArray();
return {
props: {
posts: JSON.parse(JSON.stringify(posts)),
postsFeed: JSON.parse(JSON.stringify(postsFeed)),
},
};
}
Doesn't seem to be a built-in option to process batches of static pages https://github.com/vercel/next.js/discussions/14929
I can only think of dividing the work using a bash script where you set an env variable and use it in the code where you're fetching the data to generate the paths, then run the build command as many times as parts you need to split the data, in each iteration move the generated files to another directory that will be your consolidated output.
COUNTER=1
PARTS=100 # change it to control number of parts
while [ $COUNTER -lt $PARTS ]; do
let COUNTER=COUNTER+1
CURRENT=$COUNTER PARTS=$PARTS next build
# move generated files to another directory
done
in your get getStaticPaths
export async function getStaticPaths() {
const currentPercentage = process.env.CURRENT/process.env.PARTS
// logic to fetch the corresponding current percentage of the data
// 1% when there are 100 parts, 0.5% when 200 parts, etc.
}
Be aware that if the data changes very often, you'll see incorrect results, like repeated pages or skipped ones, since each pagination will occur at different moments when running the script. I believe you could create an auxiliary node (or another language) script to better handle that quantity of records, maybe in a streamlined way, and generate JSON files for each chunk of data to use them in getStaticPaths instead of fetching them directly from the DB.
Related
I'm working on a Nuxt 3 application where I need to generate a sitemap dynamically from a CMS using the method described for Nuxt Content. I'm using the built in /server/api/ directory to get this data for the rest of the app. My sitemap.xml.js file is in /server/routes/.
I'd like to be able to make $fetch requests from my sitemap file to the apis instead of rewriting all of that code (making the $fetch requests to the CMS again).
sitemap.xml.js
export default defineEventHandler(async (event) => {
const { auth, posts } = useRuntimeConfig()
//[...]
const { records: postList } = await $fetch(`${posts}?api_key=${auth}`)
postList.forEach((post) => {
if(post.fields.Status){
sitemap.write({
url: 'posts/' + post.fields.Slug + '/',
lastmod: post.lastMod
})
}
})
//[...]
}
What I would like to do
export default defineEventHandler(async (event) => {
//[...]
const { records: postList } = await $fetch('/api/posts')
postList.forEach((post) => {
sitemap.write({
url: 'posts/' + post.fields.Slug + '/',
lastmod: post.lastMod
})
})
//[...]
}
My sitemap code all works fine, I just can't get the data from my other server routes. Does anyone know if this is possible/how to do it?
I have a problem that my website being deployed to Vercel with NextJS ISR. When I use getStaticPaths. The website will reload unlimited when I click to getStaticPaths that page.
Only getStaticProps will not make this error. I am using the hobby plan. Everything works when I am on localhost and use "npm run build and start".
here is my code:
export async function getStaticProps({ params: { slug } }) {
const res = await fetch(`${API_URL}/post/${slug}`);
const posts = await res.json();
return {
props: {
posts,
},
// Next.js will attempt to re-generate the page:
revalidate: 10, // In seconds
};
}
export async function getStaticPaths() {
// this URL is created specifically for fetch first time of ISR
// My first post will unlimited reload**********
const res = await fetch(`${API_URL}/postforisrfirst`);
// const res = await fetch(`${API_URL}/post`);
const posts = await res.json();
// Get the paths we want to pre-render based on posts
const paths = posts.map((post) => ({
params: { slug: post.slug },
}));
return { paths, fallback: "blocking" };
}
In SingleBlogPost.jsx i have:
export async function getStaticPaths() {
const res = await fetch("http://localhost:1337/api/posts");
let { data } = await res.json();
const paths = data.map((data) => ({
params: { slug: data.attributes.slug },
}));
return {
paths,
fallback: "blocking",
};
}
where I generate blog pages by their slug.
But then in getStaticProps I need to fetch single post by slug but I want to do it by id.
export async function getStaticProps(context) {
console.log("context", context);
const { slug } = context.params;
console.log("slug is:", slug);
const res = await fetch("http://localhost:1337/api/posts");
const { data } = await res.json();
return {
props: {
data,
},
revalidate: 10, // In seconds
};
}
And I want to keep url like /blog/:slug , I dont want to include id. in url .When I already fetch all posts in getStaticPaths how I can access post id in getStaticProps to avoid fetching by slug?
You can filter your API response by your slug to get the same result
const res = await fetch(`http://localhost:1337/api/posts?filters[slug][$eq]${slug}`);
This will generate your desired result
It looks like recently released a workaround using a file system cache.
The crux of the solution is that they save the body object in memory, using something like this:
this.cache = Object.create(null)
and creating methods to update and fetch data from the cache.
Discussion here: https://github.com/vercel/next.js/discussions/11272#discussioncomment-2257876
Example code:
https://github.com/vercel/examples/blob/main/build-output-api/serverless-functions/.vercel/output/functions/index.func/node_modules/y18n/index.js#L139:10
I found a concise work around that uses the object-hash package. I basically create a hash of the params object and use that to create the tmp filename both on set and get. The tmp file contains a json with the data I want to pass between the two infamous static callbacks.
The gist of it:
function setParamsData({params, data}) {
const hash = objectHash(params)
const tmpFile = `/tmp/${hash}.json`
fs.writeFileSync(tmpFile, JSON.stringify(data))
}
function getParamsData (context) {
const hash = objectHash(context.params)
const tmpFile = `/tmp/${hash}.json`
context.data = JSON.parse(fs.readFileSync(tmpFile))
return context
}
We can then use these helpers in the getStaticPaths and getStaticProps callbacks to pass data between them.
export function getStaticPaths(context) {
setParamsData({...context, data: {some: 'extra data'})
return {
paths: [],
fallback: false,
}
}
export function getStaticProps(context) {
context = getParamsData(context)
context.data // => {some: 'extra data'}
}
I'm sure someone can think of a nicer API then re-assigning a argument variable.
The tmp file creation is likely not OS independent enough and could use some improvement.
So, I am working on a simple Next.Js app. It's just a Lyrical Page but I'm running onto some issues. I'm new to Next.Js and this is my first project. So, here is my problem:
I have a dynamic set of pages/folders which looks like this:
songs (empty folder)
[singer] (child of songs containng just an index.jsx)
[title] (child of [singer] containing another index.jsx)
Now in my [title] - index.jsx im rendering a simple page providing lyrics for a specific song.
My problem is that I want to count views (every time someone opens this page) for each song sepertly. I have the following code:
export const getStaticProps = async (context) => {
const res = await fetch(`${process.env.PROXY}api/lyrics/post/${encodeURIComponent(context.params.title)}`);
const data = await res.json();
const send = await fetch(`${process.env.PROXY}api/lyrics/views/${data.post._id}`);
return {
props: {
data: data.post,
message: 200
}
}
}
export const getStaticPaths = async () => {
const res = await fetch(`${process.env.PROXY}api/lyrics/getAll`);
const data = await res.json();
const paths = data.all.map((name) => ({ params: { singer: name.singer.toString(), title: name.title.toString() } }));
return {
paths: paths,
fallback: 'blocking'
}
}
The problem is I know getStaticProps renders only on build time, however, I want to render every time so that I can count the views with my send variable.
Can someone please help me figure this out? Any help will be appreciated!
I have an application with React as client framework and Graphql for running the API.
Plan is fetching user posts and posts count.
Client side
const Profile = () => (
<div>
<PostCount />
<Posts />
</div>
)
const PostCount = () => ...
const Posts = () => ...
We need to display posts in Posts component and the posts count in Count component.
So my question is, which is better.
fetch all of data in Profile component in one request and send it to Post and Count component as props. or fetch posts count in Count component and posts in Post component.
scenario one includes one request to server and bigger chunk of data.
scenario two includes two request to server and smaller chunk of data.
Scenario one server side:
const schema = `
type Query {
feeds(): Feed!
}
type Feed {
posts: [Post!]!
count: Int!
}
type Post {
...
}
`
async function feed() {
const posts: await Post.findAll();
const count = await Post.count();
return {
count
posts,
}
}
Scenario two server side:
const schema = `
type Query {
posts(): [Post!]!
count(): Int!
}
type Post {
...
}
`
async function feed() {
const posts: await Post.findAll();
return posts;
}
async function count() {
const count = await Post.count();
return count;
}
P.S. also consider bigger data. posts and counts are example. for example user posts and user comments.
Both ways are correct! it depends on your application to choose a better approach!
count is usually faster than fetching data ( someone might mess it up!:D), so if you fetch it separately, you can show it faster while your posts are still in loading.
BTW, GQL handles Promise by itself! so there's no need for that async awaits!