I'm trying to deploy my first next.js project on vercel. Locally everything is working.
My problem: When deploying (command: next build) the web app I get the message "Generating static sites (0/2000)" and then nothing happens. The deployment is cancelled after 1h (time expiration).
The problem is somewhere in the (simplified) code below. Here's why: when I deploy the project without the part that follows after const content the deployment is successful - so basically instead of having singleProductResponse AND contentResponse as props, I only have singleProductResponse..
I'm a little stuck and don't know how to solve this. Can someone tell me what I'm doing wrong? Thanks a lot!!
const Item = ({ singleProductResponse, contentResponse }) => {
const router = useRouter();
if (router.isFallback) {
return <div>Loading...</div>;
}
return (
<div className={styles.section}>
<Overview
singleProductResponse={singleProductResponse}
contentResponse={contentResponse}
/>
</div>
);
};
export async function getStaticPaths() {
const itemResponse = await knex("Items");
const paths = itemResponse.map((product) => ({
params: {
brand: product.brand,
item: product.item,
},
}));
return {
paths,
fallback: true,
};
}
export async function getStaticProps({ params }) {
try {
const itemData = await knex("Items").where(
"item",
"like",
`${params.item}`
);
const singleProductResponse = itemData[0];
//!!!!!!!!!when leaving the following part out: deployment is successful!!!!!!!!!!!!
const content = itemData[0].contentList;
const splitContent = content.split(", ");
const contentArray =
typeof splitContent === "string"
? [splitContent]
: splitContent;
const result = await knex("Cos")
.leftJoin("Actives", "Cos.content", "ilike", "Actives.content")
.leftJoin("Alcohol", "Cos.content", "ilike", "Alcohol.content")
.column([
"Cos.content",
"function",
"translationPlant",
"categoryAlcohol",
])
.where((qb) => {
contentArray.forEach((word) => {
qb.orWhere("Cos.content", word);
});
return qb;
});
return {
props: { singleProductResponse, contentResponse },
revalidate: 1,
};
} catch (err) {
console.log(err);
return {
redirect: {
destination: "/",
permanent: false,
},
};
}
}
UPDATE
After digging deeper: I think the problem is that the part after const content is too slow in building process.
When running next build, as well as when deploying on vercel the building process stops after approx. 1h. I suppose this is because of the limit of 45min for building process.
The last message I get is "Generating static pages (1000/2000)" (in vercel and locally) and "Build failed" (vercel). I don't get any other error-messages (also not in the catch block).
I've already tried to optimize the part after const content (each table has an index (clustered indexes -> primary keys), I've redesigned the tables (only 4 tables to join, instead of 6), eliminated everything unnecessary in the query and checked that the database (postgres hosted on heroku - hobby-basic) is also in the US). The performance is better, but still not enough. Does anyone have some suggestions for improvement? TTFB might be a somehow slow.
Related
I have SPA page, all work very good but when user reload page beeing on winners or garage get info :
Cannot GET /Garage. Then have to pick default url. How to set reload function on current page.
https://darogawlik-async-race-api.netlify.app/ (my app)
const navigateTo = url => {
history.pushState(null, null, url)
router()
}
const router = async () => {
const routes = [
{ path: '/Garage', view: garage },
{ path: '/Winners', view: winners },
]
// Test each route for potential match
const potentialMatches = routes.map(route => ({
route,
isMatch: location.pathname === route.path,
}))
let match = potentialMatches.find(potentialMatches => potentialMatches.isMatch)
if (!match) {
match = {
route: routes[0],
isMatch: true,
}
}
const view = new match.route.view(document.querySelector('#main'))
}
window.addEventListener('popstate', router)
document.addEventListener('DOMContentLoaded', () => {
document.body.addEventListener('click', e => {
if (e.target.matches('[data-link]')) {
e.preventDefault()
navigateTo(e.target.href)
}
})
router()
})
window.addEventListener('load', router())
This will be a problem with default document handling in the web host - it is not a page load problem. Eg just click this link to get the problem:
https://darogawlik-async-race-api.netlify.app/Garage
Since you are using path based routing, your web host must serve the default document for all paths, including /Garage and /Winners. As an example, in Node.js Express you write code like this. For other web hosts you either write similar code or there is a configuration option that will do it for you.
// Serve static content for physical files, eg .js and .css files
expressApp.use('/', express.static());
// Serve the index.html for other paths
expressApp.get('*', (request, response) => {
response.sendFile('index.html');
}
According to this post on Netlify, you can add a file something like this. I'm no expert on this platform, but hopefully this gives you the info you need to resolve your issue:
[[redirects]]
from = "/*"
to = "/index.html"
status = 200
Problem:
Revalidate doesn't work when im useing (serverSideTranslations) from next-i18next. Please see my code below.
When a new user register, an static page is generated for each language: (da (Default), en, sv, de).
Everyting works fine with the code below, for allready generated pages at build time (When deployed on Vercel).
When a new user register, the page wont get revalidated and thorws a 500 Internal server error. / The page was not generated
Please see comment in code!
export async function getStaticPaths({ locales }) {
const { data: profiles, error } = await supabase.from("profiles").select("*");
console.log(locales);
const paths = profiles
.map((profile) =>
locales.map((locale) => ({
params: { profileName: profile.username },
locale, // Pass locale here
}))
)
.flat();
console.log(paths);
return { paths, fallback: "blocking" };
}
export async function getStaticProps({ params, locale }) {
const { data: profiles, error } = await supabase
.from("profiles")
.select("*")
.eq("username", params.profileName)
.single();
return {
props: {
profiles,
...(await serverSideTranslations(locale, ["Profil"])), //When this line is removed everything works fine --- When its added GetStaticPaths and GetStaticProps breaks.
},
revalidate: 30,
};
}
I would like to obtain all (or a subset) of my records from an Algolia index and access them via GraphQL.
I know there is a Gatsby plugin that allows you to do the opposite i.e., add data from a GraphQL query to Algolia, but not the other way around.
I have been able to get the tutorial for adding GraphQL data to work, but I have not had any success when trying to go beyond hardcoded arrays (this is in the gatsby-node.js file):
const algoliasearch = require("algoliasearch/lite");
const searchClient = algoliasearch(
process.env.GATSBY_ALGOLIA_APP_ID,
process.env.GATSBY_ALGOLIA_SEARCH_KEY
)
const searchIndex = searchClient.initIndex(process.env.GATSBY_ALGOLIA_INDEX_NAME)
exports.sourceNodes = ({ actions, createNodeId, createContentDigest }) => {
searchIndex.search("", {
attributesToRetrieve: ["name", "url"]
}).then(({ hits }) => {
hits.forEach(hit => {
const node = {
name: hit.name,
url: hit.url,
id: createNodeId(`hit-${hit.name}`),
internal: {
type: "hit",
contentDigest: createContentDigest(hit),
},
}
actions.createNode(hit)
})
});
}
While the console successfully logs the array of nodes, and the verbose Gatsby deploy output includes the "hit" node as a node type, they do not appear in the GraphQL explorer.
Any help is greatly appreciated, thank you!
I am using https://github.com/lxieyang/chrome-extension-boilerplate-react as the basis to build a chrome extension. It all works fine, and everything does hot-reloading (popup, background, options, newtab) except for the content-script. Reloading the matching pages, does not reload the underlying .js. It takes to reload/turn-off-on the whole extension in order for the changes to go into effect.
So, in webpack.config.js i commented out 'contentScript' hoping for it to fix that, but it makes no difference.
...
chromeExtensionBoilerplate: {
notHotReload: [
//'contentScript'
],
},
...
In src/pages/Content/index.js it actually states
console.log('Must reload extension for modifications to take effect.');
When developing another extension in plain vanilla js, i dropped a hot-reload.js from https://github.com/xpl/crx-hotreload which worked perfectly. From what i understand it is the 'chrome.runtime.reload()' call that makes chrome completely reload the extension.
So my question(s) actually is:
When changing src/pages/Content/index.js, webpack does re-build the build/contentScript.bundle.js. But why doesn't manually reloading the tab/page recognize these changes, when for popup, background, etc. it does?
And if there is no way to let the above boilerplate reload the extension (i don't mind the hard reload) how would i be able to integrate the hot-reload.js (or its effect actually) into this boilerplate? That is, how do i reload the extension when build/contentScript.bundle.js is changed?
Thanks in advance!
For who is interested. I ended up placing mentioned hot-reload.js in my extension, and loading it from within the background script. That breaks webpack's hot-reloading, by reloading the entire extension on any file-change. But as long as i only work on the content script, thats fine. I can remove it once im done, or if i work on other scripts.
Use server-sent-events:
start.js
const SSEStream = require('ssestream').default;
let sseStream;
...
setupMiddlewares: (middlewares, _devServer) => {
if (!_devServer) {
throw new Error('webpack-dev-server is not defined');
}
/** 改动:/reload path SSE */
middlewares.unshift({
name: 'handle_content_change',
// `path` is optional
path: '/reload',
middleware: (req, res) => {
console.log('sse reload');
sseStream = new SSEStream(req);
sseStream.pipe(res);
res.on('close', () => {
sseStream.unpipe(res);
});
},
});
return middlewares;
}
webpack.compiler.hook
let contentOrBackgroundIsChange = false;
compiler.hooks.watchRun.tap('WatchRun', (comp) => {
if (comp.modifiedFiles) {
const changedFiles = Array.from(comp.modifiedFiles, (file) => `\n ${file}`).join('');
console.log('FILES CHANGED:', changedFiles);
if(watchRunDir.some(p => changedFiles.includes(p))) {
contentOrBackgroundIsChange = true;
}
}
});
compiler.hooks.done.tap('contentOrBackgroundChangedDone', () => {
if(contentOrBackgroundIsChange) {
contentOrBackgroundIsChange = false;
console.log('--------- 发起 chrome reload 更新 ---------');
sseStream?.writeMessage(
{
event: 'content_changed_reload',
data: {
action: 'reload extension and refresh current page'
}
},
'utf-8',
(err) => {
sseStream?.unpipe();
if (err) {
console.error(err);
}
},
);
}
});
crx background
if(process.env.NODE_ENV === 'development') {
const eventSource = new EventSource(`http://${process.env.REACT_APP__HOST__}:${process.env.REACT_APP__PORT__}/reload/`);
console.log('--- start listen ---');
eventSource.addEventListener('content_changed_reload', async ({ data }) => {
const [tab] = await chrome.tabs.query({ active: true, lastFocusedWindow: true });
const tabId = tab.id || 0;
console.log(`tabId is ${tabId}`);
await chrome.tabs.sendMessage(tabId, { type: 'window.location.reload' });
console.log('chrome extension will reload', data);
chrome.runtime.reload();
});
}
How do you guys usually go about doing the loading part of your react? Since pages load really fast now, it's my API request who's having a hard time keeping up. For express, I have a Promise to wait for the API's return data before serving the pages.
export default function fetchComponentData(dispatch, components, params) {
const needs = components.reduce((prev, current) => {
return current ? (current.needs || []).concat(prev) : prev;
}, []);
const promises = needs.map(need => dispatch(need(params)));
return Promise.all(promises);
}
...
fetchComponentData(store.dispatch, props.components, _.merge({}, props.params, props.location.query))
.then(setupHTML)
.then(html => res.end(html))
But should I do this to react as well? Or how do you guys usually display the pages while it's loading? My current is I display the page without the data and start fetching and then rerender to display the page with the content, but then I have a small flickering page which I bet would be annoying if it's deployed already. I usually have my reducers (states) in this format:
const defaultState = {
ui: {
loading: false
}, metadata: {
},
data: { }
}
I was wondering how do you guys usually approach this?