I would like to obtain all (or a subset) of my records from an Algolia index and access them via GraphQL.
I know there is a Gatsby plugin that allows you to do the opposite i.e., add data from a GraphQL query to Algolia, but not the other way around.
I have been able to get the tutorial for adding GraphQL data to work, but I have not had any success when trying to go beyond hardcoded arrays (this is in the gatsby-node.js file):
const algoliasearch = require("algoliasearch/lite");
const searchClient = algoliasearch(
process.env.GATSBY_ALGOLIA_APP_ID,
process.env.GATSBY_ALGOLIA_SEARCH_KEY
)
const searchIndex = searchClient.initIndex(process.env.GATSBY_ALGOLIA_INDEX_NAME)
exports.sourceNodes = ({ actions, createNodeId, createContentDigest }) => {
searchIndex.search("", {
attributesToRetrieve: ["name", "url"]
}).then(({ hits }) => {
hits.forEach(hit => {
const node = {
name: hit.name,
url: hit.url,
id: createNodeId(`hit-${hit.name}`),
internal: {
type: "hit",
contentDigest: createContentDigest(hit),
},
}
actions.createNode(hit)
})
});
}
While the console successfully logs the array of nodes, and the verbose Gatsby deploy output includes the "hit" node as a node type, they do not appear in the GraphQL explorer.
Any help is greatly appreciated, thank you!
Related
When using Apollo client, I find it quite tedious to manually update the cache for every mutation that requires an immediate UI update. I therefore decided to try to make a custom hook which updates the cache automatically.
The hook works but it seems a little "hacky" and I'm worried it might mess with the normal functioning of the cache. So I just wanted to ask if this hook seems like it should work ok?
Here's the code (where mutationName is the actual graphql mutation name and fieldName is the original graphql query name corresponding to the mutation):
export const useMutationWithCacheUpdate = (
mutation,
mutationName,
fieldName
) => {
const [createMutation, { data, loading, error }] = useMutation(mutation, {
update(cache, { data }) {
data = data[mutationName];
cache.modify({
fields: {
[fieldName]: (existingItems = []) => {
const newItemRef = cache.writeFragment({
data: data,
fragment: gql`
fragment newItem on ${fieldName} {
id
type
}
`,
});
return [...existingItems, newItemRef];
},
},
});
},
});
return [createMutation, { data, loading, error }];
};
I'm new to both Strapi and Mongoose, so I apologise if this is a stupid question.
Following the docs (https://strapi.io/documentation/developer-docs/latest/development/backend-customization.html) I'm trying to create a custom query in Strapi in which I want to return the whole collection called people sorted by name desc. But when I hit the endpoint I get a 500 error and checking the terminal the error message is CastError: Cast to ObjectId failed for value "alldesc" at path "_id" for model "people".
Here's my code:
services/people.js
module.exports = {
findByNameDesc() {
const result = strapi
.query("people")
.model.find()
.sort({ name: "descending" });
return result.map((entry) => entry.toObject());
},
};
controllers/people.js
module.exports = {
async alldesc(ctx) {
const entities = await strapi.services.people.findByNameDesc(ctx);
return entities.map((entity) =>
sanitizeEntity(entity, { model: strapi.models.people })
);
},
};
config/routes.json
{
"routes": [
...
{
"method": "GET",
"path": "/people/alldesc",
"handler": "people.alldesc",
"config": {
"policies": []
}
}
]
}
What am I doing wrong?
UPDATE: even when removing .sort({ name: "descending" }); from the query, the error is still there, so I'm thinking that maybe there's something wrong in the way I use the service in the controller?
The problem was in routes.json. Basically seems like Strapi doesn't like the slash / so instead of /people/alldesc I tried /people-alldesc and it worked.
Also in the service there's no need for return result.map((entry) => entry.toObject());, that causes anther error, simply doing return result works.
I'm trying to deploy my first next.js project on vercel. Locally everything is working.
My problem: When deploying (command: next build) the web app I get the message "Generating static sites (0/2000)" and then nothing happens. The deployment is cancelled after 1h (time expiration).
The problem is somewhere in the (simplified) code below. Here's why: when I deploy the project without the part that follows after const content the deployment is successful - so basically instead of having singleProductResponse AND contentResponse as props, I only have singleProductResponse..
I'm a little stuck and don't know how to solve this. Can someone tell me what I'm doing wrong? Thanks a lot!!
const Item = ({ singleProductResponse, contentResponse }) => {
const router = useRouter();
if (router.isFallback) {
return <div>Loading...</div>;
}
return (
<div className={styles.section}>
<Overview
singleProductResponse={singleProductResponse}
contentResponse={contentResponse}
/>
</div>
);
};
export async function getStaticPaths() {
const itemResponse = await knex("Items");
const paths = itemResponse.map((product) => ({
params: {
brand: product.brand,
item: product.item,
},
}));
return {
paths,
fallback: true,
};
}
export async function getStaticProps({ params }) {
try {
const itemData = await knex("Items").where(
"item",
"like",
`${params.item}`
);
const singleProductResponse = itemData[0];
//!!!!!!!!!when leaving the following part out: deployment is successful!!!!!!!!!!!!
const content = itemData[0].contentList;
const splitContent = content.split(", ");
const contentArray =
typeof splitContent === "string"
? [splitContent]
: splitContent;
const result = await knex("Cos")
.leftJoin("Actives", "Cos.content", "ilike", "Actives.content")
.leftJoin("Alcohol", "Cos.content", "ilike", "Alcohol.content")
.column([
"Cos.content",
"function",
"translationPlant",
"categoryAlcohol",
])
.where((qb) => {
contentArray.forEach((word) => {
qb.orWhere("Cos.content", word);
});
return qb;
});
return {
props: { singleProductResponse, contentResponse },
revalidate: 1,
};
} catch (err) {
console.log(err);
return {
redirect: {
destination: "/",
permanent: false,
},
};
}
}
UPDATE
After digging deeper: I think the problem is that the part after const content is too slow in building process.
When running next build, as well as when deploying on vercel the building process stops after approx. 1h. I suppose this is because of the limit of 45min for building process.
The last message I get is "Generating static pages (1000/2000)" (in vercel and locally) and "Build failed" (vercel). I don't get any other error-messages (also not in the catch block).
I've already tried to optimize the part after const content (each table has an index (clustered indexes -> primary keys), I've redesigned the tables (only 4 tables to join, instead of 6), eliminated everything unnecessary in the query and checked that the database (postgres hosted on heroku - hobby-basic) is also in the US). The performance is better, but still not enough. Does anyone have some suggestions for improvement? TTFB might be a somehow slow.
I just started working with Gatsby to see if it would be a good choice to rebuild my company's CraftCMS website with Craft as the backend and Gatsby as the frontend. So far everything has been working well until it came time to query for the individual entries inside our "campaign" channel.
For the record, I have been able to render a complete list using .map() for each of my campaign entries on a "overall view" page to see all the campaigns. I have also been able to recursively build out each campaign page so that it calls my /src/templates/campaign-page.js template and has the correct slug pulled from my site's Craft API with no issue. For some reason, I just can't get my individual campaign data to query inside the campaign-page.js template.
I've read just about every page in the Gatsby docs and every tutorial that currently exists, but for the life of me I can't figure out why my GraphQL query will not filter for my individual campaign entries. It just keeps telling me, "GraphQL Error Expected type [String], found {eq: $slug}."
I've also tried wrapping my "slug: {eq: $slug} in a "filter:" based on some markdown docs, but that just tells me "filter" does not exist. I'm beginning to think the issue is in my gatsby-node.js file, but I'm not seeing any issue when I compare it to the docs.
Gatsby-node.js
const path = require(`path`)
exports.createPages = async ({ actions, graphql }) => {
const { data } = await graphql(`
query {
api {
entries(section: "campaigns") {
slug
}
}
}
`)
data.api.entries.forEach(({ slug }) => {
actions.createPage({
path: "/campaigns/" + slug,
component: path.resolve(`./src/templates/campaign-page.js`),
context: {
slug: slug,
},
})
})
}
Campaign-page.js
export default ({data}) => {
const post = data.api.entries
return(
<div className={"campaign-page-single"} style={{marginTop: '-21px,'}}>
<Header/>
<div>
<h1>{post.id}</h1>
</div>
</div>
)
}
export const campaignQuery = graphql`
query ($slug: String!){
api{
entries (slug: { eq: $slug }){
slug
id
... on API_campaigns_campaigns_Entry {
id
campaignTitle
slug
}
}
}
}
`
For reference, here's what a typical working query looks like on my main campaigns.js page that lists all available campaigns:
query = {graphql`
{
api {
entries(section: "campaigns") {
... on API_campaigns_campaigns_Entry {
id
campaignTitle
uri
slug
}
}
}
}
`}
I expect my /src/templates/campaign-page.js template to render the individual campaign data.
I finally had one of my coworkers take a look at my code. All I had to do was wrap my $slug variable in brackets as so:
entries (section: "campaigns", slug: [$slug] )
That's two days I wish I could have back.
I have the following graphql query that is executing at build time from my gatsby-node.js file that I'm using to bring in my article data. Is there a way to use Apollo/Axios to retrieve new articles without having to rebuild the site essentially rehydrating my site in between builds? Any help is greatly appreciated!!
I'm using Gatsby v2, Drupal as my CMS and GraphQL.
exports.createPages = ({ graphql, actions }) => {
const { createPage } = actions;
const blogTemplate = path.resolve('src/templates/blog-post.js');
return graphql(`
{
blog: allNodeArticle{
edges {
node {
id
path {
alias
}
}
}
}
`
).then(result => {
if (result.errors) {
Promise.reject(result.errors);
}
// Create blog pages
result.data.blog.edges.forEach(({ node }) => {
createPage({
path: node.path.alias,
component: blogTemplate,
context: {
alias: node.path.alias,
},
});
});
});
}
I would like to merge in the new data as it becomes available while keeping older data completely static (massive benefit of Gatsby)