Gatsby createPage Actions - javascript

This is my gatsby-node.js file. I am attempting to get my gatsby program to dynamically create pages for each user in my api with the path being node.example. I'm currently running a local dev environment and I am getting my data from a local json server. It appears that a page is only being created for the users that I manually place a "path" field with "/example" in their corresponding json element. Also, in graphiQL, it is only returning data for the users that have an "ID" field when I call allRestApiEmployees. Any idea on how to fix this?
const path = require("path")
exports.createPages = ({ actions, graphql }) => {
const { createPage } = actions
//this is going to be referencing fields.js - not that specific node
const profileTemplate = path.resolve("src/components/layout.js")
//query to return user data
return graphql(`
{
allRestApiEmployees {
edges {
node {
example
}
}
}
}
`).then(res => {
if (res.errors) {
return Promise.reject(res.errors)
}
res.data.allRestApiEmployees.edges.forEach(({ node }) => {
var path = "/" + node.example
console.log(node.example)
createPage({
path,
component: profileTemplate,
})
})
})
}

Related

Is there a way to access data from a Nuxt 3 server api route inside of another server route?

I'm working on a Nuxt 3 application where I need to generate a sitemap dynamically from a CMS using the method described for Nuxt Content. I'm using the built in /server/api/ directory to get this data for the rest of the app. My sitemap.xml.js file is in /server/routes/.
I'd like to be able to make $fetch requests from my sitemap file to the apis instead of rewriting all of that code (making the $fetch requests to the CMS again).
sitemap.xml.js
export default defineEventHandler(async (event) => {
const { auth, posts } = useRuntimeConfig()
//[...]
const { records: postList } = await $fetch(`${posts}?api_key=${auth}`)
postList.forEach((post) => {
if(post.fields.Status){
sitemap.write({
url: 'posts/' + post.fields.Slug + '/',
lastmod: post.lastMod
})
}
})
//[...]
}
What I would like to do
export default defineEventHandler(async (event) => {
//[...]
const { records: postList } = await $fetch('/api/posts')
postList.forEach((post) => {
sitemap.write({
url: 'posts/' + post.fields.Slug + '/',
lastmod: post.lastMod
})
})
//[...]
}
My sitemap code all works fine, I just can't get the data from my other server routes. Does anyone know if this is possible/how to do it?

Dynamic Page Render On Demand Next Js

So, I am working on a simple Next.Js app. It's just a Lyrical Page but I'm running onto some issues. I'm new to Next.Js and this is my first project. So, here is my problem:
I have a dynamic set of pages/folders which looks like this:
songs (empty folder)
[singer] (child of songs containng just an index.jsx)
[title] (child of [singer] containing another index.jsx)
Now in my [title] - index.jsx im rendering a simple page providing lyrics for a specific song.
My problem is that I want to count views (every time someone opens this page) for each song sepertly. I have the following code:
export const getStaticProps = async (context) => {
const res = await fetch(`${process.env.PROXY}api/lyrics/post/${encodeURIComponent(context.params.title)}`);
const data = await res.json();
const send = await fetch(`${process.env.PROXY}api/lyrics/views/${data.post._id}`);
return {
props: {
data: data.post,
message: 200
}
}
}
export const getStaticPaths = async () => {
const res = await fetch(`${process.env.PROXY}api/lyrics/getAll`);
const data = await res.json();
const paths = data.all.map((name) => ({ params: { singer: name.singer.toString(), title: name.title.toString() } }));
return {
paths: paths,
fallback: 'blocking'
}
}
The problem is I know getStaticProps renders only on build time, however, I want to render every time so that I can count the views with my send variable.
Can someone please help me figure this out? Any help will be appreciated!

Fetching multiple raw md files from GitHub in React JS

I'm trying to fetch data from multiple raw .md files from Guthub repo. Currently I'm able to fetch only one, yet I need to get to all of them.
I have a github repo and Im looking to fetch data from raw .md file, which is not a problem. The problem is that the repo has bunch of folders and each folder has its own .md file. I need to make some sort of map through all folders and fetch all of the .md files.
Lets say I have a github repo with following folders:
folder1 -> text1.md
folder2 -> text2.md
folder3 -> text3.md
I'm currently being able to fetch only one raw md usinng the following method
let fetchData = () => {
axios.get("https://raw.githubusercontent.com/user-name/repo-name/master/folder1/text1.md").then(response => {
console.log(response)
}).catch(error => {
console.log(error)
})
}
My goal is to fetch all text1, text2, text3.md so I can map through them and display in the table
Based on your comments, I would say that your best bet is to make a node worker that you can run weekly (o during deployments) that would crawl information form folders (filename and content) you tell pass it to, and then saved that information in some way you can later consume from gatsby (I guess the ideal way would be to put it on gatsby GraphQL).
This is a vague idea on how that worker could be, with the too limited information I have:
let repoBaseUrl = 'https://raw.githubusercontent.com/user-name/repo-name/master/';
let folders = [
'folder1',
'folder2',
'folder3'
];
let fetchFileName = async (folder) => {
// Your function to get the filename
return filename;
}
let fetchFileContent = async (folder, filename) => {
try {
const response = await axios.get(`${repoBaseUrl}${folder}${filename}`);
return response.data;
}
catch(error) {
// do something with the error
}
}
let fetchFolderContent = async () => {
const data = {};
folders.forEach(async (folder) => {
const filename = await fetchFileName(folder);
const content = await fetchFileContent(folder, filename);
data[folder] = {
filename,
content,
}
});
return data;
}
let main = async () => {
const data = await fetchFolderContent();
// Process your data
// IE: save it GraphQL so you can consume it from Gatsbt
}
main();

how to copy an image and save it in a new folder in electron

I am trying to make an image organizer app , which searches images using tag's ,
So I want the user to select the image they want, so far I have done this by the following code
// renderer process
$("#uploadImage).on("click", (e) => {
ipcRenderer.send('dialoguploadImage')
});
this is the main process
ipcMain.on('dialoguploadImage', (e) => {
dialog.showOpenDialog({
properties: ['openFile']
}).then(result => {
sendBackimagePathFromMain(result.filePaths[0])
}).
catch(err => {
console.log(err)
})
});
function sendBackimagePathFromMain(result) {
mainWindow.webContents.send('imagePathFromMain',result)
}
so I have the image path, and the only thing I want to know is
how can I duplicate this image, rename it, cerate a new folder and save the image in that folder
like for example to this folder
('./currentDirectory/imageBackup/dognothapppy.jpg')
You can use fs.mkdirSync() to make the folder and fs.copyFileSync() to 'duplicate and rename' the file (in a file system, you don't need to duplicate and rename a file in two different steps, you do both at once, which is copying a file), or their async functions.
const { mkdirSync, copyFileSync } = require('fs')
const { join } = require('path')
const folderToCreate = 'folder'
const fileToCopy = 'selectedFile.txt'
const newFileName = 'newFile.txt'
const dest = join(folderToCreate, newFileName)
mkdirSync(folderToCreate)
copyFileSync(fileToCopy, dest)

Why 'fs' only persists file changes after end of program?

I have an application that persists its state on disk, when any state change occur it reads from file the old state, it changes the state on memory and persists on disk again. But, the problem is that store function is writing on disk only after close program. I don't know why?
const load = (filePath) => {
const fileBuffer = fs.readFileSync(
filePath, "utf8"
);
return JSON.parse(fileBuffer);
}
const store = (filePath, data) => {
const contentString = JSON.stringify(data);
fs.writeFileSync(filePath, contentString);
}
To create a complete example, let's use load-dataset command, in the file "src/interpreter/index.js".
while(this.isRunning) {
readLineSync.promptCL({
"load-dataset": async (type, name, from) => {
await loadDataset({type, name, from});
},
...
}, {
limit: null,
});
}
In general, this calls loadDatasets, which reads json ou csv files.
export const loadDataset = async (options) => {
switch(options.type) {
case "csv":
await readCSVFile(options.from)
.then(data => {
app.createDataset(options.name, data);
});
break;
case "json":
const data = readJSONFile(options.from);
app.createDataset(options.name, data);
break;
}
}
The method createDataset() read the file on disk, update it and write again.
createDataset(name, data) {
const state = loadState();
state.datasets = [
...state.datasets,
{name, size: data.length}
];
storeState(state);
const file = loadDataset();
file.datasets = [
...file.datasets,
{name, data}
];
storeDataset(file);
}
Where methods loadState(), storeState(), loadDataset(), storeDataset() uses initial methods.
const loadState = () =>
load(stateFilePath);
const storeState = state =>
store(stateFilePath, state);
...
const loadDataset = () =>
load(datasetFilePath);
const storeDataset = dataset =>
store(datasetFilePath, dataset);
I'm using a package from npm called readline-sync to create a simple "terminal", I don't know if it causes some conflicts.
The source code is in the Github: Git repo. In the file "index.js", the method createDataset() calls loadState() and storeState(), which both uses the methods showed above.
The package readline-sync is used in the interpreter, here Interpreter file, which basic loops until exit command.
Just as note, I'm using Ubuntu 18.04.2 and Node.js 10.15.0. To make this code I saw an example, in the YouTube Video. This guy is using a MAC OS X, I really hope that the system won't be problem.

Categories