Error creating custom pages from local plugin within GatsbyJS - javascript

My goal is creating pages from local plugin. I wrote a custom plugin named my-custom-plugin. I've also installed gatsby-plugin-page-creator plugin to automatically create pages from my components outside default pages directory.
This is my project structure:
plugins
/my-custom-plugin
/gatsby-node.js
/package.json
src
/components
/pages
/single.js
gatsby-config.js
gatsby-node.js
...etc
gatsby-config.js (from root):
module.exports = {
plugins: [
`my-custom-plugin`,
{
resolve: `gatsby-plugin-page-creator`,
options: {
path: `${__dirname}/src/components/pages`,
}
},
]
}
plugins/my-custom-plugin/gatsby-node.js
const path = require('path')
const location = path.resolve(__dirname, '..', '..', '/src/components/pages')
exports.createPages = ({ actions }) => {
const { createPage } = actions
createPage({
path: `/sample-page`,
component: `${location}/single.js`,
context: {
slug: 'sample-page'
}
})
}
Unfortunately, I got error message The plugin "my-custom-plugin" created a page with a component that doesn't exist when running gatsby develop. Am I doing wrong? Any help?
Regards.

https://github.com/gatsbyjs/gatsby/tree/master/packages/gatsby-plugin-page-creator
You don't need a custom plugin. The README states that you only need to insert the config into gatsby-config.js.
Your current local plugin tries to do exactly that what the page-creator plugin already does.

Related

Web workers inside a JS library with Rollup

I am building a negamax engine in Typescript that uses Thread.js web-workers. It is a npm library that will be imported by an application built using webpack.
I am using Rollup to build the engine - how can I export the web-worker files so they are copied into the client's build directory as a separate chunk?
There are plugins for that: Alorel/rollup-plugin-web-worker, darionco/rollup-plugin-web-worker-loader
..but I ended up doing it by scratch, using a separate build configuration for the worker(s). This simply gives me more control over the situation.
Attached is the rollup.config.worker.js that I use.
The main rollup.config.mjs imports this file, has its configuration as the first build configuration. The real build config uses #rollup/plugin-replace to inject the worker's hash to the code loading it.
/*
* Rollup config for building web worker(s)
*
* Imported by the main rollup config.
*/
import sizes from '#atomico/rollup-plugin-sizes'
import resolve from '#rollup/plugin-node-resolve'
import replace from '#rollup/plugin-replace'
import { terser } from 'rollup-plugin-terser'
import {dirname} from 'path'
import {fileURLToPath} from 'url'
const myPath = dirname(fileURLToPath(import.meta.url));
const watch = process.env.ROLLUP_WATCH;
const REGION = process.env.REGION;
if (!REGION) throw new Error("'REGION' env.var. not provided");
let loggingAdapterProxyHash;
const catchHashPlugin = {
name: 'my-plugin',
// Below, one can define hooks for various stages of the build.
//
generateBundle(_ /*options*/, bundle) {
Object.keys(bundle).forEach( fileName => {
// filename: "proxy.worker-520aaa52.js"
//
const [_,c1] = fileName.match(/^proxy.worker-([a-f0-9]+)\.js$/) || [];
if (c1) {
loggingAdapterProxyHash = c1;
return;
}
console.warn("Unexpected bundle generated:", fileName);
});
}
};
const pluginsWorker = [
resolve({
mainFields: ["esm2017", "module"],
modulesOnly: true // "inspect resolved files to assert that they are ES2015 modules"
}),
replace({
'env.REGION': JSON.stringify(REGION),
//
preventAssignment: true // to mitigate a console warning (Rollup 2.44.0); remove with 2.45?
}),
//!watch && terser(),
catchHashPlugin,
!watch && sizes(),
];
const configWorker = {
input: './adapters/logging/proxy.worker.js',
output: {
dir: myPath + '/out/worker', // under which 'proxy.worker-{hash}.js' (including imports, tree-shaken-not-stirred)
format: 'es', // "required"
entryFileNames: '[name]-[hash].js', // .."chunks created from entry points"; default is: '[name].js'
sourcemap: true, // have source map even for production
},
plugins: pluginsWorker
}
export default configWorker;
export { loggingAdapterProxyHash }
Using in main config:
replace({
'env.PROXY_WORKER_HASH': () => {
const hash= loggingAdapterProxyHash;
assert(hash, "Worker hash not available, yet!");
return JSON.stringify(hash);
},
//
preventAssignment: true // to mitigate a console warning (Rollup 2.44.0); remove with 2.45?
}),
..and in the Worker-loading code:
const PROXY_WORKER_HASH = env.PROXY_WORKER_HASH; // injected by Rollup build
...
new Worker(`/workers/proxy.worker-${PROXY_WORKER_HASH}.js?...`);
If anyone wants to get a link to the whole repo, leave a message and I'll post it there. It's still in flux.
Edit:
After writing the answer I came across this: Building module web workers for cross browser compatibility with rollup (blog, Jul 2020)
TL;DR If you wish to use EcmaScript Modules for the worker, watch out! Firefox and Safari don't have the support, as of today. source And the Worker constructor needs to be told that the worker source is ESM.

Sapper/Svelte.js - How to specify client-side assets location?

I have a Sapper.js application that I have successfully running on AWS Lambda. Lambda is able to deliver the server-side generated HTML created by Sapper to AWS API Gateway which then serves the app to the user. I am using S3 to host the client side assets (scripts, webpack chunks, etc). The S3 bucket is on a different domain than API Gateway.
The issue I'm having is that I need to set an asset prefix for these scripts so that Sapper can find them. Currently all of my client side scripts include relative links and look like this: <script src="/client/be33a1fe9c8bbaa6fa9d/SCRIPT_NAME.js"></script> I need to have them look like this: <script src="https://AWS_S3_BUCKET_ENDPOINT.com/client/be33a1fe9c8bbaa6fa9d/SCRIPT_NAME.js"></script>
Looking in the Sapper docs, I see that I can specify a base url for the client and server. However, changing this base url breaks my app and causes the Lambda rendering the pages to return 404 errors.
I know that when using, say, Next.js, I can accomplish this by modifying the next.config.js file to include the following:
module.exports = {
assetPrefix: "https://AWS_S3_BUCKET_ENDPOINT.com/client",
}
But I don't know how to do this in Sapper. Do I need to modify the bundler (using webpack) config? Or is there some other way?
Thank you.
I think I've figured it out.
I had to change two sapper files. First I went into sapper/dist/webpack.js and modified it like so:
'use strict';
var __chunk_3 = require('./chunk3.js');
var webpack = {
dev: __chunk_3.dev,
client: {
entry: () => {
return {
main: `${__chunk_3.src}/client`
};
},
output: () => {
return {
path: `${__chunk_3.dest}/client`,
filename: '[hash]/[name].js',
chunkFilename: '[hash]/[name].[id].js',
// change this line to point to the s3 bucket client key
publicPath: "https://AWS_S3_BUCKET_ENDPOINT.com/client"
};
}
},
server: {
entry: () => {
return {
server: `${__chunk_3.src}/server`
};
},
output: () => {
return {
path: `${__chunk_3.dest}/server`,
filename: '[name].js',
chunkFilename: '[hash]/[name].[id].js',
libraryTarget: 'commonjs2'
};
}
},
serviceworker: {
entry: () => {
return {
'service-worker': `${__chunk_3.src}/service-worker`
};
},
output: () => {
return {
path: __chunk_3.dest,
filename: '[name].js',
chunkFilename: '[name].[id].[hash].js',
// change this line to point to the s3 bucket root
publicPath: "https://AWS_S3_BUCKET_ENDPOINT.com"
}
}
}
};
module.exports = webpack;
//# sourceMappingURL=webpack.js.map
Then I had to modify sapper/runtime/server.mjs so that the main variable points to the bucket like so:
...
const main = `https://AWS_S3_BUCKET_ENDPOINT.com/client/${file}`;
...
Testing with the basic sapper webpack template, I can confirm that the scripts are loading from the s3 bucket successully. So far this all looks good. I will mess around with the sapper build command next to make it so I can pass these hacks in as command line arguments so I don't have to hardcode them every time.
Now, I'm not sure if this will hold up as the app becomes more complicated. Looking into the sapper/runtime/server.mjs file, I see that the req.baseUrl property is referenced in several different locations and I don't know if my hacks will cause any issues with this. Or anywhere else in sapper for that matter.
If anyone with more experience with the Sapper internals is reading, let me know in the comments if I screwed something up 👍

Storybook webpack absolute import

In our app we are using absolute paths for import modules. We have react folder into our resolve root:
Folder structure
We are using webpack for build and develop app and it works ok, with the next options:
resolve: {
modules: [
'node_modules',
path.resolve('src')
]
},
I'm working on integration of storybook and found, that it can't find any module from this react folder.
ERROR in ./stories/index.stories.js
Module not found: Error: Can't resolve 'react/components/Button' in 'project_name/stories'
# ./stories/index.stories.js
for the next line:
import Button from 'react/components/Button';
As mark: I added resolve/modules to .storybook/webpack config and also if I try to import anything other from, for example services/xxx - it works.
Issues
react folder name conflicts with actual React package location: node_modules/react. Webpack tries to resolve to .resolution(default is node_modules) if the file does not exist in the path.
.resolution is not appropriate for this sort of usage. it is mostly used for package resolution because it can't tell source strings.
to change path selectively, use alias instead.
Solution
change your component folder's name so that it does not collide with node_modules/react. a good example is view/components/Button.
add alias to .storybook/main.js setting
// .storybook/main.js
const path = require('path');
module.exports = {
/* ... other settings goes here ... */
/**
* #param {import('webpack').Configuration} config
* */
webpackFinal: async (config, { configType }) => {
if (!config.resolve) config.resolve = {};
// this config allows to resolve `view/...` as `src/view/...`
config.resolve.alias = {
...(config.resolve.alias || {}),
view: path.resolve(__dirname, '../src/view'),
};
return config;
},
};
change storybook code in accordance with (1)
// Button.stories.jsx
import Button from 'view/components/Button';
//...

How do I generate SVG sprites with GatsbyJS

In my gatsby-browser.js file, I have two imports that look similar to:
#import npm-package/lib/icons.svg
#import npm-package/lib/icons-rich.svg
My current gatsby-node.js file is as follows
const path = require('path')
const SpriteLoaderPlugin = require('svg-sprite-loader/plugin')
exports.onCreateWebpackConfig = ({ actions, getConfig }) => {
const config = getConfig()
config.resolve.alias = {
...config.resolve.alias,
/// aliases working fine
}
config.module.rules = [
...config.module.rules,
{
test: /(icons|icons-rich).svg$/,
loader: 'svg-sprite-loader',
options: {
extract: true,
publicPath: './'
},
},
],
config.plugins = [
...config.plugins,
new SpriteLoaderPlugin()
]
actions.replaceWebpackConfig(config)
}
When I run gatsby develop, I get the following error:
Module Warning (from ./node_modules/svg-sprite-loader/lib/loader.js):
svg-sprite-loader exception. Some loaders will be applied after svg-sprite-loader in extract mode
and no files are ever output.
When i run gatsby build, I get a sprite.svg file output to the public directory but it doesn't seem like the svg sprite gets added to the html document body.
1) How do I get the gatsby develop command to process and output the svg to the proper directory
2) I suspect the issue with gatsby build is related to the svg file i'm trying include in the html document, which is in my Layout.jsx file and looks like
<Icon icon="all" iconPath="./public/sprite.svg" />
I would guess the ./public/sprite.svg is missing for some reason but I can't figure out what the correct file path is (tried everything except the right thing apparently).

Vue js babel build error after installing prerender-spa-plugin

So I have a vue-js application and today I started using prerender-spa-plugin to generate some static pages for better SEO. When I run npm run build, everything works perfect, no errors. Now when I want to run the development server with npm run serve, I get the following error (only a part of it)
error in ./src/main.js
Module build failed (from ./node_modules/babel-
loader/lib/index.js):
Error: .plugins[0] must include an object
at assertPluginItem (/Users/user/Desktop/app/node_modules/#babel/core/lib/config/validation/option-assertions.js:231:13)
So I guess the problem has to do with the babel plugin loader. So I commended every part of my code using prerender-spa-plugin, but I still get the same error. I hope someone can point me to the right direction.
My babel.config.js
const removeConsolePlugin = []
if(process.env.NODE_ENV === 'production') {
removeConsolePlugin.push("transform-remove-console")
}
module.exports = {
presets: [
'#vue/app'
],
plugins: [removeConsolePlugin]
}
My vue.config.js
const path = require('path');
const PrerenderSpaPlugin = require('prerender-spa-plugin');
const productionPlugins = [
new PrerenderSpaPlugin({
staticDir: path.join(__dirname, 'dist'),
routes: ['/', '/documentation'],
renderer: new PrerenderSpaPlugin.PuppeteerRenderer({
// We need to inject a value so we're able to
// detect if the page is currently pre-rendered.
inject: {},
// Our view component is rendered after the API
// request has fetched all the necessary data,
// so we create a snapshot of the page after the
// `data-view` attribute exists in the DOM.
//renderAfterElementExists: '[data-view]',
}),
}),
];
module.exports = {
configureWebpack: (config) => {
if (process.env.NODE_ENV === 'production') {
config.plugins.push(...productionPlugins);
}
}
}

Categories