Webpack dev server hot mode not working - javascript

Heres my config:
devServer: {
contentBase: '/web/dist/',
hot: true,
stats: {colors: true},
inline: true
}
And here's the gulp task im running:
gulp.task('build', ['clean', 'styles', 'bower', 'media', 'data', 'homepage'], function(done) {
es6promise.polyfill();
console.log('STARTING DEV SERVER...');
server = new WebpackDevServer(webpack(webpackDevConfig), webpackDevConfig.devServer);
server.listen(8080, '0.0.0.0', function (err, stats) {
if (err) {
throw new gutil.PluginError("webpack-dev-server", err);
}
console.log('DEV SERVER STARTED');
done();
});
});
Everything works as expected except the hot loading (no refresh or change when I make changes to files). What am I doing wrong here?

You need to add <script src="http://localhost:8080/webpack-dev-server.js"></script> to your index.html It is not added when you use the API
"Notice that webpack configuration is not passed to WebpackDevServer API, thus devServer option in webpack configuration is not used in this case. Also, there is no inline mode for WebpackDevServer API. <script src="http://localhost:8080/webpack-dev-server.js"></script> should be inserted to HTML page manually."
(http://webpack.github.io/docs/webpack-dev-server.html)
maybe you also need to add 'webpack/hot/dev-server' as an entrypoint to your webpack config

be sure to set
webpackConfig.plugins.push(new webpack.HotModuleReplacementPlugin());
in the webpackConfig as well

If you are using redux can try this.
For some random reason redux-devtools was not allowing hot reload for me. Try removing it from root component and redux compose config.
Note: Use redux devtool browser extension with this config in your store configuration: window.devToolsExtension ? window.devToolsExtension() : f => f
Also, must read: https://medium.com/#rajaraodv/webpacks-hmr-react-hot-loader-the-missing-manual-232336dc0d96#.ejpsmve8f
Or try hot reload 3:
example: https://github.com/gaearon/redux-devtools/commit/64f58b7010a1b2a71ad16716eb37ac1031f93915

Related

Write to disk option for Vite

recently I have started working with vite on a couple of small projects and found it very interesting, however got a blocker once tried to work on ExpressJS + Svelte coupled project.
I usually use Express as BFF (Backend For Frontend) when it comes to working on rather more serious projects since it allows me to go for HTTPOnly cookies as well as proxy gateway for the frontend. However for development (specially when it comes to oauth2) it is hard to develop the spa separated form the server so what I usually do with webpack is activating the WriteToDisk option for devserver which then allows me to have my development build in the dist folder.
Example with webpack will be something like the webpack config below for the frontend:
module.exports = {
devServer: {
devMiddleware: {
writeToDisk: true,
},
},
//...
}
and then on the server basically rendering the dist as static folder:
app.get(
"*",
(req, res, next) => {
if (req.session.isAuth) return next();
else return res.redirect(staticURL);
},
(req, res) => {
return res.sendFile(staticProxyPage());
}
);
My problem
I can not find in vite's documentation any APIs to do something like this, does anyone have any experience with such cases?
if it is possible with the help of plugins, can you please provide references to the plugin or dev logs of it?
Many Thanks :)
Here is an example plugin I was able to hack together. You might need to modify it to suit your needs:
// https://vitejs.dev/guide/api-plugin.html#universal-hooks=
import {type Plugin} from 'vite';
import fs from 'fs/promises';
import path from 'path';
const writeToDisk: () => Plugin = () => ({
name: 'write-to-disk',
apply: 'serve',
configResolved: async config => {
config.logger.info('Writing contents of public folder to disk', {timestamp: true});
await fs.cp(config.publicDir, config.build.outDir, {recursive: true});
},
handleHotUpdate: async ({file, server: {config, ws}, read}) => {
if (path.dirname(file).startsWith(config.publicDir)) {
const destPath = path.join(config.build.outDir, path.relative(config.publicDir, file));
config.logger.info(`Writing contents of ${file} to disk`, {timestamp: true});
await fs.access(path.dirname(destPath)).catch(() => fs.mkdir(path.dirname(destPath), {recursive: true}));
await fs.writeFile(destPath, await read());
}
},
});
In short, it copies the content of the publicDir (public) to the outDir (dist).
Only thing missing is the ability to watch the public folder and copy whatever changed to the dist folder again. It will also re-write the file if any of the files within the public folder changes.
Note: The use of fs.cp relies on an experimental API from node 16.
Feel free to modify as you please. Would be nice to be able to specify file globs to include/exclude, etc.

publicPath does not work but __webpack_public_path__ does

I am building a Single SPA application and am facing problems toward deployment.
I am deploying each app inside a subdirectory (app-bar-mf, products-mf, and so on), and root-config (the main application) at base.
When working locally, each app is served through it's own server (localhost:9000, localhost:9001, and so on) so no subdirectories here.
When I am trying to deploy, the publicPath is not used, and assets are served through "/img/foo.png" instead of "/products-mf/img/foo.png".
If I set __webpack_public_path__ in my main.js, everything works as expected.
Any clues ?
vue.config.js
module.exports = {
publicPath: process.env.NODE_ENV === "production" ? "/products-mf/" : "/",
configureWebpack: () => {
const conf = {
externals: [
"vue",
"vuex",
"vue-router",
"vue-i18n"
]
};
if (process.env.NODE_ENV === "production") {
conf.output = { "products.js" };
}
return conf;
},
filenameHashing: false,
transpileDependencies: ["vuetify"]
};
In single-spa projects, we use systemjs-webpack-interop in order to set the __webpack_public_path__ dynamically based on the URL of where the microfrontend is hosted. This has worked well for hundreds of single-spa applications so you could consider using the same logic. This plugin is included in Vue projects generated by create-single-spa.

Sapper/Svelte.js - How to specify client-side assets location?

I have a Sapper.js application that I have successfully running on AWS Lambda. Lambda is able to deliver the server-side generated HTML created by Sapper to AWS API Gateway which then serves the app to the user. I am using S3 to host the client side assets (scripts, webpack chunks, etc). The S3 bucket is on a different domain than API Gateway.
The issue I'm having is that I need to set an asset prefix for these scripts so that Sapper can find them. Currently all of my client side scripts include relative links and look like this: <script src="/client/be33a1fe9c8bbaa6fa9d/SCRIPT_NAME.js"></script> I need to have them look like this: <script src="https://AWS_S3_BUCKET_ENDPOINT.com/client/be33a1fe9c8bbaa6fa9d/SCRIPT_NAME.js"></script>
Looking in the Sapper docs, I see that I can specify a base url for the client and server. However, changing this base url breaks my app and causes the Lambda rendering the pages to return 404 errors.
I know that when using, say, Next.js, I can accomplish this by modifying the next.config.js file to include the following:
module.exports = {
assetPrefix: "https://AWS_S3_BUCKET_ENDPOINT.com/client",
}
But I don't know how to do this in Sapper. Do I need to modify the bundler (using webpack) config? Or is there some other way?
Thank you.
I think I've figured it out.
I had to change two sapper files. First I went into sapper/dist/webpack.js and modified it like so:
'use strict';
var __chunk_3 = require('./chunk3.js');
var webpack = {
dev: __chunk_3.dev,
client: {
entry: () => {
return {
main: `${__chunk_3.src}/client`
};
},
output: () => {
return {
path: `${__chunk_3.dest}/client`,
filename: '[hash]/[name].js',
chunkFilename: '[hash]/[name].[id].js',
// change this line to point to the s3 bucket client key
publicPath: "https://AWS_S3_BUCKET_ENDPOINT.com/client"
};
}
},
server: {
entry: () => {
return {
server: `${__chunk_3.src}/server`
};
},
output: () => {
return {
path: `${__chunk_3.dest}/server`,
filename: '[name].js',
chunkFilename: '[hash]/[name].[id].js',
libraryTarget: 'commonjs2'
};
}
},
serviceworker: {
entry: () => {
return {
'service-worker': `${__chunk_3.src}/service-worker`
};
},
output: () => {
return {
path: __chunk_3.dest,
filename: '[name].js',
chunkFilename: '[name].[id].[hash].js',
// change this line to point to the s3 bucket root
publicPath: "https://AWS_S3_BUCKET_ENDPOINT.com"
}
}
}
};
module.exports = webpack;
//# sourceMappingURL=webpack.js.map
Then I had to modify sapper/runtime/server.mjs so that the main variable points to the bucket like so:
...
const main = `https://AWS_S3_BUCKET_ENDPOINT.com/client/${file}`;
...
Testing with the basic sapper webpack template, I can confirm that the scripts are loading from the s3 bucket successully. So far this all looks good. I will mess around with the sapper build command next to make it so I can pass these hacks in as command line arguments so I don't have to hardcode them every time.
Now, I'm not sure if this will hold up as the app becomes more complicated. Looking into the sapper/runtime/server.mjs file, I see that the req.baseUrl property is referenced in several different locations and I don't know if my hacks will cause any issues with this. Or anywhere else in sapper for that matter.
If anyone with more experience with the Sapper internals is reading, let me know in the comments if I screwed something up 👍

Do I remove webpack-dev-server and hot-module middleware code during production build, or when I get ready for production?

As the title suggests, when I'm ready to host the code for production, should I remove all usages of webpack-dev-middleware and webpack-hot-middleware from my server code as they are dev-dependencies? What's the best way to set this up so maybe I don't have to worry about this?
This is a snapshot of my server code:
// webpack -> HMR
const webpack = require("webpack");
const webpackConfig = require("../webpack.config");
const compiler = webpack(webpackConfig);
// webpack HMR init
app.use(
require("webpack-dev-middleware")(compiler, {
noInfo: false,
publicPath: webpackConfig.output.publicPath,
})
);
app.use(require("webpack-hot-middleware")(compiler));
...
app.get("/", async (req, res) => {
const initialContent = await serverRender();
res.render("index", {
...initialContent,
});
});
app.listen(port, () => {
console.log(`Express application listening on port ${port}`);
});
You should wrap your HMR code (or really, any development/environment specific setting) into it's own area. I wouldn't recommend taking it out of your code as you may come back to the application and want to update something. Having HMR is a pretty nice luxery, so I would just have you code sniff out the environment, and if it's development, run the associated code. Otherwise, don't run it.
How do you detect the environment in an express.js app?

cannot GET error 404 on reload?

my react router is working fine with dev env, this is what I did in webpack dev server:
historyApiFallback: {
index: 'index.html',
}
so in production mode I wanted to do the same, I did it in express like this:
const indexPath = path.join(__dirname, '../public/index.html')
const publicPath = express.static(path.join(__dirname, '../public'))
app.use('/public', publicPath)
app.use('/graphql', graphQLHTTP(async (req) => {
let { user } = await getUser(req.headers.authorization);
if(!user) {
user = 'guest'
}
return {
schema,
pretty: true,
graphiql: true,
context: {
user,
}
}
}));
app.get('/', function (_, res) { res.sendFile(indexPath) });
I did not change anything with react-router-dom so I am am assuming the error is in my express config. so what's the equivalent of historyApiFallback in production mode? below is my webpack bundle config:
output: {
path: path.join(__dirname, 'public'),
filename: 'bundle.js',
publicPath: '/public/'
},
in my html I reference the bundle like this:
<script type="text/javascript" src="/public/bundle.js"></script>
I think a have the right config but when I reload I get cannot GET Error 404?
You should add this line to your app:
app.get('*', function (_, res) { res.sendFile(indexPath) });
Or you should use this package better: https://github.com/bripkens/connect-history-api-fallback
You should read more about history mode:
To get rid of the hash, we can use the router's history mode, which leverages the history.pushState API to achieve URL navigation without a page reload:
When using history mode, the URL will look "normal," e.g. http://oursite.com/user/id. Beautiful!
Here comes a problem, though: Since our app is a single page client-side app, without a proper server configuration, the users will get a 404 error if they access http://oursite.com/user/id directly in their browser. Now that's ugly.
Not to worry: To fix the issue, all you need to do is add a simple catch-all fallback route to your server. If the URL doesn't match any static assets, it should serve the same index.html page that your app lives in. Beautiful, again!

Categories