I may have a basic misunderstanding of how things work, so bear with me.
Project
I have built a small website with NPM, Webpack, React, and react-router. the site has six content-only pages generated by react components. Each page is served up through react router, meaning I can go to localhost:8080/page-2 directly and be given the page I mean to display at that page.
Issue
It works well, but the problem is I can only get it to work while running a server locally, set up using the technique described here
I tried running the site with http-server to see if I could pass my project off to my client to run on whatever server he runs this on. This needs to be a pretty hands-off launch, but my fear is that my client's server needs to be of a certain type, or need to be configured in a specific way.
I understand I may have built this incorrectly. Is there a good way to build React(with react-router) sites in the future so that launching is a simple drag-and-drop process?
Issue
The thing causing your issue is the react-router. If you don't use any routing, the server gives you index.html and then with react (javascript) you handle any static or dynamic content you wish.
When you use react-router for routing, it pushes new URL into your browser every time a link is clicked. The server has no idea that you run a client(browser) app and tries to go into this url/directory specified.
That means that any form of launch you choose, it must either take care of it or you must specify to the server that it needs to serve index.html, whatever the current url is. There are many ways to do that, for example in my Node app, I have:
app.get('*', function(req, res) {
res.sendFile(path.join(__dirname, 'index.html'));
});
Summary
The server doesn't know it's a client app. It tries to serve you files/urls that doesn't exist. You must tell it to serve index.html at all times.
Related
I have a vue.js SPA which I want to deploy on different servers without recompiling the frontend for each deploy. The SPA connects to a backend, with a url yet unknown to the spa. Is there a way I can dynamically tell the frontend at runtime where the backend is?
Many articles and forum threads suggest to just use different config files for different environments and build them in at build time, but that is not an option for me because I simply don't know where the frontend/backend will be deployed when building it.
EDIT: The project is an open source project, so I can't really make an assumption about how people will deploy it. I've always kind of "assumed" it would be deployed on a seperate sub domain, with the frontend being reachable at / and the backend with a proxy at /api because that's how I set up my server.
However, I've seen people deploying the api at a totally different sub domain (sometimes even with different ports) than the frontend or on a sub path or even a mixture between the two.
Things I've considered so far:
Putting the config in a conf.js which would then expose the backend url via window.config.backendUrl or similar and load that file in my index.html from a <script> tag
Slightly similar to 1.: Put the config in a config.json and making a fetch request to it once the application loaded, then exposing the result of it in window.config.backendUrl
Inserting a <script>window.config.backendUrl = 'http://api.example.com'</script> in my index.html.
Serving the frontend with a custom made web server (based on express or similar) which parses either env or a different config file and then creates the <script> tag from 3. dynamically
Always "assuming" where the backend will be with some kind of list to work up, like "First look at /api then at ./api then at api.current-host.com etc."
Bundling the frontend with the backend - that way I would always "know" where the backend relative to the frontend is.
Yet all of theses options seem to me a bit hacky, I think there has to be a better way.
My favourite is the third option because it is the best trade off between configurability and performance IMHO.
If I was in the same situation I would have considered the following 2 approaches:
Deploying a JSON file with the same name but with different contents - so the frontend can always fetch its configuration by making AJAX call to /config.json but this file will depend on where you deploy and will be generated during the deployment step
Using a single API endpoint with a fixed/constant URL (completely separate from your backends) - so that frontend always calls this API endpoint to get its configuration at startup and the actual URL of the corresponding backend for its further operation.
Basically (2) is just the dynamic version of the static configuration in (1).
I am very used to the approach where SSR meant that the page got a full refresh and received a full HTML from the server, where it gets rendered with razor/pub/other depending on the backend stack. So every time the user would click on the navigation links, it would just send a request to the server and the whole page would refresh, receiving a new HTML. That is the traditional SSR which I understand.
With SPA however, we have for example react or angular, where we receive almost empty HTML on the beginning and then the JS so that the whole app gets initialized on the client side. We can then have some REST API to get json data and render views on the frontend (client side routing and rendering) without any page refresh. We don't even need any server really.
Now, what I have a problem understanding is how SSR (such as next.js) works with react.
From what I am reading, the first request returns full HTML+CSS (which helps with SEO etc - I get that), but what happens later? What happens after that first/initial request? Does the whole react app initialize in the browser and then it just behaves EXACTLY as if it was a normal SPA (meaning we have client side routing and rendering from now on, without any need to make requests to that server)? In other words, does next.js still make any server requests after the initial one, or does it act like a typical SPA with CRA from now on?
I spent lots of time reading but all the articles mainly focus on the initial request and SEO and time to first byte, paint etc. and I am simply trying to understand why its called SSR since it seems to work different than the traditional SSR which I described on the beginning.
does next.js still make any server requests after the initial one, or does it act like a typical SPA with CRA from now on?
You got it right. The first (initial) request is handled by the server and after that the frontend handles the routing (at least in the case of Next.js).
If you want to see an example OpenCollective is built with Next.js. Try playing around with it and see the Network tab in the DevTools.
I am simply trying to understand why its called SSR since it seems to work different than the traditional SSR which I described on the beginning.
It is called SSR because the app is effectively being rendered on the server. The fact that frontend routing takes over after the initial render doesn't remove the fact that the server did the work of rendering the app as oppose to the user machine.
That's Not all the things that happen with Next.js, In Next.js You can build something called Hybrid apps.
In traditional SSR, all of your client requests get handled by the server. Every request goes to the server and gets responses.
In classic CSR with something like React, as you said, all the things happens in the browser via the client-side javascript.
But, in Next.js you can define three different approaches (mainly two according to the docs) for pages to get delivered.
based on the app needs and requirements, you can serve a couple of pages in pure traditional SSR mode, a couple of them in classic CSR mode, and a couple of in SSR mode via dynamic data that get fetched and rendered into the pages on the fly.
these features bring lots of flexibility to design a web-app that behaves perfectly in every different scenario that is needed.
I'm working on a Next.js app that needs to fetch a config file from a remote server before initializing. I want to request the config file just once per call to the server before rendering the app server side. After that, I would like to be able to get the same config in the client without having to make a second request to the remote server from the browser.
I have tried to achieve this by using getInitialProps function either in _app or _document files and then use the React's Context API to make the configuration visible to every component but unless I'm wrong, this will run the code that requests the configuration both in server (on the first call from the browser) and client (on every page navigation).
I have also tried to create a server.js file, request the configuration from there and store in a variable within a ES6 module. However, I couldn't make this approach work because apparently the Next.js React app can't access the same modules than the server.js because they are actually two different apps. Again, I could be wrong.
Basically I would like to know if Next.js offers any kind of "bootstrapping place" where I can perform app initialization tasks that generate data that can be sent to the React app Next.js will initiate.
You are correct in that anything in getInitialProps in _app.js or _document.js will be run on every server request. Once you get your config, you can pass it down to the components in pages as props. From there, I think you have two choices:
Now that the app is bootstrapped, run the rest of the app as a SPA client-side. This would prevent any future SSR from happening. Obviously, you lose the benefits of SSR and the initial page load is likely longer, but then it would be snappy afterwards.
After getting the config in _app.js, send it as a cookie (assuming it's not too big to be a cookie?). On future requests to the server, the cookie will be sent automatically and you would check the cookie first - if it doesn't exist, get the config. If it does exist, skip that more expensive bootstrapping because the app is bootstrapped.
So I think it really depends on whether you want a single page application bootstrapped on the server but then entirely client side after that (option 1) or server side rendering per page while minimizing expensive bootstrapping (option 2).
Nothing prevents you from sending multiple cookies from the server if that makes sense for your app and bootstrapping. And remember not to make it a HTTP-Only cookie because you'll want to read that cookie client side - after all, that's what you're looking for - the client side configuration generated on the server.
Despite we ended up leaving Next.js for this and other reasons that made us decide Next.js was not the best option for our use case, we kind of mitigated this problem as follows while be sticked to it, I hope it makes sense to anyone. Also, by now maybe Next.js provides a better way to do this so I would read the Next.js docs before using my approach. Final note: I don't have access to the code anymore since I change to a different company so maybe there are some points that won't be 100% as we made it.
There is goes:
We created a module that was responsible to request the config file and keep the results in a variable. At the moment of importing this module, we ensure that the variable is not already present in window.__NEXT_DATA__. If it is, we recover it, if it's not, we request it to the remote server (this will be helpful in the clint side rendering).
We created a server.js file as described by Next.js docs. In this file we make the call to get the config file and store it in memory.
In the body of the function passed to createServer we add the config file into the req object to make it accesible to the app in the getInitialProps functions server side.
We made sure that the getInitialProps using the config file returns it, so that it will be passed to the components as props and also serialized by Next.js and made available to the client in the __NEXT_DATA__ global Javascript variable.
Given that the config ended up in the __NEXT_DATA__ variable in the server, using the trick described in the step 1 makes the app not request the config for a second time.
Hello I am fairly new to electron but have been developing web apps using Express. I am building a desktop app and I have an index.html page with a simple login form. I understand in express I can do validation and redirect to the correct router depending on the result of the validation. How can I have the same functionality in Electron? Another thing is I don't want to create another browser window, I just want the paths to redirect and render html pages in the same browser window. Thanks
Electron is a very flexible combination of Node and Chromium, with some added secret sauce of its own API.
You have a lot of options available to you.
One of the biggest points to realize with Electron is that it's possible to develop offline apps that don't require an online back end like you're probably used to doing. This means you can choose to run Express inside of Electron, handling routing and doing its usual job. This would mean Express is running on the PC or Mac of your end user, instead of running on a hosted server somewhere on the Internet.
As an Express developer, this might be a good way for you to initially get things done quickly. You can install Express into your Electron app (npm install express --save).
This way you can run express within Electron, allowing you to continue to work in many of the same ways as what you're already used to. It won't be exactly the same. Like you already see, you'll need to learn to manage browser windows and other Electron concepts along the way. There are also some limitations and workarounds, since Express is normally running on the hosting provider back end server.
There are pointers for how to get started here: NodeJS Electron with express or you can Google for "building apps with Electron and Express".
You'll need to start wrapping your head around Electron specific concepts though so plan to do some reading or courses on Electron.
There's a really great list of Electron related learning and other resources here:
https://github.com/sindresorhus/awesome-electron#videos
Update: I realised I didn't address some of your question specifically so,
To validate the form, you can choose to do it the way you are used to (probably by posting the form to Express and having some logic run), or maybe using a script running on the actual page.
To redirect to a specific path in Electron, you have many options but Express routing could still work for you, or you can load a specific file using loadFile on the Electron BrowserWindow webContents API object (you'll probably need to do some reading on Main and Renderer to nicely understand this).
Enjoy developing with Electron and good luck!
It's more or less communication between ipcMain and ipcRenderer.
As you have understanding on web development, you can regard ipcMain as backend(express) and ipcRenderer as client(browser).
However, the difference is you communicate between ipcMain and ipcRenderer by emitting events instead of network things like AJAX calls as you do with backend/frontend.
To validate data on express side, emit event from ipcRenderer so that ipcMain listens it and do validation. After that emit event back to ipcRenderer and once the ipcRenderer receives the validation data, handle it.
Easiest way to handle redirect/render html pages in same browser window is to develop single page app within your electron by using html5 routing. Here is good boilerplate to kick off if you are familiar with react and react-router. https://github.com/electron-react-boilerplate/electron-react-boilerplate
I'm building a reactjs based website that others will be deploying. It takes the form of a single page app with URL routing /#like=this and the final websites will be content rich. All of the content needs to be visible to search engine bots. Is there a way to do this (even a hacky one) that doesn't require isomorphic server-side rendering? In particular, I can't expect the end users to be able to serve pages with node/express.
Is there a way to do this (even a hacky one) that doesn't require isomorphic server-side rendering?
No
In particular, I can't expect the end users to be able to serve pages with node/express.
Nodejs has nothing to do with this. The client does not know where the content is rendered.