fastify-vite is changing rapidly and is now part of a bigger endeavour dubbed Fastify DX.
Subscribe to this newsletter to hear Fastify DX news.
The 2.x release line has been deprecated and is no longer maintained. Find the legacy documentation here.
Find below the README
for the upcoming 3.x release line, currently in beta.
This plugin lets you load a Vite client application and set it up for Server-Side Rendering (SSR) with Fastify.
It is focused on architectural primitives rather than framework-specific features.
It automates a few aspects of the setup, such as:
- Compiling your Vite application's
index.html
into a templating function for page-level setup. - Toggling Vite's development server on and off, i.e., run in development or production mode.
- Integrating routing at the client level (History API-based) with Fastify server-side routing.
This README contains all the documentation. Also see the working examples/
.
The late 2010s saw the dawn of the age of the SSR framework. Since server-side rendering (SSR) is just too complex and often requires a great deal of preparation to get right — starting from the fact that people (to this date!) still disagree on what SSR actually is[1], specialized frameworks started appearing to meet the inevitable demand for tools that spared developers of the boilerplate work and let them jump straight into their application code, without caring for underlying implementation details.
[1] SSR in this context refers to the server-side rendering of client-side JavaScript to produce on the server the same markup that is dynamically rendered by the browser, so client-side JavaScript doesn't have to spend time rendering the same fragment twice.
First came Next.js (React) and Nuxt.js (Vue) back in 2016, and in recent times, SvelteKit (Svelte) and Remix (React). There are many others, but presently these are the ones that have amassed the largest user bases.
Between 2018 and 2020 I was a core contributor to Nuxt.js and acquired a deep understanding of the complexities and challenges involved.
At some point in between debugging server integration and SSR performance issues in my Nuxt.js applications, it ocurred to me that for optimal performance, safety and flexibility, frameworks would be better off building on top of Fastify rather than trying to incorporate their own backend mechanics with built-in Express-like servers.
That's when I started working on fastify-vite, a Fastify plugin to integrate with Vite-bundled client applications. At least Nuxt.js and SvelteKit seem to agree that building on top of Vite is a good idea — the Vite ecosystem is a solid base for addressing a lot of core, foundational aspects of frameworks, not only bringing a lot of flexibility to the build process (through Vite plugins), but also providing developer experience features such as hot module reload.
After many iterations, fastify-vite evolved to become a highly configurable approach for integrating Vite within Fastify applications. Focusing now on architectural primitives, such as dependency injection and route registration, it's conceivably feasible to reimplement any framework with it. To demonstrate this level of flexibility, I reimplemented two Next.js essential features for both React and Vue.
“Simplicity is a great virtue but it requires hard work to achieve it and education to appreciate it. And to make matters worse: complexity sells better.” ― Edsger W. Dijkstra
The one thing fastify-vite doesn't do is provide an API out of the box for how route modules can control HTML shell, rendering and data fetching aspects of an individual web page. It provides you with an API to implement your own. That's an area that will be addressed by the upcoming Fastify DX toolset.
npm i fastify-vite --save
First you need to import the fastify-vite Vite plugin (fastify-vite/plugin
) in your vite.config.js
file:
import { join, dirname } from 'path'
// Import other plugins
import viteFastify from 'fastify-vite/plugin'
export default {
root: join(dirname(new URL(import.meta.url).pathname), 'client'),
plugins: [
// Register other plugins
viteFastify()
]
}
Note that
__dirname
isn't available in ES modules, that's why we get it fromimport.meta.url
.
Next you need to tell fastify-vite
whether or not it's supposed to run in development mode, in which case Vite's development server is enabled for hot reload — and also, where to load vite.config.js
from (root
):
import Fastify from 'fastify'
import FastifyVite from 'fastify-vite'
const server = Fastify()
await server.register(FastifyVite, {
dev: process.argv.includes('--dev'),
root: import.meta.url,
})
await server.vite.ready()
await server.listen(3000)
In this example, we're conditioning the development mode to the presence of a --dev
CLI argument passed to the Node.js process — could be an environment variable.
fastify-vite
's default value for thedev
configuration option is actually what you see in the snippet above, a CLI argument check for--dev
. That's why you don't see it set in any of theexamples/
, they're just following the convention.
For setting root
, fastify-vite
is smart enough to recognize file URLs, so it parses and treats them as directories. In this snippet above, passing import.meta.url
works the same as passing __dirname
if it was a CJS module.
As for awaiting on server.vite.ready()
, this is what triggers the Vite development server to be started (if in development mode) and all client-level code loaded.
This step is intentionally kept separate from the plugin registration, as you might need to wait on other plugins to be registered for them to be available in fastify-vite
's plugin scope.
The project root of your Vite application is treated like a module, so by default, fastify-vite
will try to load <project-root>/index.js
. If you're coming from the SSR examples from the Vite playground, this is the equivalent of the server entry point.
This is why it's also recommended you keep your client application source code separate from server files. In the vite.config.js
previously shown, the project root is set as client
.
So in server.js
, the root
configuration option determines where your vite.config.js
is located. But in vite.config.js
itself, the root
configuration option determines your project root in Vite's context. That's what's treated as a module by fastify-vite
.
It's very important to understand those subtleties before getting started.
fastify-vite
automatically decorates the Fastify Reply class with two additional methods, reply.render()
and reply.html()
. Let's talk about reply.render()
first, and how to create it.
To understand this fully, let's examine examples/react-vanilla
, an educational example demonstrating the absolute minimum glue code for making client-level code available for server-side rendering.
This basic example has the following structure:
├── client
│ ├── base.jsx
│ ├── index.html
│ ├── index.js
│ └── mount.js
├── package.json
├── server.js
└── vite.config.js
The first thing to remember is that fastify-vite
treats your Vite project root as a JavaScript module, so it'll automatically look for index.js
as the server entry point, that is, the module that's gets bundled for production in SSR mode by Vite.
The React component to be server-side rendered is in client/base.jsx
:
import React from 'react'
export function createApp () {
return (
<p>Hello world from React and fastify-vite!</p>
)
}
Next we have the client entry point, which is the code that mounts the React instance to the server-side rendered HTML element. It is aptly named client/mount.js
:
import { hydrateRoot } from 'react-dom/client'
import { createApp } from './base.jsx'
hydrateRoot(document.querySelector('main'), createApp())
If we were to skip server-side rendering (also possible!) and go straight to client-side rendering, we'd use the
createRoot()
function fromreact-dom
, but in this case, since we expect React to find readily available markup delivered by the server, we usehydrateRoot()
.
Now, let's see client/index.js
:
import { createApp } from './base.jsx'
export default { createApp }
All it does is make the createApp()
function available to the server-side code. In order to create reply.render()
, fastify-vite
expects you to provide a createRenderFunction()
function as a plugin option. This function receives as first parameter the default export from your client module (client/index.js
above).
Now the following snippet, server.js
, will be easy to follow:
import Fastify from 'fastify'
import FastifyVite from 'fastify-vite'
import { renderToString } from 'react-dom/server'
const server = Fastify()
await server.register(FastifyVite, {
root: import.meta.url,
createRenderFunction ({ createApp }) {
return () => {
return {
element: renderToString(createApp())
}
}
}
})
await server.vite.ready()
await server.listen(3000)
You can guess the createApp
value collected from the first argument passed to createRenderFunction()
is coming from client/index.js
. It proceeds to use that to create a new instance of your app, in this case, the root React component, and pass it to renderToString()
from react-dom/server
.
A string with a the server-side renderered HTML fragment for your React component is produced by renderToString()
, and then returned in an object as element
. The only thing left to do in this example is manually specifying a route to call reply.render()
from, but we also need to call reply.html()
:
server.get('/', (req, reply) => {
reply.html(reply.render())
})
That's what's required to get a SSR function for your Vite-bundled application and send the generated markup through a route handler — but there's a big question left to answer:
How does that HTML fragment end up in index.html
?
Let's shift attention to client/index.html
now:
<!DOCTYPE html>
<main><!-- element --></main>
<script type="module" src="/mount.js"></script>
As per Vite's documentation, index.html
is a special file made part of the module resolution graph. It's how Vite finds all the code that runs client-side.
When you run the vite build
command, index.html
is what Vite automatically looks for. Given this special nature, you probably want to keep it as simple as possible, using HTML comments to specify content placeholders. That's the pattern used across the official SSR examples from Vite's playground.
Before we dive into reply.html()
, you should know fastify-vite
packs a helper function that turns an HTML document with placeholders indicated by comments into a precompiled templating function:
import { createHtmlTemplateFunction } from 'fastify-vite'
const template = createHtmlTemplateFunction('<main><!-- foobar --></main>')
const html = template({ foobar: 'This will be inserted '})
By default, that function is used internally by the createHtmlFunction()
configuration option, which is responsible for returning the function that is decorated as reply.html()
.
Here's how createHtmlFunction()
is defined by default:
function createHtmlFunction (source, scope, config) {
const indexHtmlTemplate = config.createHtmlTemplateFunction(source)
return function (ctx) {
this.type('text/html')
this.send(indexHtmlTemplate(ctx))
}
}
You can see that default definition (and many others) in fastify-vite
's internal config.js
file.
Looking at the default createHtmlFunction()
above, you can probably guess how the react-vanilla
example works now. The result of render()
is a simple object with variables to be passed to reply.html()
, which uses the precompiled templating function based on index.html
.
In some cases, it's very likely you'll want to provide your own createHtmlFunction()
option through fastify-vite
's plugin options. For instance, the vue-streaming
example demonstrates a custom implementation that works with a stream instead of a raw string.
If you try to run any of the examples/
without the --dev
flag, you'll be greeted with an error message:
% node server.js
/../node_modules/fastify-vite/mode/production.js:6
throw new Error('No distribution bundle found.')
^
Error: No distribution bundle found.
This means you're trying to run fastify-vite
in production mode, in which case a distribution bundle is assumed to exist. To build your client application code in preparation for fastify-vite
, you must run two vite build
commands, one for the actual client bundle, that gets delivered to the browser, and another for the server-side version of it (what fastify-vite
sees as the client module, or server entry point).
Assuming you're using the default clientModule
resolution (/index.js
), these are the scripts
needed in package.json
:
"build": "npm run build:client && npm run build:server",
"build:client": "vite build --outDir dist/client --ssrManifest",
"build:server": "vite build --outDir dist/server --ssr /index.js",
After running npm run build
on react-vanilla
, for example, you should see a new client/dist
folder.
├── client
+ │ ├── dist
│ ├── base.jsx
│ ├── index.html
│ ├── index.js
│ └── mount.js
├── package.json
├── server.js
└── vite.config.js
That's where the production bundle of your Vite application is located, so this folder needs to exist before you can run a Fastify server with fastify-vite
in production mode.
Also note that in production mode, fastify-vite
will serve static assets from your Vite application via @fastify/static
automatically, but you should consider using a CDN for those files if you can, or just serve through Nginx instead of directly through Node.js. A detailed guide on how to set this up will be added soon.
The essential configuration options are root
, dev
and createRenderFunction()
. Following the conventions covered in the previous section, setting those is enough to get most simple apps working well.
But all steps of the setup can be configured isolatedly.
Below is a execution flow diagram of all configuration functions:
├─ prepareClient()
│ ├─ createHtmlFunction()
│ ├─ createRenderFunction()
│ ├─ createRouteHandler()
│ └─ createErrorHandler()
└─ createRoute()
If unset, fastify-vite
will automatically try to resolve index.js
from your Vite project root as your client module. You can override this behavior by setting this option.
As soon as the client module is loaded, it is passed to the prepareClient()
configuration function.
See its default definition here. If it finds routes
defined, fastify-vite
will use it to register an individual Fastify (server-level) route for each of your client-level routes (VueRouter
, ReactRouter
etc). That's why prepareClient()
is implemented that way by default.
See the react-hydration
and vue-hydration
examples to see how the same routes.js
file is used to set up ReactRouter and VueRouter, and the associated Fastify routes.
As covered previously, this is the function that creates the reply.html()
method.
As covered previously, this is the function that creates the reply.render()
method.
This configuration function creates the default route handler for registering Fastify routes based on the client module routes
exported array (if available). See its default definition below:
function createRouteHandler (client, scope, options) {
return async function (req, reply) {
const page = await reply.render(scope, req, reply)
reply.html(page)
}
}
This configuration function creates the default error handler for the Fastify routes registered based on the client module routes
exported array (if available). See its default definition below:
function createErrorHandler (client, scope, config) {
return (error, req, reply) => {
if (config.dev) {
console.error(error)
scope.vite.devServer.ssrFixStacktrace(error)
}
scope.errorHandler(error, req, reply)
}
}
Finally, this configuration function is responsible for actually registering an individual Fastify route for each of your client-level routes. See its default definition below:
function createRoute ({ handler, errorHandler, route }, scope, config) {
scope.route({
url: route.path,
method: 'GET',
handler,
errorHandler,
...route,
})
}
A single configuration object which can be used to set all of the settings above.
You can see it in the streaming examples/.
You can consider fastify-vite
a microframework for building full stack frameworks.
With configuration functions hooking into every step of the setup process, you can easily implement advanced automation for a number of scenarios.
For example, collecting a Next-like getServerSideProps()
function from every route component and registering an associated payload API endpoint for every route through createRoute()
.
See this blog post for a walkthrough doing just that.
Created by Jonas Galvez, Open Source Maintainer and Engineering Manager at NearForm.
This project is sponsored by NearForm and maintained with the help of David Meir-Levy.
MIT