How To Set Up React Router V7 For Production In 2025
npx create-react-router@latest my-react-router-app
cd my-react-router-app
npm run dev
ESLint & Prettier
npm i -D prettier eslint
npm i -D prettier-plugin-tailwindcss @eslint/js @eslint/compat typescript-eslint eslint-config-prettier eslint-plugin-prettier eslint-plugin-unicorn eslint-plugin-simple-import-sort @vitest/eslint-plugin eslint-plugin-testing-library eslint-plugin-playwright
export default {
arrowParens: 'avoid',
bracketSameLine: false,
bracketSpacing: true,
htmlWhitespaceSensitivity: 'css',
insertPragma: false,
jsxSingleQuote: false,
plugins: ['prettier-plugin-tailwindcss'],
printWidth: 80,
proseWrap: 'always',
quoteProps: 'as-needed',
requirePragma: false,
semi: true,
singleQuote: true,
tabWidth: 2,
trailingComma: 'all',
useTabs: false,
};
import path from 'node:path';
import { fileURLToPath } from 'node:url';
import { includeIgnoreFile } from '@eslint/compat';
import eslint from '@eslint/js';
import vitest from '@vitest/eslint-plugin';
import playwright from 'eslint-plugin-playwright';
import eslintPluginPrettierRecommended from 'eslint-plugin-prettier/recommended';
import simpleImportSort from 'eslint-plugin-simple-import-sort';
import testingLibrary from 'eslint-plugin-testing-library';
import eslintPluginUnicorn from 'eslint-plugin-unicorn';
import tseslint from 'typescript-eslint';
const __filename = fileURLToPath(import.meta.url);
const __dirname = path.dirname(__filename);
const gitignorePath = path.resolve(__dirname, '.gitignore');
export default tseslint.config(
includeIgnoreFile(gitignorePath),
eslint.configs.recommended,
tseslint.configs.recommendedTypeChecked,
tseslint.configs.stylisticTypeChecked,
eslintPluginUnicorn.configs['flat/recommended'],
{
languageOptions: {
parserOptions: {
projectService: true,
tsconfigRootDir: import.meta.dirname,
},
},
},
{
files: ['**/*.{js,ts,jsx,tsx}'],
plugins: {
'simple-import-sort': simpleImportSort,
},
rules: {
'@typescript-eslint/consistent-type-definitions': ['error', 'type'],
'simple-import-sort/imports': 'error',
'simple-import-sort/exports': 'error',
'unicorn/better-regex': 'warn',
'unicorn/no-process-exit': 'off',
'unicorn/no-array-reduce': 'off',
'unicorn/prevent-abbreviations': [
'error',
{ replacements: { params: false, props: false, utils: false } },
],
},
},
{
files: ['app/**/*.test.{js,ts,jsx,tsx}'],
plugins: { vitest },
// eslint-disable-next-line @typescript-eslint/no-unsafe-assignment
rules: {
// @ts-expect-error https://github.com/vitest-dev/eslint-plugin-vitest/issues/667
// eslint-disable-next-line @typescript-eslint/no-unsafe-member-access
...vitest.configs.recommended.rules,
},
settings: { vitest: { typecheck: true } },
languageOptions: { globals: { ...vitest.environments.env.globals } },
},
{
files: ['src/**/*.test.{jsx,tsx}'],
...testingLibrary.configs['flat/react'],
},
{
files: ['**/*.spec.ts'],
...playwright.configs['flat/recommended'],
},
eslintPluginPrettierRecommended,
);
{
"compilerOptions": {
"allowJs": true,
"checkJs": true,
// ...
}
}
"scripts": {
"format": "prettier --write .",
"lint": "eslint .",
"lint:fix": "eslint . --fix",
}
If you ever get the error Unsafe call of a(n) `error` type typed value.
, just reload your editor.
Commitlint
To enforce consistent commit messages and automate formatting checks in your React Router V7 project, you can integrate Commitlint with Husky. This setup ensures that every commit follows the conventional commit format and that code is automatically formatted before being committed.
Why Use Conventional Commits?
Using conventional commits (e.g., feat(app): add new feature
, fix: resolve bug
) provides a structured format for commit messages, which offers several key benefits:
- Automated Changelog Generation: Tools like
standard-version
orsemantic-release
can parse conventional commits to automatically generate a changelog. For example,feat
commits become "Features,"fix
commits become "Bug Fixes," and so on, saving you time and ensuring consistency when preparing release notes. - Semantic Versioning: Conventional commits align with semantic versioning (e.g., MAJOR.MINOR.PATCH). A
feat
triggers a minor version bump, afix
triggers a patch, and breaking changes (noted with a!
likefeat!: update API
) trigger a major version bump. This makes versioning predictable and automatable. - Improved Readability and Collaboration: A standardized format makes it easier for team members to understand the intent and scope of changes at a glance, enhancing collaboration and code review processes.
- Tooling Integration: Many modern tools, such as CI/CD pipelines, linters, and release managers, are built to work seamlessly with conventional commits, streamlining workflows. You're going to learn how to do this later in this article.
By adopting conventional commits, you set your project up for scalability and maintainability, especially as it grows or when preparing for production releases.
Husky
First, install the necessary dependencies:
npm install -D @commitlint/cli @commitlint/config-conventional husky
Next, initialize Husky to manage Git hooks:
npx husky-init
This creates a .husky
directory in your project. It also sets up a pre-commit hook to run formatting and linting checks automatically. Edit the .husky/pre-commit
file:
npm run lint && npm run typecheck
Furthermore, the husky-init
commands updates your package.json
scripts to enable Husky by adding a prepare
script to your package.json
. You need to modify this prepare
script to use || true
:
"scripts": {
"prepare": "husky || true"
}
When installing only dependencies (and not devDependencies
), the 'prepare': 'husky'
script might fail since Husky won’t be installed.
Run the following to ensure Husky is set up:
npm run prepare
Now, every time you run git commit
, Husky will trigger npm run lint
and npm run typecheck
to format your code with Prettier and check it with ESLint, then stage the changes automatically.
Commitizen
To streamline writing conventional commit messages, integrate Commitizen with Commitlint. This allows you to use an interactive CLI to generate commit messages that adhere to the conventional commit standard.
Install Commitizen and the conventional changelog adapter:
npm install -D commitizen cz-conventional-changelog
Configure Commitizen by adding the following to your package.json
:
"config": {
"commitizen": {
"path": "cz-conventional-changelog"
}
}
Next, configure Commitlint to enforce the conventional commit format. Create a commitlint.config.js
file in the root of your project:
export default {
extends: ['@commitlint/config-conventional'],
rules: {
'type-enum': [
2,
'always',
[
'feat',
'fix',
'docs',
'style',
'refactor',
'perf',
'test',
'build',
'ci',
'chore',
'revert',
],
],
'scope-empty': [2, 'never'], // Enforces a scope like (app) or (tests)
},
};
To integrate Commitlint with Husky, add a commit-msg
hook. Create or edit .husky/commit-msg
:
#!/bin/sh
. "$(dirname "$0")/_/husky.sh"
npx --no -- commitlint --edit "$1"
You might need to make both of your scripts executable:
chmod a+x .husky/pre-commit
chmod a+x .husky/commit-msg
These two Unix commands, chmod a+x .husky/pre-commit
and chmod a+x .husky/prepare-commit-msg
, make the specified Husky hook scripts executable for all users by adding executable permissions. This allows Git to run pre-commit
and prepare-commit-msg
automatically.
Now, instead of using git commit
, you can run:
git add --all
npx cz
This will launch an interactive prompt like this:
$ npx cz
cz-cli@4.3.0, cz-conventional-changelog@3.3.0
? Select the type of change that you're committing: (Use arrow keys)
❯ feat: A new feature
fix: A bug fix
docs: Documentation only changes
style: Changes that do not affect the meaning of the code (white-space, formatting, missing semi-colons, etc)
refactor: A code change that neither fixes a bug nor adds a feature
perf: A code change that improves performance
test: Adding missing tests or correcting existing tests
Follow the prompts to select the type, add a scope (e.g., (app)
), and write a commit message. For example:
feat(app): add Commitlint with Husky integration
What is scope? The scope in a conventional commit (e.g., (app)
or (tests)
) is an optional part of the message that specifies the area of the codebase affected by the change. It provides context, making it easier to identify which module, component, or feature the commit relates to, especially in larger projects.
Commitizen will format this into a conventional commit message, and Commitlint will validate it via the Husky hook. If the message doesn’t meet the rules (e.g., missing scope or invalid type), the commit will fail with an error message, prompting you to fix it.
Verify the Setup
Test the setup by making a change, staging it, and running:
git add --all
npx cz
Then, try committing directly with an invalid message to see Commitlint in action:
git commit -m "bad commit message"
You should see an error like:
⧺ invalid type: "bad" is not one of "feat", "fix", "docs", "style", "refactor", "perf", "test", "build", "ci", "chore", "revert"
With this configuration, your React Router V7 project now enforces consistent commits and automates formatting checks, keeping your codebase clean and maintainable.
Routing
npm install -D remix-flat-routes @react-router/remix-routes-option-adapter
import { remixRoutesOptionAdapter } from '@react-router/remix-routes-option-adapter';
import { flatRoutes } from 'remix-flat-routes';
export default remixRoutesOptionAdapter(defineRoutes => {
return flatRoutes('routes', defineRoutes, {
ignoredRouteFiles: [
'**/.*', // Ignore dotfiles like .DS_Store
'**/*.{test,spec}.{js,jsx,ts,tsx}',
// This is for server-side utilities you want to colocate next to your
// routes without making an additional directory. If you need a route that
// includes "server" or "client" in the filename, use the escape brackets
// like: my-route.[server].tsx.
'**/*.server.*',
'**/*.client.*',
],
});
});
Move the app/routes/home.tsx
file to app/routes/_index.tsx
.
Vitest
npm install -D vitest @faker-js/faker @vitest/coverage-v8
"scripts": {
"test": "vitest --reporter=verbose",
}
import { describe, expect, test } from 'vitest';
describe('Example', () => {
test('should be true', () => {
expect(true).toEqual(true);
});
});
$ npm test
> test
> vitest --reporter=verbose
DEV v3.0.7 /Users/jan/dev/my-react-router-app
✓ app/example.test.ts (1 test) 1ms
✓ Example (1)
✓ should be true
Test Files 1 passed (1)
Tests 1 passed (1)
Start at 21:42:12
Duration 42ms (transform 8ms, setup 0ms, collect 21ms, tests 21ms, environment 0ms, prepare 21ms)
React Testing Library
npm install --save-dev @testing-library/react @testing-library/dom @testing-library/jest-dom @testing-library/user-event happy-dom
import '@testing-library/jest-dom/vitest';
import { cleanup } from '@testing-library/react';
import { afterEach } from 'vitest';
afterEach(() => {
cleanup();
});
import { reactRouter } from '@react-router/dev/vite';
import tailwindcss from '@tailwindcss/vite';
import { defineConfig } from 'vite';
import tsconfigPaths from 'vite-tsconfig-paths';
const rootConfig = defineConfig({
plugins: [
tailwindcss(),
!process.env.VITEST && reactRouter(),
tsconfigPaths(),
],
});
const testConfig = defineConfig({
test: {
workspace: [
{
...rootConfig,
test: { include: ['app/**/*.test.ts'], name: 'unit-tests' },
},
{
...rootConfig,
test: { include: ['app/**/*.spec.ts'], name: 'integration-tests' },
},
{
...rootConfig,
test: {
environment: 'happy-dom',
include: ['app/**/*.test.tsx'],
name: 'react-happy-dom-tests',
setupFiles: ['app/test/setup-test-environment.ts'],
},
},
],
},
});
export default defineConfig({ ...rootConfig, ...testConfig });
import type { RenderOptions } from '@testing-library/react';
import { render } from '@testing-library/react';
import type { ReactElement, ReactNode } from 'react';
const AllTheProviders = ({ children }: { children: ReactNode }) => {
return <>{children}</>;
};
const customRender = (
ui: ReactElement,
options?: Omit<RenderOptions, 'wrapper'>,
) => render(ui, { wrapper: AllTheProviders, ...options });
export * from '@testing-library/react';
export { customRender as render };
export { default as userEvent } from '@testing-library/user-event';
export { createRoutesStub } from 'react-router';
import { describe, expect, test } from 'vitest';
import { render, screen } from '~/test/react-test-utils';
function Greeting() {
return (
<div>
<h1>Hello World</h1>
<p>Some description</p>
</div>
);
}
describe('Greeting component', () => {
test('renders a greeting', () => {
render(<Greeting />);
expect(
screen.getByRole('heading', { name: /hello world/i }),
).toBeInTheDocument();
expect(screen.getByText(/some description/i)).toBeInTheDocument();
});
});
Styling
By default, React Router V7 comes with Tailwind V4.
npx shadcn@latest init
npx shadcn@latest add card
Internationalization
npm install remix-i18next i18next react-i18next i18next-browser-languagedetector i18next-http-backend i18next-fs-backend
{
"greeting": "Hello"
}
export default {
// This is the list of languages your application supports
supportedLngs: ['en'],
// This is the language you want to use in case
// if the user language is not in the supportedLngs
fallbackLng: 'en',
// The default namespace of i18next is "translation",
// but you can customize it here
defaultNS: 'common',
};
import path from 'node:path';
import Backend from 'i18next-fs-backend/cjs';
import { RemixI18Next } from 'remix-i18next/server';
import i18n from './i18n';
const i18next = new RemixI18Next({
detection: {
supportedLanguages: i18n.supportedLngs,
fallbackLanguage: i18n.fallbackLng,
},
i18next: {
...i18n,
backend: {
loadPath: path.resolve('./public/locales/{{lng}}/{{ns}}.json'),
},
},
plugins: [Backend],
});
export default i18next;
import i18next from 'i18next';
import LanguageDetector from 'i18next-browser-languagedetector';
import Backend from 'i18next-http-backend';
import { startTransition, StrictMode } from 'react';
import { hydrateRoot } from 'react-dom/client';
import { I18nextProvider, initReactI18next } from 'react-i18next';
import { HydratedRouter } from 'react-router/dom';
import { getInitialNamespaces } from 'remix-i18next/client';
import i18n from '~/utils/i18n';
async function hydrate() {
await i18next
.use(initReactI18next)
.use(LanguageDetector)
.use(Backend)
.init({
...i18n,
ns: getInitialNamespaces(),
backend: { loadPath: '/locales/{{lng}}/{{ns}}.json' },
detection: {
order: ['htmlTag'],
caches: [],
},
});
startTransition(() => {
hydrateRoot(
document,
<I18nextProvider i18n={i18next}>
<StrictMode>
<HydratedRouter />
</StrictMode>
</I18nextProvider>,
);
});
}
if (globalThis.requestIdleCallback) {
globalThis.requestIdleCallback(() => void hydrate());
} else {
// Safari doesn't support requestIdleCallback
// https://caniuse.com/requestidlecallback
globalThis.setTimeout(() => void hydrate(), 1);
}
import path from 'node:path';
import { PassThrough } from 'node:stream';
import { createReadableStreamFromReadable } from '@react-router/node';
import { createInstance } from 'i18next';
import Backend from 'i18next-fs-backend';
import { isbot } from 'isbot';
import type { RenderToPipeableStreamOptions } from 'react-dom/server';
import { renderToPipeableStream } from 'react-dom/server';
import { I18nextProvider, initReactI18next } from 'react-i18next';
import type { EntryContext } from 'react-router';
import { ServerRouter } from 'react-router';
import i18n from '~/utils/i18n';
import i18next from '~/utils/i18next.server';
export const streamTimeout = 5000;
export default async function handleRequest(
request: Request,
responseStatusCode: number,
responseHeaders: Headers,
routerContext: EntryContext,
// loadContext: AppLoadContext,
) {
const instance = createInstance();
const lng = await i18next.getLocale(request);
const ns = i18next.getRouteNamespaces(routerContext);
await instance
.use(initReactI18next)
.use(Backend)
.init({
...i18n,
lng,
ns,
backend: {
loadPath: path.resolve('./public/locales/{{lng}}/{{ns}}.json'),
},
});
return new Promise((resolve, reject) => {
let shellRendered = false;
const userAgent = request.headers.get('user-agent');
// Ensure requests from bots and SPA Mode renders wait for all content to load before responding
// https://react.dev/reference/react-dom/server/renderToPipeableStream#waiting-for-all-content-to-load-for-crawlers-and-static-generation
const readyOption: keyof RenderToPipeableStreamOptions =
(userAgent && isbot(userAgent)) || routerContext.isSpaMode
? 'onAllReady'
: 'onShellReady';
const { pipe, abort } = renderToPipeableStream(
<I18nextProvider i18n={instance}>
<ServerRouter context={routerContext} url={request.url} />
</I18nextProvider>,
{
[readyOption]() {
shellRendered = true;
const body = new PassThrough();
const stream = createReadableStreamFromReadable(body);
responseHeaders.set('Content-Type', 'text/html');
resolve(
new Response(stream, {
headers: responseHeaders,
status: responseStatusCode,
}),
);
pipe(body);
},
onShellError(error: unknown) {
reject(error as Error);
},
onError(error: unknown) {
responseStatusCode = 500;
// Log streaming rendering errors from inside the shell. Don't log
// errors encountered during initial shell rendering since they'll
// reject and get logged in handleDocumentRequest.
if (shellRendered) {
console.error(error);
}
},
},
);
// Abort the rendering stream after the `streamTimeout` so it has time to
// flush down the rejected boundaries
setTimeout(abort, streamTimeout + 1000);
});
}
import './app.css';
import { useTranslation } from 'react-i18next';
import type { LoaderFunctionArgs } from 'react-router';
import {
isRouteErrorResponse,
Links,
Meta,
Outlet,
Scripts,
ScrollRestoration,
useLoaderData,
} from 'react-router';
import { useChangeLanguage } from 'remix-i18next/react';
import i18next from '~/utils/i18next.server';
import type { Route } from './+types/root';
export const links: Route.LinksFunction = () => [
{ rel: 'preconnect', href: 'https://fonts.googleapis.com' },
{
rel: 'preconnect',
href: 'https://fonts.gstatic.com',
crossOrigin: 'anonymous',
},
{
rel: 'stylesheet',
href: 'https://fonts.googleapis.com/css2?family=Inter:ital,opsz,wght@0,14..32,100..900;1,14..32,100..900&display=swap',
},
];
export const handle = { i18n: 'common' };
export async function loader({ request }: LoaderFunctionArgs) {
const locale = await i18next.getLocale(request);
return { locale };
}
export function Layout({ children }: { children: React.ReactNode }) {
const { locale } = useLoaderData<typeof loader>();
const { i18n } = useTranslation();
useChangeLanguage(locale);
return (
<html lang={locale} dir={i18n.dir()}>
<head>
<meta charSet="utf-8" />
<meta name="viewport" content="width=device-width, initial-scale=1" />
<Meta />
<Links />
</head>
<body>
{children}
<ScrollRestoration />
<Scripts />
</body>
</html>
);
}
// ... rest of the default code
import { useTranslation } from 'react-i18next';
import logoDark from './logo-dark.svg';
import logoLight from './logo-light.svg';
export function Welcome() {
const { t } = useTranslation();
return (
<main className="flex items-center justify-center pt-16 pb-4">
<div className="flex min-h-0 flex-1 flex-col items-center gap-16">
{/* ... Header code ... */}
<div className="w-full max-w-[300px] space-y-6 px-4">
<nav className="space-y-4 rounded-3xl border border-gray-200 p-6 dark:border-gray-700">
<p className="text-center leading-6 text-gray-700 dark:text-gray-200">
{t('greeting')}
</p>
{/* ... resources list code ... */}
</nav>
</div>
</div>
</main>
);
}
// ... your resources ...
Database
Since React Router v7 is a full-stack framework, you can use any database you want. This tutorial will use Postgres because it is well battle-tested, but you're going to use the Prisma ORM to abstract away the database layer.
For local development, you can use Docker to spin up a PostgreSQL 17 database independently, while running your React Router app with npm run dev
and npm run build
as usual. This keeps your development workflow lightweight. In production, you can either run both your app and the database with Docker, or you can connect to a serverless database like Supabase.
To run a PostgreSQL container locally, create a minimal docker-compose.yml
file in your project root:
version: '3.8'
services:
db:
image: postgres:17-alpine
ports:
- '5432:5432'
environment:
- POSTGRES_USER=user
- POSTGRES_PASSWORD=password
- POSTGRES_DB=myapp
volumes:
- postgres-data:/var/lib/postgresql/data
volumes:
postgres-data:
This configuration:
- Uses the
postgres:17-alpine
image for a lightweight PostgreSQL 17 instance. - Exposes port
5432
for local access. - Sets up a database named
myapp
with credentialsuser
/password
. - Persists data in a named volume (
postgres-data
).
Start the database with:
docker-compose up -d
The -d
flag runs it in detached mode, keeping it in the background. To stop it later, use:
docker-compose down
Your app can connect to this database using the connection string postgresql://user:password@localhost:5432/myapp?schema=public
. You’ll configure this in the Prisma setup below.
Prisma
To interact with your database, use Prisma, a modern ORM for Node.js. Here’s how to set it up with a simple UserProfile
model for both local PostgreSQL and Supabase (which also uses PostgreSQL).
Install Prisma and its client:
npm install -D prisma
npm install @prisma/client
Initialize Prisma:
npx prisma init
This creates a prisma
directory with a schema.prisma
file and a .env
file. Add the .env
file to your .gitignore
file to avoid committing it to your repository:
.DS_Store
/node_modules/
# React Router
/.react-router/
/build/
*.env
Update .env
with the local database connection string:
DATABASE_URL="postgresql://user:password@localhost:5432/myapp?schema=public"
Define your schema in prisma/schema.prisma
:
generator client {
provider = "prisma-client-js"
}
datasource db {
provider = "postgresql"
url = env("DATABASE_URL")
}
model UserProfile {
id String @id @default(cuid(2))
createdAt DateTime @default(now())
updatedAt DateTime @updatedAt
email String @unique
name String @default("")
acceptedTermsAndConditions Boolean @default(false)
}
This schema works for both local PostgreSQL and Supabase, defining a UserProfile
with:
- A unique
id
usingcuid(2)
. createdAt
andupdatedAt
timestamps.- A unique
email
, optionalname
, andacceptedTermsAndConditions
boolean.
Add Prisma scripts to package.json
:
"scripts": {
"prisma:deploy": "npx prisma migrate deploy && npx prisma generate",
"prisma:migrate": "npx prisma migrate dev --name",
"prisma:push": "npx prisma db push && npx prisma generate",
"prisma:reset-dev": "npm run prisma:wipe && npm run prisma:seed && npm run dev",
"prisma:setup": "prisma generate && prisma migrate deploy && prisma db push",
"prisma:studio": "npx prisma studio",
"prisma:wipe": "npx prisma migrate reset --force && npx prisma db push",
}
"prisma:deploy"
: Applies pending Prisma migrations to the database and generates the Prisma Client."prisma:migrate"
: Creates a new migration based on changes in the Prisma schema and applies it to the development database, requiring a name for the migration."prisma:push"
: Synchronizes the database schema with the Prisma schema without creating migration files and generates the Prisma Client."prisma:reset-dev"
: Wipes the database, seeds it with initial data, and starts the development server."prisma:setup"
: Generates the Prisma Client, applies migrations, and syncs the database schema with the Prisma schema."prisma:studio"
: Launches the Prisma Studio interface for visually managing the database."prisma:wipe"
: Resets the database by dropping all data and reapplying migrations, then syncs the schema with Prisma.
With the PostgreSQL container running and the scripts in place, apply the schema:
npm run prisma:setup
This creates the UserProfile
table in your local database. For Supabase, you can run the same command in production after updating DATABASE_URL
, or use Supabase’s dashboard to apply migrations.
Set up a Prisma client:
import { PrismaClient } from '@prisma/client';
let prisma: PrismaClient;
declare global {
// eslint-disable-next-line no-var
var __database__: PrismaClient;
}
// This is needed because in development we don't want to restart
// the server with every change, but we want to make sure we don't
// create a new connection to the DB with every change either.
// In production we'll have a single connection to the DB.
if (process.env.NODE_ENV === 'production') {
prisma = new PrismaClient();
} else {
if (!globalThis.__database__) {
globalThis.__database__ = new PrismaClient();
}
prisma = globalThis.__database__;
void prisma.$connect();
}
export { prisma };
Use it in your app, like this:
import db from '~/utils/database.server';
await db.userProfile.create({
data: {
email: 'user@example.com',
name: 'Jane Doe',
acceptedTermsAndConditions: true,
},
});
With Prisma configured, your React Router V7 app is now connected to your PostgreSQL database.
Now, run your app locally with:
npm run dev
If you want to run both your app and a PostgreSQL 17 database in production using Docker, your current Dockerfile already builds a container for your React Router V7 app. To include the database, you’ll need to manage multiple containers, and Docker Compose is the ideal tool to orchestrate both your app (using the existing Dockerfile) and a PostgreSQL service.
Simply add the following to your docker-compose.yml
file:
services
app:
build:
context: .
dockerfile: Dockerfile
ports:
- '3000:3000'
environment:
- DATABASE_URL=postgresql://user:password@db:5432/myapp?schema=public
depends_on:
- db
db:
# ... same as before
Now running docker-compose up -d
will start both your app and the database.
Facades
It’s smart to introduce a layer of abstraction between your database logic and the rest of your app. That’s where facades come in - a design pattern that wraps a complex subsystem (like database operations) behind a clean, simplified interface tailored to your app’s needs.
Let’s set up a facade for our UserProfile
model. Create a file at app/routes/user-profiles+/user-profiles-model.ts
to house all the database interactions related to user profiles.
What does the +
suffix do?
In remix-flat-routes
, the +
suffix on a folder (e.g., user-profiles+
) tells the routing system to treat the files inside as flat-files at the parent level, flattening the folder name into a dot-separated path rather than a nested URL segment. For example, routes/user-profiles+/_index.tsx
maps to /user-profiles
as if it were routes/user-profiles._index.tsx
, and routes/user-profiles+/detail.tsx
maps to /user-profiles/detail
like routes/user-profiles.detail.tsx
. This allows you to organize routes in folders while emulating the flat-files convention (dot-separated paths), avoiding extra URL slashes from folder nesting, all while supporting colocation of related files like utilities or layouts within the folder.
import type { UserProfile } from '@prisma/client';
import { prisma } from '~/utils/database.server';
export async function retrieveUserProfileFromDatabaseByEmail(
email: UserProfile['email'],
) {
return await prisma.userProfile.findUnique({ where: { email } });
}
export async function retrieveAllUserProfilesFromDatabase() {
return await prisma.userProfile.findMany();
}
Now, whenever you need to fetch a user by email or grab all user profiles, you can call these functions instead of writing raw Prisma queries throughout your app. For example, in a loader or action:
import { retrieveAllUserProfilesFromDatabase } from './user-profiles-model';
const users = await retrieveAllUserProfilesFromDatabase();
So, why bother with facades? They bring a handful of compelling benefits to the table:
- Vendor Flexibility: Facades act as a buffer between your app and the underlying database technology. Imagine you decide to swap PostgreSQL for SQLite or migrate from a local setup to a serverless provider like Supabase. Without a facade, you’d be rewriting Prisma calls scattered across your codebase. With a facade, you only tweak the logic inside
user-profiles-model.ts
- say, updating it to hit a new API or ORM - and the rest of your app can stay the same. - Code Simplicity: By encapsulating database queries into focused, app-specific functions, facades cut down on repetitive boilerplate. Instead of wrestling with Prisma’s full API every time you need data, you get a streamlined interface designed for your use cases.
- Readability and Intent: Facades let you name functions in a way that reflects your app’s domain, making the code self-documenting. Compare
await prisma.userProfile.findMany()
toawait retrieveAllUserProfilesFromDatabase()
- the latter instantly tells you what’s happening without needing to decode the query.
Playwright
$ npm init playwright@latest
> npx
> create-playwright
Getting started with writing end-to-end tests with Playwright:
Initializing project in '.'
✔ Where to put your end-to-end tests? · playwright
✔ Add a GitHub Actions workflow? (y/N) · false
✔ Install Playwright browsers (can be done manually via 'npx playwright install')? (Y/n) · true
If this is your first time with Playwright, you might want to check the test-examples/
folder, which is auto generated for you when you run the initialization command. Then delete it.
Modify the use
key to listen for your app's port on 5173, and the webServer
key in your playwright.config.ts
file to point to your app:
import { defineConfig, devices } from '@playwright/test';
export default defineConfig({
// ...
use: {
baseURL: process.env.BASE_URL ?? 'http://localhost:5173',
trace: process.env.CI ? 'on-first-retry' : 'retain-on-failure',
},
// ...
webServer: {
command: process.env.CI ? 'npm run build && npm run start' : 'npm run dev',
},
});
"scripts": {
// ...
"test:e2e": "npx playwright test",
"test:e2e:ui": "npx playwright test --ui",
}
import { expect, test } from '@playwright/test';
test.describe('landing page', () => {
test('given visiting the landing page: shows a greeting', async ({
page,
}) => {
await page.goto('/');
await expect(page.getByText('Hello')).toBeVisible();
});
});
GitHub Actions
# https://docs.github.com/en/actions/learn-github-actions/variables
env:
HUSKY: 0
npm install @supabase/ssr @supabase/supabase-js