Hexagonal Architecture in Next.js: Ports, Adapters, and the Mock Problem


You are six months into a typical Next.js project. Your server components are fetching data directly from the database, your API routes are importing your ORM directly, and your business logic is scattered between page.tsx files and route handlers. It looks something like this:

// src/app/approvals/page.tsx — the "typical" approach
import { db } from "@/lib/db";
import { cookies } from "next/headers";
import { FileMakerClient } from "filemaker-odata";
export default async function Page() {
const session = cookies().get("session");
if (!session) redirect("/login");
const client = new FileMakerClient({ server: process.env.FM_SERVER! });
const records = await client.getRecords("Approval", { $top: 10 });
return <ApprovalList approvals={records} />;
}

It works, and it’s fast to write — until it isn’t.

The real pain shows up when you try to run the app locally and realize you need the backend running. In enterprise work, that backend might be FileMaker — which requires a VPN connection, OAuth credentials, and a running server. Or it might be a Postgres instance that only exists in staging. Either way, your frontend developer can’t work without it. You can’t write tests without it. Every new team member has a multi-hour setup ritual before they can render a single page.

The root cause is coupling. The page above directly imports FileMakerClient. The session check is baked in. The query parameters are inline. To test this, you’d need a real FileMaker server, a valid session cookie, and a running Next.js app. To render this locally without FileMaker, you’d have to comment things out.

So how can we fix that? There are many ways, but a battle-tested and a favorite of mine is Hexagonal Architecture (also called Ports and Adapters). The core idea is simple: your business logic lives in the center of your application and knows nothing about the infrastructure around it. The database, the FileMaker server, the authentication provider, even the framework — these are all external details that plug in through defined interfaces. Swap the interfaces out, and you swap the backend.

In practice, this means you can run the entire app with a mock backend — static in-memory data, no external dependencies, instant startup.


Domain at the Center

At its simplest form, the architecture splits the codebase into three layers:

src/
domain/ models, ports, use cases, services
adapters/ implementations (FileMaker, mock, ORMs, APIs, etc.)
lib/ Next.js bridge (general utilities and glue code)
app/ Next.js pages and routes
components/ React components

The domain layer contains your business logic. It has no knowledge of the database, the HTTP layer, or Next.js. It defines what the application needs (ports), and what it does (use cases). Models are plain TypeScript interfaces — intentionally anemic, with no methods:

// src/domain/models/post.ts
export interface IPost {
id: number;
title: string;
body: string;
createdAt: Date;
updatedAt: Date;
}
// src/domain/models/approval.ts
export interface IApproval {
id: string;
requestId: string;
owner: string;
decision: string;
elapsedDays: number;
hasApproval: number;
hasDenied: number;
// ...
}

No methods, no getters, no toJSON(). Keeping models as plain data means they’re safe to pass directly from server components to client components — Next.js can serialize them without any special handling.

The dependency rule is strict: domain code can only import from other domain code. Adapters can import from the domain. Pages can import from lib/. Nothing in the domain imports from adapters or app/.

app/ → lib/ → adapters/ → domain/

This one-way dependency is what makes the whole thing work. Because the domain doesn’t know about FileMaker, you can replace FileMaker with in-memory mocks, or another database like Postgres or MongoDB. Because use cases only depend on interfaces (ports), you can inject anything that satisfies the interface — a real database adapter, a mock, or a test double.


Ports: Define What You Need, Not How You Get It

A port is a TypeScript interface that describes what the domain needs from the outside world. Nothing more.

// src/domain/ports/approvals-repository.ts
import type { IApproval } from "@/domain/models";
export interface IApprovalRepository {
findAll(): Promise<IApproval[]>;
findById(id: string): Promise<IApproval>;
}
// src/domain/ports/get-current-user.ts
import type { IUser } from "@/domain/models";
export interface IGetCurrentUserService {
getCurrentUser(): Promise<IUser | null>;
}

One file per port. Named for the operation, not the implementation. The domain never imports FileMakerClient or knex — it only imports these interfaces.

Keep ports small. IApprovalRepository only has two methods because only two use cases need it. If a new use case needs a create method, you add it to the interface — don’t prematurely add methods nobody calls. Following the Interface Segregation Principle (ISP) here prevents adapters from being forced to implement methods they don’t support, and keeps the interfaces honest about what the domain actually requires.

For the boilerplate, even a single repository can be split by role:

// src/domain/ports/posts.ts
export interface IFindLatestPosts {
findLatestPosts(): Promise<IPost[]>;
}
export interface ICreatePost {
createPost(data: { title: string; body: string }): Promise<IPost>;
}

This way a FindLatestPosts use case only depends on IFindLatestPosts, not on the full IPostRepository. This is intentional: the use case declares its minimum dependency, and the adapter implements both interfaces on the same class if it wants to.


Use Cases: Where Business Logic Lives

Each use case is a class with a single execute() method. It takes its dependencies through the constructor — never instantiates them itself.

// src/domain/use-cases/approvals/find-all-approvals.ts
import type { IApprovalRepository } from "@/domain/ports";
export class FindAllApprovals {
private approvals: IApprovalRepository;
constructor({ approvals }: { approvals: IApprovalRepository }) {
this.approvals = approvals;
}
async execute() {
return await this.approvals.findAll();
}
}

Simple cases like this are just orchestration — they call the port and return the result. But use cases are also where authorization, validation, and multi-step logic live:

// src/domain/use-cases/posts/create-post.ts
export class CreatePost {
constructor({
createPost,
getCurrentUser,
createLog,
}: {
createPost: ICreatePost;
getCurrentUser: IGetCurrentUser;
createLog: CreateLog;
}) { ... }
async execute({ title, body }: { title: string; body: string }): Promise<IPost> {
const currentUser = await this.getCurrentUser.getCurrentUser();
if (currentUser === null) throw new Error("Unauthorized");
if (title.length === 0) throw new Error("Empty title");
if (body.length === 0) throw new Error("Empty body");
const post = await this.createPost.createPost({ title, body });
await this.createLog.execute({
user: currentUser,
action: "Posts/Create",
data: { post },
});
return post;
}
}

Notice what’s missing: no req, no res, no cookies(), no ORM imports. This use case knows about its business rules (auth required, fields required, log on create) and nothing else. You can test every branch of this logic without spinning up a Next.js server or a database.

What belongs in a use case:

  • Authorization checks
  • Input validation
  • Orchestration between multiple ports
  • Domain events / logging

What does not belong:

  • HTTP status codes
  • Cookie or session access
  • Framework-specific types
  • Database queries

Adapters: Implementing the Ports

Adapters are the implementations. They live in src/adapters/ organized by driver — not by type. So the folder structure looks like:

src/adapters/
filemaker/
approvals-repository.ts
notes-repository.ts
requests-repository.ts
get-current-user.ts
base.ts
mock/
approvals-repository.ts
notes-repository.ts
requests-repository.ts
get-current-user.ts
persistence/ ← (in the boilerplate: Knex/Postgres)
knex-post-repository.ts
containers/
filemaker.ts
mock.ts
domain.ts

For FileMaker, there’s a base class that handles the common scripting pattern — all FileMaker scripts in the app return a JSON envelope with { data, error, message, status }, so the base class handles that plumbing:

// src/adapters/filemaker/base.ts
export abstract class FileMakerRepository {
protected filemaker: FileMaker;
constructor({ filemaker }: { filemaker: FileMaker }) {
this.filemaker = filemaker;
}
async script<T>({ name, params }: { name: string; params: Record<string, unknown> }): Promise<ScriptJsonResponse<T>> {
const { success, data } = await this.filemaker.script<string>(name, params);
if (!success) throw new Error(`[FileMaker Repository] Script failed: ${name}`);
if (data === undefined) throw new Error(`Did not get data back from script: ${name}`);
const json = JSON.parse(data) as ScriptJsonResponse<T>;
if (json.error !== 0) throw new Error(`Script raised error: '${name}' (${json.error}: ${json.message})`);
return json;
}
}

The FileMaker approval repository extends it:

// src/adapters/filemaker/approvals-repository.ts
export class ApprovalRepository
extends FileMakerRepository
implements IApprovalRepository
{
async findAll(): Promise<IApproval[]> {
const records = await this.filemaker.getRecords<ApprovalRecord>("Approval", {
$select: ["id", "idRequest", "decision", /* ... */],
$orderby: ["date", "desc"],
$top: 10,
});
return records.map(this.parse);
}
parse(record: ApprovalRecord): IApproval {
return {
id: record.id,
requestId: record.idRequest,
decision: record.decision,
// ... field mapping between FileMaker's naming and the domain model
};
}
}

The parse method is doing the translation between FileMaker’s field naming conventions and the domain model. This is typical: external systems have their own conventions, and the adapter is the only place that knows about them.


The Mock Problem (and How the Architecture Solves It)

This is the reason the architecture pays for itself.

FileMaker requires:

  • A VPN connection
  • OAuth credentials
  • A running FileMaker server
  • Valid session tokens that expire

If you’re a frontend developer who just wants to style a list of approvals, you need all of that before you can see a single row of data. If you’re writing a test, your CI pipeline needs network access to a specific server.

The mock adapters are plain in-memory implementations of the same interfaces:

// src/adapters/mock/approvals-repository.ts
const MOCK_APPROVALS: IApproval[] = [
{
id: "350C26A3-3710-4838-8174-BFC6E1F26A89",
requestId: "3D5CE683-4920-4F63-AF4D-014111911AEF",
descriptionRequest: "Leak Investigation — TEST1234",
owner: "David Rice",
decision: "",
elapsedDays: 20,
// ... all fields from IApproval
},
// ... more records
];
export class MockApprovalRepository implements IApprovalRepository {
async findAll(): Promise<IApproval[]> {
return MOCK_APPROVALS;
}
async findById(id: string): Promise<IApproval> {
const approval = MOCK_APPROVALS.find((a) => a.id === id);
if (!approval) throw new Error(`[Mock] Approval not found: ${id}`);
return approval;
}
}

No network. No auth. No VPN. The same for MockGetCurrentUserService:

// src/adapters/mock/get-current-user.ts
export class MockGetCurrentUserService implements IGetCurrentUserService {
async getCurrentUser(): Promise<IUser | null> {
return {
name: "Mock User",
email: "mock.user@apple.mock",
role: "admin",
};
}
}

A developer working on the UI sets APP_MOCK=true in their .env and runs npm run dev. The app starts instantly, data is available immediately, and they can build the entire frontend without a backend credential in sight.

The mock data is seeded from real production data (copied from server logs or an export), so the UI looks realistic. When you’re working on an approval status badge, it matters whether your mock data has a mix of '', 'Approved', and 'Denied' decisions — not just '' everywhere.


Wiring It Together: The DI Container

The container is the only place that knows which adapters are being used. It lives in src/adapters/, which is appropriate: it’s the layer that bridges domain and infrastructure, and it needs to import from both.

For the FileMaker app, there are two containers — one for real, one for mock:

// src/adapters/containers/filemaker.ts
export const filemakerContainer = ({ env, credentials }) => {
const infrastructure = {
filemaker() {
const client = new FileMakerClient({ server: env.filemaker.server, database: env.filemaker.database });
return client.withOAuth(credentials);
},
};
const adapters = {
approvalRepository() {
return new ApprovalRepository({ filemaker: infrastructure.filemaker() });
},
// ...
};
const useCases = {
findAllApprovals() {
return new FindAllApprovals({ approvals: adapters.approvalRepository() });
},
findApprovalById() {
return new FindApprovalById({ approvals: adapters.approvalRepository() });
},
// ...
};
return { useCases };
};
// src/adapters/containers/mock.ts
export const mockContainer = () => {
const adapters = {
approvalRepository: () => new MockApprovalRepository(),
// ...
};
const useCases = {
findAllApprovals() {
return new FindAllApprovals({ approvals: adapters.approvalRepository() });
},
// ...
};
return { useCases };
};

The two containers are structurally identical — same useCases shape, same method names, same return types. The only difference is which classes get instantiated.

The single decision point is getDomain():

// src/adapters/domain.ts
export const getDomain = async () => {
const env = getEnv();
if (env.app.mock) return mockContainer();
const session = await getSession();
if (session.requestId === undefined || session.identifier === undefined)
throw new UnauthorizedError();
return filemakerContainer({ env, credentials: { requestId: session.requestId, identifier: session.identifier } });
};

One function. One if. Everything else in the system is blind to the distinction. The container also manages the infrastructure layer internally — the FileMakerClient is constructed in infrastructure.filemaker(), and nothing outside the container ever touches it directly.

For the boilerplate (Postgres variant), a single container handles everything, with a singleton for the Knex connection:

// src/domain/container.ts
const singletons: { knex: Knex | null } = { knex: null };
const container = (env: Environment) => {
const infrastructure = {
knex() {
if (singletons.knex === null)
singletons.knex = knex({ client: "pg", connection: env.app.databaseUrl });
return singletons.knex;
},
};
// adapters, services, useCases...
};
export const getDomain = () => container(getEnv());

Bridging to Next.js: The execute() Pattern

The domain is framework-agnostic, but Next.js still needs to call it. The bridge is execute() in src/lib/:

// src/lib/execute.ts
"use server";
import { redirect } from "next/navigation";
import { getDomain } from "@/adapters/domain";
import { UnauthorizedError } from "@/lib/types";
type Domain = Awaited<ReturnType<typeof getDomain>>;
type UseCases = Domain["useCases"];
type UseCaseKey = {
[K in keyof UseCases]: UseCases[K] extends () => {
execute(...args: unknown[]): Promise<unknown>;
}
? K
: never;
}[keyof UseCases];
type ExecuteParams<K extends UseCaseKey> = Parameters<ReturnType<UseCases[K]>["execute"]>;
type ExecuteReturn<K extends UseCaseKey> = ReturnType<ReturnType<UseCases[K]>["execute"]>;
export async function execute<K extends UseCaseKey>(
name: K,
...args: ExecuteParams<K>
): Promise<Awaited<ExecuteReturn<K>>> {
try {
const domain = await getDomain();
const useCase = domain.useCases[name].bind(domain.useCases)();
return await useCase.execute(...args);
} catch (error) {
if (error instanceof UnauthorizedError) redirect("/sessions/create");
throw error;
}
}

This looks intimidating but the runtime behavior is simple: resolve the container, find the use case by name, call execute(), and redirect on auth failure.

The TypeScript magic is the important part. Because UseCaseKey is derived from the actual container type, calling execute('findApprovalById', id) is fully type-checked: the name must be a valid use case, the arguments must match execute()‘s parameters, and the return type is inferred automatically. If you add a new use case to the container, it becomes callable through execute() with zero extra wiring.

Server components use it directly:

// src/app/page.tsx
export default async function Page() {
const approvals = await execute("findAllApprovals");
return <ApprovalListPage approvals={approvals} />;
}
// src/app/approvals/[id]/page.tsx
export default async function Page({ params }) {
const { id } = await params;
const approval = await execute("findApprovalById", id);
const flowchart = await execute("getRequestFlowchart", approval.requestId);
const notes = await execute("findNotesByRequestId", approval.requestId);
return <ApprovalDetailPage notes={notes} flowchart={flowchart} />;
}

Client components call it too — execute is marked 'use server', so Next.js treats it as a server action. A client component can call it directly:

// src/components/pages/posts/NewPostPage/NewPostPage.tsx
"use client";
export function NewPostPage() {
const handleNewPostClicked = async () => {
await execute("createPost", { title, body });
router.push("/");
};
// ...
}

The page-level components (e.g., ApprovalListPage, ApprovalDetailPage) are pure presentational components. They receive data as props and render it. They don’t know about use cases or adapters. This is the container/presentational split: the Next.js page.tsx is the container (data fetching), the component is presentational (rendering).


Testing Use Cases

Because use cases only depend on interfaces, you can test them by injecting whatever implementations you need. For the boilerplate, tests use SQLite in place of Postgres (same Knex queries, different driver), and mocks for things like session/auth that are harder to reproduce:

// tests/domain/container.ts
const testContainer = (_env: Environment) => {
const infrastructure = {
knex() { return getTestDb(); }, // SQLite in-memory
};
const adapters = {
postRepository() { return new KnexPostRepository(infrastructure.knex()); },
logRepository() { return new KnexLogRepository(infrastructure.knex()); },
};
const useCases = {
createPost() {
return new CreatePost({
createPost: adapters.postRepository(),
getCurrentUser: container.mockGetCurrentUser(), // ← mock auth
createLog: services.createLog(),
});
},
};
return { useCases };
};

The tests themselves read like specifications:

describe("CreatePost", () => {
test("errors when current user is null", async () => {
mocks.setCurrentUser(null);
const domain = getTestDomain();
await expect(
domain.useCases.createPost().execute({ title: "", body: "" })
).rejects.toThrow("Unauthorized");
});
test("creates post with title and body", async () => {
const domain = getTestDomain();
await domain.useCases.createPost().execute({ title: "Foo", body: "Bar" });
const post = await getTestDb()("posts").first();
expect(post.title).toEqual("Foo");
});
test("creates log records", async () => {
const domain = getTestDomain();
await domain.useCases.createPost().execute({ title: "Foo", body: "Bar" });
const log = await getTestDb()("logs").first();
expect(log.action).toEqual("Posts/Create");
});
});

No HTTP server. No mocking framework. No jest.mock() calls at the module level. You control behavior by swapping what gets injected. The mocks.setCurrentUser(null) call changes what the mock adapter returns; everything else uses the real SQLite implementation.


What You Get

Frontend developers can work without the backend. Set APP_MOCK=true, run the dev server, and you have a fully functional app with realistic data. No VPN, no credentials, no setup.

Use cases are independently testable. Because they only depend on interfaces, you inject whatever you need — real SQLite for data logic, mocks for auth, mocks for external services. The tests are fast, deterministic, and readable.

Adding a new backend is a new adapter, not a rewrite. The domain and use cases stay exactly the same. You write new implementations of the ports and wire them into a new container.

Type safety across the boundary. The execute() function infers argument types and return types from the container. You can’t call a use case with the wrong arguments, and the return type flows back to the component without any manual annotation.

The tradeoffs

More files. A feature that might live in one route handler in a typical Next.js app is spread across a model, a port, a use case, an adapter, and a container entry. That’s the cost.

It’s worth it when:

  • The backend requires external dependencies (VPN, OAuth, running services)
  • Multiple developers work on the frontend and backend independently
  • The business logic is complex enough to warrant isolation and testing
  • There’s a realistic chance of migrating the backend

It’s probably overkill for a simple app backed by a single Postgres instance where every developer can run docker-compose up and be ready in 30 seconds.

The architecture doesn’t add complexity to be clever. It adds the minimum structure needed to answer one question cleanly: can I run this app without its backend? If the answer matters to your team, hexagonal architecture is how you make it yes.