Streaming, Suspense, and loading.tsx: how we made the Proofly dashboard feel instant
A dashboard with slow database queries feels slow even when the queries are fast. Here's how we restructured Proofly's App Router layout to stream the shell immediately and push async data behind Suspense boundaries — so navigation feels instant regardless of query time.
TL;DR
Keep the dashboard layout synchronous. Put async data fetches in Suspense-wrapped child components with loading.tsx fallbacks at each route segment. The shell — header, nav, sidebar — renders and streams instantly. Data slots pop in when queries resolve. Navigation transitions feel instant because the shell is never waiting on data.
When we first built Proofly's dashboard, every page looked like this:
export default async function TestimonialsPage() {
const user = await requireUser();
const testimonials = await loadTestimonials(user.id); // 80ms DB query
const stats = await getStats(user.id); // 45ms DB query
return <TestimonialsView testimonials={testimonials} stats={stats} />;
}
Both queries ran sequentially — the page waited for loadTestimonials to finish before even starting getStats. The total wait before any HTML was 125ms, plus the time for the first TTFB to reach the browser, plus React rendering. On a cold Vercel function, that was a 200–300ms white screen before anything appeared.
The pages weren't slow by database standards. But they felt slow.
The mental model: shell vs. data#
The key realization is that a dashboard has two layers with different latency requirements:
The shell: the header, navigation, sidebar, layout structure. This never changes between navigations. It has zero data dependencies. It should render in single-digit milliseconds.
The data: the actual content — testimonials, metrics, wall configs. This requires database queries. It's inherently async and can't be faster than the slowest query.
The App Router lets you separate these cleanly with Suspense. The shell renders and streams immediately. Data slots show skeletons while their queries run, then pop in when the data arrives. The visitor sees chrome instantly and content a few hundred milliseconds later — instead of a blank page for 300ms followed by everything at once.
Making the layout synchronous#
The dashboard layout is the most important piece. If it has any async work, it blocks the entire shell from streaming.
// app/dashboard/layout.tsx
export default function DashboardLayout({
children,
}: {
children: React.ReactNode;
}) {
return (
<div className="relative isolate flex min-h-screen flex-col">
<header className="sticky top-0 z-30 ...">
<div className="flex items-center gap-8">
<Link href="/dashboard">Proofly</Link>
<Suspense fallback={<DashboardNav plan="sketch" />}>
<DashboardNavForUser />
</Suspense>
</div>
<div className="flex items-center gap-2">
<ThemeToggle />
<Suspense fallback={<UserMenuFallback />}>
<DashboardUserMenu />
</Suspense>
</div>
</header>
<main>{children}</main>
</div>
);
}
The layout function itself is synchronous — no async, no await. The parts that need user data (DashboardNavForUser shows plan-aware nav items, DashboardUserMenu shows the avatar) are wrapped in <Suspense> with instant fallbacks.
This means the layout shell — the sticky header, the layout structure, the nav skeleton — renders and streams in the same time as a static HTML page. The async components inside Suspense stream in afterwards.
Route-level loading.tsx#
For each dashboard route segment, we add a loading.tsx file:
// app/dashboard/loading.tsx
import { PageSkeleton } from "./_components/ui";
export default function Loading() {
return <PageSkeleton withMetrics rows={4} />;
}
This file automatically wraps the segment's page.tsx in a Suspense boundary with <Loading /> as the fallback. Next.js handles the wrapping — you don't write the Suspense boundary manually.
What this means in practice: when you click a link to /dashboard/testimonials, the layout shell paints instantly (it was already cached or streaming), the loading.tsx skeleton appears in the content area immediately, and the actual testimonials load in when the query finishes. The transition feels like client-side navigation even though the content is server-rendered.
Parallel data fetching inside pages#
The other half of the problem: serial database queries inside a single page. Even with streaming, if the page runs two queries back-to-back, the data arrives after both finish instead of each arriving as soon as it's ready.
// Before: serial queries, 125ms total wait
const testimonials = await loadTestimonials(user.id); // 80ms
const stats = await getStats(user.id); // 45ms
// After: parallel queries with Promise.all, 80ms total wait
const [testimonials, stats] = await Promise.all([
loadTestimonials(user.id),
getStats(user.id),
]);
For truly independent data slots, you can go further and push each into its own async component behind its own Suspense boundary:
// app/dashboard/page.tsx
export default async function DashboardPage() {
const user = await requireUser();
return (
<div>
<Suspense fallback={<MetricsSkeleton />}>
<DashboardMetrics userId={user.id} />
</Suspense>
<Suspense fallback={<RecentSkeleton />}>
<RecentTestimonials userId={user.id} />
</Suspense>
</div>
);
}
async function DashboardMetrics({ userId }: { userId: string }) {
const stats = await getStats(userId); // 45ms, starts immediately
return <MetricsView stats={stats} />;
}
async function RecentTestimonials({ userId }: { userId: string }) {
const testimonials = await loadRecentTestimonials(userId); // 80ms, starts immediately
return <TestimonialsView testimonials={testimonials} />;
}
Now both queries start at the same time, and each section pops in independently as its query resolves. The visitor sees metrics at 45ms and testimonials at 80ms rather than everything at 125ms.
The tool pages: large client components#
The marketing tool pages (like our free testimonial request builder) have a different problem. They use large "use client" components — interactive form builders — that are significant JavaScript bundles. The issue isn't a slow database query; it's JavaScript parse time.
For these, we apply the same pattern at the segment level:
// app/(marketing)/tools/[tool]/loading.tsx
export default function Loading() {
return <ToolSkeleton className="h-[600px]" />;
}
The skeleton is sized to match the tool's rendered height. When Next.js prefetches the page (in production, Next.js automatically prefetches <Link> hrefs in the viewport), the static shell is ready. The visitor clicking the link sees the skeleton immediately while the JavaScript bundle loads. Note: this prefetching only runs in production builds — in dev, you don't see it.
What we don't put behind Suspense#
Not everything needs a Suspense wrapper. The rule we use:
- Does it block the shell from rendering? If yes, move it behind Suspense.
- Does it have a meaningful skeleton state? If no (like a single number that loads in 5ms), wrapping it in Suspense adds code complexity for no visible benefit.
- Is it part of the persistent chrome (header, nav)? Wrap in Suspense with an instant same-size fallback, so the chrome never shifts on load.
The dashboard layout's <DashboardNavForUser> is wrapped because it resolves the user's plan (sketch vs studio) to show different nav items. The fallback is <DashboardNav plan="sketch" /> — the same component rendered without plan data, so there's no layout shift when the real plan resolves.
The result#
Navigation between dashboard pages now feels instant. The layout shell never blocks. The content areas show skeletons for the time it takes to resolve their queries — typically 40–100ms on warm Vercel functions — and then pop in. Cold invocations are slower, but even then the shell is painted first and the content follows rather than everything arriving together after a blank period.
The change isn't about making queries faster. It's about getting pixels on screen before queries finish — and then streaming the rest in.
Frequently asked
Quick answers
Does loading.tsx work for client-side navigation or only on page load?+
Both, but differently. On the initial page load, loading.tsx creates an HTML streaming boundary — the shell renders first, data streams in. On client-side navigation (clicking a link inside the app), Next.js uses the loading.tsx skeleton as the instant-render placeholder while the page's Server Components fetch their data. The result is that navigating between dashboard pages feels like a client-side transition even though the content is server-rendered.
How do Suspense boundaries interact with the sticky dashboard header?+
The header is part of the dashboard layout, which is synchronous. It renders and paints immediately on any navigation — no database queries, no async work. The header never waits. Only the content area below it shows a loading skeleton. This is why the layout shell stays synchronous: it's the persistent chrome that should always be present.
What's the difference between a loading.tsx skeleton and a Suspense fallback?+
loading.tsx is Next.js syntax sugar for a Suspense boundary at the route segment level. Placing a loading.tsx in /dashboard automatically wraps that segment's page.tsx in <Suspense fallback={<Loading />}>. Inline Suspense boundaries (inside a page or layout) give you more granular control — you can stream different data slots independently within the same page.