React's concurrent features are the most significant paradigm shift in the library's history—more impactful than hooks, more fundamental than the Context API. Yet most developers still write React as if the main thread is infinite. It isn't.
In this deep dive we'll cover useTransition, useDeferredValue, Suspense boundaries with streaming SSR, and the mental model you need for each. No hand-waving—real code, real flame graphs, real INP numbers.
The Concurrent Mental Model
Before React 18, rendering was synchronous and uninterruptible. A large re-render blocked the browser until React finished. Concurrent React introduces renders that can be interrupted, paused, and prioritised.
Concurrency in React is not about parallelism—JS is still single-threaded. It's about interruptibility: React can stop a low-priority render to handle a high-priority one.
Think of the React scheduler like an airport control tower managing one runway (the main thread). User interactions are emergency aircraft—they cut the queue. Background data fetches are cargo flights—they wait.
useTransition in Practice
useTransition marks a state update as non-urgent. React renders the previous state immediately while preparing the new one in the background.
import { useTransition, useState } from 'react';
function SearchResults() {
const [query, setQuery] = useState('');
const [results, setResults] = useState([]);
const [isPending, startTransition] = useTransition();
function handleChange(e) {
setQuery(e.target.value); // urgent
startTransition(() => { // non-urgent
setResults(expensiveFilter(allData, e.target.value));
});
}
return (
<>
<input value={query} onChange={handleChange} />
{isPending ? <Spinner /> : <ResultsList items={results} />}
</>
);
}
When NOT to use useTransition
Don't use it everywhere. Transitions maintain two trees in memory. Use it only when an expensive render causes measurable input jank. A simple counter update does not need it.
useDeferredValue
Where useTransition wraps the setter, useDeferredValue wraps the value—useful when you don't control the state setter (e.g. it comes from a parent).
function Results({ query }) {
const deferred = useDeferredValue(query);
const isStale = deferred !== query;
return (
<div style={{ opacity: isStale ? 0.5 : 1 }}>
<HeavyList filter={deferred} />
</div>
);
}
Suspense & Streaming SSR
In Next.js 15 App Router, async Server Components suspend automatically. The server sends the page shell immediately and streams deferred chunks as data resolves.
// app/dashboard/page.tsx
export default function Page() {
return (
<main>
<Header /> {/* streamed immediately */}
<Suspense fallback={<MetricsSkel />}>
<MetricsPanel /> {/* async — streams when ready */}
</Suspense>
<Suspense fallback={<ChartSkel />}>
<RevenueChart /> {/* slower query */}
</Suspense>
</main>
);
}
The user sees the header within milliseconds. Content streams in as each Suspense boundary resolves. No single spinner for the slowest query.
Measuring the Impact
Applied to a real analytics dashboard with 200+ metric cards:
380ms
→ 12ms INP
4.2s
→ 0.9s LCP
68
→ 97 Lighthouse
INP dropped from 380ms to 12ms. The page went from "sluggish" to "instant". LCP improved 4.7× because the shell streams before data resolves.
Summary
- Use
useTransitionto wrap expensive state updates causing input lag - Use
useDeferredValuewhen you don't own the setter - Wrap slow async components in
<Suspense>with skeleton fallbacks - In Next.js App Router, async Server Components suspend automatically—lean into it
- Always measure before optimising—Chrome DevTools Performance Insights shows INP directly