Performance Patterns That Actually Matter in Production
We're talking performance today.
And before you scroll away thinking "yeah yeah, lazy load your images, use a CDN" — I promise this is not that blog. We're going deeper. We're talking about the stuff that actually comes up in system design interviews AND the stuff that makes your production app not feel like garbage.
Let's get into it.
Why Performance Even Matters (Beyond the Obvious)
Here's the real talk: a 1 second delay in page load = ~7% drop in conversions. Google uses Core Web Vitals as a ranking signal. And your users on mobile in tier-2 cities? They're on 4G with 200ms latency. If your app isn't optimized, they're gone.
But in interviews, performance is also where juniors and seniors get separated. A junior says "use lazy loading." A senior says "here's when to use it, why, and what the tradeoff is." That's what we're building today.
The Four Pillars We're Covering Today
Virtualization
Code Splitting
Lazy Loading
Bundle Size Optimization
Each one has a when to use it, a how it works, and a real example. Let's go.
1. Virtualization — Don't Render What You Can't See
Imagine you have a list of 10,000 rows in a table. If you render all 10,000 DOM nodes at once, your browser is sweating. FPS drops. Scrolling feels janky. Users hate it.
Virtualization (also called windowing) solves this by only rendering the rows currently visible in the viewport — plus a small buffer above and below.
How it works
You maintain a virtual list. As the user scrolls, you calculate which items should be visible based on scroll position and item height, and only render those. Items that scroll out of view get unmounted (or recycled).
// Without virtualization — renders all 10,000 rows
function BadList({ items }) {
return (
<div>
{items.map(item => <Row key={item.id} data={item} />)}
</div>
);
}
// With react-window — only renders ~20 visible rows
import { FixedSizeList } from 'react-window';
function GoodList({ items }) {
return (
<FixedSizeList
height={600}
itemCount={items.length}
itemSize={50}
width="100%"
>
{({ index, style }) => (
<div style={style}>
<Row data={items[index]} />
</div>
)}
</FixedSizeList>
);
}
When to use it
Lists with 100+ items that users scroll through
Data tables, feeds, chat history, autocomplete dropdowns with many results
The tradeoff
Virtualization adds complexity. For small lists (under 50 items), it's overkill. Don't over-engineer.
2. Code Splitting — Don't Ship What You Don't Need Yet
By default, a React app bundles everything into one giant JS file. User lands on your home page? They're downloading code for your settings page, your admin dashboard, and that feature you shipped last quarter that nobody uses.
Code splitting breaks your bundle into smaller chunks that load on demand.
How it works in React
React has React.lazy and Suspense built-in for this.
import React, { Suspense, lazy } from 'react';
// Before — eagerly loaded, always in bundle
import HeavyDashboard from './HeavyDashboard';
// After — lazily loaded, separate chunk
const HeavyDashboard = lazy(() => import('./HeavyDashboard'));
function App() {
return (
<Suspense fallback={<div>Loading...</div>}>
<HeavyDashboard />
</Suspense>
);
}
Vite and webpack both handle the actual chunk splitting automatically once you use dynamic imports.
Route-based splitting — the most common pattern
import { lazy, Suspense } from 'react';
import { Routes, Route } from 'react-router-dom';
const Home = lazy(() => import('./pages/Home'));
const Dashboard = lazy(() => import('./pages/Dashboard'));
const Settings = lazy(() => import('./pages/Settings'));
function App() {
return (
<Suspense fallback={<PageLoader />}>
<Routes>
<Route path="/" element={<Home />} />
<Route path="/dashboard" element={<Dashboard />} />
<Route path="/settings" element={<Settings />} />
</Routes>
</Suspense>
);
}
Now each route is its own chunk. User never visits settings? Settings code never downloads.
When to use it
Route-level splitting: always, almost no downside
Component-level: heavy components like rich text editors, charts, map libraries
Feature flags: code behind a flag shouldn't be in the main bundle
3. Lazy Loading — Images and Components on Demand
We covered component lazy loading above. But images are actually where lazy loading has the biggest impact for most apps.
Native lazy loading (just use this)
<img src="product.jpg" alt="Product" loading="lazy" />
That's it. The browser handles it. It defers loading images that are below the fold until the user scrolls near them. Zero JS required.
Intersection Observer — when you need control
For more custom behaviour (like infinite scroll, or lazy loading non-image content), use the Intersection Observer API.
import { useEffect, useRef, useState } from 'react';
function LazyImage({ src, alt }) {
const imgRef = useRef(null);
const [isVisible, setIsVisible] = useState(false);
useEffect(() => {
const observer = new IntersectionObserver(
([entry]) => {
if (entry.isIntersecting) {
setIsVisible(true);
observer.disconnect(); // stop observing once loaded
}
},
{ threshold: 0.1 }
);
if (imgRef.current) observer.observe(imgRef.current);
return () => observer.disconnect();
}, []);
return (
<div ref={imgRef}>
{isVisible
? <img src={src} alt={alt} />
: <div className="placeholder" />
}
</div>
);
}
The threshold option
threshold: 0.1 means "trigger when 10% of the element is visible." Set it higher (0.5) for elements you want mostly in view before loading.
4. Bundle Size — The Invisible Performance Tax
Your bundle size is a silent killer. Every kb of JS has to be:
Downloaded over the network
Parsed by the browser
Compiled and executed
A 500kb bundle vs a 200kb bundle isn't just a download difference — the parse and execution time on a mid-range Android phone can differ by seconds.
Check your bundle first
Don't optimise blindly. Use rollup-plugin-visualizer with Vite:
// vite.config.js
import { visualizer } from 'rollup-plugin-visualizer';
export default {
plugins: [
visualizer({ open: true }) // opens a treemap of your bundle after build
]
}
Run npm run build and it opens a visual breakdown. You'll probably find one library eating 40% of your bundle.
Common culprits and fixes
Moment.js — 67kb gzipped just for date formatting. Replace with date-fns (tree-shakeable, you only import what you use) or dayjs (2kb).
// Bad — imports entire moment library
import moment from 'moment';
// Good — tree-shakeable, only imports format
import { format } from 'date-fns';
Lodash — don't import the whole thing.
// Bad — imports entire lodash
import _ from 'lodash';
_.debounce(fn, 300);
// Good — only imports debounce
import debounce from 'lodash/debounce';
Icon libraries — importing from @mui/icons-material wrongly can pull in thousands of icons.
// Bad — might import entire icon set depending on bundler config
import { Delete } from '@mui/icons-material';
// Good — direct import, guaranteed single icon
import DeleteIcon from '@mui/icons-material/Delete';
Tree shaking — make sure it's working
Tree shaking removes unused exports from your bundle. It only works with ES modules (import/export). If a library uses CommonJS (require), it can't be tree-shaken.
Check package.json of your dependency for "module" or "exports" field — that's a sign it supports ESM and tree shaking.
Putting It All Together — Interview Answer Framework
When a system design interviewer asks about performance, don't just list techniques. Structure your answer:
1. Identify the bottleneck first
Is it initial load time? → Code splitting, bundle size
Is it runtime rendering? → Virtualization, memoization
Is it network? → Lazy loading, caching, CDN
2. Propose the solution with the tradeoff
- "I'd use virtualization here because we have 10k rows — the tradeoff is added complexity but the rendering performance gain is worth it."
3. Mention metrics
LCP (Largest Contentful Paint) — loading
FID/INP (Interaction to Next Paint) — interactivity
CLS (Cumulative Layout Shift) — visual stability
Saying "I'd measure this with Lighthouse and set a performance budget" puts you ahead of 90% of candidates.
Quick Summary
| Pattern | Problem it solves | When to use |
|---|---|---|
| Virtualization | Too many DOM nodes | Lists 100+ items |
| Code Splitting | Oversized initial bundle | Route/feature level |
| Lazy Loading | Loading unused resources | Images, heavy components |
| Bundle optimization | Bloated dependencies | Always, check with visualizer |
What's Next
Day 5 we're going into state management patterns — when to use local state vs context vs Redux vs Zustand, and how to think about this in a system design interview without just saying "it depends."
If this helped, drop a reaction or share it with someone grinding frontend interviews. And if you disagree with anything here — good, argue with me in the comments. That's how we both learn.
