Core Web Vitals turned from a vague Google promise into a real ranking factor in 2021. By 2025, the signal has matured, the tooling has improved, and most SEOs have figured out what to actually care about. But there is still a lot of noise. This is what actually matters.
INP Replaced FID -- And That Actually Mattered
First Input Delay was a terrible metric. It measured the delay before the browser started processing an interaction, not the full delay users felt. Interaction to Next Paint replaced it in March 2024, and it is a genuinely better signal. INP captures the full latency of an interaction: from click or keypress to the next paint that reflects that input.
The 200ms threshold for "Good" INP is achievable but requires real work. The main culprits: long JavaScript tasks blocking the main thread, synchronous third-party scripts, and render-blocking resources. If your INP score is in the "Needs Improvement" range (200-500ms), start with a coverage report in Chrome DevTools. Look for tasks exceeding 50ms. Break them up with scheduler.yield() or defer them.
For most sites, third-party scripts are the primary INP killer. Tag managers, chat widgets, and A/B testing tools all compete for main thread time. Run a quick test: load your page with all third parties disabled and measure INP. The delta tells you your third-party tax.
LCP: Still the One That Correlates With Rankings
Of the three CWV metrics, LCP has the strongest documented correlation with ranking positions. The target is 2.5 seconds or better from page load start. For most sites, the LCP element is either a hero image or a large text block.
The biggest LCP win available in 2025: preload the LCP image. A single line in your head:
<link rel="preload" as="image" href="/hero.webp" fetchpriority="high">This signals to the browser to prioritize that resource before it discovers it in the HTML. Combined with proper image sizing (width/height attributes to prevent layout shifts), serving WebP or AVIF formats, and using a CDN, most sites can get LCP under 2.5 seconds without a full site rebuild.
For server-rendered pages, TTFB is upstream of LCP. If your server takes 600ms to respond, your LCP will suffer regardless of frontend optimization. Cache aggressively. Use edge caching where possible.
CLS: The One Most Sites Actually Fixed
Cumulative Layout Shift is largely a solved problem for sites that went through the exercise of fixing it. The main causes -- images without dimensions, dynamically injected content above existing content, web fonts causing FOUT -- have well-documented fixes.
What to watch for in 2025: CLS regressions from A/B testing tools. Platforms like Optimizely and VWO inject content asynchronously, which can tank CLS scores. Run CWV audits before and after launching new tests. Some testing platforms have introduced anti-flicker options that reduce CLS impact.
Also worth checking: layout shifts triggered by cookie consent banners. If your consent banner pushes content down on load, that is a direct CLS hit. Use position: fixed, not position: relative, for these elements.
What Google Has De-emphasized
Core Web Vitals are a tiebreaker, not a primary ranking signal. A page with exceptional content relevance and strong backlink authority will outrank a page with perfect CWV scores. Google has been consistent about this, and the data backs it up.
The CWV "bonus" in rankings is real but modest. Treat CWV optimization as a floor you need to meet, not a ceiling you are trying to maximize. Once you hit "Good" across all three metrics for 75% of page loads (the standard PageSpeed Insights reports against), return-on-time for additional CWV work drops sharply.
The Workflow That Works
- Pull your Core Web Vitals data from Google Search Console (the Core Web Vitals report, not PageSpeed Insights, which uses lab data).
- Segment by URL group. Find pages in "Poor" status first.
- Use CrUX data to validate -- Chrome User Experience Report gives you real-user data.
- Diagnose per-page with Lighthouse or WebPageTest.
- Fix the highest-traffic pages first. CWV is weighted by traffic in how Google assesses your site.
The sites that have CWV fully handled are running automated Lighthouse audits in CI/CD pipelines, alerting on regressions before they hit production. If your site is large enough, that investment pays for itself quickly.
Tools Worth Using
- PageSpeed Insights: Lab data plus field data if your URL has enough traffic
- CrUX Dashboard (Looker Studio): Historical CWV trends for your domain
- WebPageTest: More diagnostic detail than PSI, filmstrip view shows exactly when LCP fires
- Screaming Frog: Crawl and flag pages missing width/height on images (CLS prevention)
Stop obsessing over PSI scores. A 72 on PSI with real-user "Good" CWV data is better than a 95 PSI score with real-user "Poor" CWV. Field data is what Google uses. Lab data is for diagnosis.