top of page

The cost of a bad experience

Core web vitals help quantify the impact

Experience is the choice throwaway phrase of every panelist, keynote, and webinar speaker. It encapsulates the entirety of the engagement, yet, doesn’t carry accountability for any moment.


But, when Google gets involved in quantifying even a slice of the experience,  we sit up straight.


Quick CWV history:
~ Launched in 2021 with Largest Contentful Paint (a measure of page load), Cumulative Layout Shift (stability), and First Input Delay (interactivity).

~ Touted as a set of page performance measures that contribute to the user experience… and more critically have have <some> influence over page rank.

~ Updated in March 2024, replacing FID with Interaction to Next Paint


Since release, CWVs have gradually gained acceptance. You know you’ve made it when the CWV gang finds a home in platform reporting UIs (including Shopify).


Although CWVs are aimed at the user experience, it was Google’s industry weight, along with the threat of search rank penalties that helped drive attention (and acceptance).

There’s a lesson in that of course: we’ll do anything for traffic.


So what about that user experience? Does poor CWV performance have an impact on sessions?


Using two of the CWVs, LCP and the newly minted INP, we looked at the impact that ‘poor’ performance has on the shopper journey.


~LCP~ has become the de facto measure of page load speed; thankfully breaking through the mess of different page load metrics used across ecommerce (dom complete, onLoad, time to interactive, ttfb). Slow page loads is a top frustration factor - and is at its worst when it leads to shoppers bouncing. Bounces remain the absolute worst outcome; all that hard work to attract a shopper only to see them leave. There is of course lots of natural bounce - sometimes one page is “enough.” LCP performance ranges help us tease out the ‘controllable bounce’. And, it shines through clearly - sites with poor LCP performance see 3.9% more bounces. (47.5% v. 45.7%).


~INP~ is intriguing as a way to measure frustration within the journey. Sites that are slow to respond to user interactions are creating more friction during the journey, and likely limiting the potential of deeper sessions. And, shoppers on those poor performing sites (compared with those earning a good INP score) consume one fewer page, a -12.5% difference.

<note - the impact of bounces is removed from that calculation, too>


Put it all together, poor performance for LCP and INP slices engagement by -15.5% across the journey. Or, when factoring that for retail’s RPV ($3.58), the cost of poor site performance is -$.56 per session


At a time when cost per visit is soaring (up 12.4% YoY), allowing frustration to fester inside of the journey is sinful - and CWVs help show how much potential is wasted by poor site performance.

Source: Contentsquare Retail Digital Experience Benchmark



bottom of page