Pages are getting bigger and it still matters


Google’s Gary Illyes and Martin Splitt used a recent episode of Find the Off the Record podcast to discuss whether web pages are becoming too large and what this means for both users and crawlers.

The conversation started with a simple question: Are websites getting bigger? Splitt immediately pushed back on the framing, arguing that website-level size makes no sense. The size of the individual page is the size the discussion belongs to.

What the data shows

Splitt cited the Web Almanac 2025 from HTTP Archive, which found that the median mobile homepage was 845 KB in 2015. By July, that same median page had grown to 2,362 KB. That’s about a 3x increase over a decade.

Both agreed that growth was expected, given the complexity of modern web applications. But the numbers still surprised them.

Splitt highlighted the challenge of defining “page weight” consistently, because different people interpret the term differently depending on whether they’re thinking about raw HTML, transferred bytes, or whatever a browser needs to render a page.

How Google Crawl Limits Fit In

Illyes discussed a default of 15MB that applies to Google’s broader analytics infrastructure, where each URL has its own limit and referenced resources such as CSS, JavaScript and images are fetched separately.

This is a different number than what appears in Google’s current documentation on Googlebot. Google says Googlebot for Google Search crawls the first 2 MB of a supported file type and the first 64 MB of a PDF.

Our previous coverage has broken down the documentation update that clarified these numbers earlier this year. Illyes and Splitt discussed the flexibility of these limits in a previous episode, noting that internal teams can override defaults depending on what is being explored.

The question of structured data

One of the most interesting moments came when Illyes brought up the topic of structured data and page bloat. He traced this back to a statement by Google co-founder Sergey Brin, who said early in Google’s history that machines should be able to understand everything they need from text alone.

Illyes noted that structured data exists for machines, not users, and that adding the full range of structured data types supported by Google to a page can add weight that visitors never see. He presented it as a tension rather than offering a clear answer as to whether it is a problem.

Is this still important?

Splitt said yes. He acknowledged that his home Internet connection is fast enough that page weight doesn’t matter in his daily experience. But he said the situation changes when traveling to areas with slower connections, and noted that metered satellite internet has made him rethink how much data websites transfer.

He suggested that growth in page sizes may have outpaced improvements in median mobile connection speeds, although he said he would need to verify this against real data.

Illyes referenced previous studies suggesting that faster websites tend to have better retention and conversion rates, although the episode did not cite specific research.

Looking to the future

Splitt said he plans to cover specific techniques for reducing page size in an upcoming episode.

Most pages are still unlikely to reach these limits, with the Web Almanac reporting a median mobile home page size of 2,362 KB. But the broader trend of increasing page weight affects both performance and accessibility for users using slower or limited connections.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *