Google Core Update, Crawl Limits and Gemini Traffic Data – SEO Pulse


Welcome to this week’s Pulse: Updates affect how Google ranks content, how its crawlers manage page sizes, and where AI referral traffic goes. This is what matters to you and your work.

Google rolls out March 2026 core update

Google began rolling out the March core update this week. This is the first general update of the year.

Highlights: Deployment may take up to two weeks. Google described it as a regular update designed to show more relevant and satisfying content from all types of sites. It comes two days after the March spam update completed in less than 20 hours.

Why it matters

The December Core Update was the most recent general update, ending on December 29. This represents a gap of three months. The February 2026 update only affected Discover, so search rankings haven’t been recalibrated since late December.

Ranking changes could appear in early April. Google recommends waiting at least a full week after deployment is complete before monitoring Search Console performance. Compare with a reference period before March 27.

What SEO Professionals Say

John Mueller, a member of the Google Search Relations team, wrote on Bluesky when asked if the two updates overlap:

One is about spam, the other is not. If with some experience you are unsure whether your site is spam or not, unfortunately it is probably spam.

Mueller later explained that core updates do not follow a single deployment mechanism. Different teams and systems contribute to changes, and these components may require step-by-step deployments rather than a single release. This is why deployments take weeks and why ranking volatility often appears in waves rather than all at once.

Roger Montti, writing for Search Engine Journal, noted that the proximity of the spam update may not have been a coincidence. Combating spam is logically part of the broader reassessment of the quality of a core update.

Read our full coverage: Google begins rolling out March 2026 core update

Read Roger Montti’s cover: Google explains why major updates can be rolled out in stages

Illyes explains Googlebot’s crawling architecture and byte limits

Google’s Gary Illyes, an analyst on Google’s search team, published a blog post explaining how Googlebot works within Google’s broader crawling systems. The article adds new technical details to the 2MB crawl limit released by Google earlier this year.

Highlights: Illyes described Googlebot as one of the clients of a centralized crawling platform. Google Shopping, AdSense, and other products all route requests through the same system under different crawler names. HTTP request headers count towards the 2MB limit. External resources like CSS and JavaScript have their own separate byte counters.

Why it matters

When Googlebot reaches 2 MB, it does not reject the page. It stops retrieval and passes the truncated content for indexing as if it were the full file. Anything over 2MB is never indexed. This is important for pages with large inline base64 images, large inline CSS or JavaScript, or oversized navigation menus.

The centralized platform details also explain why different Google crawlers behave differently in server logs. Each client sets its own configuration, including byte limits. Googlebot’s 2MB is a search-specific replacement for the platform’s default 15MB.

Google has now covered these limitations in documentation updates, a podcast episode, and this blog post in two months. Illyes noted that the 2MB limit is not permanent and may change as the web evolves.

What SEO Professionals Say

Cyrus Shepard, founder of Zyppy SEO, wrote on LinkedIn:

That said, as SEOs, we are often faced with extreme situations. If you notice that some content is not indexed on VERY LARGE PAGES, you will probably want to check your size.

Read our full coverage: Google explains Googlebot’s byte limits and crawling architecture

Illyes and Splitt from Google: pages are getting bigger and it still matters

Gary Illyes and Martin Splitt, Developer Advocate at Google, discussed page weight growth and crawling in a recent Search Off the Record podcast episode.

Highlights: Web pages have increased almost 3-fold over the last decade. The 15MB default applies to all of Google’s broader crawlers, with individual clients like Googlebot for Search changing it to 2MB. Illyes asked whether the structured data Google asks websites to add contributes to page bloat.

Why it matters

The Web Almanac 2025 reports a median mobile home page size of 2,362 KB. This indicates that pages are becoming larger, although this should not be safely considered below Googlebot’s 2 MB recovery limit. However, Illyes’ question about structured data contributing to bloat is worth watching. Google encourages sites to add schema markup for rich results, and this markup increases the weight of each page.

Splitt said he plans to cover specific techniques for reducing page size in an upcoming episode. Pages with heavy online content should verify that their critical elements load within the first 2MB of the response.

Read our full coverage: Google: pages are getting bigger and it still matters

Gemini Referral Traffic More Than Doubles and Surpasses Perplexity

Google Gemini more than doubled its referral traffic to websites between November 2025 and January 2026. The data comes from SE Ranking’s analysis of more than 101,000 sites with Google Analytics installed.

Highlights: SE Ranking measured a combined 115% increase over two months, with the jump starting around the time Google rolled out Gemini 3. In January, Gemini sent 29% more referral traffic than Perplexity globally and 41% more in the US. ChatGPT still generates around 80% of all AI referral traffic. For the sake of transparency, SE Ranking sells AI visibility tracking tools.

Why it matters

As of August 2025, Perplexity was sending approximately 2.9 times more referral traffic than Gemini. Gemini’s rise between December and January reversed this trend in January 2026. ChatGPT’s lead over Gemini also narrowed, from around 22x in October to around 8x in January.

All AI platforms combined still account for about 0.24% of global internet traffic, up from 0.15% in 2025. This is measurable growth, but it’s still a small share compared to organic search. Two months of Gemini’s growth corresponds to the launch of a known product, but it is too early to call this a lasting trend.

Gemini is now worth looking at alongside ChatGPT and Perplexity in your referral reports.

Read our full coverage: Google Gemini sends more traffic to sites than confusion: report


Theme of the week: Google explains its own systems

In three of the four articles this week, Google explains how its systems work. Illyes published a blog post detailing the architecture of Googlebot. That same week, the Search Off the Record podcast covered page weight and crawl thresholds. Mueller explained why major updates are rolled out in waves rather than all at once. Each fills a gap that the documentation alone has left open.

Gemini traffic data offers a new perspective. Google is open about how its crawlers and ranking systems work. Traffic flowing through its AI services is growing fast enough to be reflected in third-party data, and Google doesn’t explain that part.

Top stories of the week:

More resources:



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *