RealDaddy
Repeatedly violating rules
- Jun 30, 2018
- 9,017
- 11,292
Source - Search Engine Land
Google has updated its help document on Googlebot to specify that Googlebot will crawl up to the first 15MB of the page and then stop. So if you want to ensure that Google ranks your page appropriately, make sure Googlebot can crawl and index that part of the page within the first 15MB.
What is new. In the https://developers.google.com/search/docs/advanced/crawling/googlebot, Google added this section that reads:
- In general, you probably want to keep your pages pretty light for both users and search engine crawlers. But here Google is being very clear about how much Googlebot will consume from your page.
- A good way to test this is to use the URL Inspection tool in Google Search Console and see what parts of the page Google renders and sees within the debugging tool.
Google has updated its help document on Googlebot to specify that Googlebot will crawl up to the first 15MB of the page and then stop. So if you want to ensure that Google ranks your page appropriately, make sure Googlebot can crawl and index that part of the page within the first 15MB.
What is new. In the https://developers.google.com/search/docs/advanced/crawling/googlebot, Google added this section that reads:
Googlebot can crawl the first 15MB of content in an HTML file or https://support.google.com/webmasters/answer/35287. After the first 15MB of the file, Googlebot stops crawling and only considers the first 15MB of content for indexing.
- In general, you probably want to keep your pages pretty light for both users and search engine crawlers. But here Google is being very clear about how much Googlebot will consume from your page.
- A good way to test this is to use the URL Inspection tool in Google Search Console and see what parts of the page Google renders and sees within the debugging tool.