Why has Google made indexing hard for webmasters ?

seomaster5

Elite Member
Joined
Oct 31, 2021
Messages
4,487
Reaction score
1,257
What does Google achieve by this ? They should be adding as much results as possible to give options to users. But they are making it hard for webmasters to even get their articles indexed. Ranking is nowhere in picture.
 
It's not entirely true... junk copied PAA sites seem to index quite well.

My guess is that google is saving resources but I might be wrong.
Both statements are contradictory. Or do you mean that Google wants to save resources by avoiding junk sites but it has failed to do so ?
 
Both statements are contradictory. Or do you mean that Google wants to save resources by avoiding junk sites but it has failed to do so ?
Kinda yeah, that's what I think... it failed miserably.
 
Rewrite of the topic "Why has AI Webmasters made it harder for Google to Index" with all the AI based content coming up, I am sure Google bot's indexing requirement have multiplied manifold and Google is trying to catch up to an impossible situation
 
Indexing issues were largely fixed with the last core update. Everything goes quickly now.
 
Trying to cut costs and carbon footprint
 
Just apply for Google news even if you don't have a news website and see the indexing benefits almost instantaneously. However, you really want to see those instantaneous results you need to submit news sitemap in GSC.
 
They should be adding as much results as possible to give options to users.
They actually shouldn't. They already have millions of results for several keywords. Adding new sites increases their work, which in turn increases costs.


making it hard for webmasters to even get their articles indexed
They are making it harder for webmasters to get articles on their new site indexed*

Large and existing websites have no problem with indexation. This thread was created some hours ago and it's already indexed on Google.

Pretty sure it's a proof of work basis. You need some good number of pages for Googlebot to take crawling you seriously it seems.
 
They actually shouldn't. They already have millions of results for several keywords. Adding new sites increases their work, which in turn increases costs.



They are making it harder for webmasters to get articles on their new site indexed*

Large and existing websites have no problem with indexation. This thread was created some hours ago and it's already indexed on Google.

Pretty sure it's a proof of work basis. You need some good number of pages for Googlebot to take crawling you seriously it seems.

That's actually not true at all, mate.

A site I created just last week can get articles indexed within hours (fresh domain) with a GSC index request.

A site I've been running for 2,5 years struggles to get articles indexed since last August. Hence I'm using some indexing services from here.
 
That's actually not true at all, mate.

A site I created just last week can get articles indexed within hours (fresh domain) with a GSC index request.

A site I've been running for 2,5 years struggles to get articles indexed since last August. Hence I'm using some indexing services from here.
Your old website likely got affected by a penalty.

New sites do happen to have quite a bit of variables. Ig they index some small enough sites quickly because it requires a very small budget. Not sure why some new sites get affected and don't get crawled even after requests tho..
 
Your old website likely got affected by a penalty.

New sites do happen to have quite a bit of variables. Ig they index some small enough sites quickly because it requires a very small budget. Not sure why some new sites get affected and don't get crawled even after requests tho..

The site isn't affected by any penalty, it's functioning as normal with high rankings for most of the keywords.
 
Back
Top