Google indexing API + python = maximum 200 urls/day: how to increase the limit?

CryptoCapitalista

Regular Member
Joined
Dec 3, 2022
Messages
321
Reaction score
128
Hello marketeers,

I have an ecommerce website with +4000 products that have long product description generated with chat gpt 4.

However, it is hard to get product pages indexed.

I already tried Google indexing API + python, but only 200 urls a day are allowed :(. Is there any way to work around this?

Already struggling with this issue for months, maybe a year, and I am becoming desperate.

All advice and help is highly appreciated!
 
In my personal experience, it's not good to abuse the Google indexing API, let Google do its thing.
 
You ideally want to use Google's APIs to refresh indexing of your main categories and let Googlebot follow on to your products via proper internal linking. Trying to force them to index every individual product page via the API will probably result in a lot of short-lived indexing. Strengthen your main 'hubs' through good internal linking and even some external backlinks and the crawls are more likely to come.
 
You can even go further, get some hundred apis, add a website to one account, do all your stuff (sitemaps, indexing api, etc), remove the website, add it to the next account, repeat.
The accounts are a bit painful to create automaticly, cause they need oAuth, not just an API code, but it's possible.
 
There was a russian python script where it allows you to enter multiple google indexing API and request more than 200 days url indexing.
I forget the link where the Script was found. I will try searching for it and reply with the link here
 
Back
Top