Beagle Scraper - low resource e-commerce category scraper

Chris.Roark

Elite Member
Joined
Aug 16, 2016
Messages
1,942
Reaction score
687
This is my little creation, as I started learning to code only 3 months ago.

The goal is to develop Beagle Scraper into the largest (plug-and-play) open-source E-commerce scraper. At the moment, it supports only Amazon, BestBuy and HomeDepot, but I will add new websites weekly.

Give it a go and let me know what you think of my pet project.

Here’s the code source:
https://github.com/ChrisRoark/beagle_scraper

If you have a basic understanding of Python, you can run Beagle Scraper within minutes. But if you need help, here’s also a tutorial
https://www.bestproxyproviders.com/...e-e-commerce-websites-and-modify-the-scraper/


PS: I don't know if this is the right section, please let me know if I need to change it
 
so, the main goal is to search in all ecommerce websites and get the lowest price possible for the same product...
and, you should add walmart too, since walmart and amazon are the big ones in US...
 
The goal is for everyone who wants to use it to do anything he/she finds fit. A price comparison website, a data mining or analytics websites, etc.

Regarding walmart, I will look into adding it in the future, walmart.com has a weird javascript implementations, but hopefully I will get around it.

And for other ecommerce websites, anyone can create his own scraper function, it just needs to modify one of current functions.

If there are other websites that you would like to scrape, leave them here and I will update it.
 
interesting, been looking for something similar. May give it a try once I have the time.

Been interested in learning to code bots, I know old programming languages like pascal & visual basic.

Might start learning sometime soon when I have the time
 
interesting, been looking for something similar. May give it a try once I have the time.

Been interested in learning to code bots, I know old programming languages like pascal & visual basic.

Might start learning sometime soon when I have the time

If you want to scrape amazon or bestbuy, the scraper is ready. All you have to do is to provide some product category links. And you can get the data which you can later use it for marketing purposes.
 
If you want to scrape amazon or bestbuy, the scraper is ready. All you have to do is to provide some product category links. And you can get the data which you can later use it for marketing purposes.
Not usre how it works yet but I would be using it to scrape products, if you could then remove duplicate products would be great since theres always a few resellers selling the same product
 
if you could then remove duplicate products would be great since theres always a few resellers selling the same product

Try following the tutorial, you need to work with the terminal or command line, but I am looking of creating a GUI in the future.

I don't think amazon has duplicate products and Beagle Scraper is scraping straight from product category pages, where amazon lists a product only once, but then on the product page, there are different vendors and prices.
 
Im trying this scraper out for Amazon, set it all up according to the tutorial, but getting error message: No scraper for domain: amazon
for each category and then it finishes with zero results.

Any idea why?

I tried with both VPN and rotating proxies.
 
Please paste here a couple of Amazon category URLs that you want to scrape so I can check it
 
Thank you, would really appreciate if u could fix it! I even tried with my own residential IP to see if it was a blocking issue, but its not.

Also, i checked the ini files but didnt see any configuration for setting threads? Got any plans to add that? Ill be using stormproxies, they just give one IP that i connect to for 40 different connections so i think it would be rather slow with current setup.
 
Any ETA on the fix? Really eager to start using this! If its gonna take a loong time please say so and ill find another way, btw thanks a lot for sharing this, very kind of u!
 
I even setup a linux ubuntu VM to try if it was a windows issue, getting same error.
 
Back
Top