1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Need a little help from a Programmer

Discussion in 'Black Hat SEO' started by acotut, Dec 12, 2011.

  1. acotut

    acotut Elite Member

    Joined:
    Dec 1, 2010
    Messages:
    2,294
    Likes Received:
    1,040
    Gender:
    Male
    Home Page:
    Hey guys,my new method of making money involves yellowpages and a lot of repetitive task wich eats most of my time + it's boring.

    Curently I'm trying to create a bot in AutoIT to do my job,but I hit a big wall.
    Before I start anything else,I need to scrape URL's of yellowpages.com listings and then do my thing.

    The problem is,I don't know how to create an advanced bot,and I don't know any programming languages...that's why i use autoit.
    So,if someone can help me out with a script or a little tool that can scrape url's,I would be verry greatfull.
    Let's say I search for Restaurant in new jersey.When I hit enter there will be lots of results,and all I need is the link of each business out there.

    Thanks for taking the time to read this,now i hope you'll take the time to help me too :D
     
  2. Jrim_Software

    Jrim_Software Power Member

    Joined:
    Aug 1, 2011
    Messages:
    772
    Likes Received:
    179
    Home Page:
    I would use C# (or VB.NET) and the Microsoft mshtml library to grab the DOM of a web browser and pick it apart.

    If you want somebody to code it for you, the Hire a Freelancer section is that way ----> :p
     
  3. acotut

    acotut Elite Member

    Joined:
    Dec 1, 2010
    Messages:
    2,294
    Likes Received:
    1,040
    Gender:
    Male
    Home Page:

    I think what I'm asking for is verry simple,and wouldn't take more then 5 minutes to create...
     
  4. Jrim_Software

    Jrim_Software Power Member

    Joined:
    Aug 1, 2011
    Messages:
    772
    Likes Received:
    179
    Home Page:
    Right, but I just told you exactly how to do it!

    Do you use linux? Did you know linux can get all of the URLs on a page with a simple command? http://tips.webdesign10.com/scrape-web-pages-gnu-linux-shell

    Edit: MORE STUFF! Scrape URLs using PHP:

    http://www.merchantos.com/2007/08/scraping-links-with-php/

    You can setup PHP on even a windows machine, but if you have a website you could just make a script to scrape the links. I'm sure if you google around for 'URL Scraper' you could find something like what you need.

    Edit2: I've done it now. I have given you my 5 minutes as requested, and found an AUTOIT library to scrape HTML and manipulate the individual variables. Enjoy ;) http://lmgtfy.com/?q=autoit+html+dom
     
    • Thanks Thanks x 1
  5. Cnotey

    Cnotey Power Member

    Joined:
    Jun 25, 2010
    Messages:
    707
    Likes Received:
    912
    Location:
    Seattle
    Home Page:
    Why not just use scrapebox for this?
     
  6. Busrunner

    Busrunner Junior Member

    Joined:
    Nov 26, 2011
    Messages:
    130
    Likes Received:
    27
    Don't forget to use proxies if you need a lot of pages. If they have some firewall set up, they can block you.