1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

[GET] Semi-Automatic Google URL/Blog scraping Script

Discussion in 'Black Hat SEO Tools' started by StiflersMom, May 27, 2011.

  1. StiflersMom

    StiflersMom Registered Member

    Joined:
    May 19, 2011
    Messages:
    57
    Likes Received:
    27
    Occupation:
    All your SERPs are belong to us!
    Location:
    /dev/null
    Hey,

    first off ... if you got scrapebox or anything like that ... this is totally useless for you.
    It's just a stupid simple script to extract URLs from Google to increase the workflow for anyone who's like brand new to blog commenting etc... and does everything manually.

    Anyways ... that was pretty much my first way of building blog/guestbook backlinks in combination with Roboform :D

    I used it with WAMP so I wouldn't always have to upload the serp.txt to my hosting.

    Usage:
    0) rename extract.txt to extract.php
    1) Adjust you G search settings "turn off google instant" + "show 100 results per page"
    2) type your keyphrase/footprint into google
    3) view sourcecode -> copy everything -> paste into a simple textfile
    4) click through to second serp page on G and repeat
    4.1) save the text file as serp.txt
    5) visit extract.php and wait 1-2 seconds to ouput the clean urls
    6) do whatever you want with your 1.000 (or 10.000 or more) clean urls or just start visiting and posting with Roboform or manually typing your stuff

    Note: the list is hyperlinked and with meta refresh to hide referrer + links open in new window/tab


    I really don't know if it's of any use to anyone, but it's a simple way to extract like 1k URLs with like 1min of work :D And no magic to be expected here ... geez it's 700 BYTES of code :D

    sidenote to any fellow coder: don't tell me about the poor formatting of my code *lol really :D
     

    Attached Files:

    • Thanks Thanks x 1
    Last edited: May 27, 2011