1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Twitter Mass E-Mail Grabbing Script

Discussion in 'BlackHat Lounge' started by dor@tehexploa, May 18, 2009.

  1. dor@tehexploa

    dor@tehexploa Registered Member

    Joined:
    Apr 25, 2009
    Messages:
    95
    Likes Received:
    25
    I saw mention of a Twitter mass e-mail script in a thread tonight, and decided I would attempt to find it / make my own. This is almost my first PHP script so don't laugh at the superfluous code.. I am a huge newb, and I couldn't figure out how to increase the page # in a loop. Hope this helps somebody, because I read that there is one here on the board but is kept in a senior section.

    Code:
    
    <?php
    for($i=0;$i<102;$i++) 
    {
    $url = "";
    switch($i)
    {
    case 0:
        $url= "http://search.twitter.com/search?q=gmail.com+OR+hotmail.com++OR+%22email+me%22";
        break;
    case 2:
        $url= "http://search.twitter.com/search?max_id=XXXXXXXXXX&page=2&q=gmail.com+OR+hotmail.com++OR+%22email+me%22";
        break;
    case 3:
        $url= "http://search.twitter.com/search?max_id=XXXXXXXXXX&page=3&q=gmail.com+OR+hotmail.com++OR+%22email+me%22";
        break;
    
    ......... cycles ........ i kno i kno... i suck lol
    
    case 99:
        $url= "http://search.twitter.com/search?max_id=XXXXXXXXXX&page=99&q=gmail.com+OR+hotmail.com++OR+%22email+me%22";
        break;
    case 100:
        $url= "http://search.twitter.com/search?max_id=XXXXXXXXXX&page=100&q=gmail.com+OR+hotmail.com++OR+%22email+me%22";
        break;
    }
    
    $file = file_get_contents($url);
    $file = strip_tags($file);
     
    preg_match_all(
        
    
    "([a-z0-9!#$%&'*+/=?^_`{|}~-]+(?:\.[a-z0-9!#$%&'*+/=?^_`{|}~-]+)*@(?:[a-z0-9](?:[a-z0-9-]*[a-z0-9])?\.)+(?:[A-Z]{2}|com|org|net|gov|mil|biz|info|mobi|name|aero|jobs|
    
    museum)\b)siU",
        $file,
        $matches);
    
    ob_start();
    
    print_r($matches);
    
    $output = ob_get_clean();
    
    file_put_contents( 'emails2.txt', file_get_contents('emails2.txt') . $output );
    
    $file = array();
    $matches = array();
    $output = array();
    // end of for loop - back to top!!
    } 
    ?>
    
    Used http://www.fromzerotoseo.com/twitter-email-grabber/ as a framework.

    Directions:
    1. Perform this search in your own browser: http://search.twitter.com/search?q=gmail.com+OR+hotmail.com++OR+"email+me"
    2. Go to page 2 of search.
    3. Copy and paste the 10 digit ID to replace the XXXXXXXXXX in the switch loop above.
    (this can be done pretty easily with notepads replace function)
    4. upload this script and an empty 'emails2.txt' (or change name) to your server
    5. browse to this script..

    takes about 1 minute for it to complete but def works nicely

    any help on how to condense this would be greatly appreciated!
     
    • Thanks Thanks x 2
    Last edited: May 18, 2009
  2. crisis23

    crisis23 BANNED BANNED

    Joined:
    Jan 6, 2009
    Messages:
    253
    Likes Received:
    79
    do you mean it grabbed email of the users or just random emails??
     
  3. dor@tehexploa

    dor@tehexploa Registered Member

    Joined:
    Apr 25, 2009
    Messages:
    95
    Likes Received:
    25
    emails of the users..

    it's basically banking off of people that actually twitter something like, "hey i'm dumb e-mail me at A@B.com" lol

    now that I think about it.. is this the right section?
     
    Last edited: May 18, 2009
  4. dor@tehexploa

    dor@tehexploa Registered Member

    Joined:
    Apr 25, 2009
    Messages:
    95
    Likes Received:
    25
    :::SORRY FOR SO MANY POSTS BUT I FIXED THE CODE UP CONSIDERABLY:::

    Code:
    
    <?php
    
    for($i=1;$i<XXX;$i++)             // CHANGE XXX TO NUMBER OF PAGES TO BE SCANNED
    {
    
    $id = "XXXXXXXXXX";              // CHANGE THIS LINE
    
    $url= "http://search.twitter.com/search?max_id=" . $id . "&page=" . $i . "&q=gmail.com+OR+hotmail.com++OR+%22email+me%22";
    
    
    $file = file_get_contents($url);
    $file = strip_tags($file);
     
    preg_match_all(
        
    
    "([a-z0-9!#$%&'*+/=?^_`{|}~-]+(?:\.[a-z0-9!#$%&'*+/=?^_`{|}~-]+)*@(?:[a-z0-9](?:[a-z0-9-]*[a-z0-9])?\.)+(?:[A-Z]{2}|com|org|net|gov|mil|biz|info|mobi|name|aero|jobs|
    
    museum)\b)siU",
        $file,
        $matches);
    
    ob_start();
    
    print_r($matches);
    
    $output = ob_get_clean();
    
    file_put_contents( 'emails.txt', file_get_contents('emails.txt') . $output );
    
    //$file = array();
    //$matches = array();
    //$output = array();
    
    // end of for loop - back to top!!
    
    } 
    
    ?>
    
    
    This is much nicer looking. :)
     
  5. ukescuba

    ukescuba Jr. VIP Jr. VIP Premium Member

    Joined:
    Feb 24, 2008
    Messages:
    994
    Likes Received:
    634
    Occupation:
    Mobile Marketer & QR Code Junkie
    Location:
    San Antonio, TX
    Home Page:
    add yahoo.com and aol.com to your $url line to also collect all yahoo and aol emails too...

    hth

    edit

    actually use this url structure... it will remove your twitter id and paste 100 results per page too :)

    Code:
    http://search.twitter.com/search?page=1&q=gmail.com+OR+hotmail.com+OR+yahoo.com+OR+aol.com&rpp=100
     
    • Thanks Thanks x 1
    Last edited: May 18, 2009
  6. ukescuba

    ukescuba Jr. VIP Jr. VIP Premium Member

    Joined:
    Feb 24, 2008
    Messages:
    994
    Likes Received:
    634
    Occupation:
    Mobile Marketer & QR Code Junkie
    Location:
    San Antonio, TX
    Home Page:
    lol ok so i merged my code and dor@tehexploa... hope you dont mind dor@tehexploa!

    this code removes your twitter id and creates more results per page... you will need to keep refreshing this to gather more emails... some of the emails may need cleaning too... but hey its a free scrape script to get your rolling! :)

    i know this is a double post in jr vip but since dor@tehexploa is non jr vip and contributed to the code its only fair its shared in non jr vip too...

    PS the code is showing an extra space that i cant delete right before yahoo.com

    Disclaimer: this should be used for educational purposes only guys... remember spamming is bad and in some places punishible by fines! ;)
     
    • Thanks Thanks x 1
    Last edited: May 18, 2009
  7. dor@tehexploa

    dor@tehexploa Registered Member

    Joined:
    Apr 25, 2009
    Messages:
    95
    Likes Received:
    25
    I don't mind at all! That's awesome that you can put more than the standard ~10 - ~15 results to a page. Just updated my code and couldn't be happier. :)
     
    • Thanks Thanks x 1
  8. ukescuba

    ukescuba Jr. VIP Jr. VIP Premium Member

    Joined:
    Feb 24, 2008
    Messages:
    994
    Likes Received:
    634
    Occupation:
    Mobile Marketer & QR Code Junkie
    Location:
    San Antonio, TX
    Home Page:
    i also posted the code back at the original blog and also took the liberty to credit you with helping put this together... ;)

    best regards

    ukescuba
     
    • Thanks Thanks x 1
  9. kingbrend

    kingbrend Regular Member Premium Member

    Joined:
    Feb 12, 2008
    Messages:
    427
    Likes Received:
    113
    Home Page:
    That was fast... hxxp://www.webpronews.com/topnews/2009/05/11/spammers-may-have-another-trick-in-twitter
     
  10. dor@tehexploa

    dor@tehexploa Registered Member

    Joined:
    Apr 25, 2009
    Messages:
    95
    Likes Received:
    25
    Yea this was posted in another thread here in the lounge.. This is what made me investigate how to do it. I didn't even know this was possible and already available in the Junior VIP section. :p

    -----

    Thanks so much, ukescuba!
    Btw how does one get into the Jr. VIP section? Sounds like it has some pretty useful information - no reason to reinvent the wheel :p

    Also, I edited the code a bit to output a comma separated list. Quick addition to those who aren't familiar with PHP and can't do it themselves.

    Code:
    <?php
    //set # of pages to scrape - you can increase this...
    //I found there were a max of 15 when searching manually
    for($i=1; $i<15; $i++)
    {
    
    //set url to scrape
    //if you can think of any other popular emails domains let us know!
    $url = file_get_contents("http://search.twitter.com/search?&page=".$i."&q=gmail.com+OR+hotmail.com+OR+yahoo.com+OR+aol.com&rpp=100");
    
    
    //clean content
    $content = strip_tags($url);
    
    
    // extract emails
    preg_match_all("([a-z0-9!#$%&'*+/=?^_`{|}~-]+(?:\.[a-z0-9!#$%&'*+/=?^_`{|}~-]+)*@(?:[a-z0-9](?:[a-z0-9-]*[a-z0-9])?\.)+(?:[A-Z]{2}|com|org|net|biz|info)\b)siU",$content,$emails);
    
    ob_start();
    
    for($x=0; $x < count($emails,1)-1; $x++)
    {
    	print $emails[0][$x] . ", ";
    }
    // next line creates a space between pages
    print "\n";print "\n";
    
    $output = ob_get_clean();
    
    // write emails to a file
    file_put_contents( 'emails.txt', file_get_contents('emailst.txt') . $output );
    
    }
    ?>
    
     
  11. ukescuba

    ukescuba Jr. VIP Jr. VIP Premium Member

    Joined:
    Feb 24, 2008
    Messages:
    994
    Likes Received:
    634
    Occupation:
    Mobile Marketer & QR Code Junkie
    Location:
    San Antonio, TX
    Home Page:
    @dortehexploa

    you can get jrvip status and access to the jrvip area either by donating to BHW and continue on contributing like you are doing, i think after a certain time/valid post count you automatically upgraded

    for exec status think you need to be voted in as well as be an all round helpful guy/gal :)

    hth
     
    • Thanks Thanks x 1
  12. arvo

    arvo Junior Member

    Joined:
    Aug 1, 2008
    Messages:
    111
    Likes Received:
    47
    I've made a few enhancements to this email harvester. You can now watch it harvest the emails. It will not timeout. It's configurable.

    Please give me feedback so I can enhance it more... :D
     

    Attached Files:

    • Thanks Thanks x 4
  13. NgocChinh

    NgocChinh Newbie

    Joined:
    Jul 7, 2009
    Messages:
    32
    Likes Received:
    84
    GREAT !!!

    How can i use it via command line? Because i can't wait it load to finish.

    Another question, can i use it for another website.
    Example: I have websites like this:
    http://domainname.com/contact.php?id=$number
    And $number is range of 1 to 10000.
    Each page have an email, so can you modify your code ?

    Thanks.
     
    Last edited: Dec 18, 2009
  14. anonymust

    anonymust Newbie

    Joined:
    Apr 3, 2013
    Messages:
    16
    Likes Received:
    0
    Occupation:
    Bored
    Location:
    Twitter
    Home Page:
    So how does one like me get this to actually work? Do i need to install a virtual server? XAMMP? or something like it?