1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Scraping facebook emails like a Pro ( Real Advanced method )

Discussion in 'FaceBook' started by emptyzero, Aug 14, 2014.

  1. emptyzero

    emptyzero Regular Member

    Joined:
    Aug 13, 2012
    Messages:
    297
    Likes Received:
    160
    Hello, seen people trying to gather facebook uids, emails etc.. to use in marketing..
    So before this goes down, i will share this cool method to extract emails from facebook groups.

    - Better be a member in the group so it works..

    First, go to
    Code:
    https://developers.facebook.com/tools/explorer/
    Hit generate access token, and make sure you check the ( groups action )

    second, get the group id by entering this into your graph command line ( /me/groups ) this will show a list of all groups you are in, with its names and ids.

    once you get the group id you want..

    Code:
    https://graph.facebook.com/v1.0/(Group ID)/members?fields=username&limit=5000&access_token=(your access token)
    replace (Group ID) and (your access token) and browse the link you will get around 5000 results, and if group have more members, you will find a link below results to continue browsing results from where you reached !


    Now Cleaning results, it comes like this :
    Code:
    [COLOR=#000000]   "username": "wafa.zagal.1",[/COLOR]         "id": "100004550220996"
          },
          {
             "username": "layal.mattar.1",
             "id": "100002974932215"
          },
          {
             "id": "100005475917204"
          },
          {
             "username": "ahmad.shaaban.507",
             "id": "634233856"
          },
          {
             "username": "fatima.diab.338",
             "id": "100000486916699"
          },
          {
             "username": "alaa.azab.96",
             "id": "100001977767819"
          },
          {
             "username": "rawand.aldeeb", [COLOR=#000000]         "id": "1339392086"[/COLOR]
    save all results into txt file let's say ( results.txt )

    now upload this into your root, or linux vps, or leave it if you are already using linux.

    In Terminal CMD or SSh, use this command to eliminate bad results :

    Code:
    grep username results.txt > clean.txt
    now all lines that contains ( username ) will be moved to a new file called clean.txt

    download this file again :)
    in notepad hit CTRL H

    Replace "username" with nothing..
    Replace ", with @facebook.com
    Replace " with nothing

    i can scrap 100,000 results in 5 min only ( that's even faster than tools lol )
    That's all folks ;)

    emptyzero
     
    • Thanks Thanks x 13
  2. neweaver

    neweaver Regular Member

    Joined:
    Feb 12, 2013
    Messages:
    313
    Likes Received:
    167
    I love you buddy :)
     
    • Thanks Thanks x 1
  3. shahd

    shahd Newbie

    Joined:
    Apr 21, 2014
    Messages:
    8
    Likes Received:
    1
    Any idea about scraping uid from targeted pages
     
  4. emptyzero

    emptyzero Regular Member

    Joined:
    Aug 13, 2012
    Messages:
    297
    Likes Received:
    160
    Not yet, trying though
     
  5. aky004

    aky004 Newbie

    Joined:
    Aug 6, 2014
    Messages:
    5
    Likes Received:
    0
    Code:
    grep username results.txt > clean.txt
    this is not working for me, can u help me?
    anyways GREAT SHARE!
     
  6. YouTubez

    YouTubez Regular Member

    Joined:
    Feb 3, 2013
    Messages:
    327
    Likes Received:
    140
    Occupation:
    YouTube Marketing
    This method will be patched in 3,2,1....
     
  7. Casio

    Casio Registered Member

    Joined:
    Apr 28, 2010
    Messages:
    74
    Likes Received:
    26
    is this work on targeted page ?
     
  8. gondar00

    gondar00 Newbie

    Joined:
    Sep 21, 2013
    Messages:
    41
    Likes Received:
    4
    how do you send bulk emails???
     
  9. sashablack

    sashablack Elite Member

    Joined:
    Jan 8, 2010
    Messages:
    3,697
    Likes Received:
    2,050
    Gender:
    Male
    looks like this is being patched as we speak, could not get this to work :(
     
  10. SevenMathew

    SevenMathew Newbie

    Joined:
    May 3, 2014
    Messages:
    15
    Likes Received:
    2
    Maybe you're using windows.
    Save your text file to C:\users\welcome

    Now open cmd and use the below command to clean the file.

    Code:
    findstr username results.txt > clean.txt
    The clean.txt file will be at C:\users\welcome
     
  11. hlcreativemedia

    hlcreativemedia Newbie

    Joined:
    Jul 4, 2014
    Messages:
    2
    Likes Received:
    0
    Thank you for this smooth method.
    What can I use these Email addresses for?
    Any productive ideas?
     
  12. E=MC²

    E=MC² Junior Member

    Joined:
    Apr 12, 2013
    Messages:
    176
    Likes Received:
    183
    It's working just fine.
     
  13. blackhaterzz

    blackhaterzz Newbie

    Joined:
    Aug 14, 2014
    Messages:
    26
    Likes Received:
    4
    Location:
    Somewhere
    Thanks for this method,

    I can't browse the next page up to the 5000 limit though, when I copy the "next" link, the result page shows only a "previous" link without any data . any ideas?
     
  14. nicoms91

    nicoms91 Junior Member

    Joined:
    Mar 9, 2014
    Messages:
    102
    Likes Received:
    4
    Any idea of why if you dont have joinde the group, its impossible to scrap the group?
     
  15. emptyzero

    emptyzero Regular Member

    Joined:
    Aug 13, 2012
    Messages:
    297
    Likes Received:
    160
    this method is for linux users, or guys who have dedicated / vps ( linux ) and use SSH ( putty ) to control ..
     
  16. prosper109

    prosper109 Junior Member

    Joined:
    Mar 3, 2013
    Messages:
    165
    Likes Received:
    12
    I'm pretty sure you can add a &filter=id

    To only get the ID's. Something like that.
     
  17. nicoms91

    nicoms91 Junior Member

    Joined:
    Mar 9, 2014
    Messages:
    102
    Likes Received:
    4
    I noticed that I can only scrap groups with this url structure "groups/199109493482902" but not the ones with this one "https://www.facebook.com/groups/199109493482902/?ref_br or something like that"
     
  18. emmag92

    emmag92 Newbie

    Joined:
    Sep 12, 2014
    Messages:
    4
    Likes Received:
    0
    Yes. I am also having this problem. Could you please advise? Or have facebook patched this?

    I can get the 1st 5000 with no problems.

    I would appreciate a response, thanks a lot :)
     
  19. lord1027

    lord1027 Elite Member

    Joined:
    Sep 20, 2013
    Messages:
    3,174
    Likes Received:
    2,222
    This will probably get patched anytime now, you shouldn't have shared it publically. FB forbids scraping.
     
  20. emmag92

    emmag92 Newbie

    Joined:
    Sep 12, 2014
    Messages:
    4
    Likes Received:
    0
    actually facebook does not mind because they benefit from it. Facebook give you an option to upload UIDS to use for the ad campaign on a like page