1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

How do I scrape all the content from a list of urls?

Discussion in 'Black Hat SEO' started by tacopalypse, Jan 20, 2010.

  1. tacopalypse

    tacopalypse Executive VIP Jr. VIP Premium Member

    Joined:
    Nov 30, 2009
    Messages:
    980
    Likes Received:
    2,485
    Home Page:
    ok, so i have this list of around 1000 urls, and i need to scrape the title & source code of every page in the url list, and save the data in a text file or something.

    how would i do this?

    is there already a program out there that does this automatically?

    if not, how would i program one myself?

    :)
     
  2. FamousMassacre1

    FamousMassacre1 Newbie

    Joined:
    Apr 23, 2009
    Messages:
    12
    Likes Received:
    3
    I can make one really fast, but I will charge because of so many. Just tell me a fair price, like $2, or a Domain :D
     
    Last edited: Jan 20, 2010
  3. essares1

    essares1 Junior Member

    Joined:
    Apr 27, 2009
    Messages:
    163
    Likes Received:
    27
    Thats very easy to do man. Just look up for 'cURL' on google. Look for some sample code. you can create one urself in minutes. If you can't get it, then I will do it for you for a small price. PM me
     
  4. alan50

    alan50 Registered Member

    Joined:
    Jan 3, 2009
    Messages:
    90
    Likes Received:
    15
    Can anyone point us in the direction of an open source/free way to do this?

    I need this solution as well. Sort of like an automated WinHTTrack. Any ideas?
     
  5. Spaceman

    Spaceman Regular Member

    Joined:
    Aug 8, 2009
    Messages:
    435
    Likes Received:
    53
    I will do it for you today - all I want is 5 links to 5 of my sites - from 5 domains of yours which are indexed in google.

    What do you say?