Got 30.000 urls to download images from, how?

Discussion in 'General Scripting Chat' started by ghettogong, Jan 3, 2010.

  1. ghettogong

    ghettogong Regular Member

    Joined:
    Oct 7, 2009
    Messages:
    299
    Likes Received:
    28
    I just joined a company were each of their product images is located at a url.

    The URL looks something like this:
    mysuppliercompany.com/images/1/1/24324532543.jpg

    and list ends with
    mysuppliercompany.com/images/9/9/76564565465.jpg

    I tried running this list of 30.000 urls trough httrack but it when it finished it only downloaded 12.000 images and not 30.0000

    Anyone knows a software were i can run my url list through and it will save all images on computer?
     
  2. borgvall

    borgvall Registered Member

    Joined:
    Jul 25, 2009
    Messages:
    62
    Likes Received:
    2
    I think there is a plugin for Mozilla called "Download them all" try that plugin!
     
    • Thanks Thanks x 1
  3. ghettogong

    ghettogong Regular Member

    Joined:
    Oct 7, 2009
    Messages:
    299
    Likes Received:
    28
    Thx for that, but i have sticked to firefox 2, and so cant install this addon. P.S i love my firefox 2!

    Any other programs were i can drop my big list of urls and it will download from each url and save it to my computer?
     
  4. appleman

    appleman Regular Member

    Joined:
    Oct 30, 2009
    Messages:
    358
    Likes Received:
    97
    http://club.myce.com/f3/how-download-multiple-files-73860/

    google will find you something will a little looking if u know how to use it
    as for it..they sound like they are on the right track with flashget you just need to edit the notepad file links to a webpage and then click download all with the program or something
     
    • Thanks Thanks x 1
  5. AgentOrange_MkUltra

    AgentOrange_MkUltra Junior Member

    Joined:
    May 29, 2009
    Messages:
    180
    Likes Received:
    44
    Occupation:
    Hijacking Your System
    Location:
    Many steps ahead & Behind 4 Proxies + double VPN!
    • Thanks Thanks x 1
  6. AgentOrange_MkUltra

    AgentOrange_MkUltra Junior Member

    Joined:
    May 29, 2009
    Messages:
    180
    Likes Received:
    44
    Occupation:
    Hijacking Your System
    Location:
    Many steps ahead & Behind 4 Proxies + double VPN!
  7. bevardis

    bevardis Registered Member

    Joined:
    Jul 22, 2008
    Messages:
    89
    Likes Received:
    37
    If you have direct links to all of them, you could try something in python.

    import urllib
    for item in open("urllist.txt","r").readlines():
    urllib.urlretrieve(item)

    This should work if you have direct links in txt file and all files have different names.
     
  8. ghettogong

    ghettogong Regular Member

    Joined:
    Oct 7, 2009
    Messages:
    299
    Likes Received:
    28
    Yup i have a text file with all different urls..i only know pyton is a snake, so could u tell how to create that script please/?
    Posted via Mobile Device
     
  9. n2zen

    n2zen Regular Member

    Joined:
    Sep 27, 2009
    Messages:
    269
    Likes Received:
    71
    Lazy way? Download wget, and then run the following in a command prompt or batch file:

    ini file can be used and tweaked to adjust timeouts and tons of other options.
     
  10. ghettogong

    ghettogong Regular Member

    Joined:
    Oct 7, 2009
    Messages:
    299
    Likes Received:
    28
    Ah ok...so the name of my txt file has to be named: urls.txt right? And then enter the following command:
    wget -i file_with_urls.txt right?
    Posted via Mobile Device
     
  11. n2zen

    n2zen Regular Member

    Joined:
    Sep 27, 2009
    Messages:
    269
    Likes Received:
    71
    wget -i urls.txt
     
    Last edited: Feb 6, 2010