Is there a tool or script or book for cloning websites?

Discussion in 'Black Hat SEO Tools' started by NeedTips, Feb 26, 2009.

  1. NeedTips

    NeedTips Registered Member

    Feb 26, 2009
    Likes Received:
    Hi, all:

    Let's say I've a reseller hosting account and I created a structure I like for the website for a domain say DommainName01. Then I want to create identical website structure duplicates for, and so on.

    Obviously, it would be foolish to do each subsequent website structures manually as it takes time. So some sort of automated script is needed. Is such software or script available?

    I am no linux person, but I do know MS-DOS commands and DOS batch files quite well.

    Should I learn linux, or am I better off using a windows hosting (assuming that in the windows web-hosting world, one can use DOS commands and batch files to automate creation of folders and structures, and do upload, file moves, deletes and renames etc.)? As you can see, I've never had experience with windows hosting. I do have getting-by rudimentary knowledge of cpanel in linux hosting environment.

    What's the easiest and quickest approach? I have about 10-20 hours to learn to do such things if they don't take more time than that.

    After I've created these groups of websites, I need to upload different files to each. Once again, I like to do this in batch mode. What would be the best FTP client to use that allows me to automate, using command line interface and batch files, daily uploads to say 1,000 different websites (assuming I need to post new information and new files daily for each site, and each website requires different files of pre-determined file names)?

    Any tips, advice and experience sharing are much appreciated.
  2. cooooookies

    cooooookies Senior Member

    Oct 6, 2008
    Likes Received:
    You can simply copy a webpage with several linux-tools (do not remember well what I used once, maybe 'htdig'). But you can only copy delivered html content which is then static. Any php/any dynamics (like for instance database queries) further than java script cannot be copied. If you also want that, you have to hack the site.

    If you still want to do that and have no idea of linux, 20h is not enough. However, in our business, for many people linux knowledge is a must, although it is possible to make money without. But you will often come in situation (like this one :) ) where you feel a little helpless, so I recommend that you give yourself some basic linux knowledge.
  3. misterajc

    misterajc Registered Member

    Aug 19, 2008
    Likes Received:
    Learn linux. It's trivial to recursively copy a directory structure, preserving file permissions etc. "cp -a <old directory> <new directory>" should do it. When designing your web site, make sure all the links are relative rather than absolute, so they will still work on a different web site.

    If you want to clone someone else's web site, wget does a fine job for static web pages, and will grab all the images and so on.
  4. BTOffensiveSecurity

    BTOffensiveSecurity Newbie

    Feb 13, 2013
    Likes Received:
    Try to use HTTrack
    and yes you need to know something about linux to do this, for an example: figure out how to install HTTrack from Ubuntu Software Center (if you are using Ubuntu distribution)
  5. olystyle

    olystyle Regular Member

    Jan 6, 2012
    Likes Received:
    Just a side notice httrack can easily be used from the commandline and i am pretty sure that you can pass a .txt file containing all the websites you wish to download as an argument...

    cheers olystyle
  6. agriya

    agriya Newbie

    Oct 6, 2012
    Likes Received:
    i am using grabber bots :)