I'm copying a website. The primary purpose of the project is to get the website's \website\users subdirectory, but (I assume) permissions issues prevent me from browsing or copying the whole subdirectory completely. HTTrack will copy individual files (\website\users\someuser) one file at a time, if it is following a link from somewhere else in the site, but the whole purpose of the project is to get ALL the users in one batch, and only the users. I've changed the setting so "robots.txt" is ignored in HTTrack and have messed with other settings to no effect. Can anyone help? Thanks in advance.