1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Scrapebox: Dont understand it

Discussion in 'Black Hat SEO Tools' started by verticall, May 1, 2010.

  1. verticall

    verticall Regular Member

    Joined:
    Aug 10, 2009
    Messages:
    309
    Likes Received:
    16
    I've read alot on Scrapebox and watched the videos but I dont understand the importing and exporting of links. No matter what I do, I have no clue how to keep track of links so I dont re-comment on them. I watched all the videos multiple times but still I am not sure how to do it.

    Can someone explain to me how its supposed to work? Or link me to a good video? If you have AIM and can help me pm me
     
  2. nixnash

    nixnash Power Member

    Joined:
    Oct 26, 2009
    Messages:
    581
    Likes Received:
    204
    Occupation:
    Student
    Location:
    BHW
    When you open scrapbox , On the bottom right hand side you will see a option import export url /Import export url with pr.

    Select one of these options , you will get a list of files options you want your links to be in txt file , CSV file , Xls file format.
    if you know which format you might like ..test with all file formats , I would prefer choose xls sheet and save it on your desktop...now open the file...Voila..you have all urls on your desktop..

    Hope it helps..
    hit thanks if it helps.
     
    • Thanks Thanks x 2
  3. SpeakToTman

    SpeakToTman Regular Member

    Joined:
    May 31, 2009
    Messages:
    249
    Likes Received:
    158
    Location:
    BHW
    Home Page:
    Thanks, but i dont think you fully understood the OP's question...if i understand it correctly, he wants to compare his harvested urls with a an imprted list that he has already commented on...this way he does not comment on the same blog twice...

    @OP: When you have the harvested URLs in the list, on the right click "Import" then the option that says "URL Lists to Compare"...then select the txt document of the urls that you have already commented on. This will remove any duplicates from your harvested list..
     
    • Thanks Thanks x 1
  4. verticall

    verticall Regular Member

    Joined:
    Aug 10, 2009
    Messages:
    309
    Likes Received:
    16
    Yeah, thanks. But now my question is: Instead of having multiple documents, how do I combine it all into one? Do I always need to hit "Explort URL LIST" -> "Add to existing list"?
     
  5. The Captain

    The Captain Regular Member

    Joined:
    Sep 12, 2009
    Messages:
    343
    Likes Received:
    188
    why don't you just use the "blacklist" option
     
  6. Matt123

    Matt123 Junior Member

    Joined:
    Mar 3, 2010
    Messages:
    120
    Likes Received:
    46
    Occupation:
    Being the Man all day long :)
    Location:
    The edge of genius
    Home Page:
    Use the blacklist option or just open the .txt file and add them to the list. Theirs not currently a feature on there that let you add to the list so you will have to do one of the options above.
     
  7. verticall

    verticall Regular Member

    Joined:
    Aug 10, 2009
    Messages:
    309
    Likes Received:
    16
    I just tried out the blacklist option and it is not working. I went to the Local Black List in ScrapeBox, added all 500 sites i scrapeboxed, ran the harvester and it still scrapebox'd all of them. So then, I went to Blacklist.txt, added them manually, saved it and its still doing it.
     
  8. gregstereo

    gregstereo Elite Member

    Joined:
    Oct 5, 2009
    Messages:
    1,833
    Likes Received:
    1,027
    Occupation:
    I'm known to locate certain things from time to ti
    Location:
    Moose Factory, ON
    Is "Use Blacklists" ticked? (under the Black List menu)

    And when you say you "ran the harvester and it still scrapebox'd all of them" - what does that mean?
     
  9. verticall

    verticall Regular Member

    Joined:
    Aug 10, 2009
    Messages:
    309
    Likes Received:
    16
    Yes its checked.

    That means that I added all the URLS (about 500 of them) to the TEXT file, and ran the same URLS that I harvested and they successfully ran when Scrapebox should of blocked them.
     
  10. Yukinari84

    Yukinari84 Elite Member

    Joined:
    Dec 12, 2007
    Messages:
    2,474
    Likes Received:
    4,665
    Occupation:
    I'm retired ;p
    Location:
    Somewhere in space...
    I just save the URLs after every harvest and comment to a text file for each niche/promotion I'm doing and label it something like "commented on + date of update".

    The next time I harvest URLs for the same niche/promotion, I just use SB to add the list of URLs from my "commented on" text file then use SB to remove duplicates.

    This way I always have a list of fresh URLs after each harvest and it takes me about 30 secs to do it.
     
  11. The Captain

    The Captain Regular Member

    Joined:
    Sep 12, 2009
    Messages:
    343
    Likes Received:
    188
    Umm, I'm pretty sure it will only delete ONE of the duplicates, not both.
     
  12. broker

    broker BANNED BANNED

    Joined:
    Feb 22, 2009
    Messages:
    135
    Likes Received:
    41
    This is the best solution, imho.
     
  13. SpeakToTman

    SpeakToTman Regular Member

    Joined:
    May 31, 2009
    Messages:
    249
    Likes Received:
    158
    Location:
    BHW
    Home Page:
    I do something similar to Yukinari...except i dont import the urls that i have already commented on to my harvested list...unless im finding it difficult to get my comments approved...then ill import an old list that have successfully been commented on before (within the same niche), add it to my harvested list, remove the duplicates then start posting...because this way a large sum of the urls will accept my comment again.

    If i dont want to post on the same blog twice, i save the lists i have already commented on in the past...and i compare them to my harvested list as i mentioned above...this removes all the blogs i have already commented on from my harvested list..
     
  14. Yukinari84

    Yukinari84 Elite Member

    Joined:
    Dec 12, 2007
    Messages:
    2,474
    Likes Received:
    4,665
    Occupation:
    I'm retired ;p
    Location:
    Somewhere in space...
    The point is my list is comprised of places I already commented on.

    Therefore, by adding that list into a fresh harvested list, SB will remove all URLs that are the same as my old list leaving me with a list of fresh URLs.
     
  15. proxygo

    proxygo Jr. VIP Jr. VIP Premium Member

    Joined:
    Nov 2, 2008
    Messages:
    10,221
    Likes Received:
    8,692
    just thought ide drop in the thread and say
    hi to greg / yukinari
    i see you guys are still tutoring the noobies
    good on ya

    [​IMG]
     
  16. irie08

    irie08 Junior Member

    Joined:
    Aug 23, 2009
    Messages:
    138
    Likes Received:
    52
    I do the same as Yukinari pretty much.

    After running a SB blast, I save the "posted" urls to a folder within my comment project directory, called "Allposted". I use these so I can ping them later, and also check to see what links stuck.

    I make another folder in my project called "Blacklist" or something like that, where I export ALL URLs that I harvested and sent comments to. So next time, I can easily load those lists and filter out any URLS that I already hit. You can load as many lists as you like, all at once, and SB will merge them. So you don't have to check each list individually...