They are too many footprints in this thread, i have not time to use all of them.
My Scrapevox works 24/7
Thanks for sharing!
Can you post footprints for zf sites?
How many platforms footprints do you have?
You're a god, thank you! <3
First bhw visit and i'm already starting to love this place
anyone care to share what footprints are? sorry if noob question?
Sure no problem ..... just click this.
Enjoy ..... ..... popzzz ~ BlackHatWorld.com
Wow this is an amazing list, I have a scraper and poster that will post to any anonymous platform so one where you don't have to make an account. How can I go about sorting these footprints into anonymous only?? Im alittle new to this.
Last edited by silentrunner2; 04-13-2013 at 01:36 PM.
Your going to download a program called Sick Platform Reader which is a free tool to Analyze your URL list and then split them into platforms. Depending on the size of your list it can take around a day to complete, again running on your server.
With the list now split into sections you will have multiple platforms to which you can submit links to.
Coming back to Article Kevo I’m going to choose all the platforms this software supports and import the list into the software, it will then check for supported sites and add each URL into the correct sections in the software. Other SEO software you use should have this function also.
From here you want to grab your test domain, setup the software and project to submit all the links you just scraped then it’s time to rock and roll.
Hopefully you will have a nice big success list of new sites you just submitted.
Grab the verified URL’s of all the submitted sites and now download a free tool called Scrapebox Do Follow Tester.
Add your success list into this and it will scan each link and find the Do Follow back links which you can export and save into a .txt file.
So in 7 days you have just created your own private Do Follow URL list which can now be used on your main website safe in the knowledge that you are getting quality do follow back links.
Found on another site. Thought I would share becuase it was very helpful to me.
Hello gtreeoutsourcing can you please share the below Footprints.. I check whole thread but dint find it.. this will be useful for everyone.
Thanks for your Awesome Share men..
Last edited by WayneInc; 04-22-2013 at 09:56 AM.
DedeEIMS FootPrintsCode:http://www.blackhatworld.com/blackhat-seo/black-hat-seo/491042-get-huge-search-engine-optimization-footprints-collections-7.html#post5234097 "Powered by PHP Nuke" inurl:modules.php?name=Forums&file
Vanilla FootPrintsCode:"powered by DedeEIMS" inurl:guestbook.php
Code:"Powered by Vanilla" inurl:/profile/ "Powered by Vanilla" inurl:/index.php?p=/profile/ "Powered by Vanilla" inurl:/entry/register
Can you pls share the Article-Moodle Foot print..
Not all are working well.
wow its really awesome good work
This member has been permanently banned from BHW.
the best of all what you can use for seo thanks a ton
I apologise for my lack of understanding very new to this. Could someone please explain what these footprints are or what I can do with all these lists of stuff.
Code:http://www.blackhatworld.com/blackhat-seo/black-hat-seo/491042-get-huge-search-engine-optimization-footprints-collections-3.html#post4850890 http://www.blackhatworld.com/blackhat-seo/black-hat-seo/491042-get-huge-search-engine-optimization-footprints-collections-3.html#post4835002 http://www.blackhatworld.com/blackhat-seo/black-hat-seo/491042-get-huge-search-engine-optimization-footprints-collections-2.html#post4780910
hai gtreeoutsourcing.This thread help newbie like me a lot...could you find these type of footprints?
Article Friendly Ultimate
pHp Link Article
php link article-login
press releasae script
article directory pro
Thanks for your help =)
XRumer is hard to learn at first. But if you need any further assistance just PM me and I can organise a tutorial
Thank you so much for these footprints!
Do you have any footprints for PDF sharing sites?
This is amazing..... I seriously have no idea how much this is going to help, but its going to HELP haha
Is this useful for GSA SER? I need niche blog comments or similar about making money online
Can you recommend footprints for media wiki, wikka wiki and macos wiki?
I don't know what is this and what the use of this
Example :Google "site:.edu "forums register" It will give you list of .edu sites where you can register.
Edu Footprints - To find .edu sites, which accepts registrations, comments, etc
Guestbook Footprints - This is to find guest blogging opportunities.
You just google and follow the results you will find out.
*You need to insert your keyword by replacing the word "keyword" on the search query.
Would it make much sense to use these footprints now that they are posted publicly?
Since everybody and their mom will be whoring these footprints out, there will most likely be tons of spam links already on all of the pages that these footprints bring up.
Am I right?
Any footprints for website builders? like "Powered By Website Tonight" and "powered by godaddy"
Very very nice share. As some have already mentioned, this is perfect for outsourcing. And I don't agree they'll be spammed to death. Just think about how many different keyword combinations you can have.
I think i have to spend my full life time into it
how to use this???? I am such a noobie. I need great lesson from you guys. Please help me!!
Sure no problem ..... just click this.
Enjoy ..... ..... popzzz ~ BlackHatWorld.com
wow what a great list, i copy and pasted them all ..This should keep me busy for the next 1000 years, thanks =;0)
Do somebody have Dutch scrapebox footprints?
Article Beach FootPrints
Article Friendly Ultimate FootPrintsCode:“inurl:index.php?page=submitarticle” “Articles with any spelling or grammar errors will be deleted” inurl:index.php?pagedb=Submission Guildlines "Here are the most popular 100 articles on" “upload your articles and keep updated about new articles.”
Press Release Script FootPrintsCode:"Powered By: Article Friendly Ultimate" "This page took Micro Seconds to load." "Powered By: Article Friendly Ultimate" "Newest Authors" "Powered By: Article Friendly Ultimate" "Our New Articles" "Powered By: Article Friendly Ultimate" "inurl:/submitarticle.php" "Powered By: Article Friendly Ultimate" "inurl:/signup.php"
vBulletin Blog FootPrintsCode:"Powered by Press Release Script" "Sign-Up" "Powered by Press Release Script" "Most Rated Press Releases" "Powered by Press Release Script" "Expert Authors" "Powered by Press Release Script" "Recently Approved" "Powered by Press Release Script" "Hot Press Releases" "Powered by Press Release Script" "Editors Picks"
YAD FootPrintsCode:"Powered by vBulletin" inurl:/blogs/
Article Friendly FootPrintsCode:"Recently Approved Articles" "Free Article Submission" "Recently Approved Articles" + "Sign-Up" "Recently Approved Articles" "Expert Authors" "Most Popular Articles" "Hot Articles" "Editors Picks"
Moodle FootPrintsCode:"Powered By: Article Friendly" "total articles" "Powered By: Article Friendly" "Keywords"
moodle "public profile"
PublicBookmark FootPrintsCode:inurl:"/blog/index.php?postid=" moodle
UCenter FootPrintsCode:"melden Sie sich zuerst kostenlos an!"
Code:"Powered by UCenter Home"
Sorry to be a noobie but can anyone tell me which of this footprints will work in xrumer?
I have been looking for an xrumer platform list but have yet to find 1.
can I ask something? what are these all about? can someone explain it to me please..?
basically its for spamming.
big thanks to OP. this is HUGE
If you want to scrape websites for xrumer, try any footprints from here for guestbooks or forums:
Open folder where your Hrefer is installed-> Open "Templates" folder -> create 2 txt files: for example, forum.txt and forum_addwords.txt
open forum_addwords.txt -> add some footprints from this thread
open forum.txt -> add footprints like: /forum/ or /register/ (hrefer will check if any of the added pieces appear in each scraped url to sift trash)
open "Words" folder -> create .txt file and put there a few thousand words associated with your niche
open "Hrefer" -> harvest some proxies (I prefer using paid proxies) -> find your file with your niche words -> choose "forum" in "search engines options and filter" -> addwords will be chosen automatically -> choose search engines ->choose a .txt file where to save your freshly scraped forums -> scrape a few thousand forums -> copy the file where they are saved and paste it in "Links" folder in Xrumer -> submit
Something like that)
i have been lurking here and there for information...as i have increased my knowledge, i can definitely say that this is one of the most helpful threads out there...amongst the top