1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Google aint so smart

Discussion in 'Black Hat SEO' started by MDSOperandi, Jul 18, 2011.

  1. MDSOperandi

    MDSOperandi Regular Member

    Joined:
    Jul 10, 2009
    Messages:
    224
    Likes Received:
    192
    Occupation:
    Senior Software Engineer / Internet Marketing
    Location:
    Australia
    Home Page:
    With all this talk of the Panda updates / spammy links and the end is nigh for blackhat etc I decided there was just one thing to do: Do a little test!

    It was to be a combination of the following:

    Intelligent cloaking (More about this later)
    Leveraging an existing domain purchased on the domain aftermarket
    Backlinks via
    * A series of open link wheels
    * PhpLink Directory submission (Xrumer Mod to to this)
    * Forum profiles / Forum posting (Xrumer again)
    * Scrapebox / BacklinkBrigand blog / guestbook posts
    * Manual backlinks via a VA

    I wont go into too much detail about link pyramids etc as all this information is covered in this forum in detail but here are some specifics that some people may not know about:

    Cloaking:
    My cloaking mechanism is really quite simple, I have two pages, one for the bots and one for the humans. Obviously when a bot arrives they see the bot page and when the human arrives they see the human page.

    The challenge here is to ensure that the bot identification database is *always* up to date. Forget about using useragents, they can be faked easily and the bot IP databases that are updated every n hours can let bots new through until the database is updated. If a single bot gets through and compares what one bot sees to what another bot sees and they are different (CheckSum people) then you are in deep $hit. RedFlag Which leads me to my next point

    Checksum:
    You have two pages, one for the bots and one for the humans. It's no secret that computers think in numbers so, what would be the fastest way for a computer to check that two pages (bot page and heaven forbid a human page "seen" by a bot) are different? Pagesize.

    If you have two pages and one is for the human and one is for the bot and they are way different sizes (a few k is no biggie) then you are gone. Think about it, why would one page (which is supposed to be the same page) be different for a bot then for a human??? RedFlag So the key is to ensure that your Human page and Bot page are the same size: Build your cloaked page first and then append   a shim.gif a bunch of <td>'s or whatever to make the two of them the same size (I also ensure that there are the same number of images etc but yo get the idea)

    Screening incoming visitors:
    So, to ensure that every visitor is who they say they are I lookup up my IP database to see if the visitor is a GoogleBot. If they are the Google bot then they get the "cloaked page" and life is good.

    If the IP is not in my database then the "GateKeeper" steps in and does both a forward and reverse DNS look up to see who the IP actually is. If they are the GoogleBot then they are added to my database and shown then "cloaked page". On occasion an IP does not come back clean meaning the forward / reverse does not add up. In that instance I actually make a webrequest and scrape the WhoIS info. Reverse / Forward DNS hardly ever fails but when it does I am prepared.

    So, that is the basic premise of my cloaking, dynamic identification of bots and checksum between the two pages. Works really well and has kept crappy cloaked .infos alive and inside the Google index for coming up to two years so like I said Google aint so smart

    Domain name on the domain aftermarket:
    The second main step of my test was to get a domain with some age to it (It's no secret that Google like aged domains so I cheated here) so I got one off the Godaddy aftermarket (Pretty generic domain which did not contain the keyword), it did have some incoming links and a PR of 2 (which I may / may not lose in the next PR update, it does not bother me)


    LinkBuilding / Niche:
    I then chose my niche (simple clickbank product which paid $45 per sale) and wrote a unique article on the product and then created the keyword rich cloaked page ensuring that the checksum between the two matched.

    I then set about link building using the usual suspects: Xrumer / SENuke / BacklinkBrigand and got my VA to do some manual link building.

    I then used pingler to ping my website and continued to gradually build links over a period of weeks (I think long term).

    Specifically I used SENuke to build "feeder" pages to my main page and then used Xrumer / Senuke / BacklinkBrigand to link to those feeder pages. This is all very old news and something that people have been doing for a long time so like I said Google aint so smart

    I then used Xrumer to submit my new website to phpLink directories (I have a mod to do this) and chose *the relevant category* when submitting the site (A relevant category is very important if you want the phpLink directory submissions to stick).

    A note about Xrumer; I use the rgen.exe util alot, it works and it works well and enables me to link to the main page without to much drama. The old "Hi, I'm new here" xrumer posts are rubbish, learn Xrumer and learn it well.

    I also paid for Decaptcher / CaptchaBot (for Xrumer) to ensure that links were getting submitted correctly.

    The final results:
    I then watched my website climb up the serps (there was a small Google Dance but during this time I continued to build links *gradually*) and it has been sitting between spots 3 and 5 for the last three months.

    Total time to build the website was just under an hour and a half: Total spent on domain and captcha was $9 for the domain, around $10 on Decaptcher / CaptchaBot and paid my VA $22 to do the manual submissions.

    Earnings so far have been averaging four sales per week for the last three months @ $45 per sale: Total profit returned: Just under $2000

    Summary:
    I purchased a domain on the aftermarket to get around the new domain penalty Google aint so smart

    I used intelligent cloaking to feed the bots with keyword "rich" pages Google aint so smart

    I used a tried and tested procedures (LinkWheel) to build backlinks which contrary to what Matt Cutts would have you believe still worked well Google aint so smart

    I used tools that were freely available to purchase on the Internet (Do you honestly believe that Google does not have copy of Xrumer / SENuke et al) and used them intelligently (I'll say that again INTELLIGENTLY) to get backlinks to my website in order to push it higher in the serps Google aint so smart

    I turned an investment of just over $40 and an hour and a half of my time into just under $2000 (so far) using everything that I really shouldn't have Google aint so smart

    Remember Google also killed a ton of beautiful whitehat sites with their Panda update meaning they just blanket applied a computer algorithm that could not tell the difference between some crappy auto-blogs and some peoples "livelyhood" website which had been around for years and had SEO done properly Google aint so smart

    Finally, Google even kicked it's own website out of it's own index

    I'll finish as I started Google aint so smart

    Any questions shoot :D
     
    • Thanks Thanks x 31
  2. jrbobdobbs

    jrbobdobbs Junior Member

    Joined:
    Jul 7, 2011
    Messages:
    116
    Likes Received:
    18
    Interesting stuff - I agree, people think Giggle is all-knowing, like some sentient supercomputer, and it just ain't.

    What's the rationale behind the cloaking though? Why not just make the 'human' page as keyword-rich as the cloaked version?
     
  3. MDSOperandi

    MDSOperandi Regular Member

    Joined:
    Jul 10, 2009
    Messages:
    224
    Likes Received:
    192
    Occupation:
    Senior Software Engineer / Internet Marketing
    Location:
    Australia
    Home Page:
    :) I'm telling you, this cloaked page is so keyword stuffed it's almost undreadable; I just place tons of keywords inside <h1>, <b> <i>, <h2> etc tags. It still works and works well
    Cheers
    MDS
     
  4. gsy159

    gsy159 Power Member

    Joined:
    Apr 29, 2011
    Messages:
    654
    Likes Received:
    158
    thanks for this test. I'll pm you a few question maybe you want to help me.
     
  5. mojstermiha

    mojstermiha Regular Member

    Joined:
    Jul 27, 2010
    Messages:
    447
    Likes Received:
    1,061
    Hey man! I won your BacklinkBrigand in Brad's competition:) Still didn't found the time to set it up, NET.Framework is messing with me:) Nice to finally meet your on BHW.
     
  6. raybes

    raybes Junior Member

    Joined:
    Jul 19, 2010
    Messages:
    120
    Likes Received:
    17
    Location:
    USA
    Feeding the search-engine spiders pages with blatant keyword stuffing is honestly working well? I was under the impression they started penalizing for that. Perhaps not. How long have you had things running successfully under that setup?

    I loved your post though, MDSOperandi. You're quite right. I am just curious about that one part.
     
  7. alman

    alman Jr. VIP Jr. VIP Premium Member

    Joined:
    Feb 23, 2011
    Messages:
    1,322
    Likes Received:
    389
    Occupation:
    Crorkservice.com
    Location:
    Crorkservice.com
    Nobody says google is smartest..
    All black hat methods works today..But every year get in top is harder and harder. And today, when you got it - you have to be the luckiest man in the world to keep good positions..
     
  8. Luxury

    Luxury Junior Member

    Joined:
    Jul 17, 2011
    Messages:
    137
    Likes Received:
    16
    Occupation:
    Brainstormer
    Location:
    The Great North, North America
    Google seems to be doing pretty well, #1 ranking website in world.
     
  9. xbox360gurl70s

    xbox360gurl70s Elite Member

    Joined:
    Sep 28, 2008
    Messages:
    1,532
    Likes Received:
    349
    Location:
    In your wet dreams
    Google is top because they have seasoned spammers in their lineup. Spammers think the same, that is to funnel traffic and humans to view their ads or stuff.

    The only way to beat a spammer when you are a spammer is not going to be simple but a matter of skill. If you dont bump heads with other spam folks like here in BHW, then the Google group of spammers [ now anti spammers ] will always be 1 step ahead of all of us, but if we work together to employ and not saturate some things secret [JR.VIP and EXCLUSIVE VIPs] then we can be miles away from their technology to even dare stop us
     
  10. leonm

    leonm Newbie

    Joined:
    Jul 18, 2011
    Messages:
    4
    Likes Received:
    0
    Occupation:
    my job is looking for my job of love
    Location:
    south africa paarl
    Thanks op. I got hit very hard by big G. This valuable info gave me hope again. all the best.
     
  11. zebrahat

    zebrahat Elite Member

    Joined:
    Aug 6, 2008
    Messages:
    2,349
    Likes Received:
    2,890
    Google is like Jason from the Friday the 13th movies---it's slow and bulky, you can run miles ahead of it, but through some mystical or demonic stealth power somehow catches up to its prey, and later strikes out of nowhere. We may be looking at how to manuever around cyberspace from the vantage point of people inside its universe, but Google is complex enough to see the entire universe, from some place outside it. That's really far away, which explains why there's frequently a delay in it getting around to slapping people.
     
  12. bertbaby

    bertbaby Elite Member

    Joined:
    Apr 15, 2009
    Messages:
    2,019
    Likes Received:
    1,496
    Occupation:
    Product marketing
    Location:
    USA
    Home Page:
    Thanks OP! I just keep reminding myself that it's Eric Schmidt as CEO of Googliath and remember his somewhat problematic turn as CEO of Novell. The article reminds that as Google gets larger it's behavior is resembling Microsoft and its bad old ways. Image is by Manu Cornet at Hat tip Flowing Data.

    [FONT=&quot][​IMG][/FONT]
     
  13. FuryKyle

    FuryKyle Jr. VIP Jr. VIP Premium Member

    Joined:
    Nov 19, 2010
    Messages:
    2,395
    Likes Received:
    1,369
  14. pleasenukeme

    pleasenukeme Junior Member

    Joined:
    Apr 19, 2011
    Messages:
    102
    Likes Received:
    11
    Well the fact its getting harder all the time, is probably not that Google gets any better.

    It's because there are coming more websites and better internet marketers every day - more competitors. But the amount of highly wanted keywords remain almost the same.

    Enevitably it will become harder each day from this day forward, so just exploit it all while you can :cool:
     
  15. seoperson

    seoperson Registered Member

    Joined:
    Mar 23, 2011
    Messages:
    87
    Likes Received:
    37
    So it looks like you didn't blast your site directly with that many crap links? More so did the link wheel thing? Did you even bother with high PR homepage links to your site?

    I agree google is still really stupid in so many ways. Also, yea, I they did ruin a lot of white hat sites with the latest stupid updates.

    Also, their own domain (google.com) went from a PR10, to a PR9, while facebook is still a PR10 LOL!

    Ebay.com went from a PR8 to a PR7...... that is just CRAZY?

    I mean serious, eBay.com? That is one of the most popular websites on the internet.

    So much for too much spammy onpage SEO hurting a site eh? That is very interesting.

    Still though, it seems that blasting your site directly with crappy links definitely ruins it nowadays though. I have TONS of sites that I killed like that :(

    And it wasn't just the link velocity, because it caught up with me and ruined the sites months after the blasting, when I had stopped blasting it with crap links weeks ago.

    Thoughts?
     
  16. friedman

    friedman Registered Member

    Joined:
    Apr 15, 2011
    Messages:
    70
    Likes Received:
    9
    this system will not work forever.

    I know for a FACT from my logs that google (not their bot) fakes referrer information, has access to non-google domains that list as private companies, etc. for spot checks of cloaking.

    As soon as a human from google visually checks the site, then looks at the googlebot version of the site, the game is over.

    This technique will not last long if you're running adwords.

    No longer than six month to deindexing for sure.

    And you WON'T come back from a deindexing...
     
  17. MDSOperandi

    MDSOperandi Regular Member

    Joined:
    Jul 10, 2009
    Messages:
    224
    Likes Received:
    192
    Occupation:
    Senior Software Engineer / Internet Marketing
    Location:
    Australia
    Home Page:
    I'll respond to the posts below

    Hey buddy, good to see you too, funny how the BH community is smaller then people think. Fire that software up, it works well

    Yes, it's called on page SEO (LOL), The old <h1> etc tags still work well and yes Matt Cutts would love you to be under that impression. Re time? Over two years

    Some are / some are not. Like I said in my post specify the correct category when making a submission and if it does not exists then do not submit it. Use Xrumer xprior.txt to setup the correct categories prior to posting

    Sure, you are right, nothing will work forever

    I don't check by referer, I check by IP in real time

    Sure, but you need to trigger a redflag before you get a manual review. If you fly below the radar (checksum / Dynamic IP inspection of *all* incoming visitors etc) then it's possible to fly well below the radar and not trip a redflag. Google simply does not have the resources to manually check every cloaked page against every non-cloaked page. They do it via code hence the checksum routine with the same number of images / same title. But yes, if you did trigger a manual review and got reviewed manually you would be gone

    I don't run adwords / google analytics / adsense on my cloaked websites. I don't even check them with browsers that have the Google toolbar installed

    From my original post



    I couldn't care less; I turned a $41 investment into $2000, show me another investment that provides that type of return and I'm in.

    Finally I have been getting asked about what cloaking tool I used / where I am getting the database IP from

    I wrote my own code to do this, it was not hard to from code

    Code:
    reverse forward dns php
    Code:
    reverse forward dns C#
    Gives you plenty of code samples

    What I am trying to tell everyone here is that do not rely on what you hear from people like Matt Cutts as being gospel. BH still works but like other people have mentioned in this thread there are many more IM's coming online daily. Google has an up-hill struggle to find and kill the BH sites and are taking out nice WH sites along the way in their algorithmic changes so don't feel bad when your sh!tty Autoblog that took ten minutes to setup gets nuked. Learn and move forward and most of all don't trip the
    redflag

    Cheers
    MDS
     
  18. dragonrage01

    dragonrage01 Power Member

    Joined:
    May 19, 2011
    Messages:
    674
    Likes Received:
    155
    This is so funny.
    Google ain't smart?
    It's smart enough to penalize your backlinkbrigand domain.
    Next time you are going to sell a service, make sure you can actually rank high for the keywords that you are targeting while not getting penalized by google.
     
  19. MDSOperandi

    MDSOperandi Regular Member

    Joined:
    Jul 10, 2009
    Messages:
    224
    Likes Received:
    192
    Occupation:
    Senior Software Engineer / Internet Marketing
    Location:
    Australia
    Home Page:
    :rolleyes:
    Just out of curosity what keywords am I targeting and how do you know my domain has been penalized... extra credit for finding all the backlinks I have pointed to the domain doing SEO to get it to rank (Hint: It's between 0 and 0)
     
  20. spasovski

    spasovski Regular Member

    Joined:
    Mar 21, 2011
    Messages:
    394
    Likes Received:
    240
    Occupation:
    Web designer and Internet marketeer.
    I don't know why some members here attack the OP saying all sorts of stuff.

    Please keep in mind that he opened this thread in your favor. He wants you to know what he knows. I know that you'll post bs here but later you'll use this knowledge given by him.

    And why are you people still in the box ? This forum was suppose to get you out in no time... Google is not so smart, because OPs website is below the radar for 2 years now and after all the updates it is still making a fool out of googles algorithms. And he's not the only example i know! There are some other websites I'm familiar with, and they are build and work in this same principle. It just takes some active brain usage and a perfect scheme.

    99% of you will never manage to accomplish something like this flawlessly and outsmart google, so please have some respect for the op.

    P.S. Google's employees are human you know. They are not super intelligent aliens with 500 IQ...
     
    • Thanks Thanks x 1