1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

No penalty on cloaking

Discussion in 'Cloaking and Content Generators' started by Andrea0519, Dec 11, 2016.

  1. Andrea0519

    Andrea0519 Newbie

    Joined:
    Nov 25, 2016
    Messages:
    6
    Likes Received:
    3
    Gender:
    Female

    Pearly White
    - On SEOmoz, we have PRO content like our Q+A pages, link directory, PRO Guides, etc. These are available only to PRO members, so we show a snippet to search engines and non-PRO members, and the full version to folks who are logged into a PRO account. Technically, it's showing search engines and some users different things, but it's based on the cookie and it's done in exactly the type of way engines would want. Conceptually, we could participate in and get all of that content into the engine, but haven't done so to date.

    Near White - Craigslist.org does some automatic geo-targeting to help determine where a visitor is coming from and what city's page they'd want to see. Google reps have said publicly that they're OK with this so long as Craigslist treats search engine bots the same way. But, of course, they don't. Bots get redirected to (or if I switch my user agent). It makes sense, though - the engines shouldn't be dropped onto a geo-targeted page; they should be treated like a user coming from everywhere (or nowhere, depending on your philosophical interpretation of Zen and the art of IP geo-location). Despite going against a guideline, it's so extremely close to white hat, particularly from an intention and functionality point-of-view, that there's almost no risk of problems.

    Light Gray - I don't particularly want to "out" anyone who's doing this now, so let me instead offer an example of when and where light gray would happen (if you're really diligent, you can see a couple of the sites above engaging in this type of behavior). Imagine you've got a site with lots of paginated articles on it. The articles are long - thousands of words, and even from a user experience point-of-view, the breakup of the pages is valuable. But, each page is getting linked to separately, there's a "view on one page" URL, a "print version" URL, and an "email a friend" URL that are all getting indexed. Often, when an article's interesting, folks will pick it up on services like Reddit and link to the print-only version, or to an interior page of the article in the paginated version. The engines are dealing with duplicate content out the wazoo, so the site detects for engines and 301s all the different versions of the article back to the original, view on one page source, but drops visitors who click that SERP to the article homepage in the paginated version.


    Source By: Rand Fishkin