1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Webs.com blocked robots.txt file

Discussion in 'Black Hat SEO' started by myfault, Jun 29, 2014.

  1. myfault

    myfault Power Member

    Joined:
    Sep 21, 2012
    Messages:
    636
    Likes Received:
    121
    I had created web2.0 sites but few sites disallow/ in robots.txt but google indexed those links but it is showing = A description for this result is not available because of this site's robot.txt. This mean google not giving me the credit of that web 2.0 site in ranking ?
     
  2. myfault

    myfault Power Member

    Joined:
    Sep 21, 2012
    Messages:
    636
    Likes Received:
    121
    ohh... 90 views and no reply ?
     
  3. djw1606

    djw1606 Regular Member

    Joined:
    Jan 24, 2014
    Messages:
    381
    Likes Received:
    187
    Yes, you are not getting credit for the link because google bot can't crawl it. There are certain web 2.0 sites that do this. LiveJournal is another. It only seems to affect new accounts. People who have had a LiveJournal account for some time don't have this problem. I think I read somewhere that if you post regularly, then LiveJournal will remove this block.
     
  4. LiquidKnight

    LiquidKnight Newbie

    Joined:
    Jun 30, 2014
    Messages:
    17
    Likes Received:
    1
    It's a block that more and more web 2.0 websites are setting up to prevent people from just creating accounts to generate backlinks.
     
  5. wisdomkid

    wisdomkid Jr. VIP Jr. VIP

    Joined:
    Jun 20, 2011
    Messages:
    2,594
    Likes Received:
    750
  6. umerjutt00

    umerjutt00 Jr. VIP Jr. VIP Premium Member

    Joined:
    Oct 28, 2011
    Messages:
    3,648
    Likes Received:
    1,906
    Occupation:
    Ninja