1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

[Private Article] How to cheat next Google Penguin 2.1+ updates, be safe and Rank top10 ea

Discussion in 'White Hat SEO' started by jaksana03, Oct 10, 2013.

  1. jaksana03

    jaksana03 Jr. VIP Jr. VIP Premium Member

    Joined:
    Dec 10, 2010
    Messages:
    1,386
    Likes Received:
    449
    How to cheat next Google Penguin 2.1+ updates, be safe and Rank top10 easily ? SEO Method

    This is type of not do it at blackhat seo home theory or do at your own risk ;) That is why im hiding the post form public because im not trying to make public do that. This is just idea of how can this be done and why it will work. It is an loophole that after some updates might be fixed and there is already many people using it now but in quiet circles.


    How Do Google`s Penguin Updates Work
    As we know Panda was for onsite factors, while Penguin is pure offsite ? speaking shortly penguin is about links. From the time Penguin has started, the ear of negative seo has arisen. Currently over last 2.0 and 2.1 tweaks its literally super easy to penalize most domains. It is big advantage to SEOs because competitors can now hurt others really easy and actually i could imagine in some niches people are ranking sites by constantly deranking others. Ive seen some of that but its not part of this post ? its just food for thought, esp for Google to change the way they treat things. What matters here is they are tweaking algorithms around links pointing to the websites in question.
    Important thing is Penguin doesn't run 24/7 but its refreshed and started once in a while. That means that they do some tests, findings then run the penguin to kind of ?rerank? stuff the way it likes ? of course due to backlinks coming to each website, its anchors, content quality, trust rank etc.


    The Perfect Penguin Algorithm

    If it existed it would ? know if content is readable and has sense as a whole, see if link is hand placed or not, see if link was bought, see if site is spammy or not. Shortly ? it would have to be human. Since google cannot determine all of that withing the algo and they are aware of it they put LOADS of human moderators to check top niches, manually put sites there and of course dig how they ranked. It is still not enough. There is now so many tools and so many spammers/seo people its simply impossible for them to determine whats going on with their humans and they cannot tweak algo too hard because they are aware it would crush too many fully legit sites. So they came with idea.. Disavow tool. You are their free moderators this way actually telling them what links to your site you think are bad. And they are not stupid ? they know what you will put is links you created automatic [or bought somewhere], because you have no other way of telling them whats good or not. SPeaking shortly ? you are doing their job on massive scale if you are disavowing. You are the google moderator now, and if you lost rankings i nthis update you are partially responsible for it.
    Am i clear enough? Penguin = Backlinks = Disavow Data. This stuff is that simple, and since so many of you work for them free ? they are cpable of doing updates so often now. This reminds me one movie quote ? I loved a girl, i could give my hand cut for her. And you know what now? Now i wouldnt have fuckin hand.
    Not sure if this was funny for you or if you know original [its non english movie tho..] so lets get back to topic. Anyways i can give my hand 90% of Penguin updates from now on is simple filters made of Disavow data that you share. Just like Xrumer finds patters in links within their tool ? Google does the same but on larger scale. When they think they nailed it ? they add it to Penguin update.


    Manipulating Disavow Data
    Yes i bet you felt it now. Imagine such hypothetical situation. 43534634646s of webmasters instead of submitting their auto generated/spammy links they would submit fully legit links and domains, mostly of currently top10 ranking sites. After some time Google would have to almost hand moderate everything the algo finds from disavowed links, because they would never know if it was actually legit or not. Im sure they are aware of it and have lists of trust rank domains that disavowing doesnt pass etc but still they cannot have that for everything. They surely cannot distinct articles from WAC and real ones.
    In early stages what it would do is simply neg seo comptitors, but after some time Google would have to put filters down and make SEO like it was before ? with less possibility to use negativity aka to fuck up ranking by overdoing something. Thats why im not sharing this public imagine some of you decide to do that [im repeating - this is just loophole in their idea, im not telling to try it!] , but if 3245235 people do it then there could be massive war with putting links and that would fuck up whole user experience in the end.


    So How Would This Be Done

    For start you need to have sites with manual penalty [yes if you dont have manual penalty you cant submit links - they protected that part]. Its not hard if you are doing lots of SEO.
    1. We add penalized sites that lost ranks to GoogleWebmasterTools or add new sites that we will try to rank within 48h with zillions of links or get penalized.
    2. We scrape links of competitors using Ahrefs in our niche that we didnt rank/got penalized and/or some others that we know are interesting.
    Im not speaking here of choosing .gov .edu or wikipedia sites or niches with superb rotation and manual checks [payday, pharmacy, casino] but some more stabile ones.
    3. We run out automated tool like GSA SER, Magic Submitter or whatever ? just it needs to be able to auto post to many platforms ? and we try to post to as many scraped links.
    4. We take all the links we created + ones we scraped and add all to disavow tool and ask for reconsideration request.
    Imagine big number of such actions on big or small niches and impact it would have. The next penguin could fully derank white sites and leave spam.