Discussion in 'Black Hat SEO' started by maxlinks, May 3, 2012.
yes, we are providing google with all the info they need to hang us....
well the board is public and if this wasn't another one would be.
Completely pointless thread so far. Would you like to elaborate some more?
that's what jr. vip is for
lol, every time they do a panda or what ever, we feed them all the info they need on how its working for them,
oh I'm junior VIP, didn't even notice
I can't really understand the conclusion of your sentence. So you say that it's a bad thing that this forum is public? Even if it was private there will be no difference. Still they would be able to find info.
The only way of G00gle not knowing what's happening in forums would be not to create forums = No info for us too. I agree with codo that it's a bit pointless please do elaborate some more?
He's saying when we come up with new ideas to fool google into ranking our sites,they will create a update to penalize what were doing and knock us out of the SERPS
It's kind of obvious that Google is going to react to BH SEO (a complete exploit of their system). If you don't know this and aren't able to adapt and be prepared for it, you shouldn't be in the game. It's fairly simple. No use in complaining about it.
The community is necessary. Nobody can solve the google algo alone. We need each other to beat the system.
The bright side is that its much easier to exploit a loophole than it is to fix a system that has a loophole. We have a significant advantage.
I agree...google always has to react to what we are doing...google bot is an algorithm, all math problems have a solution, now we need to find it...
Yes, that's the important thing to remember it's Google is an algorithm, not a personal vendetta against us and our sites.
The the internet is just too huge for it to ever be manually done.
Yes Google will probably monitor this forum every now and then, but remember Scrapebox was released years ago. The Penguin update is only really a response to that NOW.
They just had to do something about it. Scrapebox is pretty mainstream now.
I could launch a WP blog now and once it's indexed and fairly high in the SERPS it'll get 10-100 Scrapebox comments a day.
QFT. They have administrative inertia. They also have more to lose if the results are not what Users expect, due the centrally-controlled structure of their organization.
There is a paper I read a while back about the advantage of (in that case) genetic diversity. When major environmental changes occur, having a diverse population allows those that are able to adapt to a changing environment to survive, even while others die-off.
When Google makes a change, some will die and others will not. Here's a list of false beliefs that come about as a reaction to that change:
1) Skill will prevail. The "skills" that gave past success are the very same that created present failures. In the long term, you are just as likely to fail as a direct result of your skill as you are to succeed. To borrow from Taktical, the skill of exploiting a loophole puts a big target on your backside, with the label "Kill this Loophole Exploiter" on it.
2) Blind luck. God doesn't roll dice with the universe, and neither does Google.
3) "Wait it out". There's no guarantee that the recent change in the algorithm isn't only the first step, and those that dodged the first bullet will not dodge the second.
4) "keep doing what you've been doing" Same as above.
5) Google screwed me, and now they are going to fail. Most Users are glad to see a spammer die. The stupidity behind that belief is the very reason why everyone hates them.
6) I'm going to start spamming Bing, Yahoo and Baidu and get rich that way. If "that was" was such a good idea, you would have done it years ago.
I've been reading a lot since Penguin. I could probably add more to the list, but while constructing it I realized the point is larger than the sum total of the items on the list. The point is that most of what is being written now is wrong, and it's being written by people who were wrong before Penguin.
I think some of what Google has done isn't difficult to understand. First and foremost they are focused on what they call relevance. Having a clear understanding of that word is critical. Giving spammers a good income is not one of the components of the definition of the word "relevance", and anyone that thinks otherwise should be immediately and permanently ignored. If they can't see Google from Google's perspective, then they can't see Google at all.
Second, what are the tools at Google's disposal, and how would you use them if you were Google. If your goal is to remove spammy search engine results, which types of results would be the safest/easiest results to diminish, with the lowest level of risk of diminishing the Users search relevance. Describe a list of qualities that make a site spammy, and how would you use Google's tools to identify them.
I think bounce rate is the most obvious choice. If a #1 site has a 90% bounce rate and "bounce" is defined as back-button or window close within 10 seconds of the click, and the #2 site has a bounce of 34%, what would you do if you were Google, DUH? But I read these posts by these dumb f-ing spammers and they are all like "Oh pish-post with the "bounce rate". So a few little Users get their panties in a wad because they clicked into a site promising a large erection when their searched keyword was "long underwear"." Although, a small voice just told me I should encourage them to think this way, for obvious reasons.
Another "gem" I'll post again, because the Washington Times failed to call me the next morning after I posted it the first time, is that this mysterious "Google Dance" that causes an uncontrollable urge among the natives to gather together in the center of the village and start gyrating wildly in their grass skirts and chicken-bone pierced through bared breasts, may actually be testing bounce rates when sites A, B and C are ordered 1, 2 & 3, then 3, 2 and 1, etc... (See mathematically pseudo-scientific chart below. Grass skirts with optional coconut protector and other safety equipment is recommended before reading.)
Case A: 1, 2, 3
Case B: 1, 3, 2
Case C: 2, 3, 1
Case D: 2, 1, 3
Case E: 3, 1, 2
Case F: 3, 2, 1
I'm still developing the idea, but consider that in each case (A through F) the total volume of searches in the first 3 results might change. For example, during testing period Case A (1, 2, 3) had 34,000 searches, but during a different testing period Case F (3, 2, 1) had 98,000 searches. What would that mean?
If I were Google, I'd start looking for the "sweet spot" and that's how the results would move towards.
Now. Most people might read this, and that's the extent of their universe, and God love them for it. Thanks for the gumball Mickey.
But we (me, you and Google) are all smarter than that, and we know that if 3, 2, 1 is good, then maybe 3, 2, 4 is better.
And that's where this furball completely explodes. Who could possibly have enough computing power to make continuous tests of bounce rate (and who knows what-all else) of all the millions of sites on the internet, raised to the exponential power of all the possible keywords.
Where Google's computing power = CP and,
the number of webpages on the entire internet = W and,
the number of possible variations of english words (up to and beyond 12 characters) = KW then,
CP = W to the KW power or (I think) W^KW.
I'm tired of reading a bunch of the mathematical equivalent of illiterates going on about what Google cannot do. Their attempt at limiting Google's abilities is an expression of their own limitations and nothing else. The small fish live in a very small pond, as it's limited by the size of their imagination.
I think the pond is big. REALLY fucking big. So fucking big I KNOW I don't know how big it is. And I also know that those that think they do are stupid, and paying too much attention to them could cost you money. Big fishes live in big ponds. It's a big fucking pond, is my point.
Well i think some google employee or slaves here has authority accounts
So jr vip etc won't work.
Nigel, your logic rests upon the assumption that all sites that are ranked by blackhat techniques are inherently spammy. the population of obvious spam sites has been reduced greatly over the past few years as google's ability to analyze on site content has grown.
today, many of our sites are decent, and some are actually quite well designed and useful. the assumption that just because a website is for-profit then it is not what a user is looking for is axiomatically false.
exactly. this is the biggest mistake of that matts. some people here do awesome site and clever more than to spend on adwords.
We are Google's testers and developers and they don't even pay us! Shame on you Google!
Yes, when "spammy" has an evolving definition, and only Google knows what it is for certain. What's that quote?
Oh yeah, Bob Dylan.
We all have to take the risk of G00gle getting info from us... otherwise how would we share knowledge/methods?
Yea he's right, google might have a dozens of spy users here sending them reports about what we are doin here ha ha ...
Separate names with a comma.