Discussion in 'White Hat SEO' started by almir012, Jan 26, 2011.
Yearly Google Public Relations scare. It's going to be an update that cleans up some of the crap out there. Then we find ways around it, and the cycle starts over. Good thing is, a lot of the newbies are going go to be weeded out so it's good for us.
What I suspect is that they'll go after grammar, and those keywords that the noobs have screwed up with SB.
This is also why you don't cheap out when doing SEO. Buy some well written articles and get some good tools. Also, spend more time working on your campaigns than asking why your site isn't getting indexed.
I was just doing some KW research tonight on a couple of KW's I'm targeting. I wanted to see if I could hop on some of the pages that my competitors are linking from. Well, I discovered a site that is sitting at #3, less than a year old site and 95% of the links were pr0 form post links and the other 5% was from 2.0 sites, bookmarks and a blog roll or 2. There was 1000 links total so nothing special. They keyword gets about 30k searches a day so this site is getting some nice traffic all from spammy BH methods. Also consider that the competition isn't to hard, but tough and are some high pr sites that are fortune 500 companies. This goes to show that Google doesn't go after link farms like they should. I really don't believe to much of what I've read about how Google handles links, since I seen what I seen earlier today.
This is nothing new really when you think about it. Ever wondered why an ezine article will rank a lot easier than a goarticle one? Google knows ezine content is generally of much better quality when compared to a spam sink like Goarticles.
Lol, here we go again.
All google is going to do is devalue the power of the link from the article directories, but it cannot punish the site thats linked to it unless its from a seriously suspect site. Think about how easy it would be to sabotage competitors if you could create a ton of shit links using poor content and you just spam the web with it.
Which means that article sites with a high PR will probably become even more valuable.
My gut feeling at the moment is as follows, but maybe you guys can add to this or tell me if you are experiencing different.
1 - Onsite SEO is becoming more important again and internal linking structure says a lot about the site. Some basic tweaks I have made to my sites have made a world of difference to my rankings, not to mention more links to pages on my sites within content itself
2 - Blog comments are showing little value these days, while blog posts are still worth their weight in gold when you get them. Seems as if all the blog spam really does add little value. I am no longer going to point blog comments to my money sites, bar a few hundred just to provide some sort of link variety
3 - Forums, both profiles and posts, are as important as they ever were
4 - directories have become a waste of time
5 - Outbound links to high PR sites are actually beneficial and add some level of credibility to your site as an information resource.
6 - More weight to credible Web2.0 and social networking sites
Its simple really folks. If you are going to submit articles, take the time to make sure that even when you spin them they are good enough to clear manual checks.
Its important to understand that most websites if they engage in any SEO are normally paying a fortune for manual work. Manual blog comments, crappy profiles, the odd article here and there. Have a look at the average SEO company and the advice being offered and you will see the same thing.
Lame ass white hat techniques that DO work. They just dont work as well as our rampant spam does.
Well researched niches should not have much competition in the first place, and if its there, it should not be spamming the web to the degree the average user here is.
The web is a big place bro
To be fair to Google, they are extremely effective at fighting spam. They do try their best, but an anti-spam measure can't have a measurable negative effect on SERP quality across the board so a lot of stuff that seems easy to filter falls through the cracks. It's not nearly as much as it used to be though. It gets significantly harder every year.
I am sort of glad to see it. Use to be able to google something and get meaningful results, now days it is pure crap.
i don't believe a single word from that search company, its all propaganda.
I don't care about this much. As long as you choose the keywords right, any type of link can be effective.
Google wants... what people searching with google want..... solid relevant information.
If you have a bullsh!t autoblog.... with no solid unique relevant content... and its main focus is to push your products... then sooner or later... google will change something that will hurt you.
if you have unique solid content... then google actually wants you to rank high.
same thing with backlinks..... if your site is a real site... you can do BH backlinking to your hearts content and not get sandboxed...
but if your site is just a CLONE autoblog... you will get sandboxed.
Google has real people that visit the sites....
and when a site offers no real value... google tries to find a fingerprint or
pattern that best describes this site.... and then adjust their algorithm to drop
the ranking of that particular site and any site that matches that same fingerprint.
google makes money by bringing relevant results to users...
when we use BLACK hat methods....
it fuks with their ability to bring the best relevant results.....
so they adjust their algorithms to circumvent us circumventing them.
Looks that this is also the death of link exchange.
After checking the new algorithm in detail I came out after 3 months with an amazing result, also I have been testing it and it works 100%. Im making a website little by little until I finish with all the written material.
What I can tell you is I have copied on some of my sites articles from wikip*dia with all links to it and it has boosted my serp for many more long tailed keywords.
It makes a lot of sense, every time I try to do research for an article that I am writing I will have to weed through all of the affiliate review sites, and because of SEO, these sites are usually in the top 10. It's hard to find an honest review of a product or service anymore.
Well they still can't detect duplicate content so clone autoblogs are still safe. They do know exactly what's going on with black hat but it's rare that they actually identify something that they can deal with by tweaking the algorithm. The stuff that's used today, blog comments and forum profiles, these things have been around for many years. They seem like they should be detected, but implementing an algorithm change to do away with this sort of thing just isn't feasible.
And as far as BH link building on a legit site and not getting sandboxed, possible if you've also got quality links, but it's not a matter of a unique site.
lol.... you are severely underestimating google...
there are thousands of sites that have been deranked and deindexed for using methods not in line with google.
google can detect duplicate content in a heartbeat.
google has almost unlimited resources..... keep that in mind.
in real estate... there are 3 rules.... location... location.. location..
for google the rules are.... unique content... unique content.. unique content....
Believe what you want. I've had hundreds of domains deindexed, but for major, major stuff.
The thing about duplicate content is that they know there are multiple copies, but they don't know which is the original. In this case, the stronger page all around gets the good ranking. Happens all the time.
And there's no such thing as unlimited resources, especially in computing. To algorithmically combat some black hat techniques without adversely affecting the overall quality of SERPs would be an engineering nightmare.
Wayne what do you do for onsite SEO to make a world of
difference like you mentioned.
Well... they are going to start deindexing for minor stuff now.. you can believe that
yes... this is right... but from now on...the plagiarizer is not going to receive any rankings "points" for having the article.
Well comparing googles resources to blackhatters.... you might as well label them "unlimited" for the sake of this discussion.
and its not that hard to do....
this is all google has to do to wipe out most autoblogs...
simple steps as follows:
1) find blogs with a lot of traffic -google knows which ones already
2) cross match articles to find duplicate content
3) if a site has more than 20(or some mathematically calculated threshold) articles in which ownership is questionable... then it fair to assume that site is hosting duplicate and its part of its business model
4) lower rank on that site or deindex until further investigation or until they complain about rankings and an actual google employee verifies it
or whatever google feels is best to go about in fixing this problem.
quite simple.... no nightmare.
no one is saying they are going to do it overnight... that would just be dumb. But it can be done.
This reminds me heavily of the days everyone and their mothers use to hack DIRECTV....
I told them that directv could easily wipe all the hackers out....
people told me the same thing.. it would be a engineering/financial/etc nightmare... they don't have the resources etc..etc...
well they did it... they stopped the pirating of their signal cold.
this was because it was their territory... their playground .... their rules.
with google its the same thing....
this is google's playground... its their search engine... their rules.
don't underestimate them.
from your posts... it seems that you make money from autoblogs or whatever.... and me saying google can wipe out these sites is the worst thing you could read about...
so ... i have said what i needed to say...
I will not be answering nor posting to this thread anymore.
That's actually a pretty serious nightmare. Any sort of algorithm that would perform that would have a big o runtime of O(n^2), meaning that as the data set grows, the amount of resources required to keep processing grows exponentially. Put very simply, to check a set of 10 pages for duplicates, 100 comparisons would have to be made. To check a set of 100 pages, 10,000 comparisons would need to be made. This sort of thing gets out of hand very quickly.
I also don't do BH these days other than for research, but I know why some things still work after ten years or so. They've done a great job overall, but there are some things that remain just out of their reach.
With all of this crap around the Internet I personally don't know in what to believe. I just believe myself. Screw what others have to say and make your own judgements.
Separate names with a comma.