Discussion in 'Black Hat SEO Tools' started by Mattiy, Dec 7, 2012.
How does fully automated linkbuilding work for you?
It fully automates linkbuilding, that's how it works for me.
It depends on your strategy - if you're using out of box shit then probably not, if you scrape/scrub your own site lists and work on your strategy of course it can be effective.
The level of automation is entirely inconsequential.
Google doesn't know if a piece of content is placed there by a bot or a human. UNLESS you use crap content and crap working practices when you automate.
If you do it properly automated and human are 100% identical.
NB - Most people don't seem to do it properly - so this may be where the question comes from.
Sorry if this is a newbie question but, are there any tools that you are using? if so, what is it? and is it free?
I use very few softwares for link building, but none are free. The majority of my link building is done manually. This way I know the 1st tier links are high quality. I only use the softwares to build 2nd and 3rd tier links, but again none of the softwares are free.
If you don't mind investing some money I would suggest SEnuke.
For me, its better to do it manually but if you really want it automated you can try Senuke.
If by "automated" you mean "press a button and leave it" then I would say it doesn't work anymore (at least for me!).
Nowadays I always want to have control over my backlinks. I would spend maybe a whole day setting up the content, carefully picking what goes where and when, choosing the properties on the different Tiers etc. then I would press a button and leave it.... for several days.
In a week or two, depending on how I schedule everything, I would revisit my campaign and make further adjustments like extracting live links for other Tiers and setting up other campaigns.
I miss the times when you could simply do a couple of AMR blasts to rank well....
As you can see, it's not 100% automated but being a geek to a certain extent :nerd:
I love spending a lot of time setting up the system and then leaving it to do what it's supposed to do.
I'm fairly successful with it, but it's because I automate things that are mundane or that I'm normally just spinning, copying, and pasting for.
I think Google can tell if there is an editorial process at a website, versus a situation where links are easily added by others. I think Auto Submitters are a really bad idea and the #1 reason for the Unnatural Linking Penalty and Notices people are receiving.
For the links to money site I have manual links built...I have my own software that I load those links into and it builds giant pyramids around the manual web 2.0 links...I drip them up to 60 days and can reverse all the links if I wanted to...It has worked pretty well...
It' as good as it gets. Like scritty said, you can automate your linking 100% if you know what you're doing. Just make sure to use generated content only for tier 2+. Hire a writer to create spun readable articles and stick to web 2.0s for T1.
And I prefer do it manually
You are wrong. Completely and nonsensically wrong
If you take care with a tool and more importantly the conentent THERE IS NO WAY THAT ANY SEARCH ENGINE, PERSON OR DEVICE CAN EVER NOW HOW IT WAS POSTED!
If you use crappily spun content in a tools "auto spin" option, fail to select appropriate linking targets, post the same spun content too often, don't set up your content with links in the appropriate place, titles, tags, resource boxes, bios, formatting etc and then don't set up sundries like captcha and proxies then you will get very poor results in terms of posted links and your links WILL look automated
BUT THAT'S DOWN TO THE USER - NOT THE TOOL
Stop spreading nonsensical, irrational, fear mongering CRAP.
Absolutely a search engine like Google can tell the sites that have a mass of links being added all the time with no editorial process. For one, there are more external links. Second, there is greater variance in the content of those links (ex. off topic). It isn't hard at all, and an automated tool can most certainly lead to an unnatural link penalty.
If you are sure of yourself, why not give us an example site where you built links with a tool, and let's do an analysis on the success?
Using a tool to help you "discover" links, I get that. Using a tool to post links automatically in the same place many other people are posting them. That is a failure from the start.
If you believe it is a failure then don't do it but it is not. What Scritty is saying is that if you take the time and set up your campaign correctly in the beginning nobody will be able to tell. You can run your own footprints and also check how many links are on a page. You also can check if that page is indexed or not. Just because you (anybody) lacks the necessary skills to do it doesn't mean that it does not work....
As I said, if you have an example, by all means, show us. I've had several PR 6 and 7 sites, millions of links pointing to the sites I manage or own, ranked in the top 5 on google for "shoes", and been doing this 12 years. If you are doing it right, there is no need for article spinning, pinging, profile links, automated blog comments, or working off of a list of "do follow" links. If you are doing it right, people in your niche love your content, and traffic naturally grows. Even this forum is a great example of that. People love to hang out here because the content is managed well. If you are having success, I'm not going to judge you, but this thread was for offering an opinion which I have offered. Make of it what you like.
Scritty is correct. I am just going to chalk this up as your not really understanding the way the web works technically, because if you did you would know this is not possible. Lets look at the Facts here, no opinions just facts.
GoogleBot is just a web scraper. It goes to a page scrapes all the text and Links on that page, follows the links and does the samething over and over again. GoogleBot CANNOT tell If a human posted the content or if it was a bot. Being a Developer myself going on 15 years, this is just not possible. What Google DOES do is look at the page text and links posted and examine those using it's algorithm just like it has been doing for years.
Tons of people use automated link posting. Top SEO Companies a lot of people on these very forums. Between SEnuke, MS, UD and other automated like building software we are talking tens of thousands of people using these tools. Now if Google could Detect auto-posting, why are these companies still in business? Why are people still buying the sofware? Why do you not see tons of posts on this very forum saying SEnuke, MS or UD killed my site? It's simple really, Google cannot detect if was a human or bot that posted Infomation. Hell, the site that you are posting on can't even tell if you do it right.
People using the tools improperly is what is causing them to get caught. Using Unreadable junk spun content. Not Speading links over a period of time, ect.. Point is, most of these tools have options and you can set them in a way that no one could possibly tell automation from manual.
The fact that tens of thousands of people are still using these tools and no one is complaining (I mean who would pay monthly for a tool that does not work?) is proof enough that google can't tell if content was posted with a bot or manually. People give Google way to much credit and assume they can do anything which is a shame. More money for me I guess, the more people that are scared the less competition I will have.
I think the real test is would it pass a manual review by a google employee? If you were to look at all the links from the site and scan through them, would you find a pattern that seems unnatural?
There are many reputable sites in google's eyes that regularly produce quality content including external links on many occasions. Links from these sites are very powerful.
Then there are those types of links that are easily generated without an editorial review of the site owner. I don't know what you think, but I can tell absolutely that sites like this are VERY easily identified by google and the links don't pass any juice, or not for long. Worse still, a large number of such links pointing to a site is easily identified too, and it is flat our unnatural.
I live in a building full of Google employees and talked shop at the jacuzzi many times. They do lots of manual views, and then build algorithms that account for unnatural links. I will emphasize that sites that allow external links with little or no editorial policy, or have a large number of external links that have no relationship, are especially the ones that don't count for much. In addition, new sites and new links have less value.
Why work so hard trying to do something automated when there is so much more benefit producing your own good content over a period of time, making friends with site owners that have reputable sites, and working on the links that have a lot more juice from webmasters that actually care about there site and where they link? The good links are more difficult to obtain but are worth 1000 or more automated links or links posted on sites that can be added without the approval of the owner.
Have you guys forgot about Google Analytics and what information it provides to Google? I hope you guys are blocking it when link building.
Separate names with a comma.