SEO Software - need advice

howdoyou

Regular Member
Joined
Nov 17, 2008
Messages
289
Reaction score
58
i need some suggestions, I'm programming software right now, it searches google for a few keywords (cant tell you the keywords... sorry :confused:), this pulls up a listing of 5,755,000 web pages with no captcha, or any thing to keep you from posting your link/comment/artical up to generate 5,755,000 back links.

it then goes threw the web pages and post your message/name/website etc on popular blogs, forums, etc.
you can also specify keywords to narrow down and hit niche sites.
can anyone think of any features to add to this bot to make it better?

I'm also looking for a good GUI Designer. PM me if you can fill this spot.
anyone that gives me some good ideas for the software - and i use the idea, they will get a copy of the bot at no charge.

i won't be selling this bot for a few months, as i'm going to be useing it myself first. :p

Thanks,
 
Last edited:
Randomizing of link anchor text is a must, you don't want 1,000s of links with the same keyword popping up in a day.

So you need a box to paste a list of anchor text the bot randomizes from. Also doing the same for URL's would be good, so you can paste say 10 subpage URL's and it would alternate them.
 
Spot holders, got it.

example:
{URL}

would be replaced with a random url from a list of urls.

{URL} - gives a random url from your list of urls
{NAME} - post a random name from a list of your names
{WEBPAGE} - post a random webpage from your list of webpages
{WEBPAGE_NOW} - post the name of the webpage your posting on.
{DATE} - you know...
{TIME} - "
{KEYWORD1}
{KEYWORD2}
{KEYWORD3}

any other ones?
 
Last edited:
Confirmation of successful posting, storing of successful URL's so you can make niche databases to post to again without harvesting, ordering of the list by Pagerank.

Also number of outbound links on a target page, if these sites have no spam protection you don't really want to be dropping your URL on pages with 500 other spam comments.
 
I think you should hire sweetfunny. I like her ideas ;-) She seems like she knows how to blog farm her way into daisy duke profiteering.
 
Thanks for the tip sweethoney, I had heard this before but didnt know it was that important. Most of my keywords on one of my sites are the same and im Google and Yahoo #1 for my main keyword and 6 others!
 
Confirmation of successful posting, storing of successful URL's so you can make niche databases to post to again without harvesting, ordering of the list by Pagerank.

Also number of outbound links on a target page, if these sites have no spam protection you don't really want to be dropping your URL on pages with 500 other spam comments.

This is good,
so the bot should have a user option where the user can input the amount of inbound links there are, and if the number of inbound links exceeds the number the user said, it skips the website?
 
What about if the keyword is in the title of the page for the 5 million results?
 
Proxy addition would be a must for this so your ip doesnt get banned and you cant post again.....

also maybe splitting the sites by PR would be good for some of the bigger seo guys out there who need to get their sites pr up and so would only be posting in a select few places.
 
Thanks for the tip sweethoney, I had heard this before but didnt know it was that important. Most of my keywords on one of my sites are the same and im Google and Yahoo #1 for my main keyword and 6 others!

Google has a certain tolerance, getting 5k links in a day all with the anchor text "Consolidation Loans" is not natural and will raise more red flags than a Spanish bull fight.

Your 6 keywords are probably not competitive, the links with identical anchors were probably obtained over weeks not hours and we are probably talking links in the dozens not tens of thousands. In other works, not enough to cause a blip on the link graph radar.

This is good,
so the bot should have a user option where the user can input the amount of inbound links there are, and if the number of inbound links exceeds the number the user said, it skips the website?

Yes, detection for the number of outbound links already on the page and an option to skip posting to it if it exceeds a certain value.

Example:
Code:
http://www.gocc.gov/Members/sjwillis/weblog_storage/blog_76624

A once good .gov page, now annihilated with over 600 comments and so many links dropped it takes a minute to load the page. Everything from WOW Gold, Gay Porn and Mary Atkins diets.

You really don't want your links dropped in places like that, it's no benefit and may actually be detrimental.
 
similar to the above post I would have negative keyword matching in your bot. Also if you could implement some sort of module where it would generate common misspellings, common keyboard typos, synonym gerneration, etc. I think that would be very helpful. You might even use google to your advantage by using Google adwords to generate your similar terms for you. *sneaky* =) ........ I would also implement an IP hopping mechanism such as having your bot parse a list from a site like http://www.samair.ru/proxy/ and rotate your connection every post and keep cycling through the list at random.
 
your bot should recognize on which site works html and on which is bbcode and post always anchor urls
 
Back
Top