Rank checker free and open source

pierz

Registered Member
Joined
Nov 9, 2009
Messages
58
Reaction score
65
Hi,

Let me introduce serposcope, a free and open source rank checker. You can install this software on your local machine, on a dedicated server (best) or a shared hosting.

index.png highcharts.png table.png proxies.png options.png logs.png

This rank checker can check your position on Google and support local search for doing search from a specific city.

It also support socks proxies, http proxies or just multiple interfaces in case you have a dedicated server with multiple IPs. It can also handle Google captchas (you need a DeathByCaptcha account for this).

There is also an embedded calendar to track your seo action and see the direct impact on your ranking.

I have released this software a few months ago already so It has been well tested and it's bug free (I hope :D).

Fell free to do feedback or suggest new feature here or on the support forum so I can improve the software. Also if you are a coder you can push your own modification on the github repo.
 
checking it on localhost. what does this mean " Warning, run should be done from COMMAND LINE or via cron, continue ?"
 
checking it on localhost. what does this mean " Warning, run should be done from COMMAND LINE or via cron, continue ?"

Well you are launching the run via your web browser, it's not executed the same way if it was done from command line, (starting cmd.exe on windows or /bin/bash on linux/mac and executing php cron.php).

When you execute it from the web brower the script could be interupted by max execution time on some shared hosting which don't allow you to edit max execution time. But If you are running from localhost and you did edit correctly the php.ini (see documentation) and restarted php/apache, then it should not be a problem.
 
Great share. Its using one website for proxy check (revo...), is it under your control? Is it possible to check more then 100 results?
 
Well you are launching the run via your web browser, it's not executed the same way if it was done from command line, (starting cmd.exe on windows or /bin/bash on linux/mac and executing php cron.php).

When you execute it from the web brower the script could be interupted by max execution time on some shared hosting which don't allow you to edit max execution time. But If you are running from localhost and you did edit correctly the php.ini (see documentation) and restarted php/apache, then it should not be a problem.

Yes I did edit it, and it works fine now. You did a great job! One more question, what happens if I add another group of keywords while the log is runing?
 
Great share. Its using one website for proxy check (revo...), is it under your control?

Yes I own revolt.vu.cx , why ?

Is it possible to check more then 100 results?

This number is hardcoded but It's definetly a great idea to be able to check more than 100 results or be able to specify the number of result per page you want (actually limited to 10). So I'm going to add an option for this in the next version.
 
Yes I did edit it, and it works fine now. You did a great job! One more question, what happens if I add another group of keywords while the log is runing?

Should not be a problem, however, the group won't be checked on the current run, you'll have to launch the job one more time (you can launch a run/job for a specific group only on the group page).
 
Yes I own revolt.vu.cx , why ?

Just curious, do you mind if I use this site to check approx 5k proxies/day?

Although it's serp checker you know best its only one step do make it also google scraper (it's allready scraping google + complex proxies support), what do you think about it?
 
Just curious, do you mind if I use this site to check approx 5k proxies/day?
Yes you can use it, it's a dedicated server with nginx tweaked for massive simultaneous connection. I did set it up specificly for proxy checking.

Although it's serp checker you know best its only one step do make it also google scraper (it's allready scraping google + complex proxies support), what do you think about it?
Yes I m already working on a scrapper but I won't intergrate it on serposcope, because the code of serposcope is done a way to be extended but only for soft rank checking (take a look at the extend part of the documentation). I will make a separate software for this task.
 
This look hot ... will check this out soon.

It looks very easy to use and has a good overview
 
Awesome work Thanks a LOt taking a look at its functions.
 
Yes you can use it, it's a dedicated server with nginx tweaked for massive simultaneous connection. I did set it up specificly for proxy checking.
Great, thanks.

Checking proxies against google would be usefull.
 
Great, thanks.

Checking proxies against google would be usefull.

Serposcope support unreliable public proxy (like 1% of valid proxy in your list), however you have to tweak serposcope configuratin for this, take a look at Tweaking configuration for public or private proxies and take a look at the FAQ, there is probably something interesting on this too.

You can set severals proxy list from URL in proxy.php, serposcope will fetch these urls before each run (usefull if they are updated daily), and it will automatically eject bad proxies, extermly efficient if you have DeathByCaptcha configured.
 
Back
Top