My Photo

« Web Calendar Update | Main | IceRocket to be sold »

April 14, 2006


Greg Linden

Hi, Matthew. The data used for that Findory graph does exclude robots. It is hits, but the page view numbers show similar growth Q1 2006.

Yes, I think the problem is that Alexa is generally unreliable. Alexa stats only come from people who have installed the Alexa toolbar, a tiny sample that is heavily biased.

As you can see from the chart you cite, Alexa data is extremely noisy. The Findory website does not see wild daily fluctuations that appear in that Alexa data. The most likely explanation for those fluctuations is a small and biased sample.

Alexa is also trivially easy to manipulate. There are many techniques and tools available for manipulating Alexa ranking. Simply asking a couple people working at your startup to install the Alexa toolbar seems to be sufficient to yield a large jump in your website's Alexa rank.

Alexa probably is accurate comparing sites with an order of magnitude difference in traffic, but I would be careful about using it to make conclusions about small differences.

Max Kalehof

Matt -
Alexa is a great free tool -- it's better than nothing, but it is the furthest thing from real measurement projections. It's accurate of one thing: people who install the Alexa toolbar. Who are those people anyway?

Ron Kass

Alexa is not an accurate tool, but it does reflect real life changes to some degree.
It is also important to realize that robots traffic is not a constant but a changing factor. Usually a rapidly growing one in the early days a site, and a growing one as number of pages on a site grows.
It is also important to remember that, unless using image verification for logging (which is many times not the case with internal stats) self log-analyzing trends are not accurate and are not able to keep out all robots activity since some of it is unreported (through user-agent).
Also, it is true that alexa is heavily fluctuating, however that is true mostly for smaller sites, since as sample usually go, accuracy is in high frequency numbers.
Manipulation of alexa is possible, but again, to a degree..

That being said, it is important to realize that Alexa is a statistical projection rather than a true measurement and comparison tool. One should not trust alexa when estinating traffic pattern changes.
But one can certainly assume longer term trends are corrent.

Things that might be considered when judging alexa's graph compared with internal stats:

1. Robots traffic IS growing.
2. Many robots are stealth (including google verification for checking for cloacking), this might be nistakenly accounted in intenal stats as surfer traffic. A suggestion would be, trying to measure statistics with image verification (which most robots don't follow)
3. Views per user may grow as a site mature. Average per user may go from 2.5 pages per session to 3.5 for example. this is a very big change that might change reach greatly in alexa.. a thing that doesn't show in internal stats.
4. Your visitors' demographic matters. Ans so are other characteristics of your visitors. Some populations differ in the aspect of alexa toolbar installation.

John Cass

I would not trust the alexa results for numbers, but I think the trend might be used as a basis of information.

Account Deleted

You can also try – a free tool that provides a nice summary of the website performance. The estimation provided by estimix is the result of a complex analysis based on factors like: the age of the website, the demographic structure of the traffic, the countries where the website is popular and sources of the traffic.

The comments to this entry are closed.

Twitter Updates

    follow me on Twitter

    March 2016

    Sun Mon Tue Wed Thu Fri Sat
        1 2 3 4 5
    6 7 8 9 10 11 12
    13 14 15 16 17 18 19
    20 21 22 23 24 25 26
    27 28 29 30 31    


    Blog powered by Typepad