Blogging for Business (and Pleasure) | October 26, 2006
I had the pleasure of speaking at the Blogging for Business seminar yesterday, organised by e-consultancy. My talk was on the History and Culture of Blogging and the slides are now online if you’re interested. I’m going for more minimal slides these days, so they probably won’t mean much unless you were there. However I essentially talked about how blogging has taken off over the last 5 years, what the current ecosystem looks like and how businesses could benefit from joining the conversation.
I’ve given a modified version of this talk a few times, and each time I go and check out the number of blogs indexed by Technorati to give an indication of the size of the blogosphere. The first time I gave the talk in September 2005, Technorati was tracking 16.5 million blogs. Now they are up to 57.4 million! That’s a huge increase in the space of a year. I’m not sure if this is an indication of how fast the blogosphere is growing or just how far Techorati’s reach is extending (a bit of both I imagine), but it’s a pretty amazing figure.
Talking of figures, the highlight of the day for me was a presentation by Heather Hopkins on blogging statistics. Heather made the interesting point that while most people reach blogs through search engines and social media sites, where they go afterwards is much more evenly distributed. So the search engines act as a funnel to blogs, but from there people go to news sites, shopping sites, photo sites and other blogs. Heather pointed out that blogs account for only about 1% of Amazons traffic. However she also pointed out that Yahoo! only accounts for about 2%, making blogs a pretty significant source of traffic.
Heather also posted a list of the top 10 UK bloggers as measured by direct traffic to their site (i.e. not including RSS). This was interesting as there was another UK bloggers list released yesterday, this time the top 100 UK bloggers. This second list was created using data from Technorati who use links to measure popularity rather than traffic. The interesting thing is that while there is some correlation between both lists, there is also significant deviation. This brings into question the best metrics to use when judging the authority of a site.
Traffic seems like an obvious choice to me, although with more and more people reading sites through RSS feeds, this will start to get skewed. Sites with a technical focus are likely to have more people subscribed using RSS and hence fare less well on the traffic test. On the other hand, sites that appeal to an audience of bloggers are likely to generate more links just because their audience have the ability to link to them.
When using traffic as a metric you need to be wary of sudden spikes that could seriously skew the overall results. To get round this you probably need to track traffic over several months and then calculate the interquartile mean to get a fair representation. When using links as a metric, you are likely to get even more skewed results as links are a product of time. Simple put, the longer you’ve been around, the more links you’ll have. As such, if you’re using links as a measurement of popularity, you really need to average them out over the last 6-12 months. That way you know that the sites in question are authoritative now and not five years ago.
The best option would be to create an algorithm that measured both links and traffic over time. Mix this with some clever textural analysis, site theaming and result clustering and you’d have yourself a pretty cool search engine to rival Technorati or even Google. Any takers?
Posted at October 26, 2006 10:51 PM