The new State of the News Media 2006 report is out, and it’s generally useful reading. Perhaps the finding that will get the most attention in the blog-o-sphere, however, is this: Blogs don’t break news. Here is the relevant snippet:
We found little of what would be considered journalistic reporting done by these bloggers, as in examining public documents, conducting interviews, or acting as a direct witness to events. In more than three quarters of all the posts (79%, 88 posts) the highest level of reporting offered was a commentary from the blogger. Just 5% (5 posts in all) involved some original research.
Fair enough, and also damning stuff. But then again, few blogs would claim to break news. This one doesn’t do it very often, and that’s by design. (Okay, to be fair, when I have a chance to break news, like when I had the Google analyst day PPTs, I also miss the opportunity.)
But there is a deeper problem in the above analysis. Because plenty of bloggers do break news — Om Malik does, Rafat does, TechCrunch does, Read/Write does, TheStalwart does, Footnoted does, Jeff Matthews does, etc. — but they get tossed in the above analysis. Granted, it isn’t always investigative journalism, and the proportion of news-breaking blogs is a small percentage of the total, but I think, at the very least, the authors of the report might have stratified their results, separating pure opinion/link blogs (that are not designed to break news) from ones that occasionally or more frequently break news.
And there is a deeper problem. The sample size in the report is vanishingly small. A total of seven political blogs. That’s it. Seven blogs over one day. How is that in any way representative of anything?
A more legitimate criticism might be one directed at the authors of this report. How are we to take seriously an industry that continues to ignore the basics of sampling? Instead it’s the same old, same old: Collect a few data points, and then call it a trend. Q.E.D.