?> September | 2006 | INFOMATTERS

Monthly Archives: September 2006

Wikipedia v Britannica

Wall St. Journal Online edition presented an interesting exchange this week between wikipedia founder Jummy Wales and Britannica’s editor in chief, Dale Hoiberg (http://tinyurl.com/jxe33). Since Nature published a study indicating that the accuracy of entries on Wikipedia was comparable to Britannica, traditionalists have been quick to find fault with the study or point to clear errors in Wikipedia, but this is no simple argument. The reported discussion certainly paints Wikipedia as the brave new entrant come to break the monopoly on ‘facts’ and some of Hoiberg’s comments are a little defensive but I think most of us agree we don’t want a mass of inaccurate and biased entries passed off as reliable (we get enough of that on TV). The Nature study asked experts to judge various entries without knowing from which source they came, and the results indicated an average of 4 errors per entry in Wikipedia to 3 errors per entry in Brittanica. The main point here is these rates are so comparable, though without looking at the type of errors found you might be forgiven for wondering why Britannica is so respected if each entry has that many errors. Nature released data showing the type of errors found and these include judgements of ‘overstatement’ or ‘too short’ as well as lack of clarity, failure to include certain works in a bibliography, and a mispelling of a place name (using an ‘e’ instead of an ‘a’). In fact, Nature concluded there were only 8 serious errors reported, and these were equally divided, four each in Wikipedia and Britannica.

A key argument made by Wales is that Wikipedia really builds on the openness principle: all can contribute through entries or corrections, and the result is likely to be more representative than the invitation-only contributions of Britannica. Since entrants to Wikipedia have to be motivated to contribute (have you?) there is certainly potential for mischief but Wales talks of moving to semi-protected and even editable ‘ nonvandalized’ versions to improve quality. And the proof is in the pudding, as they say. He likens publishing entries that are still being edited to Britannica’s revealing the in-draft versions of new entries which they won’t do of course (though there’s probably a collector somewhere who would pay for those).

In sum, arguments that authority must be maintained or chaos will ensue in the information world seem to be far less convincing than they once were. Maybe a little authority goes a long way. Stay tuned.

Future of academic libraries symposium

I attended a closed-shop symposium at UT this week on the future of the academic library (http://www.utexas.edu/president/symposium/index.html). The two opening addresses, by James Duderstadt, former President of the University of Michigan) and Clifford Lynch (of CNI) were models of insightful, powerpoint-free talks that took us through a range of future scenarios (definitely plural!) suggesting major challenges ahead. Duderstadt pointed to the growing need for libraries as learning spaces, not as repositories, and made a case for a world of life-long learners who would engage with universities remotely and repeatedly. I was a little concerned about the presentation of dramatic scenarios for a new cyberinfrastructure of open access without clear examples of real human activities that we could consider, the talk certainly raised the collective sights of the attendees. Clifford Lynch noted specifically that the humanities have thoroughly embraced digital technologies, with new research enabled through text mining, remote access to collections and e-publishing, but he argued convincingly that easy predictions of what lies ahead for scholarship in the digital realm are inevitably wrong.

With only 60 attendees present it was easy to engage and lively discussions were common. I chaired a panel consisting of Dan Connolly (of W3C), Kevin Guthrie (Ithaka) and Alice Proschaka (Yale) on the future of access and preservation which got the crowd going when Dan stated there was no real preservation problem since 95% of clicks on links resulted in the desired result, and Kevin argued that access suffered greater impermanence than preservation in the digital realm. Much depends on how you interpret these points, and we spent much time trying to clarify just what Dan was measuring, but he argued strongly that this is not the same as claiming 95% of sites are permanent, and indeed on the web there is a good reason why we might want and expect some sights to be very transient. The facts need to be established more clearly here and there is certainly a study waiting to happen.

The final session after 1.5 days was an open discussion which led to some interesting summary statements. While it is clear that no university or publisher has the answers, there is real concern that the world is changing and we are not ready. Personally, I think the missing piece is a better understanding of human behavior since scholarship, learning, education etc. reside at the human not the artifact or collection level. The media will always change, but the human need to communicate, share and engage with data can be undertood better and designed for accordingly.

One interesting side-discussion involved the fate of LIS education for this new world of open access, networked and aggregated, personal digital spaces. Jim Neal (Columbia) suggested that the current masters programs in LIS were not really meeting the needs of academic libraries, and this was interpreted by one attendee from another program as deeming them irrelevant (a charge Jim denied). Oddly, nobody here mentioned ‘crisis’, or a failure to teach cataloging as the problem, but the feeling seemed to be that the futures facing academic libraries will not be shaped by graduates of many current LIS programs. No comment from me required!

Update — audio files (mixed quality) of the symposium are available at: http://www.lib.utexas.edu/symposium/