We’ve written a bit about in this blog about discovery, and specifically about Blacklight, which will be implemented as a new interface for IUCAT next summer. One particularly interesting possibility opened up by discovery interfaces like Blacklight is the ability to create customized search views based on a variety of criteria – by format (like music, or film & video), or location (a specific campus), to name just two.
You can see an example of this in action at UVA – they have both a music view and a video view. In the music search, users have the ability to limit using both the standard facets, like language, and by facets specific to the subset of materials, like instrument – as illustrated below.
As you might imagine, with the ability to provide customized search interfaces come many important decisions about what data to index and display, and how to best to do so to fully optimize discovery. Our colleagues in the Cook Music Library & the Digital Library Program have been engaging with these questions for some time through the Variations/FRBR project, one deliverable of which – the Scherzo search interface – is powered by Blacklight.
Last week, a subgroup of the Emerging Technologies committee of the Music Library Association released a draft document on Music Discovery Requirements, which can be found at http://musicdiscoveryrequirements.blogspot.com/ – they are inviting public comment on the document through December 5th, and anticipate releasing a second draft early in the new year.
I hope those of you with expertise in this area will review and comment on the draft document, and we also invite comments and questions on the topic of optimizing discovery of music records here.