Reflections on Evaluating Digital Projects

It took a little over two months to finally devote a seminar session on the topic of evaluating digital projects.  If you consult our workshop schedule, you will notice it was a topic slotted for coverage on the first day we convened as a group on November 19, 2013.  It attempted to resurface as part of various other seminar sessions, and finally, on January 14, 2014 we gave it the time it deserved.

The seminar started with a brief introduction to a holistic approach to reviewing digital projects with respect to project planning and peer-review.  We concluded the seminar portion of the session with an introduction to Digital History with a shout-out to Nebraska because they are rocking the Digital History.  After the overview, we demonstrated two projects that focused on institutional histories similar to ours, and then it was time for the rest of us to do what we do best – research, review, and critique digital projects.

Evaluation as Part of Project Planning

Before jumping into most research projects, we often conduct literature reviews, environmental scans, or a needs assessment, the latter two especially relevant when seeking grant funding, in order to present a compelling argument, hypothesis, or justification for a given research project.

As we survey the landscape of related digital projects, we will be able to:

  • Distinguish unique components of our collection/project
  • Recognize gaps and ways our digital archive could be strengthened
  • Articulate (innovative) features and functionality, especially in ways that leverage our unique content

But above all, uncovering relevant and intriguing projects, serves as a much-needed form of inspiration as we collectively embark in building a digital archive. This inspiration will help us begin connecting the various parts of the process from collection processing (still in the throes of that) to publishing.

We discussed reviewing digital projects based on a fairly simple, but wide-reaching model:

  • Purpose/Goals of Site
    • Teaching, research, etc.
  • Organization
    • Clear navigational paths, intuitive ordering of content, etc.
  • Content
    • Kind of content, presentation of content, etc.
  • Functionality
    • Discovery mechanisms (i.e., browse/search), image-viewing (i.e., zooming, annotation), etc.

As we segued into the lab part of our workshop, which entailed finding and critiquing digital projects according to set of unruly heuristics (an elaboration of the simple model above), we considered important characteristics for online storytelling:

  • Flow
    • How will our history of the IU Libraries unfold?  Thematically? Chronologically? Both?
  • Narration
    • How do we critically tell our story?  How do we integrate multiple modes of narration: visual, aural, textual?
  • Interaction
    • How do we balance static and dynamic content?  Can we or should we alter views of events as we collect oral histories and anecdotes and juxtapose those stories with documentary evidence?

The Critique Session

After 20-30 minutes of scouring the Web for exemplar or inspirational digital projects, a few of us took the lead in demonstrating and critiquing a particular project of interest.  We discussed projects like:

I was struck by the qualities and characteristics that we seemed to value as a group:

  • Crowdsourcing or user-contributed metadata
  • Robust metadata
  • Scholarly narratives

I was especially tickled (as an author of such content) by the importance given to the customary project information pages that we, or I, anyway, feel is often overlooked when exploring projects online.  As we grapple with eight million things as part of our cross-training initiative, project documentation is certainly one of those things, and an important thing, no less.  I look forward to learning from my colleagues effective ways I can present the often blah-blah project information section for any given digital project I manage as part of my day job, and certainly look forward to a compelling project information section for our own digital archive.

Finally, I felt that the level of engagement for this seminar session was super awesome and contagious.  I look forward to more sessions like these.

Looking Ahead

Up until now, I must admit, though I know there is a method to the madness that is our 18+ month long workshop schedule, it has felt a bit choppy.   This session on evaluating digital projects came at a perfect time, despite the many false starts, as we enter the next chunk of our training which will include:  exploration of the research lifecycle, introduction to project management, and metadata creation (a hot topic of interest, see above).  We are building blocks, finally!