Back to School: A Week in Numbers

Fall semester began last Monday August 22, so I thought I’d share some numbers from the website.

Eight days a week – if it’s good enough for the Fab Four, it’s good enough for me.

Monday August 22 – Monday August 29

Overall: 79, 836 pageviews

Google Analytics dashboard data - Aug 2016

That dip you see? Saturday. And, you might be able to just make out the top browser: Chrome.

Top 10 pages
  1. Homepage – 16,909 hits (21% traffic)
  2. Fall 2016 Ensemble Information – 4091 (5%)
  3. Music Library – 3210 (4%)
  4. PED (Performing Ensembles Division, Music) – 3066 (3.8%)
  5. A-Z List of Resources – 1710 (2%)
  6. Student Jobs at the Libraries – 1631 (2%)
  7. Herman B Wells Library – 1313 (1.6%)
  8. Hours – 990 (1.2%)
  9. Education Library – 942 (1%)
  10. Business/SPEA Information Commons – 841 (1%)

No other page logged more than one percent of overall hits for the site in that time frame – that’s pretty typical behavior for us. What does that mean? Well, we have a lot of pages, and we have a lot of people entering somewhere other than the home page.

What was our most used resource? Google Scholar, with 330 hits, followed by the New York Times with 316.

About 5000 sessions were via smartphones – that’s 15% of our overall traffic, which is up 150% from our previous average of about 10%. Only 2% of our users reached us via a tablet.

Where did the desktop users click? Have a look!

Heatmap: Indiana University Libraries - Desktop August 2016

Small changes to make a big difference

In her Weave UX article, “Improving the Library Homepage through User Research – without a Total Redesign”, Amy Deschenes writes about the usefulness of making continual, small changes to a library’s website based on patron feedback and the results of user testing. Although user testing, feedback, site statistics, and/or heat maps are necessary to consider while completely redesigning a site, website managers can conduct further testing and analysis after a redesign. This can show how new features are being used by patrons and if they are helping a patron find desired information. By determining how a redesign is being used, site managers can make small changes where users can efficiently find information without becoming disoriented by large changes or even noticing that the site has changed.

Since the Summer 2014 Drupal migration, DRS has been making changes to the new Libraries’ site. A heat map revealed where users clicked on the homepage. Feedback from emails and reference desk questions indicated links and labels that were useful or needed to be changed. Google Analytics showed how long users stayed on each page and their navigation. Through this information, we were able to shorten the hompage and prioritize its links so that it is faster to find the footer’s useful information, such as recommended databases and hours.

In December, the ‘Start Your Research’ section in the top left had four subordinate categories to list links by subject. A heat map revealed that the links under the ‘Featured Collections’ and ‘Faculty & Graduate Students’ categories were underused.

The December 2014 homepage shows a large amount of content under the 'Start your Research' section.
The homepage on December 8th, 2014

Therefore, we got rid of the four categories and reduced the number of ‘Start Your Research’ links to those that are more widely used. We also changed the ‘Resources’ category in the navigation bar to ‘Research Resources’ to indicate that subject guides and databases are found within that category.

The January 2015 homepage shows that the content under 'Start your Research' has been condensed so that users scroll less to find the footer.
The homepage on January 22nd, 2015

By focusing on a few site features, we are able to improve the site’s usability without creating new obstacles for users. As users navigate the updated site, we can use statistics, feedback, and testing to continually improve the site in small ways that are barely noticeable, but helpful.

You say “iPad,” Google Analytics says “iOS”

Given that we’ve held a couple of workshops on using Google Analytics in the last couple of weeks, I thought I’d share an article about a recent change in how Google Analytics tracks Apple devices – logging is no longer broken out by device (iPhone, iPad, iPod Touch) but instead will be noted under a catch-all category of “iOS.” The change took place on May 29th.

What does this mean for the average Google Analytics user? Be aware that data-sets spanning that date (May 1 – May 31, for example) are going to have numbers and displays that look a little bit funky because of the change as they sort of “bump over” from one measure to the other. Going forward, it means we will all have a sense of overall Apple-device traffic to our sites, and if one is interested in determining which Apple devices, the best way to do that will be through looking at screen resolution reporting (logged under Audience -> Technology -> Browser & OS).

AND – if you’re really geeking out on Google Analytics right now, you might take a gander at these two case studies on how PBS and the San Francisco Museum of Modern Art have been using Google Analytics. We’re always interested to see discussions of GA usage specific to nonprofits.

Thanks, Anne, for the tips on the articles.

Wrap-up: Google Analytics webinar series

We certainly enjoyed the recent webinar series on Google Analytics, Library Analytics: Inspiring Positive Action through Web User Data (an ALA TechSource webinar/workshop), and we hope that you did too. If you missed the sessions the first time around, we do have access to the archives, so give us a yell if you’d like to see them.

We also wanted to collect some information here, for easy access. Enjoy!

Session 1: The Basics of Turning Numbers into Action
Continuing the Conversation: ALA Techsource blog post with slides, additional resource links and content

Session 2: How Libraries Analyze and Act
Continuing the Conversation
: ALA Techsource blog post with slides, additional resource links and content

The presenters provided the following list of recommended readings:
Wikipedia Entry: Web Analytics
“About Us” Page, Web Analytics Association
Measuring Website Usage with Google Analytics, Part I
Measuring Website Usage (from http://coi.gov.uk/guidance.php?page=229)
Library Analytics (Part 1)

Arendt, Julie and Wagner, Cassie. 2010. “Beyond Description: Converting Web Site Usage Statistics into Concrete Site Improvement Ideas“, Journal of Web Librarianship, 4: 1, 37 — 54
Black, Elizabeth L.2009. “Web Analytics: A Picture of the Academic Library Web Site User“, Journal of Web Librarianship, 3: 1, 3 — 14
DANIEL WAISBERG and AVINASH KAUSHIK. 2009. “Web Analytics 2.0: Empowering Customer CentricitySEMJ.org Volume 2 Issue 1.

You may also be interested in this recent interview with the presenters, “Paul Signorelli and Char Booth Discuss the Role of Web Analytics in the Library.”

Google Analytics and the Library

As an undergraduate and, more recently, a graduate student, I have noticed many students no longer want to go to the library to conduct research.  In fact, many times when students go to the library, they are going to meet with a study group or rehearse a presentation because of the availability of study rooms.  Many resources are in digital format and only require a computer with an internet connection for access.  This means that more and more students are using the library”s website to find information and conduct their own research.

One of my main projects as a graduate assistant for the Digital User Experience has been working with Google Analytics to understand what information it can provide to increase usability and accessibility of Indiana University-Bloomington’s library website.  The first few months were spent mining data and exploring the wide range of information that Analytics can provide.

What I have been doing on a monthly basis is mining four separate statistics about page usage.  These are the number of visits a page receives, the average time spent on a page, the number of visitors who exit from that page (exit %), and the number of visitors who enter and exit on a certain page (bounce rate).  They are presented in charts that can be graphed by month, week, or day.  This is incredibly useful for noticing trends in a page’s usage and can provide great information about the page.  For example, a page may see little use but have several days where usage spikes.  This could mean that the page is being taught in a seminar, which would account for many people using it on the same day.

The statistics of these pages can provide great feedback on the usage of the page.  A page full of information should have a higher “average time on page” statistic than other pages.  If it is low, then there may be a problem with how the information is displayed; is it too difficult to read, is it in a logical order, etc.  By knowing the date the page was last modified, one could see on the graph if those modifications have had an impact on its use (whether it be more visits or people spending more time on the page).

Another tool that I’ve found useful is In-Page Analytics.  This allows you to view any of your tracked pages with an overlay of information about visits, average time, etc.  Through this the user can see which links are used the most (ranked by percentage of total page clicks and with the number of clicks) and be provided with a more visual idea of the navigational flow of the site.  The page can be browsed using the same view as the visitor, but also contain information on how the user passes through the site.  If an information page has a low “average time on page” statistics, you could view that page and, with the overlay, see what links people are clicking to leave the page.  Maybe they don’t readily see the information they are seeking and think another page may be of more use?

Google Analytics is also great for providing information about the demographics of the site’s visitors.  It allows you to see where your users are from, right down to the number of users per town or city.  It can provide information on how people are viewing the site based upon operating system, browser, screen resolution, screen colors, and what version of java or flash they have.  This can help page design be optimized for the specifications of the viewers.

Mining the data is fairly easy; the difficulty is what questions to ask of the data and what it can answer.  Analytics provides a huge array of information, but is useless if there isn’t a way to interpret the data.  We have been in the process of looking more closely at the data for subsets of the library (services and departments) and other libraries.  With these groups we have been asking cursory questions such as what trends are occurring, is this page still needed, who uses these pages, are people delving deep into the site’s hierarchy or only looking at top-level information, and is this page being properly utilized?  This has helped to understand how to use the data, but I think there is still so much more to be gained from the information Google Analytics can provide.