Guest post by Selena Killick, co-author of the forthcoming book Putting Library Assessment Data to Work
With the move to increasing online provision within the education sector, and by association academic libraries, we have seen an explosion in big data sets. The initial COUNTER compliant statistics containing electronic journal title downloads seem quite quaint now as the granularity and scale of data harvesting has continued to expand. It is not uncommon for an academic library to have usage data for its electronic resources at an item level on an individual customer basis. The University of Huddersfield Library Impact Data Project were the first exploration into how library analytics could be used to identify the impact the library was having on student success. The researchers successfully identified a correlation between library usage and student success (Stone & Ramsden 2013). Coincidentally this research was being replicated in Australia and the USA at the same period, all with similar results (Cox & Jantti 2012) (Soria et al. 2013). Five years on, where are we now? And, if data really is the new oil, where should we drill next…?
The Ethical Debate
Before we look at where we could drill next, the question on whether we should drill at all needs to be considered. The librarian field is divided on this issue with strong views on both sides. Quantitative data is not the only source of information and it should not be used in isolation to evaluate library performance. That being said, it is a source of insight available to us and we should consider carefully if and how we use it. Personally, I think that if we are able to use library analytics anonymously, ethically, transparently, legally and with the goal of improving learner success we should exploit the data to benefit our students. I know some will disagree with me, and I’m happy to debate the subject. As data analytics becomes increasing part of the academic institutional infrastructure libraries need to identify the role they will play in this arena or risk becoming obsolete and ultimately redundant.
The Next Drilling Expedition
Until now, library analytics research has focussed on student satisfaction, library usage and student success. The Jisc Library Data Labs project has worked with librarians to combine and visualise various data sets. SCONUL Statistics and National Student Survey (NSS) scores have been combined to see if students who are the most satisfied with library provision are studying at the universities with the largest library budgets (Baylis & Burke 2017). LibQUAL+ satisfaction scores have been combined with Association of College and Research Libraries (ACRL) statistics to see if there is a correlation between satisfaction, usage and expenditure in academic libraries (Hunter & Perret 2011).
As an increasing number of library services move online the ability to harvest and analyse user data in the areas of enquiry handling and information literacy training is growing. Libraries are starting to use customer relationship management systems to manage enquiries received at help points (Killick 2017). Webchat transcripts between customers and librarians can be analysed to identify common enquiries with a view of improving the customer experience (Mungin 2017). Within the area of information literacy training provision, live online tuition and webinars are now being used by academic libraries. Delivery tools such as Adobe Connect can provide the library with data at the individual student level for live attendance and subsequent video views. With regards to data drilling, services are the new content, and it is only a matter of time before someone breaks ground.
Within the wider academic sphere the field of learning analytics has emerged, using big data to understand the characteristics of successful students with a view to optimise the learning environment (Rienties et al. 2017). As an important stakeholder in the learning environment, libraries are considering their role in supporting the learning analytics agenda (Oakleaf et al. 2017). The Library Integration in Institutional Learning Analytics (LIILA) Project is currently reviewing how libraries can support learning analytics. This one-year Institute of Museum and Library Services National Forum grant is working with a variety of international stakeholders, including librarians, system vendors and policy makers. The project team hope to get to the position where libraries are culturally ready and technically able to engage in this arena. This is our opportunity to shape the use of library data in this field, ensuring its use is anonymous, legal, ethical, and transparent; with the goal of improving learner success. If we fail to engage in the debate I suspect our publishers will bypass the library and pipe their usage data to the learning analytics community directly.
Selena Killick is co-author of the forthcoming book Putting LibraryAssessment Data to Work alongside Frankie Wilson. She has presented, published, and provided consultancy services to academic libraries on an international basis on library assessment for over 15 years. She is currently an editorial board member of the International Conference on Performance Measurement in Libraries. In 2003 she was part of the team that introduced LibQUAL+ to the UK in partnership with the Association of Research Libraries. Previously she has worked on the SCONUL Value & Impact Measurement Programme (VAMP) and the SCONUL Statistics e-measures pilot.
You can follow Selena on Twitter @selenakillick
Baylis, L. & Burke, S., 2017. Insights from Jisc & HESA Analytics Labs: An Agile, cross-institutional approach. In 12th International Conference on Performance Measurement in Libraries. Oxford.
Cox, B.L. & Jantti, M., 2012. Capturing business intelligence required for targeted marketing, demonstrating value, and driving process improvement. Library and Information Science Research, 34(4), pp.308–316.
Hunter, B. & Perret, R., 2011. Can Money Buy Happiness? A Statistical Analysis of Predictors for User Satisfaction. The Journal of Academic Librarianship, 37(5), pp.402–408.
Killick, S., 2017. Exploiting customer relationship management analytics to improve the student experience. In 12th International Conference on Performance Measurement in Libraries. Oxford.
Mungin, M., 2017. Stats Don’t Tell the Whole Story: Using Qualitative Data Analysis of Chat Reference Transcripts to Assess and Improve Services. Journal of Library and Information Services in Distance Learning, 11(1–2), pp.25–36.
Oakleaf, M. et al., 2017. Academic Libraries & Institutional Learning Analytics: One Path to Integration. Journal of Academic Librarianship, 43(5), pp.454–461.
Rienties, B. et al., 2017. A review of ten years of implementation and research in aligning learning design with learning analytics at the Open University UK. Interaction Design and Architecture(s) Journal, 33, pp.134–154.
Soria, M.K., Fransen, J. & Nackerud, S., 2013. Library Use and Undergraduate Student Outcomes: New Evidence for Students’ Retention and Academic Success. portal: Libraries and the Academy, 13, pp.147–164.
Stone, G. & Ramsden, B., 2013. Library Impact Data Project: looking for the link between library usage and student attainment. College and Research Libraries, 74(6), pp.546–559.
Join our mailing list
Sign up to our mailing list to hear more about new and forthcoming books.