For a better experience, click the Compatibility Mode icon above to turn off Compatibility Mode, which is only for viewing older websites.

The Role of Libraries in Assessment of Student Learning

By Beth Ten Have, Director Library Academic Partnerships, Drexel University Libraries

Assessment in academic libraries has traditionally focused on collecting measurements of community engagement with the libraries’ services and staff.  We measure the number of books checked out, articles downloaded, reference consultations held and instructional activities hosted.  Over time, we’ve added context to these counts to increase our understanding of engagement. We now analyze the subject distribution of checked out books, track the journals getting the most and least downloads, classify reference consultations by depth of subject knowledge, and track instructional activities by discipline and curriculum.  For example a look at our Reference Desk data for the last few years showed that over 75% of the questions being asked were about routine services and were very similar to the questions being asked at the Circulation Desk.  In response, we launched the Library Assistance Service at the Circulation Desks, removed Reference Desks, and reformulated Reference Consultation to focus on in-depth subject and information needs that require the expertise of a librarian. 

What we don't learn from these measures is the impact of the Libraries' collections and instructional activities on teaching, learning, and research at Drexel.  Are students better consumers of information if they have had a reference consultation?  Are students information-seeking behaviors more explorative and creative because they participated in a scaffolded sequence of instructional activities embedded in a first-year writing course?  Do students with strong academic records show a propensity for regular interactions with librarians? 

Anecdotally, based on what we see in the Libraries and on feedback from faculty, we do believe that engagement with the Libraries makes for a better, more information savvy Drexel graduate.  However, anecdotes don't suffice in the assessment-driven environment of higher education, so we have a number of initiatives underway to help us understand the Libraries’ influence on teaching, learning and research at Drexel. 

Last spring, the Libraries conducted a small pilot study of the iSkills assessment. Developed by Educational Testing Service (ETS), iSkills is a 75-minute proctored exam in which students work through computer simulations. The test measures how well a student uses technology and information to solve problems. Results from the iSkills pilot emphasized the value of performance-based assessment, but we are not convinced that we found the right tool for this.  We are experimenting with in-class assessments using basic cell phone texting to “check in” with the students in the course of an instructional activity.  We have partnered with the First-Year Writing program and Freshman Engineering Design to develop a series of activities and assessments in Drexel Learn. Delivering content through the University’s learning management system provides new methods for analyzing student engagement.  As the Libraries expands its partnerships with faculty to design assignments and courses that foster better information literacy in Drexel students, we’ll also be seeking to explore the impact of these instructional activities on student learning. 

Later this academic year, we’ll be introducing a new Manager of Learning Engagement.  This important managerial position has responsibility for the Libraries’ program of information literacy instruction, of which assessment is a key piece.  The addition of this position will only strengthen our current understanding and measurement as we seek to better understand the impact of our efforts.

Comments