Friday, August 15, 2008

Library Assessment Conference 08 – Overview

This overview ended up being a synthesis of what I learned. The conference Web site includes presenter’s PowerPoints where available: http://libraryassessment.org/

Assessment defined
Since WMU is currently defining assessment as mainly about student learning outcomes, it makes it hard for libraries to come up with appropriate assessments, as we usually do not teach semester long courses. If that was all there was to assessment, then three quarters of this conference would have been irrelevant and even LibQual would not be considered an assessment tool. (One contentious commentator called it just an evaluation tool.) Assessment is a complex concept, and the goal is a Culture of Assessment, where people automatically assess to improve what they do. One person said that instead of having a Culture of Complaint, turn it into a Culture of Assessment. Even better, someone called it a Culture of Improvement. I understand assessment in libraries as a way of gathering data on our collections, services and space, and the needs and activities of our users to improve the learning, teaching and research experience of the whole Western Michigan University community.

Organizational culture

  • Assessment goes hand in hand with strategic planning
    • “No action without a plan, no plan without data” Rick Luce
  • Most research libraries have an assessment person, sometimes combined with marketing or communications, sometimes just a designated 15% spent on assessment
  • Most research libraries have an assessment committee, as do we, though it has not been very active and has focused mostly on LibQual
  • Employee satisfaction and input
    • Employees are users too and good judges of quality
    • Employee satisfaction important in good service
    • ClimateQual, a tool developed by UofMD to measure staff perceptions about their libraries
    • Interview or survey individual employees
    • Focus groups and retreats
  • Gather data in a timely manner and pass it to appropriate people for implementation
  • One place for data – so all can access – Penn has - DataFarm
  • Important to present data visually – see PennLibrary Facts http://metrics.library.upenn.edu/FACTS07.pdf
  • Master Blog Communication System
    • One blog per group (committee, task force, project)
    • Put up agendas before meetings
    • Minutes of meetings right after meetings
    • Allows comments
    • People can set up alerts
  • What will the scholar in 2050 expect us to have saved? (Question from Betsy Wilson, UW)
  • Need frameworks and models that reflect our values (Stephen Town, Univ. of York, UK
  • Rick Luce had many good guidelines on being a successful organization
  • Great if can work with consultants – one option is Jim Self (UofVA), Steve Hiller (UofWA), and Martha Kyrillidou (ARL) will come to libraries and help them set up better assessment programs (known as the Jim, Steve and Martha show)

Assessment of Place

  • Mine LibQual comments to inform planning of space
  • Space consultants can be useful
  • Surveys – ask what they do in the library, how often they come, how long they stay, when
  • Observation – visit different areas of the libraries and record how users use space – alone, in groups (2-3, 4+), what furniture they are using, if they are using personal or library materials, their laptop or library computers
  • Facus groups to get at details of issues found by other methods
  • Involve staff in findings and planning
  • Identify affordable changes
  • U of Chicago redid their wayfinding study – give novice library users 3 books to find and follow them around to see if they do find them – after the first study they redid signage and maps

Information Literacy / Instruction

  • The Education Testing Service (ETS) test is now called iSkills
    • Sounds like quite a few institutions are using this, but some of the preliminary results did not sound promising
  • SAILS mentioned briefly
  • Megan Oakleaf (Syracuse) and Lisa Hinchliffe from U of IL Champaign-Urbana did a study of 437 instruction librarians and asked if they assess their instruction sessions, if they have data, and if they have used the data. (228 respondents actually use the data) Some reasons for not doing this, besides lack of time and resources:
    • Questions about whether the results actually measured IL
    • Lack of knowledge and skills
    • No centralized support or commitment to gather this data
    • Lack of a conceptual framework
  • One school developed a self-assessment tool
    • Worked closely with teaching faculty
    • Based on ACRL standards
    • Students reflect on own research and learning process
  • Some moving away from instruction on demand, as develop Info Lit program within the curriculum with faculty
  • When Info Lit is imbedded in the general education program, e-portfolio systems (like our iWebfolio) can keep track of papers over the years

Reference

  • READ (Reference Effort Assessment Data) Scale – with little effort gives insight into ref
    • Six point scale given to each reference question answered
    • 1: typical directional question that takes less than a minute
    • 6: working with PhD student or faculty over hours or days
    • 2-5 in between - have to train and calibrate across those who answer
    • Keep track of questions on and off the desk
    • Include in person (also WRAP), phone, e-mail, chat
    • Good for scheduling staff
    • Can be used in online system like Desk Tracker
  • Cornell did systematic quantitative and qualitative analysis of it’s reference questions
    • Transaction type, duration, mode, question content, date, time
    • As a result have closed ref desk during summer hours in undergrad study library
    • Thought of reference work as research assistant, information central & problem solver
  • University of Pennsylvania reference consultation form
    • Even more elaborate than at Cornell, but worth looking at for ideas
  • Heard a few instances of reduced reference collections
    • Idea – let’s keep track of what is used in our reference collection (even by us), probably by call number, so we can start weeding

Other assessment Tools and Methods

  • Univ. of Rochester anthropological studies
    • Learn from Rochester, but need to find how our students function
    • Ask students to map out where they go during a typical day and when
    • From their studies, as we well know, everyone has a cell phone, so make our phone number(s) more prominent – on home page, in stacks

3 comments:

  1. Interesting Maira. I like the blog post for conference reports. I may have to try it.
    Obviously we have a a lot of work to do on assessment. I think in addition to the surveys and evaluation instruments we need to be better about assessing actual user behavior somehow. I'm thinking for instance of those services that track TV viewing-- when you ask people what they watch they don't tell the truth sometimes, but when they put those boxes on the TV they find out that people are watching more crap and less PBS. So I'm wondering when we survey people on LibQual or after a BI session do they really tell us the truth about what they've learned and what they like or don't like about us?

    ReplyDelete
  2. I think one of the methods I really liked was observing students in the library - how do they really use our space and resources.

    ReplyDelete
  3. When The popular comment layout is common, so it is easily recognized scanning to post a comment. If the comment section is in a different format, then I am going to spend more time trying to decipher what everything means.

    online pedogogy

    ReplyDelete