Monday, August 18, 2008

Library Assessment Conference (LAC)

Library Assessment Conference (LAC), August 4-6, 2008, Seattle, Washington, Sponsored by Association of Research Libraries, the University of Washington Libraries and the University of Virginia Library.

Mike Crumpton and I (Kathy Crowe) attended this conference and also presented “Using Evidence for Space Planning” which discussed the in-house survey, observational studies and focus group research that we’ve been doing for the past year. It was an excellent conference both in content and organization. They also provided great parties and good food, too! I encourage you to visit the web site which has the power point presentations.This is the 2nd LAC; the first was held at the University of Virginia in 2006. The proceedings for that conference are in our collection and also available as an e-book. The organizers included Jim Self and Steve Hiller who visited us last fall. Our wonderful colleague at Wake Forest, Wanda Brown, has done a fine job of summarizing the plenary sessions on the WF blog and it’s been posted on the Library Assessment blog as well. So there’s no need to reinvent the wheel. Mike & I attended different concurrent sessions except for a couple so I’ll summarize the ones I went to and he can do the same. I will also mention there were good poster sessions as well and the abstracts for those on the conference web site.

Information Literacy

iSkills at Cal State and U of Central Florida

iSkills is the info lit test developed by ETS. They’ve used it at UCF and the entire Cal State system. I was particularly interested in the fact that at UCF they used information literacy for the SACs Quality Enhancement Plan (QEP). I’d love to do that here. At Cal State San Marcos they used iSkills with their Gen Ed assessment.

Qualitative Research

Personas and a User-centered Visioning Process / Cornell

They used anthropological methods for their web re-design. Personas were composite sketches of their target users groups. They did 36 interviews and developed 10 personas to assess research patterns of users.

Patterns of Culture: Re-aligning Library Culture to Meet user Needs

With support from a Mellon Foundation grant they used ethnographic methods to interview faculty and students and did also observational studies to learn how they obtain information, do research and prepare for teaching. They also asked students to do photographic diaries of their study habits. The research helped them develop a plan to align library resources and services more closely to user needs. They also produced a research model for other libraries to use.

Mixing Methods, Bridging Gaps : An Ethnographic Approach to Understanding Students

The presenter is an anthropologist. His study examined how doctoral students do research and use libraries. He found they like tried and true resources (e.g. WOS). They are not expert researchers. They’re unaware of many software programs such as EndNote. They don’t use librarians much unless they’re embedded.

Data into Outcomes

What if we Don’t Provide the Computers? Assessment for Reduction

At the undergrad library they received funding to develop a pilot to reduce the number of PCs and create an area for laptops. They did an in-house survey and focus groups. They asked students to put what they’d want in a laptop room on a post-it and put them on a wall. They also gave them blank floor plans to fill in. They learned that environment was very important and so were peripherals (mice, keyboards). They reduced the number of PCs from 100 to 30 and designed a much more attractive area. I think it’s just starting this fall so don’t know how successful it will be.

Reference

Using the READ Scale (Reference Effort Assessment Data): Qualitative Statistics for Meaningful Reference Assessment

Librarians from several institutions that have used READ reported on their methods and applications. READ is a 6- point scale tool. Librarians assign a number based on the level of a question. 1 is directional and 6 would be a very in-depth consultation. The scale has been tested at several institutions and more will be invited to use it. I think it looks like a fairly easy way to assess the level of questions to help determine staffing patterns. There’s more info at this web site and in the Ref Assessment ARL Spec Kit # 268I have not yet looked at either of these documents. SUNY Albany has used it with DeskTracker.

Systematic Quantitative and Qualitative Reference Transaction Assessment:An Approach for Service Improvements/ Cornell

The presenters described a reference transaction tracking system they developed at Cornell. They collaborated with Computer Science students to create the Reference Statistics Reporting System. In the first year the dataset included 70,000 transactions from 24 service points which were analyzed quantitatively and qualitatively. This system sounded rather complex to me.

Information Competence Assessment Using First Year and Upper-Division Writing Samples/ Cal State Channel Islands (it IS in California not England!)

They tested ISkills along with the rest of Cal State (see above) and didn’t feel it worked for them. It did not really show the value of the library in the system. Cal State gives Info Lit grants (wouldn’t that be wonderful?) so they developed rubrics to assess information skills along with the Composition Program. They applied the rubrics to research projects which they felt placed value on the student output.

Library Instruction Made Easy: Practical Tips to Get you Started with Little Training, Money or Time./ U of Nebraska Omaha

They worked with faculty to develop pre and post library instruction instruments which they administered via Blackboard. Sounds a lot like what Amy & Lynda have done! I’m thinking we might be able to tie this into WEAVE as well.

Assessment Plans

Mike & I both attended this session since we want to develop a plan here. Four institutions presented on how they developed their plans. Their plans are on this web site.

The above summary is from Monday through Wednesday. On Thursday I attended two half-day workshops. The first was Getting Started with Learning Outcomes Assessment: Purposes, Practical Options and Impact with Megan Oakleaf of the Syracuse Library School. Megan covered several different methods and possibilities of assessing information literacy including standardized tests (SAILS, iSkills, ILT), pre and post-tests, and using rubrics from an assignment such as a worksheet or paper. I could tell she was pushing the last method primarily. I got a lot of good handouts that I need to read more thoroughly. This workshop was a bit rushed. Fortunately, I’ve been accepted to Info Lit Assessment Immersion in December and Megan will be a faculty member there so I’m looking forward to hearing more from her.

Mike & I both attended a 2nd half-day on Successfully Implementing the Balanced Scorecard presented by Jim Self and Donna Tolson from UVA. The BSC comes from Business and emphasizes developing measurable goals or metrics from three perspectives: finance, customer services and process. Indicators are then identified for each goal along with methods and targets. An example of this process would be:

Goal in the User perspective: Develop high quality collections that reflect the needs of the Library’s users and support the Universitys’ mission.

Metric: Circulation of new monographs

Target 1:60% of all newly cataloged print monographs should circulate within two years.

Target 2:50 % should circulate within two years.

Method: A program will extract data from the SIRSI records documenting circulation of print monographs over a 2-year cycle.Only items circulated to users (not binding, etc.) will be counted.

No comments: