Skip to content

For many libraries it's summer time and there's an opportunity to devote attention to longer-term projects. In this post I want to talk about making plans for information literacy assessment.

As you think about your information literacy program you may have questions like these:

  • What can I tell my faculty colleagues about information literacy outcomes on our campus? I want to have focused conversations with them that lead to common priorities and collaborations.
  • What information literacy data can we contribute to our institution's accreditation self study?
  • How can we demonstrate the value of the library to our campus administrators?
  • At what point are students capable of critically assessing the information they encounter?
  • How does student information literacy differ at lower and upper division levels?
  • Are there tools that will help us know are we meeting our institutional learning outcome goals for information literacy?
  • How can I guide my students in gaining a deeper understanding of their IL strengths and weaknesses? Can I guide their exploration of what information literacy is and why they need it, as well as get feedback about where they can improve?

Carrick Enterprises offers a suite of valid and reliable information literacy assessments to help answer these questions and achieve these goals. Supported by a team of information literacy and measurement experts, these assessment tools produce valuable insights that librarians are using to inform their information literacy efforts. Whether it's identifying areas for growth, looking for evidence of improvement over the course of a student's college career, or bringing formalized assessment to accreditation efforts, the Carrick Enterprises assessments deliver what you need with pricing that respects your budget.

...continue reading "Planning to Plan: InfoLit Assessment Projects"

By Nicole Eva, Rumi Graham, and Sandra Cowan
University of Lethbridge, Lethbridge, Alberta, Canada

Have you ever wondered what the benefits and drawbacks of using a SAILS Cohort Test vs a Build-Your-Own Test (BYOT) might be? Wonder no more – we’re here to share our experiences in using both kinds of tests to help you make an informed decision that fits your own institution’s needs and purposes.

Sandra Cowan

Rumi Graham

Nicole Eva

When we first looked into using a SAILS information literacy (IL) test in 2015, the only one available to us as a Canadian institution was the International Cohort Test. We had hoped to gather reliable, objective data on IL levels of first-year undergrads before and after librarian-created instruction. Our key questions were:  What level of IL is possessed by incoming first-year students? and Is there a significant improvement in students’ IL abilities after receiving IL instruction?  Our aim was to explore potential answers by looking for possible correlations between students’ IL attainment levels and their year of study as well as the amount or format of IL instruction. ...continue reading "Guest Post: SAILS Cohort Test vs BYOT, A Canadian Perspective"

Photo of Liz Kavanaugh
Liz Kavanaugh, Information Literacy and Assessment Librarian at Misericordia University

Liz Kavanaugh is a founding member of the Advisory Board of the Threshold Achievement Test for Information Literacy. A long-time user of the Project SAILS information literacy assessment tool and an advocate for effective assessment, Liz was the perfect match for the fledgling project to create a new tool based on the ACRL Framework.

In this interview, you will see how Liz's commitment to assessment and to information literacy are woven throughout her professional life.

Question: What do you like about your job?

Liz: I am very fortunate to be in the position of Information Literacy and Assessment Librarian at Misericordia University in Dallas, Pennsylvania. When I took the position about five years ago, we were just heading into an accreditation year with Middle States Commission on Higher Education (MSCHE). It was an exciting time that launched me right into the thick of gathering data, writing reports, and meeting with stakeholders across campus. I really loved the active sense of how important assessment was at that time and I love how it has grown into a more full-fledged body of data today for the library. Much of it is based on the information brought forward through our long-term use of SAILS at this time and now we’re on the route to our 2024 review, which brings the excitement full circle.

Q: Please tell us about a project you are currently working on. What are you trying to accomplish? ...continue reading "Meet the TATIL Advisory Board: Liz Kavanaugh"

At Carrick Enterprises, we talk with librarians about their information literacy goals and their need for assessments that provide specific, immediate, and actionable results. Our customers have questions like these:

  • What information literacy data can we contribute to our institution's accreditation self study?
  • How can we demonstrate the value of the library to our campus administrators?
  • What role do dispositions have in information literacy? How can I understand my students' information literacy dispositions and encourage them?
  • At what point are students capable of critically assessing the information they encounter?
  • How does student information literacy differ at lower and upper division levels?
  • I want a tool that helps us know are we meeting our institutional learning outcome goals for information literacy.
  • I would like to guide my students in gaining a deeper understanding of their IL strengths and weaknesses. At the beginning of our IL course, I want them to explore what information literacy is and why they need it, as well as get feedback about where they can improve.
  • What can I tell my faculty colleagues about information literacy outcomes on our campus? I want to have focused conversations with them that lead to common priorities and collaborations.

...continue reading "Get Ready for Fall 2018: Planning for Information Literacy Assessment"

Carolyn Caffrey Gardner
Carolyn Caffrey Gardner, Information Literacy Coordinator at Cal State Dominguez Hills in Carson, California, USA

It can be a challenge to navigate accrediting bodies and their expectations for information literacy instruction and assessment. This is a snapshot of how folks at one campus tackled the self-study for WSCUC accreditation, including some takeaways that may help you on your own accreditation journey.

I joined California State University Dominguez Hills in May of 2016, in the midst of an accreditation preparation frenzy. As the new information literacy coordinator, I jumped right into the ongoing process of preparing for reaccreditation, which had started years in advance. In Fall 2015, as we geared up for our 2018 site visit, our campus created Core Task Forces. Each core task force was charged with analyzing a WSCUC core competency on our campus. These competencies are expected of every graduating student and include Information Literacy (IL). Led by Library Dean Stephanie Brasley, the IL Task Force began with extensive discussions about how information literacy is defined and where we can identify these skills being taught on our campus. The committee was made up of a diverse cross-section of faculty and administrators, each with different understandings of what information literacy is and how we can measure competency. While I wasn’t yet on campus for these discussions, the committee minutes and other documentation describe the task force’s adoption of the ACRL Framework definition of information literacy and the recommendation that we distribute that definition widely. The IL Task Force then began identifying where IL competencies were taught on our campus. Ultimately, the task force felt that retroactive assessment of assignments not intended to teach or measure information literacy outcomes wouldn’t provide an authentic understanding of our students’ learning. For those reasons, they opted not to conduct a one-time assessment project, such as applying an existing rubric (e.g., AAC&U) to collect student work, and instead opted to find existing evidence. The committee recruited students to participate in IL testing using Project SAILS, used existing NSSE data (from the general questions and not the information literacy module add-on), and explored program-level student learning outcomes assessment data. ...continue reading "CSU Dominguez Hills and the WASC Senior College and University Commission"