Skip to content

By Nicole Eva, Rumi Graham, and Sandra Cowan
University of Lethbridge, Lethbridge, Alberta, Canada

Have you ever wondered what the benefits and drawbacks of using a SAILS Cohort Test vs a Build-Your-Own Test (BYOT) might be? Wonder no more – we’re here to share our experiences in using both kinds of tests to help you make an informed decision that fits your own institution’s needs and purposes.

Sandra Cowan
Rumi Graham
Nicole Eva

When we first looked into using a SAILS information literacy (IL) test in 2015, the only one available to us as a Canadian institution was the International Cohort Test. We had hoped to gather reliable, objective data on IL levels of first-year undergrads before and after librarian-created instruction. Our key questions were:  What level of IL is possessed by incoming first-year students? and Is there a significant improvement in students’ IL abilities after receiving IL instruction?  Our aim was to explore potential answers by looking for possible correlations between students’ IL attainment levels and their year of study as well as the amount or format of IL instruction. ...continue reading "Guest Post: SAILS Cohort Test vs BYOT, A Canadian Perspective"

Photo of Liz Kavanaugh
Liz Kavanaugh, Information Literacy and Assessment Librarian at Misericordia University

Liz Kavanaugh is a founding member of the Advisory Board of the Threshold Achievement Test for Information Literacy. A long-time user of the Project SAILS information literacy assessment tool and an advocate for effective assessment, Liz was the perfect match for the fledgling project to create a new tool based on the ACRL Framework.

In this interview, you will see how Liz's commitment to assessment and to information literacy are woven throughout her professional life.

Question: What do you like about your job?

Liz: I am very fortunate to be in the position of Information Literacy and Assessment Librarian at Misericordia University in Dallas, Pennsylvania. When I took the position about five years ago, we were just heading into an accreditation year with Middle States Commission on Higher Education (MSCHE). It was an exciting time that launched me right into the thick of gathering data, writing reports, and meeting with stakeholders across campus. I really loved the active sense of how important assessment was at that time and I love how it has grown into a more full-fledged body of data today for the library. Much of it is based on the information brought forward through our long-term use of SAILS at this time and now we’re on the route to our 2024 review, which brings the excitement full circle.

Q: Please tell us about a project you are currently working on. What are you trying to accomplish? ...continue reading "Meet the TATIL Advisory Board: Liz Kavanaugh"

At Carrick Enterprises, we talk with librarians about their information literacy goals and their need for assessments that provide specific, immediate, and actionable results. Our customers have questions like these:

  • What information literacy data can we contribute to our institution's accreditation self study?
  • How can we demonstrate the value of the library to our campus administrators?
  • What role do dispositions have in information literacy? How can I understand my students' information literacy dispositions and encourage them?
  • At what point are students capable of critically assessing the information they encounter?
  • How does student information literacy differ at lower and upper division levels?
  • I want a tool that helps us know are we meeting our institutional learning outcome goals for information literacy.
  • I would like to guide my students in gaining a deeper understanding of their IL strengths and weaknesses. At the beginning of our IL course, I want them to explore what information literacy is and why they need it, as well as get feedback about where they can improve.
  • What can I tell my faculty colleagues about information literacy outcomes on our campus? I want to have focused conversations with them that lead to common priorities and collaborations.

...continue reading "Get Ready for Fall 2018: Planning for Information Literacy Assessment"

Carolyn Caffrey Gardner
Carolyn Caffrey Gardner, Information Literacy Coordinator at Cal State Dominguez Hills in Carson, California, USA

It can be a challenge to navigate accrediting bodies and their expectations for information literacy instruction and assessment. This is a snapshot of how folks at one campus tackled the self-study for WSCUC accreditation, including some takeaways that may help you on your own accreditation journey.

I joined California State University Dominguez Hills in May of 2016, in the midst of an accreditation preparation frenzy. As the new information literacy coordinator, I jumped right into the ongoing process of preparing for reaccreditation, which had started years in advance. In Fall 2015, as we geared up for our 2018 site visit, our campus created Core Task Forces. Each core task force was charged with analyzing a WSCUC core competency on our campus. These competencies are expected of every graduating student and include Information Literacy (IL). Led by Library Dean Stephanie Brasley, the IL Task Force began with extensive discussions about how information literacy is defined and where we can identify these skills being taught on our campus. The committee was made up of a diverse cross-section of faculty and administrators, each with different understandings of what information literacy is and how we can measure competency. While I wasn’t yet on campus for these discussions, the committee minutes and other documentation describe the task force’s adoption of the ACRL Framework definition of information literacy and the recommendation that we distribute that definition widely. The IL Task Force then began identifying where IL competencies were taught on our campus. Ultimately, the task force felt that retroactive assessment of assignments not intended to teach or measure information literacy outcomes wouldn’t provide an authentic understanding of our students’ learning. For those reasons, they opted not to conduct a one-time assessment project, such as applying an existing rubric (e.g., AAC&U) to collect student work, and instead opted to find existing evidence. The committee recruited students to participate in IL testing using Project SAILS, used existing NSSE data (from the general questions and not the information literacy module add-on), and explored program-level student learning outcomes assessment data. ...continue reading "CSU Dominguez Hills and the WASC Senior College and University Commission"

Sign reading Good Cheap Fast
Credit: cea+ www.flickr.com/photos/centralasian/4534292595 CCBY2.0

It can be a challenge to decide which SAILS or TATIL test is the best one for your needs. Here I will take a few minutes to explain why we offer so many test options and how to determine which one is right for you.

The construct of information literacy is very broad. If you think about it as a light spectrum, it includes everything from infrared to ultraviolet. Many important concepts such as authority, intellectual property, search strategies, scholarship, and research are included. There is a lot to cover if you are going to assess your students’ information literacy capabilities. In order to make testing of these concepts manageable, we have grouped them in various ways.

Project SAILS has eight skill sets that we developed using the ACRL Information Literacy Competency Standards for Higher Education as a source for our
learning objectives. There are 162 test questions across the eight skill sets. The skill sets allow for in-depth scoring.

Threshold Achievement Test for Information Literacy (TATIL) has four modules. Using the ACRL Framework for Information Literacy as a guide, our advisory board created performance indicators for the entire IL construct that we then combined into modules. There are a total of 101 test questions across the four modules. These modules allow for in-depth scoring.

We think it's important to make tests that can be administered in a standard class hour. This means we cannot ask a student to answer every SAILS question or every TATIL question. Instead students answer a subset of the full test question bank.

We would also like to be able to give each student an individual score when possible. For many institutions receiving individual student scores is necessary in order to achieve their goals. Having individual scores also means we can generate a custom report for each student highlighting their strengths and making recommendations.

I have covered the three aspects of information literacy testing. We call these Breadth, Depth, and Individualization. Breadth indicates how much of the IL construct is covered, from partial to complete. Depth indicates how granular the reporting is, from shallow to deep. And Individualization indicates whether an individual student receives a score.

When having someone do a job for you, the old saying goes: Good, cheap, fast -- pick two. When deciding on a testing option you have a similar choice: Breadth, Depth, Individualization -- pick two. Here’s why:

...continue reading "SAILS and TATIL: Why Are There So Many Test Options?"