Skip to content

Carolyn Caffrey Gardner
Carolyn Caffrey Gardner, Information Literacy Coordinator at Cal State Dominguez Hills in Carson, California, USA

It can be a challenge to navigate accrediting bodies and their expectations for information literacy instruction and assessment. This is a snapshot of how folks at one campus tackled the self-study for WSCUC accreditation, including some takeaways that may help you on your own accreditation journey.

I joined California State University Dominguez Hills in May of 2016, in the midst of an accreditation preparation frenzy. As the new information literacy coordinator, I jumped right into the ongoing process of preparing for reaccreditation, which had started years in advance. In Fall 2015, as we geared up for our 2018 site visit, our campus created Core Task Forces. Each core task force was charged with analyzing a WSCUC core competency on our campus. These competencies are expected of every graduating student and include Information Literacy (IL). Led by Library Dean Stephanie Brasley, the IL Task Force began with extensive discussions about how information literacy is defined and where we can identify these skills being taught on our campus. The committee was made up of a diverse cross-section of faculty and administrators, each with different understandings of what information literacy is and how we can measure competency. While I wasn’t yet on campus for these discussions, the committee minutes and other documentation describe the task force’s adoption of the ACRL Framework definition of information literacy and the recommendation that we distribute that definition widely. The IL Task Force then began identifying where IL competencies were taught on our campus. Ultimately, the task force felt that retroactive assessment of assignments not intended to teach or measure information literacy outcomes wouldn’t provide an authentic understanding of our students’ learning. For those reasons, they opted not to conduct a one-time assessment project, such as applying an existing rubric (e.g., AAC&U) to collect student work, and instead opted to find existing evidence. The committee recruited students to participate in IL testing using Project SAILS, used existing NSSE data (from the general questions and not the information literacy module add-on), and explored program-level student learning outcomes assessment data. ...continue reading "CSU Dominguez Hills and the WASC Senior College and University Commission"

Lyda Fontes McCartin
Lyda Fontes McCartin, Professor, Head of Information Literacy & Undergraduate Support, University of Northern Colorado, Greeley, Colorado, USA

In 2014, my library Curriculum Committee started work on developing new student learning outcomes for our 100-level LIB courses. We teach five distinct credit courses; four are 100-level courses and one is a 200-level course. The learning outcomes had not been revisited in years and we had added new courses since that time. With the debut of the Framework, we took the opportunity to update our learning outcomes. It was at this time we began considering all of our 100-level courses as one “program.” An overview of the process we used to create the outcomes is provided in a C&RL News article titled “Be critical, but be flexible: Using the Framework to facilitate student learning outcome development.” The 100-level student learning outcomes are:

  1. Students will be able to develop a research process
  2. Students will be able to implement effective search strategies
  3. Students will be able to evaluate information
  4. Students will be able to develop an argument supported by evidence

Since 2015, I’ve been guiding the library Curriculum Committee through the creation of signature assignments to assess our credit courses so that we can look at student learning across 100-level sections. A signature assignment is a course-embedded assignment, activity, project, or exam that is collaboratively created by faculty to collect evidence for a specific learning outcome. Most of the time you hear about signature assignments in relation to program level assessment, but they can also be used to assess at the course level and are especially useful if you want to assess a course that has many sections taught by multiple instructors (hint – this model can be used for one-shot instruction as well).

I like signature assignments because ...continue reading "Assessing Credit Courses with Signature Assignments"

This semester I provided two workshops for the part-time librarians I work with who do most of the teaching in our one-shot library/research instruction program.  Although I see them every day, it’s rare that we carve out time to meet as a group and getting together even depends on some librarians coming in on their time off.  But we get so much out of sharing our experiences with each other that we’re all willing to give a little extra to make it work.  At these meetings I had a chance to facilitate discussion about the Framework, which might seem a little late since it was first adopted nearly three years ago, but it was good timing for us because we recently got support from our college administrators to purchase the Credo InfoLit Modules and it’s helping us to think about the scope of our instruction in new ways.

In particular, we’ve been thinking about how to reach beyond our one-shots in new ways.  The information literacy lessons from Credo are one way to reach students before or after we see them in the library.  With a little coordination between the librarian and the professor who’s requesting instruction, students can be introduced to concepts like the value of information or the role of iteration in planning a search strategy before coming to the library.  Or they can get step-by-step, self-paced practice with MLA citations to follow up on our in-class discussions about how they should expect to use various types of sources in their analysis or argument.

...continue reading "Resources for One-Shots"

Sometime around 1996 I attended a conference on communication studies. I was working on a master’s degree in Comm Studies and this was my first conference in an area outside of librarianship. I was happy to discover a presentation on research related to libraries, specifically nonverbal behaviors of reference librarians. As the researcher described her findings and quoted from student statements about their interactions with librarians, I experienced a range emotions. Interest and pride soon gave way to embarrassment and frustration. The way I remember it now, there were a host of examples of poor interactions. “The librarian looked at me like I was from Mars,” that sort of thing. Most memorable to me was one of the comment/questions from an audience member. “Librarians need to fix this. What are they going to do about it?,” as though this study had uncovered a heretofore invisible problem that we should urgently address. (Did I mention feeling defensive, too?) I didn’t dispute the findings. What I struggled with was the sense that the people in the room thought that we librarians didn’t already know about the importance of effective communication and that we weren’t working on it. Was there room for improvement? For sure! But it wasn’t news to us.

I thought about that presentation again recently after viewing a webinar by Lisa Hinchliffe about her research project, Predictable Misunderstandings in Information Literacy: Anticipating Student Misconceptions To Improve Instruction. Using data from a survey of librarians who provide information literacy instruction to first year students, Lisa and her team provisionally identified nine misconceptions that lead to errors in information literacy practice. For example, first year students “believe ...continue reading "We’re Working On It: Taking Pride in Continuous Instructional Improvement"

The cornerstone of the Threshold Achievement Test for Information Literacy are the outcomes and performance indicators we wrote that were inspired by the ACRL Framework for Information Literacy for Higher Education.

Working with members of our Advisory Board, we first defined the information literacy skills, knowledge, dispositions, and misconceptions that students commonly demonstrate at key points in their education: entering college, completing their lower division or general education requirements, and preparing for graduation. These definitions laid the groundwork for analyzing the knowledge practices and dispositions in the Framework in order to define the core components that would become the focus of the test. Once we determined to combine frames into four test modules, the performance indicators were then used to guide item writing for each of the four modules. Further investigation of the Framework dispositions through a structural analysis led to identifying and defining information literacy dispositions for each module.

...continue reading "From Framework to Outcomes to Performance Indicators, Plus Dispositions!"