Skip to content

Library Instruction Assessment Improvement Strategy for SACS: A Case Study

This post is based on a presentation that Kory Paulus gave at the North Carolina Community College Library Association Conference in March of this year.

photo of Kory Paulus
Kory Paulus, Reference and Instruction Librarian, Wingate University

Wingate University recently had its Southern Association of Colleges and Schools Commission on Colleges (SACSCOC) accreditation review. During the review they suggested we strengthen our assessments. With that, WU moved Mitch Cottenoir to the position of Institutional Effectiveness and SACSCOC Liaison. Mitch approached the library and asked how we would like to improve assessment. As a library we decided most of our direct interactions with students came from classes taught by the reference and instruction librarians, Isaac Meadows and myself, Kory Paulus. So began our adventure into updating our assessment for instruction and information literacy.

As a first step, knowing only a little about assessment, I checked out our library’s collection of assessment books. I am a librarian after all and love to research! After considering several books I chose to begin my reading with Classroom Assessment Techniques for Librarians, written by Melissa Bowles Terry and Cassandra Kvenild and published by the Association of College and Research Libraries. This book gave us important guidance for our assessment.

We began the process of redesigning our instruction survey by creating learning outcomes, which I knew were essential to the assessment process. I also knew that the most important part about learning outcomes is something I learned from another librarian, Amy Burns. Amy always says, “Think about what you want them to be able to do and how you know they can do it.” The learning outcome is “what you want them to be able to do” and the assessment is “how you know they can do it.”

Classroom Assessment Techniques for Librarians gives many examples of learning outcomes. I entered them in a Google doc and identified the ones we thought we were already doing. From there we edited the learning outcomes to fit our institution’s mission and goals, using sample verbs from Bloom's taxonomy as shown in the book. Once Mitch approved them we began to create lesson plans based on the learning outcomes. Early in the lesson planning, we realized two of the learning outcomes were almost identical so I combined them into one.

We were fortunate to have many lesson plans already created because I make a lesson plan for every class I teach. We used my template to create additional lessons. Once we had lesson plans and learning outcomes we chose or wrote questions that would allow students to show us that they can complete a task. The idea behind the questions was to gather solid data that could show library instruction classes are beneficial for our students. Classroom Assessment Techniques for Librarians advises that the questions for pre-testing be different from those used in post-testing because students anticipate what the questions will be and therefore will be looking for the answers. Giving students the same questions can result in a false conclusion that they learned something new. The process of creating pre- and post-test questions was challenging. In particular we had a bit of difficulty designing Boolean operator questions that would not be too leading. In the end we used or adapted questions from other sources and created several of our own. We were able to try out the questions with approximately 25 student assistants who are a great resource and a nice sample set. We are still concerned about the Boolean questions due to the results we received. We believe the NOT questions may have some type of issue because students apparently unlearned NOT based on their pre- and post- assessment. That or they guessed correctly on the pre-test and did not really know how to apply the function.

Charts of our preliminary results are below. This set is based on our Freshman Biology 150 class. We were able to include them in the pilot because we have a good working relationship with the instructors. It is a unique sample in that we see every BIO 150 Lab class that is taught at Wingate University.

As you can see here, students improved on their OR and AND skills but their post-assessments of NOT were lower than their pre-assessment. The second chart shows the average increase of using Boolean operators, including NOT. This chart shows a four percent increase in knowledge by the BIO 150 students. The third chart shows increased knowledge without the NOT included and there is an 11% increase in knowledge! To think we had an impact on 11% of the students is a wonderful accomplishment for us. We are looking forward to new data as we conclude the spring semester and will be able to add research paper grades for some of the students.

In addition, we want students to understand that Fetch!, our discovery service, is a great resource. However, on occasion it is more effective to go straight to a specific database to do research because the results will be more accurate and provide fewer extraneous results. This is a big picture idea, but the ideal learning outcome is for them to just identify that Fetch! is a search engine and that we have some science-specific databases that they can search.

The pre-assessment, as shown in the chart above (red), clearly illustrates the students’ preference to search in Fetch!. In the post-assessment (orange) the students demonstrated they built upon their skills and expanded their understanding of how to identify specific scientific databases.

Initially, we created some rubrics as well to help grade some of the assessments. We were hoping our student assistants could assist in the grading process but we believe we may need to do the grading ourselves. We also realized quickly our rubrics were askew with the number of words we thought students could identify. We plan to edit and update rubrics after analyzing all of the data to get an accurate picture for the next semester.

This entire process has enforced what we originally thought, but now we have definitive assessment data to use as a baseline to analyze our future data. Additionally, the data can be used for faculty to include in their student assessment models and for their SACS accreditation requirements. Assessment is never a one-time process and is something that will continue to be refined and updated. With these new initiatives our assessment process is now in-line with SACS criteria.