Skip to content

By Nicole Eva, Rumi Graham, and Sandra Cowan
University of Lethbridge, Lethbridge, Alberta, Canada

Have you ever wondered what the benefits and drawbacks of using a SAILS Cohort Test vs a Build-Your-Own Test (BYOT) might be? Wonder no more – we’re here to share our experiences in using both kinds of tests to help you make an informed decision that fits your own institution’s needs and purposes.

Sandra Cowan
Rumi Graham
Nicole Eva

When we first looked into using a SAILS information literacy (IL) test in 2015, the only one available to us as a Canadian institution was the International Cohort Test. We had hoped to gather reliable, objective data on IL levels of first-year undergrads before and after librarian-created instruction. Our key questions were:  What level of IL is possessed by incoming first-year students? and Is there a significant improvement in students’ IL abilities after receiving IL instruction?  Our aim was to explore potential answers by looking for possible correlations between students’ IL attainment levels and their year of study as well as the amount or format of IL instruction. ...continue reading "Guest Post: SAILS Cohort Test vs BYOT, A Canadian Perspective"

Download sample student reportThis semester Carolyn Radcliff and I had the opportunity to discuss the test and the students’ results reports with our own classes or with students in our colleagues’ classes.  You can see an example of students’ personalized results reports by clicking the thumbnail to the right.  These reports are currently available for the field testing versions of modules 1 and 2 and will be available for field testing versions of modules 3 and 4 in 2017.

Students’ Responses to their Personalized Results

Our conversations with students gave us a new perspective on the test.   As with any test results, some students were disappointed by their results and others disagreed with the evaluation of their performance, but overall students found value in the reports.  Here are some samples of reflective responses from students:

  • I felt most engaged when the results said that I ‘have the habit of challenging (my) own assumptions.’ That’s something I definitely do and I was surprised that the test was able to detect that.
  • I was most surprised that the report said that I defer to particular kinds of authority a bit more than others; I will be sure to keep the recommendations in mind.
  • It was surprising that I wasn’t as proficient as I thought but I felt most engaged by the results when I learned that most college students are also at my level.
  • It was surprising that the results reminded me to seek out additional perspectives and not only ones that support my claim or topic.
  • The chart of my score was interesting.
  • I felt most engaged at the beginning [of the results report] when they analyzed my results directly by using [the pronoun] ‘you.’
  • The test was beneficial by making me think about the use of different sources.
  • Nothing was surprising, but I did agree with the recommendations to strengthen my writing/reading abilities, which I found very helpful.

Students appreciate having results immediately, and in one class where we promised them results but an error on my part during the test set-up delayed their reports, students expressed disappointment and were relieved when they understood that they would still get their personalized reports later.  Nevertheless we know that not every testing situation is intended to result in direct feedback to students, so the student reports are an optional feature that you can turn on or off when you set up the test each time.

...continue reading "December Update: How Students Experience the Test"