
By Lauri J. Vaughan
Library Director at The Harker School in San Jose, California, USA
We know how lucky we are at The Harker School, an independent, PK-12, college preparatory school in San Jose. We enjoy an oasis of library programming and teaching featuring five full time librarians, two part time librarians and me, the library director. My team spends hundreds of hours teaching at all levels, in all disciplines, to infuse information literacy into lessons and units collaboratively designed by subject area experts and librarians.
We have a sense that our work puts our students ahead of the curve, especially in California where the ratio of school librarians to students has been dismal for many years. We see our students’ success in classrooms. We hear about it from alumni. But we also perceive weaknesses. When a test came along to quantify our students’ skills, like any good library team, we did our research. Threshold Achievement Test for Information Literacy (TATIL), offered by Carrick Enterprises, seemed promising. Inspired by ACRL’s Threshold Concepts, which inform much of our information literacy instruction at Harker, TATIL might provide a faithful assessment of how our students are doing.
TATIL and High School Seniors
Because high school students are fairly riddled with standardized exams by the time they are seniors, we first sought approval from our administration. Since we will not repeat TATIL annually, and because we were able to administer it within the context of regular class meetings (forgoing the need for a special schedule), we were given a green light.
We administered TATIL to eight groups of second semester seniors in February 2019 in their English classes. Four of the groups were students enrolled in AP Literature and four more were students enrolled in semester-long seminar classes.
We elected to make use of all four TATIL modules: Evaluating Process & Authority, Strategic Searching, Research & Scholarship, Value of Information. Each module was taken by two groups of students: one AP Literature class and one senior seminar class. In all, 122 seniors – about two thirds of the graduating class – took a TATIL module. Because we teach in 85-minute blocks, librarians could easily start and finish administration within a single block. All testing was completed in three days.
Students took the assessment on their own laptop computers. We supplied each with a unique, numerical access key. We did not keep track of which student had which key, thereby anonymizing the results. Technical set up was seamless and most students were up and running within a few minutes. Testing time varied among modules, but most students completed the assessments between 30 and 50 minutes.
Once the testing was complete we refrained from submitting our students’ answers to Carrick Enterprises immediately. We understood that TATIL was a relatively new assessment and we hoped to be able to measure ourselves against the largest pool of peer institutions possible. Waiting until the end of the school year allowed for the maximum amount of choice in selecting comparative samples. As it happened, Harker was the only high school to use TATIL in the 2019 school year and as a result we chose to compare our results to schools that administered the test to college freshmen – near peers to our high school seniors. In May we submitted our students’ work and held our breath.
What We Learned
Mercifully, results appeared within 24 hours of submission. Carrick Enterprises sent back eight detailed, 20-page reports – one for each group of students assessed. TATIL measured our students’ information literacy knowledge and dispositions in the four module categories. We were told mean scores for each group as well as for each anonymous student. Knowledge skills were further isolated by several performance indicators presented from our strongest to weakest mean scores. Disposition results included example behaviors of students who demonstrate such habits and a description of a typical Harker student’s behavior based on our students’ mean score. We were also provided a comparison against our peer institutions’ scores, as well as that of all institutions who administered the test.
Naturally, we were delighted to discover our students demonstrated many strong skills and dispositions. The real gold in these extensive reports, however, was the identification of areas we need to target. For example, while our students scored slightly above peer institutions in their ability to be persistent when searching, they “are not likely to try unfamiliar tools and advanced strategies if they do not receive direct guidance.” Can I just say, that rang true to Harker teachers?
Qi Huang, our electronic resources librarian, created scatter graphs of our students’ scores. These graphs gave us insight on how much our students’ skills varied from each other. A glance at the figures below reveals that our students are relatively similarly persistent. Their abilities to compare search strategies, however, varied widely. Such variability inspires us to question why? In which classes do we teach these strategies? How often do we teach multiple strategies and allow students greater opportunity to experiment? How early and how fast can we add skills to our instruction?


We also discovered patterns that indicated our students’ weaknesses often showed up in areas that required strong metacognitive thinking. Consequently, we are targeting development of these skills in our middle and upper school and recently subscribed to ReflEQ, (Refleq.com) an online tool to help our students think more about what and how they are thinking as opposed to going through the motions, or as we like to say, “Doing school.”
Sharing Our Results
Librarians on our lower, middle and upper school campuses are lucky to get an audience with their respective faculties during returning teacher orientation every August. This year I presented our TATIL results to teachers in each setting. While our seniors were targeted for this assessment, we know these skills and dispositions are developed over a long period of time. Harker enjoys high rates of re-enrollment and students benefit from an information literacy program that begins with our youngest learners. Teachers across all grade levels need to share in our success and acknowledge areas of need.
Meredith Cranston, our upper school campus librarian, created slides that helped crystallize some of the most important take-aways from the pages of results. See an example slide below. We also took the time to define threshold skills. Meredith used the example of hypothetical thinking in science. A relatively simple concept, the adoption of posing and then testing a hypothesis creates a new pattern of thinking and information behavior in a budding scientist. So, too, do ACRL’s six information literacy frames in the effective researcher, we explained.

Because our assistant head of school for academic affairs is a former math teacher, we sent her copies of the numerous graphs and charts that came directly from Carrick Enterprises as well as the scatter graphs that Qi created. Let’s face it, everybody likes numbers that nail down even the most expert professional opinion of subjective assessment. Thankfully, Carrick Enterprises gave us both. The detailed results, both the good and the less-than-good, strongly celebrate a strong program. More importantly they champion support of a well-developed information literacy program at all grade levels.