Skip to content

Photo of Silvia Vong
Silvia Vong, University of Toronto

Silvia Vong joined the Advisory Board for the Threshold Achievement Test for Information Literacy in 2015. She was a key contributor to the creation of TATIL. She wrote test items, conducted cognitive interviews with students, and advised on other aspects of the project. In this interview she describes her work at the John M. Kelly Library in the University of St. Michael’s College, how her teaching approach has evolved, her project to introduce scholarly communication to undergraduates, and more!

Question: Please tell us about your job. 

I am currently Head of Public Services at John M. Kelly Library in the University of St. Michael’s College in the University of Toronto. This was a recent change as a couple years ago, I was the Collaborative Learning Librarian. In my previous role, I was the liaison for the Book and Media Studies program and taught an undergraduate course that introduced students to library and archival concepts and topics as well as research skills. Eventually I was given the opportunity to become Head of Public Services and I saw an opportunity to learn more about access services including overcoming the daily challenges that come with working the frontline. In the role, I oversaw the various services we provided for faculty, students, and staff and act as a liaison between our department and other library departments as well as the various departments across campus. 

...continue reading "Meet the TATIL Advisory Board: Silvia Vong"

This semester I provided two workshops for the part-time librarians I work with who do most of the teaching in our one-shot library/research instruction program.  Although I see them every day, it’s rare that we carve out time to meet as a group and getting together even depends on some librarians coming in on their time off.  But we get so much out of sharing our experiences with each other that we’re all willing to give a little extra to make it work.  At these meetings I had a chance to facilitate discussion about the Framework, which might seem a little late since it was first adopted nearly three years ago, but it was good timing for us because we recently got support from our college administrators to purchase the Credo InfoLit Modules and it’s helping us to think about the scope of our instruction in new ways.

In particular, we’ve been thinking about how to reach beyond our one-shots in new ways.  The information literacy lessons from Credo are one way to reach students before or after we see them in the library.  With a little coordination between the librarian and the professor who’s requesting instruction, students can be introduced to concepts like the value of information or the role of iteration in planning a search strategy before coming to the library.  Or they can get step-by-step, self-paced practice with MLA citations to follow up on our in-class discussions about how they should expect to use various types of sources in their analysis or argument.

...continue reading "Resources for One-Shots"

The cornerstone of the Threshold Achievement Test for Information Literacy are the outcomes and performance indicators we wrote that were inspired by the ACRL Framework for Information Literacy for Higher Education.

Working with members of our Advisory Board, we first defined the information literacy skills, knowledge, dispositions, and misconceptions that students commonly demonstrate at key points in their education: entering college, completing their lower division or general education requirements, and preparing for graduation. These definitions laid the groundwork for analyzing the knowledge practices and dispositions in the Framework in order to define the core components that would become the focus of the test. Once we determined to combine frames into four test modules, the performance indicators were then used to guide item writing for each of the four modules. Further investigation of the Framework dispositions through a structural analysis led to identifying and defining information literacy dispositions for each module.

...continue reading "From Framework to Outcomes to Performance Indicators, Plus Dispositions!"

Last week I was fortunate to get to attend and present at LOEX 2017, in Lexington, KY.  I’m excited to have joined the LOEX Board of Trustees this year and it was great to see familiar faces and meet new, energized librarians, too.

I presented a one-hour workshop where I walked participants through a comparison of two common types of results reports from large-scale assessments.  We looked at an example of a rubric-based assessment report and a report from the Evaluating Process and Authority module of the Threshold Achievement Test.  We compared them on the criteria of timeliness, specificity, and actionability, and found that rubric results reports from large-scale assessments often lack the specificity that makes it possible to use assessment results to make plans for instructional improvement.  The TATIL results report, on the other hand, offered many ways to identify areas for improvement and to inform conversations about next steps.  Several librarians from institutions that are committed to using rubrics for large-scale assessment said at the end of the session that the decision between rubrics and tests now seemed more complicated than it had before.  Another librarian commented that rubrics seem like a good fit for assessing outcomes in a course, but perhaps are less useful for assessing outcomes across a program or a whole institution.  It was a rich conversation that also highlighted some confusing elements in the TATIL results report that we are looking forward to addressing in the next revision.

Overall, I came away from LOEX feeling excited about the future of instruction in the IL Framework era.  While the Framework remains an enigma for some of us, presenters at LOEX this year found many ways to make practical, useful connections between their work and the five frames. ...continue reading "May Update: Report from LOEX"

April Cunningham and Carolyn Radcliff at Library Assessment Conference 2016
April Cunningham and Carolyn Radcliff at Library Assessment Conference 2016

We were honored to sponsor the 2016 Library Assessment Conference (LAC), October 31-November 2. As sponsors we gave a lunch-time talk about the test and we also attended the conference. Although Carolyn has been to this conference several times, most often presenting about the Standardized Assessment of Information Literacy Skills (SAILS), this was April’s first time attending LAC. The conference is a wonderful opportunity to gather with librarians from around the country and, increasingly, from around the world to learn about assessment methods and results that we can apply in our own settings. It was also a rich environment for engaging in conversations about the value of assessment data and what makes assessments meaningful.

Here are a few of the findings that stuck with us:

  • Representatives from ACRL’s Assessment in Action program shared the results of their interviews with leaders from throughout higher education including the Lumina Foundation, Achieving the Dream, and the Association of American Colleges and Universities. They learned from those conversations that as a profession, academic librarians already have strong data about how we affect students’ learning and which models have the most impact. The higher education leaders advised ACRL to encourage deans, directors, and front line librarians to make better use of the data we already have by telling our stories more effectively. You can read about the assessment results and instructional models they were referring to by visiting the Assessment in Action site.
  • Alan Carbery, founding advisory board member for the Threshold Achievement Test for Information Literacy (TATIL) and incoming chair of the Value of Academic Libraries committee for ACRL, co-presented with Lynn Connaway from OCLC. They announced the results of a study to identify an updated research agenda for librarians interested in demonstrating library value. Connaway and her research assistants analyzed nearly two hundred research articles from the past five years about effects on students’ success and the role of libraries. Her key takeaway was that future research in our field should make more use of mixed methods as a way of deepening our understanding and triangulating our results to strengthen their reliability and add to their validity. The report is available on the project site.

...continue reading "November Update: Library Assessment Conference Debrief"