We’re pleased to hear that on February 1, the ACRL executive board voted to file the Framework for Information Literacy for Higher Education. This decision confirms the value of our efforts to envision a new approach to assessing students’ information literacy. The richness of the Framework and its alignment with librarians’ ongoing efforts to engage information literacy’s deep questions and troublesome knowledge, serve as inspiration to our advisory board as we advance to the next stages of test development. At the same time, we are glad that a final decision about the future of the Standards will be made at a later date, once librarians in the field have had ample opportunity to apply the new Frames and fully explore their strengths and limitations. We see our test development efforts as one facet in this wide-ranging project of investigating the affordances of the Framework.
In January, the advisory board studied the final version of the Framework. We found that the performance indicators and outcomes that we drafted in the fall closely matched many of the concepts described in the Framework. Reviewing the frames again inspired us to organize our extensive lists of indicators into core components of our key IL concepts. After further refining our lists of information literacy practices and dispositions, we defined four modules for development.
A taskforce of advisory board members formed to write test items that will take advantage of HTML 5 capabilities to experiment with a variety of innovative item types. If you’re looking for more information on innovative item types, you’ll find that Christine Harmes, Cynthia Parshall, and Kathleen Scalise have developed helpful taxonomies. The item writing taskforce is now engaged in the iterative process of writing items and performance indicators. Based on what we learn from the process of writing our first module and doing preliminary usability studies with our early item prototypes, we will begin writing items for the three remaining modules in the spring.
We're really excited about our latest progress and we're looking forward to updating you again soon!
Happy New Year!
In December, despite the hectic pace of the end of the semester, the Advisory Board reviewed lists of knowledge practices and dispositions that we created throughout the fall. This review was the next step in our process of defining the elements of information literacy that this new test will assess.
At our meeting on December 19, the Board discussed how the draft Framework for Information Literacy for Higher Education can guide our decisions about the behaviors, knowledge, and values we’ll ask students to demonstrate on the test. Advisory Board members are now working in small groups to further refine our definitions and prepare the guidelines we need to have in place in order to start writing test items in February.
This month we also made a major step forward in the process of selecting a name for the new test. Watch this space for exciting announcements!
At ALA MidWinter in Chicago we expect to hear that the Framework for IL in Higher Education has been finalized and our Advisory Board members in attendance at the conference will be meeting face to face (some of us for the first time).
We’re looking forward to having a fantastic 2015! We hope you are, too.
In November we made significant headway in our test development. One essential component of designing a test is defining the concepts/skills we’ll assess and identifying the levels of performance we’ll expect to see students demonstrate. Since this will be an assessment of undergraduates’ IL, we’re drafting plans for a test that will differentiate among students at a beginning level (at or near entry), an intermediate level (at or near completion of lower-division coursework), and an emerging expert level (at or near completion of a bachelor’s degree).
The Advisory Board generated lists of the IL skills, concepts, and dispositions that we have observed among college students throughout their education. We organized those observations so that they fit into the 6 frames of the latest draft of the ACRL Framework for IL in Higher Education. This provided us our first model of how students’ IL changes as they encounter the threshold concepts at the heart of the 6 frames. We will continue to refine our model and use it to guide our development of test items beginning in early 2015.
We have also added details to our plans for using innovative item types that take advantage of the flexibility that’s possible in computer based testing. Innovative items can include images and offer alternative response modes so that we can increase the fidelity that the questions have to students’ research experiences and gauge their knowledge practices and dispositions in ways that have not been possible with traditional multiple-choice questions. For example, innovative items might use an image of typical search results and ask students to select a set of appropriate sources given a specific information need. Determining our full range of item types is the next phase before we begin writing items.
Finally, in November, we began discussions with an expert in educational adaptive technology. We are using resources like Web Accessibility in Mind to incorporate elements of universal design at this early stage of planning for test item types and response systems.
We’re looking forward to a busy winter as we conclude our planning process and begin test development. I’ll keep you up-to-date with another post soon.
As you may be aware, Project SAILS has been operated by Carrick Enterprises, Inc. since 2012. Two of the original SAILS team members formed the company in order to continue providing the SAILS tests to institutions throughout the United States and, starting this year, around the world.
Project SAILS is based on the 2000 ACRL Competency Standards for Information Literacy in Higher Education. With the upcoming move to the new ACRL Framework, Carrick Enterprises will be developing an entirely new assessment instrument. This is a big job and we plan to provide more information about the instrument at the ACRL conference in March in Portland.
We are extremely happy to be able to announce that Dr. April Cunningham has taken on the job of coordinating the design of this new instrument. April is the Instruction/Information Literacy Librarian at Palomar College, a comprehensive community college in northern San Diego County. She is active on the Learning Outcomes Council, which coodinates institutional student learning outcomes assessments (including assessment of Palomar's general education information literacy outcome). She is also one of the curriculum developers/facilitators for ACRL's Assessment in Action project. We could not have found a more qualified person to lead this effort. Please help us welcome April to the project!
The emerging ACRL framework for information literacy will affect libraries and librarians in many ways. The framework will offer new approaches to conceptualizing information literacy instruction and creating assessments. At Project SAILS, we are challenged to create a whole new standardized assessment that, like SAILS, can be used across institutions and that can span a college career.
To assist us with this process, in March we brought together a panel of knowledgeable and insightful librarians and measurement experts. During a two-day workshop, we discussed the existing draft of the framework, including metacognition and threshold concepts. We reviewed types of assessments and discussed the desired characteristics of assessments and reports. We conceptualized what an assessment based on the new framework would look like. Work continues and we are avidly following the continuing development of the framework.
Members of the panel include:
- Joanna Burkhardt, Professor; Head of CCE Library; Chair of Technical Services, University of Rhode Island
- April Cunningham, Instruction/Information Literacy Librarian, Palomar College, San Marcos, California
- Jessame Ferguson, Director of Hoover Library, McDaniel College, Westminster, Maryland
- Wendy Holliday, Head, Academic Programs and Course Support, Cline Library, Northern Arizona University
- Penny Beile O’Neill, Associate Director, Information Services and Scholarly Communication, University of Central Florida
- Dominique Turnbow, Instructional Design Coordinator, UC San Diego
- Steve Wise, Senior Research Fellow at Northwest Evaluation Association, Portland, Oregon
We are grateful to the members of the panel and energized by the conversation, debate, and exchange of ideas.
As the framework moves toward completion and approval by the ACRL Board, we are continuing our work. We will have a beta version of our brand new assessment ready for testing soon. We will be revealing the name of the new assessment soon.
What will happen with the existing SAILS assessments?