Evidence Summary
Information Literacy (IL) Intervention Workshop has Positive, but
Limited, Effects on Undergraduate Students’ IL Skills
A Review of:
Gross, M. & Latham, D. (2013). Addressing below
proficient information literacy skills: Evaluating the efficacy of an
evidence-based educational intervention. Library & Information Science Research, 35(3), 181-190.
http://dx.doi.org/10.1016/j.lisr.2013.03.001
Reviewed by:
Lisa Shen
Business Reference Librarian
Sam Houston State University
Huntsville, Texas, United States of America
Email: [email protected]
Received: 9 Mar. 2014 Accepted: 30 Apr.
2014
2014 Shen.
This is an Open Access article distributed under the terms of the Creative
Commons‐Attribution‐Noncommercial‐Share Alike License 2.5 Canada (http://creativecommons.org/licenses/by‐nc‐sa/2.5/ca/),
which permits unrestricted use, distribution, and reproduction in any medium,
provided the original work is properly attributed, not used for commercial
purposes, and, if transformed, the resulting work is redistributed under the
same or similar license to this one.
Abstract
Objective – To evaluate the impact of an educational
intervention workshop on students’ information literacy (IL) skills and
self-perception of their own IL knowledge.
Design – Quasi-experimental design with control groups and
semi-structured interviews.
Setting – Two community colleges in the United States of
America, one in a rural setting and one in an urban setting.
Subjects – Ninety-two students enrolled in an entry-level
English course, who scored below proficiency (65%) on the Information Literacy
Test (ILT).
Methods – One hundred students from each college took the
pre-session ILT and an IL self-assessment survey at the beginning of the Spring
2011 semester. The ILT used was developed and validated by James Madison
University (Wise, Cameron, Yang, & Davis, n.d.) and measures understanding
of all the Association of College and Research Libraries (ACRL) Information
Literacy Competency Standards (ACRL, 2000, pp. 2-3) except Standard 4. For
motivation, students each received $20 for their efforts and were told those
who scored in the top 15% would enter a draw to win one of two additional
prizes of $50. Those who scored below the ILT proficiency level of 65% were
invited to participate in the quasi-experiment.
Forty-nine
participants were assigned to the workshop group and 43 to the control group.
The two groups were comparable in demographic characteristics, prior IL
learning, and ILT scores. Those in the workshop group were ask to attend one of
five workshops designed around the Analyze, Search, Evaluate (ASE) process
model for IL interventions (Gross, Armstrong, & Latham, 2012). The
workshops were offered on both campuses and taught by the same instruction
librarian.
The workshop
participants completed questionnaires, which included a second ILT,
self-assessment, and ASE-based questions, before and after the IL workshops.
Each workshop participant received $30. The control group participants took the
same post-session questionnaire after the workshops were completed and received
$20. The same $50 incentive was offered to both groups. Two weeks after the
workshops, semi-structured individual interviews were conducted with 30
participants to analyze their learning experiences.
Results – Participants’ self-assessment of IL skills showed
significant downgrading after they took the ILT for the first time. This
downward calibration held true for both the control (t (41) = 4.077, p < 0.004) and the workshop (t (45) =
4.149, p < 0.000) groups.
Subsequent self-ratings from the control group showed this downward
recalibration of self-assessment was sustained over time.
For participants in
the workshop group, their average self-rating of IL ability rose from a pre-ASE
workshop rating of 2.79 out of a maximum score of 5, to a post-workshop rating
of 3.83. However, the same participants’ post-workshop ILT scores did not show
any significant improvement. Attending the ASE workshop did not help
participants to achieve the “proficient” IL skill level (an ILT score of 65% or
higher).
Nonetheless, the
workshop group’s performance on the ASE focused questions, also administered
pre- and post-session, indicated that participants did gain some IL skills
during the workshop. On the ASE questions, which had a maximum score of 25
points, the workshop group’s average score increased from 10.62, pre-session to
13.40, post-session, while the control group had an average score of 10.91
pre-session and 10.77 post-session.
In the follow-up
interviews, most participants reviewed the workshop positively and felt that
their peers would benefit from attending. However, the skills participants
reported learning primarily focused on the Search stage of the ASE model, such
as exact phrase, truncation, and the advanced search options in Google.
Conclusion – This quasi-experiment examined the impact of a
one-hour ASE model-based workshop on first-year English students with
below-proficiency IL skill levels. Self-assessment ratings indicated that
workshop attendance increased students’ confidence in their skill level,
although this upward recalibration of self-view significantly overestimated
participants’ actual skill gain. Pre- and post-test questionnaires indicated
that, while students did gain some new IL knowledge, attending the workshop was
insufficient to improve their IL skill to the proficient level.
Commentary
The design of this
study appears sound. The authors also provided either copies of, or citations
for all the assessment instruments. Nonetheless, the study scored an overall
validity of 70%, slightly below the acceptable validity measure of 75%, on the
Evidence-Based Librarianship (EBL) Critical Appraisal Checklist (Glynn, 2006).
The overall rating was
negatively affected by some missing procedural information, such as ethics
approval and participant consent, and minor study design flaws, such as asking
the control group questions about the ASE workshop in the post-session
questionnaire. While it is important to keep all conditions comparable between
the control and workshop groups, including three survey questions participants
cannot answer might be unnecessary. For instance, the researchers noted 35% of
the control group participants “failed to respond” (p. 186) to the final
self-assessment question, likely because it was about the workshop. However, it
would be more interesting to know how the other 65% of participants responded
to a question requesting comments to a workshop they did not attend.
One omission of concern
was the workshop group’s post-session ILT scores. The authors had stated there
was “no evidence of improved scores… for any of the participants” (p. 187),
though evidence was also unavailable to support this observation. The ILT is a
validated 65-question instrument with four subscales. Therefore, participants’
performance on this test demands some elaboration, especially since extensive
discussion was made on the same group’s performance on the post-session ASE
questionnaire.
Some of these issues
in the article may be attributed to editorial decisions. The authors had
included extensive information about the ASE-based IL workshop, which took up
over 10% of the article, leaving less space for other study details. The
workshop development was an extensive project and deserves due attention,
however, this information had already been captured in another article by the
authors (Gross, Armstrong, & Latham, 2012). Therefore, it would have been
more effective for the authors to refer readers to the other ASE publication,
than attempt to describe two complex studies in one article.
Despite these issues,
this study demonstrated the limited impact of a one-hour workshop on students’
actual IL skills and the false positive self-assessment such workshops could
generate. The study provides a timely and valuable contribution to current IL
research and its findings provide strong practical implications for the current
trends in reconsidering the usefulness of student self-perception reports and
effectiveness of one-time IL workshops.
References
Association of College and Research Libraries
(2000). Information literacy competency
standards for higher education. Retrieved from
http://www.ala.org/ala/acrl/acrlstandards/
informationliteracycompetency.htm
Glynn, L. (2006). A critical appraisal tool
for library and information research. Library
Hi Tech, 24(3), 387-399. doi:10.1108/07378830610692154
Gross, M., Armstrong, B., & Latham, D., (2012).
The analyze, search, evaluate (ASE) process model: Three steps toward
information literacy. Community &
Junior College Libraries 18(3-4), 103-118.
http://dx.doi.org/10.1080/02763915.2012.780488
Wise, S. L., Cameron, l., Yang, S.-T., &
Davis, S. (n.d.). Information Literacy
Test: Test development and administration manual. Harrisonburg, VA:
Institute for Computer-Based Assessment Center for Assessment and Research
Studies, James Madison University.