Research Article
Academic E-book Usability from the Student’s
Perspective
Esta Tovstiadi
Sr. Assistant Librarian, eResources Librarian
College Libraries
State University of New York College at Potsdam
Potsdam, New York, United States of America
Email: [email protected]
Natalia Tingle
Assistant Professor, Business Collections &
Reference Librarian
University Libraries
University of Colorado Boulder
Boulder, Colorado, United States of America
Email: [email protected]
Gabrielle Wiersma
Associate Professor, Head of Collection Development
University Libraries
University of Colorado Boulder
Boulder, Colorado, United States of America
Email: [email protected]
Received: 14 June 2017 Accepted: 17 July 2018
2018 Tovstiadi, Tingle, and Wiersma. This
is an Open Access article distributed under the terms of the Creative Commons‐Attribution‐Noncommercial‐Share Alike License 4.0
International (http://creativecommons.org/licenses/by-nc-sa/4.0/),
which permits unrestricted use, distribution, and reproduction in any medium,
provided the original work is properly attributed, not used for commercial
purposes, and, if transformed, the resulting work is redistributed under the
same or similar license to this one.
DOI: 10.18438/eblip29457
Abstract
Objective –
This article describes how librarians systematically compared different e-book
platforms to identify which features and design impact usability and user
satisfaction.
Methods
– This study employed task-based
usability testing, including the “think-aloud protocol.” Students at the
University of Colorado Boulder completed a series of typical tasks to compare
the usability and measure user satisfaction with academic e-books. For each
title, five students completed the tasks on three e-book platforms: the
publisher platform and two aggregators. Thirty-five students evaluated seven
titles on nine academic e-book platforms.
Results –
This study identified each platform’s strengths and weaknesses based on students’
experiences and preferences. The usability tests indicated that students
preferred Ebook Central over EBSCO and strongly preferred the aggregators over
publisher platforms.
Conclusions
–
Librarians can use student expectations and preferences to guide e-book
purchasing decisions. Preferences may vary by institution, but variations in
e-book layout and functionality impact students’ ability to successfully
complete tasks and influences their affinity for or satisfaction with any given
platform. Usability testing is a useful tool for gauging user expectations and
identifying preferences for features, functionality, and layout.
Introduction
Librarians
select materials based on a variety of criteria such as content, format,
availability, and cost (Anson & Connell 2009; Roncevic 2013). It is assumed
and expected that the content of a book does not vary because it is published
in cloth or paperback. Electronic books challenge this expectation. While some
e-books replicate the content and presentation of their print equivalents,
others transform the initial work into something that hardly resembles the
print version (Kichuk, 2015; Wiersma & Tovstiadi, 2017). Technical
limitations and design choices on different e-book platforms create variations
in the presentation, layout, and even content. These variations often go
unnoticed because they are not obvious without direct comparison and
evaluation.
In
2015, librarians at the University of Colorado Boulder tested thirty-four
elements that are important to usability and the end-user experience and
identified inconsistencies between e-book platforms, such as layout,
navigation, metadata, and search results (Wiersma & Tovstiadi, 2017). This
study builds on previous research by exploring some of those elements from the
user’s perspective. Students examined the same title on three different
platforms and completed a series of tasks to compare the usability and measure
user satisfaction with each platform. Through this study we gained a greater
understanding of student expectations and preferences that can be used to guide
e-book purchasing decisions.
Literature Review
E-Book Usability Issues Identified by Librarians
There
is no shortage of articles in the literature admonishing the poor usability of
academic e-book platforms. Bivens-Tatum (2014) described the various platforms
as a “vast array of substandard choices,” noting that restrictions on use often
cause patrons to give up on using e-books (para. 3). Digital rights management
(DRM) and restrictions on downloading, printing, and saving e-books for offline
use or for future reference are frequently cited explanations for the low
acceptance of the medium (Slater, 2010; Thomas & Chilton, 2016).
Library
Journal’s E-book Usage in U.S. Academic
Libraries (2012) provided the librarian’s standpoint about e-book
usability. The survey results revealed issues such as a “complex downloading
process” and “difficult to read on screen/online” as some of the top barriers
to e-book usage (p. 8). These issues persisted in the 2016 survey, alongside
problems such as “platform not user friendly” and “can’t read offline or
download” (Library Journal, p. 47).
Mune
and Agee (2015) developed a template for evaluating different platforms by
function, including navigation, offline availability, and full-text searching.
While their study focused on accessibility as related to users with print
disabilities, they found that “Single publisher platforms (such as Gale,
Palgrave, and Springer) appear to offer more features and have more flexibility
overall compared to aggregators (such as ProQuest and ACLS Humanities) that
include books from a variety of publishers in their collections” (p. 222).
Cataldo
and Leonard (2015) compared 14 e-book platforms, and studied seven common
features including format; user accounts; personal bookshelves; mobile
accessibility; and the ability to annotate, download, and print. In addition to
variation among platforms, they also found variation within aggregator
platforms due to publisher restrictions. While user preferences may vary, they
concluded that “It is crucial to understand the needs of your patrons, and more
specifically on how the features, functionality and accessibility of the
e-books meet those needs” (Conclusion section, para. 2).
One
the largest studies, the JISC national e-books observatory project, analyzed
e-book use in more than 120 universities in the United Kingdom and concluded
that there was a strong need for e-book platforms that are designed with
usability principles (2009). The call for consistent design is echoed
throughout the literature (Hobbs & Klare, 2016; Muir & Hawes, 2013) as
is the call for improved usability (Slater, 2010).
E-book
Usability Issues Identified by Users
A
number of usability studies have been conducted on academic e-book platforms.
Carter, et al. (2013) used a survey to identify engineering students’ attitudes
towards and experiences with e-books, finding that students expressed a number
of concerns, including issues with navigation, format, printing, and
downloading.
Using
the think-aloud method, Berg, Hoffman, and Dawson (2010) compared a set of
e-books with the print counterparts. The researchers instructed 20
undergraduate participants to complete information retrieval tasks using both
print and e-books. Students used different navigation and search strategies
depending on format, and their expectations for e-book functionality were
unmet.
A
mixed-methods study by Zhang, Niu, and Promann (2017) included a task-based
usability test of 12 participants, including undergraduates, graduate students
and faculty members. The user tests and follow-up survey call for improved
consistency among e-book platforms since platforms that do not follow general
web conventions appear to require more effort from the user.
O’Neill
(2009) compared the usability of ebrary, EBL, and MyiLibrary using task-based
methodology with 10 undergraduates and graduates. The study identified a number
of common usability issues with e-books, including functionality such as
printing and navigation. Muir and Hawes (2013) observed 14 undergraduate
physics students interact with two e-books on the NetLibrary and MyiLibrary
platforms. Their findings support previous studies by highlighting issues with
navigation and searching. In addition, the researchers developed a set of
desired e-book features based on user needs.
This
study builds on the work of previous studies by examining a greater number and
additional types of platforms and suggesting that librarians can use test
results as evidence to inform selection and purchase decisions. Further, this
study goes beyond determining whether e-book platform features exist, it
evaluates how usable they are from the student perspective.
The
aims of this study are to:
●
Identify specific functionality and
features that students prefer on e-book platforms
●
Understand how differences in e-book
platforms impact the user experience
●
Describe how librarians can factor user
experience into the selection of e-books
Methodology
Usability
testing is a method of evaluating a product or service by testing it with a
representative group of users. “The goal is to identify any usability problems,
collect qualitative and quantitative data and determine the participant's
satisfaction with the product” (U.S. Department of Health & Human Services,
2018). In this study, the authors observed students as a representative group
of academic e-book users. After receiving IRB approval for human subjects research,
they posted information about the study in an online campus newsletter and the
library’s social media channels and offered students a $10 Amazon gift card for
completing the study. They recruited one doctoral, one masters, and three
undergraduates to test each title. Although five students is a small sample of
the entire student population, according to the Nielsen Norman Group,
“test[ing] 5 users lets you find almost as many usability problems as you'd
find using many more test participants” (Nielsen, 2012). This convenience
sample of students was further limited to select students who had majors
related to the subject of the sample title in order to replicate an authentic
experience that an individual student might have with an e-book.
This
study used “task-based usability,” a technique where users complete typical
tasks on a website while an observer records if and how they were able to
accomplish the task. During the test, students completed tasks on three e-book
platforms. The tasks mimicked behaviour that students might naturally exhibit
while using an e-book, such as printing, downloading, searching within the
book, and navigating to a specific page. For the tasks, see the Appendix. We
observed their actions and noted whether, as well as how, they completed the
tasks. Using the “think-aloud protocol,” students were asked to verbalize their
thoughts and expectations. This enabled the researchers to compare the actual
results with students’ expectations in order to measure user satisfaction with
the product. The authors took notes about each test and recorded the audio and
on screen navigation.
The
authors used a convenience sample of e-books that were available on Ebook
Central, EBSCO, and a publisher platform. The library acquired access to the
sample titles on additional platforms as needed.
For
each title, five students completed the tasks on three platforms: the publisher
platform and two aggregators. The order in which the platforms were tested was
randomized, in order to temper the potential bias of consistently testing one
platform first or last, and students completed all of the tasks on one platform
before moving onto the second and third.
After
the tasks were completed on each platform, students ranked the platforms in
order of preference, using physical printouts of the e-book landing pages as
references. This gave students an opportunity to compare the platforms and
provide additional feedback.
Results
In each
usability test, the student completed a series of tasks on each of three platforms.
Results from each of the tasks are described below. Altogether 35 students (21
undergraduates, 7 masters, 7 doctoral) tested sample titles on seven publisher
(Brill, Cambridge, Duke, Oxford Scholarship Online, Springer, Science Direct,
and Wiley) and two aggregator platforms (EBSCO and Ebook Central). At the end
of the test, students were asked to rank the three platforms in order of
preference. Sixty percent of students rated Ebook Central as their preferred
platform, followed by EBSCO (26%) and individual publisher platforms (14%).
Task
1: Evaluating the Landing Page
The
usability tests started on a landing page, which is typically the first page
that a user sees when they click on a link to an e-book from a search engine,
the library catalog, or a discovery layer. We asked students what they expected
to see on this type of page and what information was most useful. Most students
expected to see the basic bibliographic elements needed to cite a book (e.g.,
title, author, and publication information) as well as a brief summary or
abstract. Some students expected an ISBN or DOI, which they indicated was
helpful for citing a book or figuring out which edition they were using. While
most platforms provide all of this information, the placement on the page and
order in which it was presented varied.
Task
2: Evaluating the Bibliographic Information
When
students were asked how they interpreted the bibliographic information
(title/subtitle, authors, dates, keywords/subject headings) on the landing page
it seems likely that students would accept the bibliographic information
presented at face value and not question its accuracy. As they navigated
between platforms, students understood each of these components individually
but were puzzled when the information for the same book varied on different
platforms. Some students pointed out discrepancies in the metadata for
publication dates, subtitles, and author information when it varied by
platform. These discrepancies would likely have gone unnoticed but were obvious
when students were asked to identify and interpret this information on each platform.
Figure
1
Availability
on ProQuest Ebook Central.
Figure
2
Concurrent
user level on EBSCO.
Some of the students noticed the subject terms but
were not impressed unless they were hyperlinked and more descriptive than just
repeating words in the title of the book. Most students were unfamiliar with
library jargon like “LC Subject Headings,” but they generally understood that
keywords and categories were meant to describe the book. Some users indicated
that they would skim through a summary or book description to help them decide
if it was appropriate for their research, but most people assumed that if they
ended up on this page, it was because they already knew that they needed this
book.
Students
wanted information about the availability of the book, but almost all of them
misinterpreted the information that described the permitted number of
simultaneous users or the number of copies. For example, Ebook Central and
EBSCO included information about availability based on the type of license that
the library purchased (see Figures 1 and 2).
While
a librarian might interpret that “access to 3 copies of this book” refers to a
license that allows three users to access the e-book simultaneously, many
students misinterpreted the number of available online copies as the number of
print copies they could find in the library. This was particularly misleading
because our library typically does not purchase books in multiple formats, and
so it is very unlikely that we would have both an e-book and print copy, much
less multiple print copies.
Task
3: Finding and Using Citation Tools
Next,
we asked students to interact with some of the information on the landing page.
We asked them how to cite the book using any available tools or information,
using the prompt: “You need to cite this book for your paper. How would you use
this page to do that?” A native citation generating tool is one of the features
that is referenced as being a benefit or expected feature of e-books (Cassidy,
2012). Many students noticed the native citation generator immediately. Some of
the students were introduced to the native citation generator while interacting
with the first of three platforms, then learned to look for a similar tool on
subsequent platforms. Although a majority of the students were able to find the
citation tools easily on the aggregator platforms, consistent naming and use of
icons between platforms would improve usability.
We
encountered two kinds of native citation generating tools, which may account
for some of the inconsistency in labeling the tool. In one type (cite), the
citation is displayed on demand in one of several citation styles for the user
to copy and paste as needed. The other variety (export) is a downloadable file
for use with a bibliographic management tool such as EndNote, Zotero, BibTeX,
or RefWorks. Some of the platforms included both kinds of tools. The platforms
that offered one generally offered the download version, although sometimes
they provided a plain text download option, which could then be opened in a
tool like Notepad and then copied and pasted. Students reacted much more
positively to the versions that showed the citation
without having to open another program or download any files, even when the
style they commonly used was not on the list.
Table
1
Finding
Citation Tools
Platform |
Found Citation Tools Easily |
Found After Some Time/ With Some Difficulty |
Did Not Find |
Brill (n=5) |
20% (1) |
20% (1) |
60% (3) |
Cambridge (n=5) |
60% (3) |
20% (1) |
20% (1) |
Duke (n=5) |
40% (2) |
- |
60% (3) |
Ebook Central (n=33)* |
58% (19) |
9% (3) |
33% (11) |
EBSCO (n=32)* |
56% (18) |
3% (1) |
41% (13) |
Oxford Scholarship Online
(n=4)* |
25% (1) |
25% (1) |
50% (2) |
Science Direct (n=5) |
20% (1) |
40% (2) |
40% (2) |
Springer (n=5) |
- |
- |
100% (5) |
Wiley (n=5) |
- |
40% (2) |
60% (3) |
*Data
was not available for some students.
In
completing this task, students mentioned a variety of other tools for creating
or managing citations. Popular programs included Easybib, Citation Machine,
Knight Cite, Mendeley, and Zotero. In several cases, the students explained
that they would Google either the title or the ISBN number along with “cite.”
Whether they would use the native tool or another means, many students in this
study also mentioned the importance of including all the elements needed to
cite manually the e-book on the landing page.
Task
4: Navigating to a Specific Chapter or Page Number
Next, the
students were asked to navigate to a specific chapter and page number. This
task was designed to observe how students preferred to navigate through an e-book
(e.g., using linked table of contents, searching, or scrolling) and if they
were able to find the correct page. The majority of students (69%) found the
correct page easily. Most students used the linked table of contents from the
landing page or in the navigation pane of an e-book reader, but they were
frustrated when it was not linked in the PDF versions. Many students also used
the page number box to “jump” to a specific page when that feature was
available.
Table 2
Success
Navigating to a Specific Page
Platform |
Found Appropriate Page Easily |
Found with Some Difficulty |
Did Not Find |
Brill (n=5) |
- |
100%
(5) |
- |
Cambridge (n=5) |
40%
(2) |
60%
(3) |
- |
Duke (n=5) |
80%
(4) |
20%
(1) |
- |
Ebook Central (n=32)* |
72%
(23) |
22%
(7) |
6%
(2) |
EBSCO (n=31)* |
74%
(23) |
23%
(7) |
3%
(1) |
Oxford Scholarship Online (n=4)* |
75%
(3) |
25%
(1) |
- |
Science Direct (n=5) |
100%
(5) |
- |
- |
Springer (n=5) |
60%
(3) |
40%
(2) |
- |
Wiley (n=4)* |
75%
(3) |
25%
(1) |
- |
*Data
was not available for some students.
Some students
scrolled within the reader to find a specific page. Sometimes scrolling was
preferable to jumping from page to page, and other times it was necessary
because the e-book did not include page numbers. Some downloaded the entire
book and navigated within the downloaded file. For some platforms, it was
necessary for students to go back to the landing page and then open the PDF of
the correct chapter. A few students noted that page numbers that were displayed
in the PDF reader did not always match the page number printed on the page. If
they struggled to find the correct page, students often used creative solutions
to find it. Although frustrating, students seemed to tolerate these
inconsistencies if they had experienced them before.
Students also
struggled to find the appropriate page when the platform did not display page
numbers, such as the EPUB version of e-books in Ebook Central and EBSCO. If
students are unable to locate a specific page in an e-book this might prevent
them from successfully completing an assigned reading, locating a cited
reference, or creating a citation. Platform design played a major role in
students’ ability to complete this task, and they preferred platforms that
included linked tables of contents, the ability to “jump” from page to another,
and clearly displayed page numbers.
Task
5: Finding and Using Annotation Tools
Next,
we invited students to interact with the text by employing available annotation
features. We asked students, “As you are reading, you want to take notes for
your class. Would you do that here in the e-book? If so, how?” At the time of
testing, annotation tools such as highlighting and note taking were only
available on the aggregator platforms.
We
rated the ease with which our testers found the annotation tools on a scale of
1 to 3, with 3 being “found easily” and 1 being “did not find.” Students had an
easier time finding annotation tools in Ebook Central than EBSCO, with 63% (23)
finding the tools easily in Ebook Central (n=32) compared to 11% (4) in EBSCO
(n=32). This may be due to the placement of the “My Notes” link in EBSCO or to
the multiple ways of accessing the tools in Ebook Central (see Figures 3 and
4).
The
annotation tools within the Ebook Central platform were easy for our users to
find and use, especially the highlighting tool. Ebook Central situated its
annotation tool icons (highlight, add note, and bookmark) in the toolbar at the
top of the reading pane (see Figure 4). Additionally, when a user selected a
section of text within the reading pane, a popup emerged with options for
copying or highlighting the selection as well as an option to add a note.
The
annotation tools in the aggregators led to popups about creating an account.
Due to the way the task scenarios were ordered, this may have been the user’s
first encounter with the need to create an account on the platform in order to
fully use the available tools. Students were quick to close the popup window,
often without a pause to read closely. One student said about creating accounts
in Ebook Central, “Unless I was really desperate for what they had in here, I
would probably see if I could find it on any other source that didn’t require
me to sign up.” This sentiment was echoed in various ways across many of the tests.
Students
had pre-existing habits and strong preferences for note-taking that influenced
their response to this question. Many students said that they would prefer to
download the PDF and highlight or annotate within the PDF file or on a printed
copy of the file. Some of the alternative note-taking options students noted
included a physical notebook, Evernote, Google Keep, Mendeley, a Word document,
or text files. They expressed some interest in the tools, particularly the
highlight tool found in Ebook Central; however, very few of the students we
tested (17%) affirmed that they would likely use the annotation tools,
expressing concern about the long-term availability of the notes they take or
having to create an account to take or keep notes. Although commonly indicated
as an important feature, some studies report that students may not take
advantage of these tools (JISC, 2009; Muir & Hawes, 2013).
Figure
3
“My
Notes” link in EBSCO.
Figure
4
Annotation
tools in Ebook Central.
Task
6: Searching and Evaluating Results
Next,
we asked students to find information on a specific topic within the e-book. We
didn’t want to lead them to use the search tool, rather we wanted to understand
how they naturally looked for content in e-books. Previous studies found that
students use multiple navigation strategies to locate information in e-books,
such as searching, navigating the table of contents or index, or scrolling
through pages (Muir & Hawes, 2013). In our study, most students employed
multiple search strategies throughout the test but generally used more on the
first platform and fewer on the last platform. On the first platform they
tested, 63% of students responded to this question by searching for the topic
within the book, either using CTRL+F (12 students) or the e-reader searching
tools (10 students). Thirteen students started with either the index or the
table of contents (TOC). Of those who started with the index or TOC on the
first platform, on subsequent platforms all 13 students started by searching
instead of using the index. All students were asked to use the search feature
even if that was not their first choice for finding information within the
e-book.
A surprising
number of students tried keyboard shortcuts instead of the search box on the
platform. Over half (54%) of students used CTRL+F to search for a term in the
book at least once during the testing, but they experienced varying levels of
success. Generally, students employed different search strategies on subsequent
platforms based on the results from the first platform. If CTRL+F did not work
on the first platform, students might continue to try it on subsequent
platforms, but they also tried other search mechanisms.
By
the third platform, the majority of students (63%) used only one strategy:
searching through the e-book’s reader tool or using CTRL+F. They may have done
this because they realized that the librarians would ask them to search as part
of the test, or they may have done this because they realized that searching
was more efficient and effective than other strategies.
The fact that
students modified their information-seeking strategy from the first platform to
the last platform suggests that they learned as they interacted with e-books.
On the first platform, students exhibited a wider range of information seeking
behaviour and often took more time to look for tools or complete tasks than on
subsequent platforms. For example, many students did not consider searching for
a term in the first e-book, but by the last platform, nearly every student
opted to search within the e-book rather than try a different strategy, such as
using the table of contents or the index.
Table
3
How
Students Would Find Specific Information in a Book
Strategy |
Number
of Students/Sample Size |
CTRL+F
(keyboard shortcut) |
19/35 |
Looking in the
Index |
11/35 |
Looking in
other parts of the book (Chapter titles, preface, etc.) |
3/35 |
Looking in the
Table of Contents |
11/35 |
Searching
using the platform or reader’s search tool* |
35/35 |
*Librarians
prompted students to search if the students did not search on their own.
How
many search results did you expect?
Along with the
task of searching within the e-book using the platform’s search feature, we
asked students to evaluate the following: how many search results they
expected, how the results were ranked, and what they expected to see if they
clicked on a result. Many students guessed or had an idea of how many results to
expect based on the number of results from previously tested platforms. Some
students (16) indicated some expectation that the number of results should be
similar across all platforms, indicating on the second or third platform an
expectation that the number of results be similar to the number they
encountered on the previous platform(s).
It
was difficult for some students to estimate how many search results to expect
if the subject matter of the book was outside of their field. The authors
attempted to have students test a sample title related to their academic
discipline, but it was difficult to offer an exact match for each major. Of the
11 students who mentioned being unsure of how many results they expected or
said that they had no expectation, five attributed this to their lack of
subject knowledge.
How do you think
these search results are ranked? Why is this one (point to top one) first?
A
majority (74%) of students expressed uncertainty or confusion about how the
results were displayed or ranked on at least one of the platforms. This is
understandable considering the wide range of search results they encountered
during the tests. E-book platforms tend to display search results at the
chapter, page, or keyword level. Students seemed to understand that
keyword-level search results listed each time a keyword appeared in the text.
Keyword results were overwhelming when the search term appeared more than a few
times in the text and the students had to scroll through dozens of results.
Chapter-level
results were confusing because the search term was not always highlighted or
included in the search results, so it was difficult to understand why each
result was a good match. Most students appreciated when the search results
displayed a snippet of text that surrounded their search term and ideally also
highlighted or bolded their search term. This helped them quickly identify the
keyword and provided helpful context to determine the best match(es). Students
were also frustrated if after navigating to the appropriate chapter, their
search term was not indicated within the text.
Students
were also confused by some of the default sorting options. Most students (63%)
expected or believed that search results would be displayed in the order in
which they appear in the book. Some students (26%) were able to figure out
relevance ranking, but many were confused when results were not displayed in
“chronological” or “page number order”. Students were confused when a platform
displayed a list of pages or chapters out of order in order to represent
relevance.
Ebook
Central had the most intuitive display because search results were grouped
within each chapter and relevance was indicated by a bar graph that clearly
represented term frequency. At the time of testing, EBSCO's platform did not
provide an overall number of keyword results nor did it give another option for
sorting results, which made it very difficult for users to interpret which
results were most relevant.
Where do you
expect to go when you click on this search result?
Most
students expected that their search term(s) would be highlighted in the results
(74%) and that clicking on a result would link them to the part of the page, or
at least the page in general, where that term appeared (60%).
Task
7: Printing, Saving, and Downloading
A
surprising number of students (40%) tried to use keyboard or mouse shortcuts,
such as Ctrl+P or right-clicking, to print, save, or download. The majority of
these students tried these strategies on the first platform and abandoned after
they didn’t work. Unfortunately, these types of commands do not work on the
majority of platforms. Based on our findings, e-book platforms should consider
making their sites responsive to these commands. At the very least, the
platform could respond to a keyboard shortcut by moving the user’s cursor to
the appropriate icon or link on the website. This would not only improve the
user experience for many users who just prefer to use shortcuts, but it may
also improve usability for students using screen readers or other assistive
technology.
Eventually,
almost all students (97%) utilized the e-book reader or PDF printing icons.
When asked what they expected to see when downloading the book, most students
expected a PDF. Many students remarked positively when the PDF contained a
citation, and they particularly appreciated the ability to select the citation
style before downloading the PDF.
It
was helpful when actions such as printing, downloading, and saving functioned
similarly across the platform or, ideally, mimicked functionality on other
websites. If a student figured out one process, then it was easier for them to
master other processes. Although the outcomes are similar, students were
confused when platforms used unclear terminology such as an option to print a
“section” rather than a chapter, and they were frustrated by warnings about
“exceeding your print allowance” when attempting to download a chapter. Our observations indicated that it is
important to present information about printing or download limits at the point
of need instead of just listing that information on the landing page. Users
need this information within the context of performing an action or when it
limits their ability to take action, but otherwise this information is
superfluous and confusing.
Both
aggregators required students to login or create an account in order to
download the entire book, whereas the publisher platforms did not. When
prompted to log in by the aggregator sites, most students summarily ignored the
pop-up notification and tried to find other ways to print or download the book.
A few students tried to subvert the DRM on the aggregators by taking
screenshots or saying that they would go to “other sites” or ways of accessing
this e-book, even though they understood that it was unethical and illegal to
do so.
Table
4
Students
Expressing Difficulty Finding Print Options by Platform
Platform |
Percent
Expressing Difficulty |
Brill (n=4)* |
25%
(1) |
Cambridge
(n=5) |
20%
(1) |
Duke (n=5) |
0%
(0) |
Ebook Central
(n=31)* |
6.4%
(2) |
EBSCO (n=33)* |
12%
(4) |
Oxford
Scholarship Online (n=5) |
40%
(2) |
Science Direct
(n=5) |
60%
(3) |
Springer (n=5) |
20%
(1) |
Wiley (n=4)* |
25%
(1) |
*Data
was not available for some students.
Table
5
Students
Expressing Difficulty Finding Download/Offline Reading Options by Platform
Platform |
Percent
Expressing Difficulty |
Brill (n=5) |
20%
(1) |
Cambridge
(n=3)* |
33%
(1) |
Duke (n=5) |
80%
(4) |
Ebook Central
(n=31)* |
45%
(14) |
EBSCO (n=32)* |
31%
(10) |
Oxford
Scholarship Online (n=5) |
20%
(1) |
Science Direct
(n=5) |
0%
(0) |
Springer (n=5) |
0%
(0) |
Wiley (n=4)* |
25%
(1) |
*Data
was not available for some students.
Table
6
Of
the Three Tested, Which Platform is Your First Preference?
Platform |
Number
of Students |
Brill (n=5) |
0 |
Cambridge
(n=5) |
1 |
Duke (n=5) |
0 |
Ebook Central
(n=35) |
22 |
EBSCO (n=35) |
8 |
Oxford
Scholarship Online (n=5) |
1 |
Science Direct
(n=5) |
1 |
Springer (n=5) |
1 |
Wiley (n=5) |
1 |
Task
8: Which Platform Would You Prefer to Use?
The
final task asked students to rank the platforms in order of preference, using
physical printouts of the e-book landing pages as a reference. Of the students
studied, 60% rated Ebook Central as their preferred platform, followed by EBSCO
(26%) and individual publisher platforms (14%). Some of the characteristics
that distinguished Ebook Central were the intuitive layout of the search
results including the bar graph that indicated whereand how many search results
were included in each chapter and the clearly visible icons and menus that made
it easy to accomplish tasks such as printing and downloading.
In
general, students preferred the platforms that offered full-text searching,
identified the number of search results, highlighted search terms within the
results, and presented search results in an intuitive order (either relevancy
or the order in which they appear in the book). They also preferred platforms
that allowed them to easily highlight in multiple colors. Students had mixed
opinions about the reading pane layout of most aggregators but seemed to prefer
the toolbars in both aggregators because the icons clearly identified the tools
that students needed most (e.g., printing and downloading). They also expressed
varying opinions about platforms that prompted them to login or create an
account in order to print, download, or save content.
In
addition to learning about students’ preferences for features and
functionality, we also learned that they are quick to blame themselves when
things do not work as expected. Regardless of whether it was a system error or
user error, many students assumed that lack of functionality was somehow
related to their limited knowledge about the subject of the book or
unfamiliarity with the platform. On the other hand, when a platform was more
intuitive to use, students were happy to demonstrate how to use the site and
seemed more assertive in their comments. This was perhaps the strongest evidence
that platform design impacts user experience and that librarians need to
understand how platforms vary in order to purchase content on platforms that
optimize user experience.
Study
Limitations and Recommendations for Further Research
Although
the usability testing revealed local user preferences, the results are not
generalizable to all students. The convenience sample of students who
participated in the study may not represent our larger student population in
terms of previous e-book experience or fields of study. The majority of
students in the convenience sample were from STEM (science, technology,
engineering, or math) majors, and we recognize that these
students
might use e-books differently than other disciplines. We also know that many of
the participants had some experience with e-books prior to the study. We
collected information regarding prior e-book use in a pre-screening survey but
were unable to use this information because the pre-screening survey did not
include a consent form. Both areas would be interesting areas for further
research.
There
are many ways to identify usability problems and measure user satisfaction.
This study used the think-aloud protocol and task-based usability. While these
techniques are designed to produce qualitative and quantitative data, there are
limitations and room for error in their application. For example, task-based
usability is predicated on an observer leading the user through a script of
predefined tasks, but it is difficult not to deviate from the script in order
to follow the subject’s flow of information seeking behavior. The study was
also limited by the tasks that we asked students to complete. We tried to
create tasks that mimic what we thought was typical student behavior, and that
may have skewed our results. For example, we did not ask students to download
the entire e-book, although if we had, we hypothesize that fewer students would
prefer the aggregator platforms. Likewise, the testing environment was based on
a false need for information. Students may have behaved or answered differently
if the need was real and attached to an outcome that mattered to them, such as
a grade on an assignment.
The
think-aloud protocol also has certain limitations. Some students respond more
naturally to verbal communication; others might have given us different answers
if they were asked to respond in writing. Some students may also have been
influenced by the perceived “power dynamics” of a faculty librarian and student
relationship. This was mitigated by the consent form and the script that
assured students that this wasn’t “a test of your knowledge, and there are no
right or wrong answers.” However, their responses might have been skewed
towards what they thought the librarian wanted to hear.
Finally,
the students’ ranking of e-book platforms was limited to the platforms that
they examined. Because more students examined the aggregators than the
publisher platforms, the results regarding the aggregators are arguably more
valid. If students had been asked to test a different publisher platform, it
may have changed their opinion relative to the aggregators that they tested.
However, since most books are only available on a publisher and one or more
aggregator platforms and not different publisher platforms, the comparison
between publisher and aggregator platforms remains the most relevant finding
rather than comparing one publisher platform to another.
There
are relatively small differences between major e-book aggregators in terms of
cost, content, and coverage. As such, user feedback about their preferred
platform was critical to selecting a default option for the library’s approval
plan and demand driven e-book programs. This study identified strengths and
weaknesses of academic e-book platforms based on students’ experiences and
preferences. These characteristics can be used alongside other factors such as
pricing and accessibility when selecting a title that is available on more than
one platform.
The
results of the usability tests in this study indicated a preference for Ebook
Central over EBSCO and a strong preference for the aggregators over publisher
platforms. We expected that students would prefer the publisher platforms
because those platforms rarely impose limits on printing and downloading.
Students in this study, however, struggled to navigate the publisher platforms
and the overall perception was that they are not as easy to use as aggregators
with clearly defined menus and icons. This suggests that students value
usability and are even willing to accept some printing and downloading
restrictions in exchange for an intuitive, user-friendly platform. Although
students will find a way to access the materials they need, all e-book
providers should follow usability design principles that serve the needs of
students.
This
study explored students’ information seeking behaviour on e-book platforms and
identified features and functionality that students prefer on these platforms.
It confirmed the results of many previous studies that found that usability
issues influence user perceptions and success rates using e-books. Until we are
able to build completely intuitive resources, having a better understanding of
user expectations will help us select books on the platforms that best meets
user expectations.
Anson, C., &
Connell, R. (2009). E-book collections.
Washington, DC: Association of Research Libraries.
Berg, S. A.,
Hoffmann, K., & Dawson, D. (2010). Not on the same page: Undergraduates’
information retrieval in electronic and print books. The Journal of Academic Librarianship, 36(6), 518–525. https://doi.org/10.1016/j.acalib.2010.08.008
Bivens-Tatum, W.
(2014). The mess of ebooks. Peer to Peer
Reviewed. Retrieved from http://lj.libraryjournal.com/2014/10/opinion/peer-to-peer-review/the-mess-of-ebooks-peer-to-peer-review/
Carter, D.S.,
Grochowski, P. F., Lalwani, L. N., Nicholls, N. H., & Samuel, S.M. (2013).
Students, vendor platforms, and e-textbooks: Using e-books as e-textbooks. ASEE Annual Conference. Retrieved from http://www.asee.org/public/conferences/20/papers/7427/view
Cassidy, E. D.,
Martinez, M., & Shen, L. (2012). Not in love, or not in the know? Graduate
student and faculty use (and non-use) of e-books. The Journal of Academic Librarianship, 38(6), 326–332. https://doi.org/10.1016/j.acalib.2012.08.005
Cataldo, T. T.,
& Leonard, M. (2015, Spring). E-STEM: Comparing aggregator and publisher
e-book platforms. Issues in Science and
Technology Librarianship (80). https://doi.org/10.5062/F4FJ2DSP
Hobbs, K., &
Klare, D. (2016) Are we there yet?: A longitudinal look at e-books through
students’ eyes. Journal of Electronic
Resources Librarianship, 28(1), 9–24. https://doi.org/10.1080/1941126X.2016.1130451
JISC. (2009).
JISC national e-books observatory project: Key findings and recommendations. JISC national e-books observatory project.
Retrieved from http://observatory.jiscebooks.org/reports/jisc-national-e-books-observatory-project-key-findings-and-recommendations/
Kichuk, D.
(2015). Loose, falling characters and sentences: The persistence of the OCR
problem in digital repository e-books. portal:
Libraries and the Academy, 15(1), 59–91. https://doi.org/10.1353/pla.2015.0005
Library Journal.
(2012). Ebook usage in U.S. academic
libraries. The Digital Shift.
Retrieved from http://www.thedigitalshift.com/research/ebook-usage-reports/academic/
Library Journal.
(2016). Ebook usage in U.S. academic
libraries. Retrieved from https://s3.amazonaws.com/WebVault/research/LJ_2016_EbookUsage_AcademicLibraries.pdf
Muir, L., &
Hawes, G. (2013). The case for e-book literacy: Undergraduate students’
experience with e-books for course work. The
Journal of Academic Librarianship, 39(3),
260–274. https://doi.org/10.1016/j.acalib.2013.01.002
Mune, C., &
Agee, A. (2015, Mar.). Ebook showdown:
Evaluating academic ebook platforms from a user perspective. Paper session
presented at the meeting of the Association of College and Research Libraries,
2015 Conference, Portland, OR. Retrieved from https://works.bepress.com/ann_agee/19/
Nielsen, J.
(2012). How many test users in a usability study? Nielsen Norman Group:
Evidence-Based User Experience Research, Training, and Consulting. Retrieved
from https://www.nngroup.com/articles/how-many-test-users/
O’Neill, L.C.
(2009). A usability study of e-book platforms. Retrieved from https://cdr.lib.unc.edu/record/uuid:9a109741-0d0e-4a11-b02b-21893c32f8b9
Roncevic, M.
(2013). Criteria for purchasing e-book platforms. Library Technology Reports, 49(3),
10–13. Retrieved from https://journals.ala.org/index.php/ltr/article/view/4305
Slater, R.
(2010). Why aren’t e-books gaining more ground in academic libraries? E-Book
use and perceptions: A review of published literature and research. Journal of Web Librarianship, 4(4), 305–331. https://doi.org/10.1080/19322909.2010.525419
Thomas, J.,
& Chilton, G. (2016). Library e-book platforms are broken: Let’s fix them.
In S. M. Ward, R. S. Freeman, & J. M. Nixon (Eds.), Academic e-books: Publishers, librarians, and users (pp. 249–262).
West Lafayette, IN: Purdue University Press.
U.S. Department
of Health & Human Services. (2018). What & why of usability. Retrieved
from https://www.usability.gov/how-to-and-tools/methods/usability-testing.html
Wiersma, G., & Tovstiadi, E. (2017). Inconsistencies
between academic e-book platforms: A comparison of metadata and search results. portal: Libraries and the Academy, 17(3),
617–648. https://doi.org./10.1353/pla.2017.0037
Zhang, T., Niu,
X., & Promann, M. (2017). Assessing the user experience of e-books in
academic libraries. College &
Research Libraries, 78(5). https://doi.org/10.5860/crl.78.5.578
Appendix
Usability Tasks
1)
Task
1: Evaluating the E-Book Landing Page (landing page, title bibliographic info,
native reader)
a)
Have
you ever seen this webpage before?
b)
What
information do you expect to see here?
i)
What
information is most helpful?
2)
Task
2: Evaluating the Bibliographic Information
a)
Did
you notice this date? What does this date mean to you? (any date on landing
page; publication or otherwise)
b)
Why
do you think these names are here (point to authors, editors, etc. names)?
c)
Did
you notice these? (point to subject terms). What do these mean to you?
3)
Task
3: Finding and Using Citation Tools
a)
You
need to cite this book for your paper. How would you use this page to do that?
4)
Task
4: Navigating to a Specific Chapter or Page Number
a)
How
would you start reading the e-book from this page?
b)
What
do you expect to see when you click on this (read e-book, open e-book, etc.;)?
c)
Your
professor told you to start reading at chapter ##. It starts on page ##.
Starting from this page, how would you do that?
5)
Task
5: Finding and Using Annotation Tools
a)
(after
they navigate) As you’re reading, you want to take notes for your class. Would
you do that here in the e-book? If so, how?
6)
Task
6: Searching and Evaluating Results
a)
You
need to find information on _______ in this book. How would you do that? [If
searching is not their first response, prompt them to search within the book]
b)
[Note
how many results]
c)
How
many search results did you expect?
d)
How
do you think these search results are ranked? Why is this one (point to top
one) first?
e)
Which
search results are more useful?
f)
Where
do you expect to go when you click on this search result?
7)
Task
7: Printing, Saving, and Downloading
a)
You
want to print this page to read later. How would you do that?
b)
You
want to save this chapter to read later. How would you do that?
a.
What
would you expect to see if you downloaded the book?
8)
Task
8: Which Platform Would You Prefer to Use?
a)
Now
that you have seen 3 different versions of this book, which would you prefer to
use? [the student will be given three print-outs; one showing a screenshot of
each landing page of the e-books that they used during the testing.]
b)
Rank
the versions in order of preference.