Research Article

 

Measuring Scholarly Productivity of Long Island Educational Institutions: Using Web of Science and Scopus as a Tool

 

Clara Tran

Science Librarian

Science and Engineering Library

Stony Brook University

Stony Brook, New York, United States of America

Email: [email protected]

 

Selenay Aytac

Associate Professor

B. Davis Schwartz Memorial Library

Long Island University

Brookville, New York, United States of America

Email: [email protected]

 

Received: 1 Jan. 2016       Accepted: 25 May 2016  

 

 

cc-ca_logo_xl 2016 Tran and Aytac. This is an Open Access article distributed under the terms of the Creative CommonsAttributionNoncommercialShare Alike License 4.0 International (http://creativecommons.org/licenses/by-nc-sa/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly attributed, not used for commercial purposes, and, if transformed, the resulting work is redistributed under the same or similar license to this one.

 

 

Abstract

 

Objective – This paper explores how to utilize two well-known library databases, Thomson Reuter’s Web of Science and Elsevier’s Scopus, to quantify Long Island educational institutions’ scholarly productivity.

 

Methods – Institutions located in the Long Island region and within Nassau and Suffolk counties, including the State University of New York (SUNY) colleges, private institutions, and technical schools, were examined for the last 14 years (2000–2013). Eight Long Island institutions were represented in both databases and were included in the study.

 

Results – Of the eight institutions, Stony Brook University produced the most publications indexed in Web of Science and Scopus during the period of 2000–2013. Cold Spring Harbor Laboratory yielded the second most publications during 2000–2013 in both Web of Science and Scopus, but it produced the highest quality publications compared with other institutions excluding Stony Brook University. Although the annual growth rates of Farmingdale State College and New York Institute of Technology increased dramatically in both Web of Science and Scopus, the large proportional increase did not represent a large increase in total value. Additionally, some institutions had a higher number of publications indexed in Web of Science than in Scopus, and others had a higher number of publications indexed in Scopus than in Web of Science.

 

Conclusions – Data were collected from institutions in Long Island with various institutional sizes, the number of faculty members employed may have made an impact on the number of publications. Thus, publication data in this study cannot be used to compare their rankings. Institutions with a similar type and similar size of faculty members should be selected for comparison. Due to the different coverage and scope of Web of Science and Scopus, institutions should use both databases to examine their scholarly output. Furthermore, institutions should consider using altmetrics to capture various impacts of the scholarly output to complement the traditional metrics.

 

Introduction

 

For decades, the traditional assessment of institutions’ scholarly or research productivity has relied on scholarly publishing (Slutsky & Aytac, 2014). As research is needed for institutions to remain relevant and sustain their reputation for knowledge discovery, monitoring scholarly productivity assessment data is useful for individual, departmental, and university level evaluations. University administrators can use scholarly productivity data for institutional productivity assessment and annual budget decisions. Additionally, they can provide information on the overall performance of their institutions to obtain government funding and support accreditation decisions (Amara, Landry, & Halilem, 2015).

 

Although the first statistical analysis of scientific literature was conducted by Alfred J. Lotka in 1926, “bibliometrics” was coined separately by Pritchard as well as Nalimov and Mulchenko in 1969 (Glanzel, 2003). Roemer and Borchardt (2015) defined bibliometrics as a quantitative tool to measure and analyze research impact on print-based scholarly productivity that can be obtained by using proprietary databases or free online ranking resources. Bibliometric analysis is not only a useful method to study scholarly productivity and institutions’ citation impact (Wang, Fu, & Ho, 2011) but also one of the most used quantitative data collection methods for investigating publication patterns within a given field (Aytac, 2010). Common bibliometric indicators include the number of publications, number of citations, and journal impact factors (Wang et al., 2011). According to Roemer and Borchardt (2015), bibliometrics include Times Cited (which measures the individual contribution level); Impact Factor, Immediacy Index, Cited Half-Life, Eigenfactor and Article Influence Score, SCImago Journal Rankings, H5-Index and H5-Median (all of which measure the journal impact level); h-index and i10 index (each measures the author level); and Essential Science Indicators Rankings, SCImago Institutions Rankings, and Snowball Metrics (all of which measure the institutional level).

 

In recent years, interest in institutional scholarly productivity ratings has increased. These ratings are generally created by using bibliometric research tools, such as Thomson Reuter’s Web of Science (WoS), Elsevier’s Scopus, and Google’s Google Scholar.

 

For over forty years, WoS was the only database that tracked citation references (Meho & Yang, 2006; Li, Burnham, Lemley, & Britton, 2010) and produced large scale bibliometric statistics (Archambault, Campbell, Gingras & Larivière, 2009). With its development in 2004, Scopus became a good alternative to WoS (Manafy, 2005; Dess, 2006; Vieira & Gomes, 2009). Likewise, Google Scholar, also created in 2004 (Adriaanse & Rensleigh, 2013), can be utilized for scholarly productivity data collection (Orduna-Malea & Aytac, 2015).

 

Moreover, WoS and Scopus provide scholarly productivity data for institutions, departments, and individual faculty members. They also deliver reliable and comparable trend data that can be used to compare the research strength of institutions. The authors of this study used WoS and Scopus to explore the institutional scholarly productivity of Nassau and Suffolk counties in Long Island. These bibliometric tools served the following purposes in this study: (1) obtaining scholarly productivity data for each institution, (2) collecting the citation data and h-index for each institution, and (3) benchmarking these institutions for annual trend data.

 

Data for Long Island institutions for the period of 2000–2013 were collected in January 2015. Eight Long Island institutions were represented in both databases and were included in this study. Due to the varying sizes of the institutions examined, this study was limited to analyzing the growth of each institution instead of comparing rankings among the institutions.

 

Literature Review

 

WoS and Scopus have been widely used for bibliometric analysis. In 2011, Sicilia, Sánchez-Alonso, and García-Barriocanal compared computer science-related journals and found that journal impact factors included in WoS and Scopus ranking lists were highly correlated and comparable. Sarkozy, Slyman and Wu (2015) studied the publication and citation activity for individual researchers in three health sciences departments and suggested that faculty and administrators should not completely rely on citation counts as a measure of productivity due to name ambiguities and database limitations. In 2012, Bergman studied the citations of social work literature and found that WoS provided the fewest citation counts while Scopus provided the highest citation counts, even though both databases had a similar coverage pattern. Archambault et al. (2009) compared science, natural sciences, and engineering data based on “the number of papers and citations received by country” and analyzed the correlation between a country’s production and its ranking among the countries examined in their study (p.1320). They concluded that WoS and Scopus are “robust tools for measuring science at the country level” and suggested the study be repeated at the institutional level (p. 1325). To understand how different WoS and Scopus are in indexing publications at the institutional level, this study examined the Long Island educational institutions’ scholarly output from the period of 2000–2013. The WoS was further used to collect institutional citation data and h-index for measuring their research quality.

 

To some extent, limitations exist in bibliometric analysis. Roemer and Borchardt (2015) suggested scholars check other sources for times cited numbers as the content overlapping in WoS, Scopus, and Google Scholar varies in disciplines. Furthermore, impact factor does not appropriately apply to disciplines that are not focused on journals and journal articles and also does not include essays and extensive opinion works that have scholarly value. These limitations can result in an increase or decrease of the impact factor (Roemer & Borchardt, 2015). Levine-Clark and Gil (2009) found that WoS does not fully measure a scholar’s actual impact since it does not index all peer-reviewed journals and “other types of resources” (p. 45). In his study of journals indexed in Google Scholar, PubMed, and Scopus, Chen (2013) found that Scopus does not index Green OA (open access), which “refers to self-archived articles hosted on OA Web sites such as institutional repositories” (p. 244). Because no single metric can fully measure the true impact factor, librarians should advise researchers, faculty, and graduate students to look for traditional and nontraditional measures for a better reflection of their scholarly works’ impact factor (Roemer & Borchardt, 2015).

 

To understand the coverage of WoS and Scopus, the authors retrieved information from the respective products’ websites. Thomson Reuters (2016a) indicates that WoS Core Collection includes five indexes and two chemistry databases:

 

 

Elsevier (2016b) also indicates Scopus’s coverage of various types of materials, including the following:

 

 

There have been studies to compare the coverage, scope, and methodology of WoS and Scopus. López-Illescas, Moya-Anegón, and Moed (2008) agreed that the two databases differ in scope, data volume, and coverage. Levine-Clark and Gil (2009) stated that in addition to covering mostly journals, Scopus also “includes conference proceedings, book series, and trade publications” (p. 33). Gravel and Iselid (2008) also found that Scopus covers a larger number of serial publications than WoS. In 2009, Levine-Clark and Gil studied citations for business and economics journals and reported that Scopus retrieved slightly more citations than WoS since Scopus includes 8,000 more journals than WoS. Dess (2006), and Li et al. (2010) found that due to different coverage, WoS allows for a longer period of citation tracking than Scopus. Scopus only covers citation tracking from 1996 onward (Li et al., 2010). During their study of content verification and quality of the South African environmental sciences journals, Adriaanse and Rensleigh (2013) found that Scopus provides the most comprehensive coverage of title, author, and volume number compared to WoS and Google Scholar.

 

The scope of disciplines covered by the two databases also varies. Elsevier (2016b) shows that Scopus’s subject areas include the Life Sciences (15%), Health Sciences (32%), Physical Sciences (29%), and Social Sciences (24%). Dess's study in 2006 showed that Scopus is heavily focused on the health and life sciences with less emphasis on physical science, mathematics, psychology, and social sciences and even less emphasis on business and marketing. Li et al. (2010) agreed that Scopus provides strong coverage in health sciences and physical sciences but not the other disciplines.

 

WoS provides two categories of searches: bibliographic search and cited reference search (Li et al., 2010). Dess (2006), and Li et al. (2010) stated that bibliographic information can be found using the basic, advanced, and author searches. The basic or the advanced search allows users to obtain specific information from search results, such as the “numbers of articles in subject areas, document type, authors, source titles, publication years, institutions, funding agencies, languages, and countries” (Li et al., 2010, p. 198). Furthermore, users can obtain a citation report that includes “the search results found, sum of the times cited, average citations per item, and h-index number” from search results (p. 198). WoS also includes unique features, such as Distinct Author Set and Citation Map. Additionally, WoS provides a useful statistical tool, Journal Citation Reports, which measures journal impact factor (Levine-Clark & Gil, 2009).

 

Likewise, Li et al. (2010) described Scopus’s Author Identifier, which retrieves matches from “their affiliation, address, subject area, source title, dates of publication citations, and co-authors,” as the strength of the database (p. 201). Similar to WoS, Scopus provides a cited reference list when searching for an author. Citation Analysis allows users to “view the articles that cited the original articles” and the h-index provides graphs that display publication record strength (p. 201). Furthermore, Scopus provides a journal analyzer that allows users to compare journals in terms of “number of citations, articles published, and percentage not cited” (p. 202). Gavel and Iselid (2008) observed that it is more difficult to study the overlapping coverage at an article level than at the journal level because overlapping coverage at the article level requires users to identify “the bibliographic subfields of individual articles cited” (p. 9).

 

Although WoS and Scopus provide reliable bibliographic data for institutions, they have limitations. First, the two databases have different criteria for indexing publications. Goodwin (2014) stated that the “Organization-Enhanced” option does not include all organizations that are indexed in WoS. In Scopus, a document that does not have sufficient citation information may not be correctly assigned to the affiliation from which the publication originates (Elsevier, 2016a). The two databases include different document types, disciplines, languages, and time periods (Zhang, 2014). The databases have other issues related to published journals. First, the databases have limited scholarly journal coverage based on the information provided on the products' sites. Second, the databases have limited coverage of open access journals, although WoS includes over 12,000 high impact journals, including open access journals (Thomson Reuters, 2016a), and Scopus indexes 4,200 open access journals (Elsevier, 2016a), but neither database includes all the journal titles in the Directory of Open Access Journals (DOAJ, 2016). Third, the databases have limited coverage of non-periodical resources, such as monographs and dissertations. Further, Scopus covers patents (Elsevier, 2016b) but WoS does not. Additionally, limited coverage for non-western publications and the language bias of these indexes may affect publication count.

 

Although bibliometric research methods, particularly Citation Indexes, have received considerable attention in the literature, some limitations of these indexes have been noted by researchers. Okubo and Miquel (1990) pointed out that for some cases, authors’ affiliations are not always the true indicator of the corresponding research’s origin. Since co-authorships are the primary indicators of affiliations and can only be tracked by authors’ affiliation data, the amount of co-authorship studies in WoS’s indexes may be limited.

 

Limitations of the SSCI, and particularly its “representativity” problem, which corresponds to the equal representation of each country’s research publication, are underlined by Schoepflin (1990). However, the main problem that corresponds with representativity is largely related to the publication language of journals. Unfortunately, journal articles published in non-mainstream languages are not likely to be in both indexes of WoS. Similarly, Braun, Glanzel, and Schubert (2000) evaluated the representativeness of the SCI’s journal coverage at the level of countries. This is a very valid issue especially for non-western or non-English speaking countries. For instance, only a few Turkish journals are listed in Journal Citation Reports, and both the SCI and SSCI indexes are lacking in terms of representation of most of developing countries due to language bias. As English is the lingua franca of science, the research done in non-English languages is oftentimes lost. The language bias of the WoS database was repeatedly discussed in the literature. Cole and Phelan (1999), Osareh and Wilson (1997), and Barrios, Borrego, Ollé, Vilaginés, and Somoza (2008) have pointed out this as a limitation in reaching those non-English scientific journals. In the same vein, Mongeon and Paul-Hus (2016) reported that Scopus has similar aforementioned limitations despite its much larger coverage. The authors can conclude that both databases WoS and Scopus have similar limitations.

 

Methods

 

There are numerous ways to quantify an institution’s research or scholarly productivity. One way is counting the number of scholarly outputs produced by the institution. This data generally consists of the number of publications made by faculty, students, and staff affiliated with the institution. In January 2015, the period of 2000–2013 was chosen (instead of 2000–2014 for data collection because publications in 2014 might not have been fully indexed in WoS and Scopus). Document types including articles, reviews, proceedings, books, and book chapters were included in this study. In addition to collecting the publication counts for measuring research productivity using the two databases, citation counts were also collected for measuring research quality on April 19, 2016, using only WoS. Scopus was not able to provide a large dataset for many of the institutions for the selected period of 2000–2013 in a single query.

 

Samples

 

A period of fourteen years (2000–2013) of scholarly productivity of eighteen Long Island institutions were identified for data collection. Cold Spring Harbor Laboratory was included in the study because of its PhD program in biological sciences. Brookhaven National Laboratory was not considered for this study because it is not an academic institution.

 

The bibliometric analysis revealed that only eight of the eighteen institutions were represented in both WoS and Scopus databases. The data of institutions, which range from private to public colleges and universities, were used for further analysis. Below is the list of the eight institutions:

 

  1. Adelphi University
  2. Cold Spring Harbor Laboratory (Watson School of Biological Sciences)
  3. Farmingdale State College
  4. Hofstra University
  5. Long Island University
  6. New York Institute of Technology
  7. Stony Brook University
  8. SUNY Old Westbury

 

Procedures

 

The data of the eighteen institutions were collected from the WoS and Scopus databases at the end of January 2015. Data collection involved several steps. The annual number of scholarly productions per institution was extracted from WoS and Scopus and exported from the databases to an Excel spreadsheet for analysis and calculations. Then, because the eight institutions were represented in both databases, their data were filtered for further analysis. Finally, the annual growth rate of these eight institutions’ scholarly productivity was calculated for each year using this formula:

 

 

 

 

For Farmingdale State College, the calculations in WoS were based on the year of 2003 as no prior publications from 2000 to 2002 were recorded.

 

During the data collection process, the “Organization-Enhanced” option in WoS and the “Affiliation Search” option in Scopus were used. In WoS, the “Organization-Enhanced” option allows users to find publications from institutions with name variants. Users can either enter the organization name in the search field or click the “Select from Index” link to search for the organization. This “Select from Index” link provides users with options to either select the organization name from the “Organization-Enhanced” list or enter the name in the “Find” field.  Additionally, selecting the preferred name from the list or entering the preferred name in the “Find” field yields a more accurate result because the result is retrieved from the addresses linked to that organization (Thomson Reuters, 2015). In this study, preferred names were mainly collected from the “Organization-Enhanced” list without expanding the “View Details” option to add or exclude any affiliations. Data for Farmingdale State College and the New York Institute of Technology, however, were collected using the search field.

 

Like WoS, Scopus also allows users to search for an organization using the “Affiliation Search” option. When a list of affiliations is generated, the affiliations of the institutions can be selected from the list. This list of affiliations provides links to documents and any available information about the affiliations, such as affiliation ID, name variations, and address information (Elsevier 2016a). Data for Farmingdale State College, however, were collected using the "Document Search," followed by "Affiliation Name."

 

Results

 

Scholarly Productivity Data 2000–2013 for Each Institution

 

Based on WoS from 2000 to 2013, Stony Brook University produced the most scholarly publications (33,406) as shown in Table 1 and Table 1a. Stony Brook University was followed by Cold Spring Harbor Laboratory (2,935), Hofstra University (2,507), Adelphi University (1,446), Long Island University (1,237), SUNY Old Westbury (424), Farmingdale State College (61), and New York Institute of Technology (16).

 

In Scopus and searching from 2000 to 2013, Stony Brook University also produced the most scholarly publications (30,759) as shown in Table 2 and Table 2a. Stony Brook University was followed by Cold Spring Harbor Laboratory (2,834), Long Island University (2,369), Hofstra University (2,229), Adelphi University (1,415), New York Institute of Technology (1,040), SUNY Old Westbury (320), and Farmingdale State College (96).

 

Times Cited and H-index of Each Institution

 

Data which were collected in April 2016 revealed that Cold Spring Harbor Laboratory produced the highest quality research papers, followed by Hofstra University, Long Island University, Adelphi University, SUNY Old Westbury, Farmingdale State College, and New York Institute of Technology as shown in Table 3. Stony Brook University was not included in this comparison. According to Thomson Reuters (2016b), the Citation Report feature in WoS only allows a search of citation activity for up to 10,000 records. As Stony Brook University’s scholarly output record from 2000 to 2013 was 33,790, multiple searches for citation activities were required (see Table 4). Additionally, data showed that some institutions had a slightly higher scholarly output number than data that were collected in January 2015; this did not affect the analysis’s result.

 

 

Table 1

Web of Science – Institutional Scholarly Productivity from 2000–2006

Name of Institution

2000

2001

2002

2003

2004

2005

2006

Adelphi University

71

79

57

52

92

82

94

Cold Spring Harbor Laboratory

165

174

210

208

219

213

208

Farmingdale State College

0

0

0

3

2

1

3

Hofstra University

122

139

135

143

136

173

168

Long Island University

67

91

90

87

80

80

77

New York Institute of Technology

1

0

1

1

2

0

0

Stony Brook University

2102

2127

2059

2087

2274

2258

2359

SUNY Old Westbury

43

26

30

22

27

45

40

 

 

Table 1a

Web of Science – Institutional Scholarly Productivity from 2007–20013

Name of Institution

2007

2008

2009

2010

2011

2012

2013

Total

Adelphi University

112

123

105

132

133

154

160

1446

Cold Spring Harbor Laboratory

221

231

208

201

216

218

243

2935

Farmingdale State College

5

6

6

8

5

9

13

61

Hofstra University

140

184

178

192

263

275

259

2507

Long Island University

79

109

105

96

89

102

85

1237

New York Institute of Technology

1

1

0

2

3

0

4

16

Stony Brook University

2418

2487

2558

2466

2586

2766

2859

33406

SUNY Old Westbury

21

34

37

30

22

26

21

424

 

 

Table 2

Scopus – Institutional Scholarly Productivity from 2000–2006

Name of Institution

2000

2001

2002

2003

2004

2005

2006

Adelphi University

43

43

41

55

81

83

99

Cold Spring Harbor Laboratory

152

149

152

210

215

236

197

Farmingdale State College

1

2

1

0

1

3

3

Hofstra University

56

81

86

131

131

193

168

Long Island University

109

114

129

123

147

165

185

New York Institute of Technology

22

23

24

39

46

36

71

Stony Brook University

1754

1643

1653

1846

2045

2188

2334

SUNY Old Westbury

35

18

23

21

25

33

23

 

 

Table 2a

Scopus – Institutional Scholarly Productivity from 2007–2013

Name of Institution

2007

2008

2009

2010

2011

2012

2013

Total

Adelphi University

135

105

128

131

155

157

159

1415

Cold Spring Harbor Laboratory

221

217

203

189

206

230

257

2834

Farmingdale State College

7

8

13

23

11

13

10

96

Hofstra University

156

182

183

192

231

237

202

2229

Long Island University

179

202

193

217

185

213

208

2369

New York Institute of Technology

81

95

98

106

112

132

155

1040

Stony Brook University

2333

2334

2368

2349

2502

2611

2799

30759

SUNY Old Westbury

14

22

27

23

19

19

18

320

 

 

Table 3

Web of Science – Institutional Scholarly Output, Times Cited, and h-index from 2000–2013

Name of Institution

Scholarly Output

Times Cited

h-index

Adelphi University

1452

11387

48

Cold Spring Harbor Laboratory

2960

313392

255

Farmingdale State College

62

319

11

Hofstra University

3270

26042

63

Long Island University

1242

13517

51

New York Institute of Technology

17

75

5

Stony brook University*

33790

*

*

SUNY Old Westbury

424

5427

37

*Please see Table 4.

 

 

Table 4

Web of Science – Stony Brook University Scholarly Output, Times Cited, and h-index from 2000–2013

Year

Scholarly Output

Times Cited

h-index

2000–2003

8462

310600

211

2004–2007

9389

278826

197

2008–2010

7578

153301

144

2011–2013

8361

99894

103

Total

33790

842621

 

 

 

Benchmark Institutions for Annual Trend Data

 

Each institution’s publication growth was measured by the percent increase in annual growth indexed in WoS and Scopus. The following figures provide benchmarking of all the institutions and give a clear view of productivity growth for each of the institutions between the years 2000 and 2013.

 

In 2013, Farmingdale State College as indexed in WoS topped an annual growth rate of +333%, followed by New York Institute of Technology (+300%), Adelphi University (+125%), Hofstra University (+112%), Cold Spring Harbor Laboratory (+47%), Stony Brook University (+36%), Long Island University (+27%), and SUNY Old Westbury (-51%). These growths rates are shown in Figure 1.

 

 

Figure 1

Web of Science – Institutional annual growth rate from 2000–2013.

 

 

Figure 2

Scopus – Institutional annual growth rate from 2000–2013.

 

 

In 2013, as shown in Figure 2, Farmingdale State College as indexed in Scopus also topped an annual growth rate of (+900), followed by New York Institute of Technology (+605%), Adelphi University (+270%), Hofstra University (+261%), Long Island University (+91%), Cold Spring Harbor Laboratory (+69%), Stony Brook University (+60%), and SUNY Old Westbury (-49%).

 

Comparison of the Two Databases on Scholarly Productivity of Each Long Island Institution for 14 Years from 2000–2013

 

Table 5 provides the institutional annual comparison between the WoS and Scopus. Referring to the Adelphi University, the numbers of years with publications indexed in WoS (7) and Scopus (7) were the same. 

 

 

Table 5

Institutional Annual Comparison between Web of Science and Scopus from 2000–2013

 

Name of Institution

Number of years with publications indexed in Web of Science

 

Number of years with publications indexed in Scopus

Number of years with publications indexed in Web of Science and Scopus

Adelphi University

7

7

0

Cold Spring Harbor Laboratory

9

4

1

Hofstra University

9

3

2

Farmingdale State College

3

10

1

Long Island University

0

14

0

New York Institute of Technology

0

14

0

Stony Brook University

14

0

0

SUNY Old Westbury

14

0

0

 

 

Cold Spring Harbor Laboratory had a higher number of years with publications indexed in WoS (9) than in Scopus (4) with one year that had the same number of articles in both databases. Similarly, Hofstra University had a higher number of years with publications indexed in WoS (9) than in Scopus (3) with two years that had the same number of articles in both databases. On the other hand, Farmingdale State College had more years with publications indexed in Scopus (10) than in WoS (3) with one year that had the same coverage in both databases.

 

Data from Long Island University and New York Institute of Technology showed that both institutions had a higher number of years with publications indexed in Scopus (14) than in WoS (0) for every single year from 2000 to 2013. To the contrary, Stony Brook University and SUNY

 

Old Westbury had a higher number of years with publications indexed in WoS (14) than in Scopus (0) for every single year from 2000 to 2013.

 

Discussion

 

In terms of publications, data showed that Stony Brook University produced the most publications during 2000–2013 in both WoS and Scopus. The Carnegie Classification of Higher Education (n.d.) showed that Stony Brook University is classified as a research university with very high research activity. Additionally, Stony Brook University employed 2,471 faculty members in the fall of 2013 (Stony Brook University, 2015). The number of faculty members employed may make an impact on the number of publications. Although Cold Spring Harbor Laboratory produced the second most scholarly output during 2000–2013 in both WoS and Scopus among the eight institutions, it produced the highest quality publications compared with six institutions.

 

Regarding the institutional annual growth rate, Figures 1 and 2 revealed that the annual growth rates of Farmingdale State College and New York Institute of Technology increased dramatically in both WoS and Scopus. Slutsky and Aytac (2014) explained that a large proportional increase in annual productivity does not represent a large increase in total value; the data presented should be viewed as trend data and no conclusion should be made from these observations. However, the presented data can be useful to see the general trend in scholarly growth among the Long Island institutions. 

 

Additionally, the graphs provided background information regarding annual scholarly productivity per institution for this investigation. For instance, Adelphi University, Cold Spring Harbor Laboratory, Hofstra University, Stony Brook University, and SUNY Old Westbury, all of which had the same or a higher number of publications indexed in WoS during 2000–2013, are either affiliated with medical schools or heavily involved in scientific research. Hence, the scholarly productivity is higher. Goodwin (2014) also stated that the publications that WoS indexes are heavily weighted towards the sciences, particularly towards the life sciences. On the other hand, Farmingdale State College, Long Island University, and New York Institute of Technology had a higher number of publications indexed in Scopus due to the publications of materials such as dissertations and theses in the humanities. Archambault et al. (2009) observed that WoS and Scopus do not have the same system of categorizing documents; the two databases may “label the same documents differently” (p.1321). To see the spread of types was for each database, the documents from Long Island University in WoS and Scopus were identified for further analysis as shown in Table 6. The data were taken from WoS and Scopus in the month of June of 2015 for this specific case.

 

 

Table 6

Document Types in WoS and Scopus for Long Island University in 2013

Long Island University

Document Type

WoS

Scopus

Article

62

168

Book

0

4

Book Chapter

0

12

Book Review

3

0

Conference Paper/Proceeding Paper

1

8

Editorial/Editorial Material

2

3

Meeting Abstract

14

0

Note

0

5

Review

3

9

Total

85

209

 

 

Figure 3

Number of documents in WoS and Scopus as well as the overlapping citations in both databases for Long Island University in 2013.

 

 

Long Island University had 61 items indexed in both WoS and Scopus. See Figure 3. Among these 61 items, the document types included article, review, editorial, book chapter, and conference proceeding as displayed in Table 7. In this subset, Scopus indexed 54 articles while WoS indexed 55 articles. Scopus indexed 2 conference papers while WoS indexed only one. In this case, the insignificant difference in number was not sufficient to show that either Scopus or WoS labeled journal articles and conference papers very differently. However, institutions should use both WoS and Scopus to examine their scholarly output as the databases’ coverage and scope are different.

 

Google Scholar compliments scholarly productivity findings from traditional approaches. Google Scholar, an academic search engine launched in November 2004, indexes and retrieves academic content throughout the Internet. Google Scholar recently released an automatic institutional affiliation tool that gathers all authors belonging to one institution. Google Scholar, however, cannot directly retrieve the number of documents published by one university as opposed to WoS or Scopus can. Instead, specific queries can be performed to retrieve the number of documents stored on the official university website. Considering the role of institutional repositories, this procedure might represent a proxy (Orduna-Malea, Ayllón, Martín-Martín, & López-Cózar, 2015; Orduna-Malea & López-Cózar, 2014; Orduna-Malea, Serrano-Cobos, & Lloret-Romero, 2009). Table 8 displays the results from Google Scholar for each institution included in the study. The hit count estimates (number of documents stored within each official university website) were retrieved from google.com with the "site" search command. The data were collected on April 1, 2016.

 

 

Table 7

Overlapping Document Type in WoS and Scopus

Overlapping Document Type

WoS

Scopus

Article

55

54

Review

3

3

Editorial Material; Book Chapter/Editorial

1

1

Article; Book Chapter/Book Chapter

1

1

Proceeding Paper/Conference Paper

1

2

Total

61

61

 

 

Table 8

Google Scholar – Institutional Scholarly Productivity from 2000–2013

Name of Institutions

Domain Names

Google Scholar (2000–2013)

Google Scholar (all years)

Adelphi University

adelphi.edu

41

52

Cold Spring Harbor Laboratory (Watson School of Biological Sciences)

cshl.edu

182

242

Farmingdale State College

farmingdale.edu

12

19

Hofstra University

hofstra.edu

660

1140

Long Island University

liu.edu

36

63

New York Institute of Technology

nyit.edu

77

105

Stony Brook University

stonybrook.edu

2040

2040

SUNY Old Westbury

oldwestbury.edu

1

1

Total

3049

3662

 

 

When interpreting data from Google Scholar, a small amount does not mean low productivity. An institution may publish a large quantity of papers, but if these materials are not deposited on the website (especially in an institutional repository), the number of items indexed in Google Scholar will be low. More importantly, a high number may mean, with some confidence, great performance and good visibility online. If citation data is needed, citations must be manually created for every item with the query "site:url". Additionally, having information management strategies, particularly institutional repositories, may help universities be better represented on Google Scholar.

 

Conclusion

 

One of the well-accepted goals of institutions is to increase institutional research. Educational institutions would find it beneficial to use WoS and Scopus more systematically to obtain scholarly productivity data on student, faculty, and staff engagement in research activities.  These data can be also used to shape institutions’ decisions on strategic planning, research allocations, and research funding.

 

As data were collected from institutions in Long Island with various types, missions, and institutional sizes, publication data in this study cannot be used to compare their rankings. For instance, both Hofstra and Stony Brook have medical schools, but Hofstra established its medical school recently, and the two universities have different sizes of faculty bodies, making them very difficult to compare. This study should be repeated for another cluster of New York institutions, such as SUNY campuses or CUNY colleges, with a similar size of faculty members. Additionally, similar type of institution should be examined, such as two-year community colleges or four-year research universities.

 

Additionally, our findings suggest that the use of the publication indicator does not cover the full research profile of the Long Island institutions that were selected as a sample; nevertheless, they do provide a sense of research growth for each institution. In order to gain a comprehensive awareness of the research activity of each institution, a future study may involve analyses of the scholarly productivity data in relation to clusters of strength in different research disciplines in science, engineering, social sciences, humanities, and medicine, with a focus on WoS or Scopus.

 

Another scholarly productivity data source, Google Scholar, provided trend data for the Long Island institutions in this paper. It is important to note that materials deposited on the Internet or an institutional repository may yield a higher number of items indexed in Google Scholar. Additionally, altmetrics should be considered to capture various impacts of the scholarly output to complement the traditional metrics.

 

As outlined previously, the limitations of these two databases will not allow a full examination of the scholarly productivity of each institution. However, the micro level of data collection procedures that was provided should be helpful to obtain the institutional scholarly output. Future projects can be built on these findings to extend the knowledge and understanding of the scholarly productivity of Long Island scholars.

 

References

 

Adriaanse, L. S., & Rensleigh C. (2013). Web of Science, Scopus and Google Scholar: A content comprehensiveness comparison. The Electronic Library, 31(6), 727-744. http://doi.org/10.1108/EL-12-2011-0174

 

Amara, N., Landry, R., & Halilem, N. (2015). What can university administrators do to increase the publication and citation scores of their faculty members? Scientometrics, 103(2), 489-530. http://doi.org/10.1007/s11192-015-1537-2

 

Archambault, É., Campbell, D., Gingras, Y., & Larivière, V. (2009). Comparing bibliometric statistics obtained from the Web of Science and Scopus. Journal of the American Society for Information Science and Technology, 60(7), 1320-1326. http://doi.org/10.1002/asi.21062

 

Aytac, S. (2010). Scientific international collaboration of Turkey, Greece, Poland, and Portugal: A bibliometric analysis. Proceedings of the American Society for Information Science and Technology, 47(1), 1-3. http://doi.org/10.1002/meet.14504701305

 

Barrios, M., Borrego, A., Ollé, C., Vilaginés, A., & Somoza, M. (2008). A bibliometric study of psychological research on tourism. Scientometrics77(3), 453-467. http://doi.org/10.1007/s11192-007-1952-0

 

Bergman, E. M. L. (2012). Finding citations to social work literature: The relative benefits of using Web of Science, Scopus, or Google Scholar. The Journal of Academic Librarianship, 38(6), 370-379. http://doi.org/10.1016/j.acalib.2012.08.002

 

Braun, T., Glanzel, W., & Schubert, A. (2000). Collaboration networks in science. In B. Cronin & H. B. Atkins (Eds.), The web of knowledge (pp. 251-277). Medford, NJ: Information Today.

 

Chen, X. (2013). Journal article retrieval in an age of open access: How journal indexes indicate open access articles. Journal of Web Librarianship, 7, 243-254. http://dx.doi.org/10.1080/19322909.2013.795426

 

Cole, S., & Phelan, T. J. (1999). The scientific productivity of nations. Minerva, 37(1), 1-23.

 

Dess, H. M. (2006). Database reviews and reports: Scopus. Issues in Science and Technology Librarianship, 46. http://doi.org/10.5062/F4X0650T

 

DOAJ. (2016). Directory of Open Access Journals. Retrieved April 21, 2016, from https://doaj.org/

 

Elsevier. (2016a). Scopus: Affiliation search. Retrieved April 20, 2016, from

http://help.elsevier.com/app/answers/detail/a_id/2312/p/8150/kw/affiliation%20search/search/1

 

Elsevier. (2016b). Scopus: Content. Retrieved April 20, 2016, from http://www.elsevier.com/solutions/scopus/content

 

Gavel, Y., & Iselid, L. (2008). Web of Science and Scopus: A journal title overlap study. Online Information Review, 32(1), 8-21. http://dx.doi.org/10.1108/14684520810865958

 

Glanzel, W. (2003). Bibliometrics as a research field: A course on theory and application of bibliometric indicators. National Science Digital Library. Retrieved from http://nsdl.niscair.res.in/jspui/handle/123456789/968

 

Goodwin, C. (2014). Advisor reviews-standard review: Web of Science. The Charleston

Advisor, 16(2), 55-61. http://dx.doi.org/10.5260/chara.16.2.55

 

Levine-Clark, M., & Gil, E. L. (2009). A comparative citation of Web of Science, Scopus, and Google Scholar. Journal of Business & Finance Librarianship, 14, 32-46. http://dx.doi.org/10.1080/08963560802176348

 

Li, J., Burnham, J. F., Lemley, T., & Britton, R. M. (2010). Citation analysis: Comparison of Web of Science, Scopus, SciFinder, and Google Scholar. Journal of Electronic Resources in Medical Libraries, 7, 196-217. http://dx.doi.org/10.1080/15424065.2010.505518

 

López-Illescas, C., Moya-Anegón, F. D., & Moed, H. F. (2008). Coverage and citation impact of oncological journals in the Web of Science and Scopus. Journal of Informetrics, 2, 304-316. http://dx.doi.org/10.1016/j.joi.2008.08.001

 

Manafy, M. (2005). Scopus of influence: Content selection committee announced. Econtent, 28(10), 12.

 

Meho, L. I., & Yang K. (2007). Impact of data sources on citation counts and rankings of LIS faculty: Web of Science vs Scopus and Google Scholar. Journal of American Society for Information Science and Technology, 58(13), 2105-2125. http://dx.doi.org/10.1002/asi.20677

 

Mongeon, P., & Paul-Hus, A. (2016). The journal coverage of Web of Science and Scopus: A comparative analysis. Scientometrics, 106(1), 213-228. http://dx.doi.org/10.1007/s11192-015-1765-5

 

Okubo, Y., & Miquel, J. F. (1990). International cooperation in basic science. In P. Weingart, R. Sehringer, & M. Winterhage (Eds.), Representations of science and technology: Proceedings of the international conference on science and technology indicators (pp. 124-143). Leiden: DSWO Press.

 

Orduna-Malea, E., Ayllón, J. M., Martín-Martín, A., & López-Cózar, E. D. (2015). Methods for estimating the size of Google Scholar. Scientometrics, 104(3), 931-949. http://dx.doi.org/10.1007/s11192-015-1614-6

 

Orduna-Malea, E., & Aytac, S. (2015). Revealing the online network between university and industry: The case of Turkey. Scientometrics, 105(3), 1849-1866. http://dx.doi.org/10.1007/s11192-015-1596-4

 

Orduna-Malea, E., Delgado López-Cózar, E. (2014). Google Scholar Metrics evolution: An analysis according to languages. Scientometrics, 98(3), 2353-2367. http://dx.doi.org/10.1007/s11192-013-1164-8

 

Orduna-Malea, E., Serrano-Cobos, J., & Lloret-Romero, N. (2009). Spanish public universities in Google Scholar: Presence, evolution and coverage of their scientific output. El Profesional De La Información, 18(5), 493-500. http://dx.doi.org/10.3145/epi.2009.sep.02

 

Osareh, F., & Wilson, C. (1997). Third World Countries (TWC) research publications by disciplines: A country-by-country citation analysis. Scientometrics39(3), 253-266. http://dx.doi.org/10.1007/BF02458529

 

Roemer, R. C., & Borchardt, R. (2015). Meaningful metrics: A 21st-century librarian’s guide to bibilometrics, altmetrics, and research impact. Chicago, IL: Association of College and Research Libraries.

 

Sarkozy, A., Slyman, A., & Wu, W. (2015). Capturing citation activity in three health sciences departments: A comparison study of Scopus and Web of Science. Medical Reference Services Quarterly, 34(2), 190-201.  http://dx.doi.org/10.1080/02763869.2015.1019747

 

Sicilia, M.-A., Sánchez-Alonso, S., & García-Barriocanal, E. (2011).  Comparing impact factors from two different citation databases: The case of computer science. Journal of Informetrics, 5, 698-704. http://dx.doi.org/10.1016/j.joi.2011.01.007

 

Slutsky, B. & Aytac, S. (2014). Publication patterns of science, technology, and medical librarians: Review of the 2008-2012 published research. Science & Technology Libraries, 33(4), 369-382. http://dx.doi.org/10.1080/0194262X.2014.952486

 

Schoepflin, U. (1990). Problems of representativity in the Social Sciences Citation Index. In P. Weingart, R. Sehringer, & M. Winterhager (Eds.), Representations of science and technology: Proceedings of the international conference on science and technology indicators (pp. 177-188). Leiden: DSWO Press.

 

Stony Brook University. (2015). Institutional research, planning & effectiveness. Human Resources (Faculty & Staff): Faculty. Retrieved from http://www.stonybrook.edu/commcms/irpe/data/hr/index.html

 

The Carnegie Classification of Institutions of Higher Education. (n.d.). Institution profile: Stony Brook University. Retrieved April 26, 2015, from http://carnegieclassifications.iu.edu/lookup/view_institution.php?unit_id=196097&start_page=lookup.php&clq=%7B%22first_letter%22%3A%22S%22%7D

 

Thomson Reuters. (2015). Web of Science core collection help: Searching the organizations-enhanced field. Retrieved June 15, 2015, from https://images.webofknowledge.com/WOKRS518B4/help/WOS/hs_organizations_enhanced.html

 

Thomson Reuters. (2016a). The world’s most trusted citation index: Web of Science core collection covering the leading scholarly literature. Retrieved April 21, 2016, from http://wokinfo.com/products_tools/multidisciplinary/webofscience/

 

Thomson Reuters. (2016b). Web of Science: Analysis the knowledge to decide. Retrieved April 21, 2016, from http://wokinfo.com/benefits/whywok/ouranalysistools/

 

Vieira, E. S., & Gomes, J.A.N.F. (2009). A comparison of Scopus and Web of Science for a typical university. Scientometrics, 81(2), 587-600. http://dx.doi.org/10.1007/s11192-009-2178-0

 

Wang, M.-H., Fu, H.-Z., & Ho, Y.-S. (2011). Comparison of universities’ scientific performance using bibliometric indicators. Malaysian Journal of Library & Information Science, 16(2), 1-19.

 

Zhang, L. (2014). The impact of data source on the ranking of computer scientists based on citation indicators: A comparison of Web of Science and Scopus. Issues in Science and Technology Librarianship, 75. http://doi.org/10.5062/F4D798CW