[ Roger Atkinson's Home Page ]

Bibliometrics: An IT perspective

Roger Atkinson

My last column, in HERDSA News 29(3), concluded with a worry, "...I'm a bit jittery about the way the RQF is ticking!" [1] Well, relax, it stopped ticking. As announced tersely on the DEST/DEEWR websites:

A new Government, led by the Leader of the Australian Labor Party, the Hon Kevin Rudd MP, was sworn in by the Governor-General on 3 December 2007. [2]
...
The Australian Government announced on 21 December 2007 that it would not be proceeding with the former Government's Research Quality Framework (RQF) project. [3]
Given that the principal recent stimulus for an interest in bibliometrics, the RQF, will "not be proceeding", why revisit bibliometrics, and why an IT perspective? To begin with, there are the "I'll be back" [4] and "Be prepared" [5] concerns. Here is a succinct potential early warning from a medical researcher in the UK:
From 2008, state funding for academic research in the UK will be calculated differently. The research assessment exercise, which is based substantially on (high intensity, high cost) peer review, will transfer to bibliometric scoring. Such metrics include journal impact factors ... [6]
So bibliometrics should remain on the agenda, though this brief column will concentrate on highlighting and awareness raising rather than a systematic exposition, recognising that deep interest in the topic will be mainly from librarians, editors [7], publishers, and specialist researchers, for example in cybermetrics and webometrics [8]. An IT perspective is especially relevant because information technology could be aptly described as the "engine room" for modern bibliometrics, in particular for the two main topics I wish to explore. Firstly, in recent developments in citation analysis services, Scopus [9] and Google Scholar [10] are challenging the ISI Web of Knowledge [11] and related products including Web of Science from Thomson Scientific, the owners of the well known ISI journal Impact Factor [11]. Secondly, many publishers have developed website services which automatically deliver links to articles that cite the article or abstract you are viewing on screen, and to related articles. These two areas are linked, in the usual titillating ways that you might expect, namely money, because Scopus and Web of Knowledge cost you (or your poor library) serious amounts, whilst Google Scholar is for free, and 'academic ego', because most of us feel that it's nice to know who out there has or has not cited my papers.

To begin at a straightforward level, a growing number of papers in the library and information sciences literature are investigating comparisons between the 'majors' in bibliometrics, and other topics in citation reporting and analysis. Staying with my 'highlighting and awareness' purpose, here is a very, very small sample of titles of articles from recent browsings:

To digress just a little, the last five are listed mainly because the titles are prepended with a nice catchy phrase (or are just catchy, like RPM is revving up), or in the case of the last title (dated 1988!), a historian may wish to lay claim to an article title with just a subtle suggestion of cynicism, The RQF: An experiment to develop research output indicators.

Going back to the main topic, one impression is that a 'bibliometric skirmish' could be brewing. For example:

The combination of the inflated citation count values dispensed by Google Scholar (GS) with the ignorance and shallowness of some GS enthusiasts can be a real mix for real scholars. [13, Jacso, 2006]

Because Google Scholar is freely accessible from the Google site, students and faculty are finding and using it. They are beginning to ask librarians for their professional opinions of its efficacy. [18, Schroeder, 2007]

...Scopus offers the best coverage from amongst these databases and could be used as an alternative to the Web of Science as a tool to evaluate the research impact in the social sciences. [17, Norris & Oppenheim, 2007]

...Google Scholar's wider coverage of Open Access (OA) web documents is likely to give a boost to the impact of OA research and the OA movement. [14, Kousha & Thelwall, 2008]

...[Google Scholar] contains all of the elements of the sort of search service which we in our libraries are trying to provide by purchasing federated search tools. ...for known item searching - for that paper by this author on this topic for instance - it is often as good as any of the abstracting and indexing services we take, and better in that it is Google - easy and free and used by everyone. [15, MacColl, 2006]
Whether this kind of ''skirmishing" becomes a "war" remains to be seen. Some may hope that it does develop into the only kind of war I like to see, namely a price war. After the bibliometric heat has been on authors, it could be time for publishers to have a turn. An executive with Elsevier's Scopus made a sympathetic comment about authors, when writing on new alternatives to the Thomson ISI Impact Factor [11]:
Originally it [the Impact Factor] was intended as a collection management tool, but has since evolved into a metric used for evaluation of science at all levels as well as evaluation of authors. This can have far-reaching consequences for an author's grant applications, promotion and tenure since the metric is directly influenced by the performance of specific journals and is thus for a large part beyond the author's control. [24, de Mooij, 2007]
However, the publishing and bibliometric scenes are also displaying some interesting trends towards collaboration. This takes us to the second of my main topics, publishers developing web page links to a citation service. To begin with, here is a simple but important illustration from SAGE Journals Online, publishing the AERA's Review of Educational Research. At this point we really need screen delivery for this column, but let's try anyway. For an example I selected Hattie and Timperley (2008) [26], partly because many members of HERDSA could be or perhaps should be interested in its content. The website display of the article's abstract [26] includes a menu comprising about 22 items, among them four links to Google Scholar. For example, click upon "Articles by Hattie, J." and the underlying HTML enables the HTTP call:
http://scholar.google.com/scholar?q=%22author%3AJ.+author%3Ahattie%22
and the reader receives Google Scholar's output from a search for articles by "author:J. author:Hattie" (of course it works best with uncommon names). Click upon "Citing Articles via Google Scholar" and the call returns "Results 1 - 10 of 10 citing Hattie: The Power of Feedback" (by the time you read this, John Hattie's citation count could be larger!). A third Google Scholar link is to "Search for Related Content". Of course you could go off to Google Scholar and type in the search strings for yourself, but the "one click" thing is rather nice, and it does suggest that SAGE Journals Online has developed some respect for Google Scholar. However, they have an each-way bet, also having a link to "Citing Articles via ISI Web of Science" (alas, it won't work for me).

For another example, consider van Raan (2005b), the "Fatal attraction..." article in Scientometrics [23]. The menu accompanying the display of the abstract [23] includes the information, "Referenced by 19 newer articles" (maybe higher if and when you look), each of these listed in a conventional way (first author, year, title, journal) with a hypertext link to an abstract for the citing article. For example, one of the links is to Yang & Meho (2007) [20] (though date is given incorrectly as 2006), published in a Wiley Interscience journal, Journal of the American Society for Information Science and Technology. That's a rather nice "one click" thing for the reader, delivered in this case via CrossRef [27], a collaborative service created by numerous publishers. As in the previous example, citation counts and links to citing articles are created and updated automatically by IT "engine room" processes.

To conclude, I should confess on the subtitle, "An IT perspective". This column is really "A search engine perspective", mainly Google and Google Scholar, and occasionally a publisher's website search engine. Without IT, and very rarely visiting a university library, how else could I find such interesting and relevant reading on a topic that I knew little about?

References

  1. Atkinson, R. (2007). Can we trust web based surveys? HERDSA News, 29(3). http://www.roger-atkinson.id.au/pubs/herdsa-news/29-3.html
  2. http://www.dest.gov.au/ and http://www.deewr.gov.au/ [viewed 29 Feb 2008]
  3. DEST/DEEWR. Research quality. [viewed 29 Feb 2008] http://www.dest.gov.au/sectors/research_sector/policies_issues_reviews/key_issues/research_quality_framework/
  4. Wikipedia. I'll be back. http://en.wikipedia.org/wiki/I'll_be_back [viewed 29 Feb 2008]
  5. Wikipedia. Scout motto. http://en.wikipedia.org/wiki/Scout_Motto [viewed 29 Feb 2008]
  6. Hobbs, R. (2007). Should we ditch impact factors? [viewed 29 Feb 2008] http://www.bmj.com/cgi/content/extract/334/7593/569
  7. For example, myself. AJET Editorial 24(3). http://www.ascilite.org.au/ajet/ajet24/editorial24-2.html
  8. For example, Statistical Cybermetrics Research Group, University of Wolverhampton, [viewed 29 Feb 2008] http://cybermetrics.wlv.ac.uk/
  9. Scopus. Scopus Info. http://info.scopus.com/
  10. Google Scholar. http://scholar.google.com/
  11. ISI Web of Knowledge. [an aggregation of Thomson products which includes Web of Science, Journal Citation Reports, etc] http://isiwebofknowledge.com/
  12. Cameron, B. D. (2005). Trends in the usage of ISI bibliometric data: Uses, abuses, and implications. portal: Libraries and the Academy, 5(1), 105-125. https://muse.jhu.edu/demo/portal_libraries_and_the_academy/v005/5.1cameron.html
  13. Jacso, P. (2006). Deflated, inflated and phantom citation counts. Online Information Review, 30(3), 297-309. http://www.emeraldinsight.com/10.1108/14684520610675816
  14. Kousha, K. & Thelwall, M. (2008). Sources of Google Scholar citations outside the Science Citation Index>: A comparison between four science disciplines. Scientometrics, 74(2), 273-294. http://www.springerlink.com/content/j263493h5687828n/
  15. MacColl, J. (2006). Google challenges for academic libraries. Ariadne, 46. http://www.ariadne.ac.uk/issue46/maccoll/
  16. Meho, L. I. & Yang, K. (2007). Impact of data sources on citation counts and rankings of LIS Faculty: Web of Science versus Scopus and Google Scholar. Journal of the American Society for Information Science and Technology, 58(13), 2105-2125. http://www.slis.indiana.edu/faculty/meho/meho-yang.pdf
  17. Norris, M. & Oppenheim, C. (2007). Comparing alternatives to the Web of Science for coverage of the social sciences' literature. Journal of Informetrics, 1(2), 161-169. http://www.info.scopus.com/docs/Infometrics_Scopus_vs_WoS.pdf
  18. Schroeder, R. (2007). Pointing users toward citation searching: Using Google Scholar and Web of Science. portal: Libraries and the Academy, 7(2), 243-248. http://muse.jhu.edu/journals/portal_libraries_and_the_academy/v007/7.2schroeder.pdf
  19. van Raan, A. F. J. (2005a). For your citations only? Hot topics in bibliometric analysis. Measurement: Interdisciplinary Research and Perspectives, 3(1), 50-62. http://www.leaonline.com/doi/abs/10.1207/s15366359mea0301_7
  20. Yang, K. & Meho, L. I. (2007). Citation analysis: A comparison of Google Scholar, Scopus, and Web of Science. Proceedings of the American Society for Information Science and Technology, 43(1), 185-185. http://www3.interscience.wiley.com/cgi-bin/abstract/116328907/ABSTRACT
  21. Garfield, E. G. (2005). The agony and the ecstasy-the history and meaning of the journal impact factor. Paper presented at the International Congress on Peer Review And Biomedical Publication, Chicago,16 September. http://garfield.library.upenn.edu/papers/jifchicago2005.pdf
  22. Wallace, D. P. (1987). A solution in search of a problem: Bibliometrics and libraries. Library Journal, 112(8), 43-47. [abstract only] http://www.eric.ed.gov/ERICWebPortal/recordDetail?accno=EJ355774
  23. van Raan, A. F. J. (2005b). Fatal attraction: Conceptual and methodological problems in the ranking of universities by bibliometric methods. Scientometrics, 62(1), 133-143. [viewed 2 Mar 2008] http://www.akademiai.com/content/n514x05067512413/
  24. de Mooij, H. (2007). Research Performance Measurement is revving up. Elsevier Library Connect Newsletter, 5(3). http://libraryconnect.elsevier.com/lcn/0503/lcn050302.html
  25. Carpenter, M. P., Gibb, F., Harris, M., Irvine, J., Martin, B. R. & Narin, F. (1988). Bibliometric profiles for British academic institutions: An experiment to develop research output indicators. Scientometrics, 14(3-4), 213-233. [Online Date: Saturday, August 06, 2005; DOI: 10.1007/BF02020076] http://www.akademiai.com/content/m4814h2220t53nh4/
  26. Hattie, J. & Timperley, H. (2007). The power of feedback. Review of Educational Research, 77(1), 81-112. [DOI: 10.3102/003465430298487; viewed 2 Mar 2008] http://rer.sagepub.com/cgi/content/abstract/77/1/81
  27. CrossRef. crossref.org: Fast Facts. http://www.crossref.org/01company/16fastfacts.html
Author: Roger Atkinson retired from Murdoch University's Teaching and Learning Centre in June 2001. His current activities include publishing AJET and honorary work on TL Forum, ascilite Melbourne 2008 and other academic conference support and publishing activities.
Website (including this article in html format): http://www.roger-atkinson.id.au/
Contact: rjatkinson@bigpond.com

Please cite as: Atkinson, R. J. (2008). Bibliometrics: An IT perspective HERDSA News, 30(1). http://www.roger-atkinson.id.au/pubs/herdsa-news/30-1.html


[ Roger Atkinson's Home Page ] [ Publications Contents ]
Created 4 Mar 2008. Last correction: 4 June 2009. HTML author: Roger Atkinson [rjatkinson@bigpond.com]
This URL: http://www.roger-atkinson.id.au/pubs/herdsa-news/30-1.html