Saturday, 13 January 2018

Maximized Research Impact: Effective Strategies for Increasing Citations | International SERIES on Information Systems and Management in Creative eMedia (CreMedia)

Effective Strategies for Increasing
Citations | International SERIES on Information Systems and Management
in Creative eMedia (CreMedia)

Maximized Research Impact: Effective Strategies for Increasing Citations

  • Nader Ale Ebrahim

  • Hossein Gholizadeh

  • Artur Lugmayr


The high competitive environment has forced higher education
authorities to set their strategies to improve university ranking.
Citations of published papers are among the most widely used inputs to
measure national and global university ranking (which accounts for 20%
of QS, 30% of THE, and etc.). Therefore, from one hand, improving the
citation impact of a search is one of the university manager’s
strategies. On the other hand, the researchers are also looking for some
helpful techniques to increase their citation record. This chapter by
reviewing the relevant articles covers 48 different strategies for
maximizing research impact and visibility. The results show that some
features of article can help predict the number of article views and
citation counts. The findings presented in this chapter could be used by
university authorities, authors, reviewers, and editors to maximize the
impact of articles in the scientific community.


How to Cite
EBRAHIM, Nader Ale; GHOLIZADEH, Hossein; LUGMAYR, Artur.
Maximized Research Impact: Effective Strategies for Increasing Citations.
International SERIES on Information Systems and Management in Creative eMedia (CreMedia), [S.l.], n. 2017/1, p. 29-52, dec. 2017.
ISSN 2341-5576.
Available at: <>. Date accessed: 13 jan. 2018.


[1] M. Fooladi, H. Salehi, M. M. Yunus, M. Farhadi, A. Aghaei
Chadegani, H. Farhadi, et al., "Do Criticisms Overcome the Praises of
Journal Impact Factor?," Asian Social Science, vol. 9, pp. 176-182,
April 27 2013.

[2] P. Smart, H. Maisonneuve, and A. Polderman, "6.11: Maximizing
research visibility, impact, and citation: tips for editors and

[3] J. Bar-Ilan, "Which h-index? - A comparison of WoS, Scopus and
Google Scholar," Scientometrics, vol. 74, pp. 257-271, Feb 2008.

[4] L. I. Meho and K. Yang, "Impact of data sources on citation counts
and rankings of LIS faculty: Web of science versus scopus and google
scholar," Journal of the American Society for Information Science and
Technology, vol. 58, pp. 2105-2125, Nov 2007.

[5] H. Gholizadeh, H. Salehi, M. A. Embi, M. Danaee, S. M. Motahar, N.
Ale Ebrahim, et al., "Relationship among Economic Growth, Internet Usage
and Publication Productivity: Comparison among ASEAN and World’s Best
Countries," Modern Applied Science, vol. 8, pp. 160-170, March 14 2014.

[6] C. E. Paiva, J. Lima, and B. S. R. Paiva, "Articles with short
titles describing the results are cited more often," Clinics, vol. 67,
pp. 509-513, 2012.

[7] N. Ale Ebrahim, H. Salehi, M. A. Embi, F. Habibi Tanha, H.
Gholizadeh, S. M. Motahar, et al., "Effective Strategies for Increasing
Citation Frequency," International Education Studies, vol. 6, pp. 93-99,
October 23 2013.

[8] K. A. Lefaivre, B. Shadgan, and P. J. O'Brien, "100 Most Cited
Articles in Orthopaedic Surgery," Clinical Orthopaedics and Related
Research, vol. 469, pp. 1487-1497, May 2011.

[9] K. Jones and K. Evans, "Good Practices for Improving Citations to your Published Work," University of BATHFebruary 2013.

[10] S.-A. Marashi, H.-N. Seyed Mohammad Amin, K. Alishah, M. Hadi, A.
Karimi, S. Hosseinian, et al., "Impact of Wikipedia on citation trends,"
EXCLI Journal, vol. 12, pp. 15-19, January 15 2013.

[11] N. Ale Ebrahim. (2012, 7 October 2012). Publication Marketing Tools
“Enhancing Research Visibility and Improving Citations”. Research Tools
in Education Series [Presentation]. 1(2), 1-86. Available:

[12] N. Ale Ebrahim, H. Salehi, M. A. Embi, F. Habibi Tanha, H.
Gholizadeh, and S. M. Motahar, "Visibility and Citation Impact,"
International Education Studies, vol. 7, pp. 120-125, March 30 2014.

[13] N. Ale Ebrahim, "Introduction to the Research Tools Mind Map," Research World, vol. 10, pp. 1-3, June 14 2013.

[14] LiU E-Press. (2007, 9 May). One way to increase citation frequency.

[15] A. Lugmayr, "Managing Creativeness in a Research Laboratory-Lessons
Learned from Establishing NAMU Lab./EMMi Lab," in The 25th Bled
eConference "eDependability: Reliable and Trustworthy eStructures,
eProcesses, eOperations and eServices for the Future", Slovenia, 2012,
pp. 1-9.

[16] A. Lugmayr, "Opening Lecture: Managing Creativeness in a Research
Laboratory-Lessons Learned from Establishing NAMU Lab./EMMi Lab,"
presented at the Doctoral Consortium - Research Supervision Dilemmas,
26th Bled EConference., Slovenia, 2012.

[17] C. Sarli and K. Holmes. (2011, 9 May). Strategies for Enhancing the
Impact of Research. Available:

[18] R. Wong, "Ways to Maximise Citations for Researchers," ed. University of Sheffield, 2008, pp. 1-7.

[19] M. Van Wesel, "Evaluation by Citation: Trends in Publication
Behavior, Evaluation Criteria, and the Strive for High Impact
Publications," Science and Engineering Ethics, pp. 1-27, 2015/03/06

[20] M. van Wesel, S. Wyatt, and J. ten Haaf, "What a difference a colon
makes: how superficial factors influence subsequent citation,"
Scientometrics, vol. 98, pp. 1601-1615, Mar 2014.

[21] J. Beel, B. Gipp, and E. Wilde, "Academic Search Engine
Optimization (ASEO)," Journal of Scholarly Publishing, vol. 41, pp.
176-190, 01/01/ 2010.

[22] (2014, 19 September). Academic
Search Engine Optimization: An inevitable evil? Available:

[23] J. Beel and B. Gipp, "On the robustness of google scholar against
spam," presented at the Proceedings of the 21st ACM conference on
Hypertext and hypermedia (HT’10), Toronto, Ontario, Canada, 2010.

[24] Emerald Guide. (2012, 09 May). How to... write an abstract.

[25] H. R. Jamali and M. Nikzad, "Article title type and its relation
with the number of downloads and citations," Scientometrics, vol. 88,
pp. 653-661, 2011/08/01 2011.

[26] Z. Corbyn, "An easy way to boost a paper's citations," Nature, vol. 406, 13 August 2010.

[27] E. S. Vieira and J. A. N. F. Gomes, "Citations to scientific
articles: Its distribution and dependence on the article features,"
Journal of Informetrics, vol. 4, pp. 1-13, 1// 2010.

[28] M. Ramos, J. Melo, and U. Albuquerque, "Citation behavior in
popular scientific papers: what is behind obscure citations? The case of
ethnobotany," Scientometrics, vol. 92, pp. 711-719, 2012/09/01 2012.

[29] G. D. Webster, P. K. Jonason, and T. O. Schember, "Hot topics and
popular papers in evolutionary psychology: Analyses of title words and
citation counts in Evolution and Human Behavior, 1979–2008,"
Evolutionary Psychology, vol. 7, pp. 348-362, 2009.

[30] E. Garfield, "Citation indexes for science. A new dimension in
documentation through association of ideas," International Journal of
Epidemiology, vol. 35, pp. 1123-1127, October 1, 2006 2006.

[31] J. Hudson, "Be known by the company you keep: Citations - quality
or chance?," Scientometrics, vol. 71, pp. 231-238, May 2007.

[32] P. Ball, "A longer paper gathers more citations," Nature, vol. 455, pp. 274-275, 2008.

[33] H. A. Abt, "Why some papers have long citation lifetimes," Nature, vol. 395, pp. 756-757, 10/22/print 1998.

[34] T. A. Hamrick, R. D. Fricker, and G. G. Brown, "Assessing What
Distinguishes Highly Cited from Less-Cited Papers Published in
Interfaces," Interfaces, vol. 40, pp. 454-464, November 1, 2010 2010.

[35] A. G. Gross, J. E. Harmon, and M. S. Reidy, Communicating science:
The scientific article from the 17th century to the present: Oxford
University Press Oxford, 2002.

[36] Z. Corbyn, "To be the best, cite the best," Nature, vol. 539, 13 October 2010 2010.

[37] E. Deckers and K. Lacy, Branding Yourself: How to Use Social Media
to Invent or Reinvent Yourself: Que Publishing Company, 2010.

[38] Taylor & Francis Group. (2012, 9 May). Optimize citations.

[39] J. K. Vanclay, "Factors affecting citation rates in environmental
science," Journal of Informetrics, vol. 7, pp. 265-271, April 2013.

[40] ACM. (2013, 30 May). ACM Computing Surveys. Available:

[41] O. Persson, "Are highly cited papers more international?," Scientometrics, vol. 83, pp. 397-401, May 2010.

[42] V. Pislyakov and E. Shukshina, "Measuring Excellence in Russia:
Highly Cited Papers, Leading Institutions, Patterns of National and
International Collaboration," presented at the Proceedings of STI 2012,
Montréal, 2012.

[43] K. Krause. (2009, 28 May 2013). Increasing your Article's Citation
Rates. Open Access Week. Available:

[44] D. W. Aksnes, "Characteristics of highly cited papers," Research Evaluation, vol. 12, pp. 159-170, Dec 2003.

[45] T. Van Leeuwen, "Strength and weakness of national science systems:
A bibliometric analysis through cooperation patterns," Scientometrics,
vol. 79, pp. 389-408, 2009/05/01 2009.

[46] K. Frenken, R. Ponds, and F. Van Oort, "The citation impact of
research collaboration in science-based industries: A
spatial-institutional analysis," Papers in Regional Science, vol. 89,
pp. 351-271, 2010.

[47] E. Y. Li, C. H. Liao, and H. R. Yen, "Co-authorship networks and
research impact: A social capital perspective," Research Policy, vol.
42, pp. 1515-1530, 11// 2013.

[48] R. Sooryamoorthy, "Do types of collaboration change citation?
Collaboration and citation patterns of South African science
publications," Scientometrics, vol. 81, pp. 177-193, 2009/10/01 2009.

[49] S. Wuchty, B. F. Jones, and B. Uzzi, "The Increasing Dominance of
Teams in Production of Knowledge," Science, vol. 316, pp. 1036-1039, May
18, 2007 2007.

[50] C. A. Cotropia and L. Petherbridge, "The Dominance of Teams in the
Production of Legal Knowledge," ed: Loyola-LA Legal Studies, 2013.

[51] P. Ball, "Are scientific reputations boosted artificially?," in Nature, ed: Nature Publishing Group, 2011.

[52] N. Haslam, L. Ban, L. Kaufmann, S. Loughnan, K. Peters, J. Whelan,
et al., "What makes an article influential? Predicting impact in social
and personality psychology," Scientometrics, vol. 76, pp. 169-185,
2008/07/01 2008.

[53] M. Sember, A. Utrobicic, and J. Petrak, "Croatian Medical Journal
Citation Score in Web of Science, Scopus, and Google Scholar," Croatian
Medical Journal, vol. 51, pp. 99-103, Apr 2010.

[54] D. Nicholas, E. Herman, and H. R. Jamali, Emerging reputation
mechanisms for scholars. Luxembourg: Publications Office of the European
Union, 2015, 2015.

[55] D. Godoy, A. Zunino, and C. Mateos, "Publication practices in the
Argentinian Computer Science community: a bibliometric perspective,"
Scientometrics, vol. 102, pp. 1795-1814, Feb 2015.

[56] S. Dhawan and B. Gupta, "Evaluation of Indian physics research on
journal impact factor and citations count: A comparative study," DESIDOC
Journal of Library & Information Technology, vol. 25, pp. 3-7,

[57] Y. W. Chang, "A comparison of citation contexts between natural
sciences and social sciences and humanities," Scientometrics, vol. 96,
pp. 535-553, Aug 2013.

[58] P. Dorta-Gonzalez and M. I. Dorta-Gonzalez, "Comparing journals
from different fields of science and social science through a JCR
subject categories normalized impact factor," Scientometrics, vol. 95,
pp. 645-672, May 2013.

[59] L. Ortega and K. Antell, "Tracking Cross-Disciplinary Information
Use by Author Affiliation: Demonstration of a Method," College &
Research Libraries, vol. 67, pp. 446-462, September 1, 2006 2006.

[60] S. Lawrence, "Free online availability substantially increases a
paper's impact," Nature, vol. 411, pp. 521-521, 05/31/print 2001.

[61] Y. Gargouri, C. Hajjem, V. Larivière, Y. Gingras, L. Carr, T.
Brody, et al., "Self-Selected or Mandated, Open Access Increases
Citation Impact for Higher Quality Research," PLoS ONE, vol. 5, p.
e13636, 2010.

[62] S. Harnad, "Publish or perish—self-archive to flourish: the green route to open access," ERCIM News, vol. 64, January 2006.

[63] C. Pernet and J.-B. Poline, "Improving functional magnetic
resonance imaging reproducibility," GigaScience, vol. 4, pp. 1-8,
2015/03/31 2015.

[64] L. Vaughan and D. Shaw, "Bibliographic and Web citations: What is
the difference?," Journal of the American Society for Information
Science and Technology, vol. 54, pp. 1313-1322, 2003.

[65] J. A. Evans, "Electronic Publication and the Narrowing of Science
and Scholarship," Science, vol. 321, pp. 395-399, July 18, 2008 2008.

[66] C. J. MacCallum and H. Parthasarathy, "Open Access Increases Citation Rate," PLoS Biol, vol. 4, p. e176, 2006.

[67] A. Swan, "Title," unpublished|.

[68] R. Frost. (2009, 9 May). Case study: Open Access visibility and
impact of an individual researcher. Available:

[69] L. G. Campbell, S. Mehtani, M. E. Dozier, and J. Rinehart,
"Gender-Heterogeneous Working Groups Produce Higher Quality Science,"
PLoS ONE, vol. 8, p. e79147, 2013.

[70] C. A. Bowers, J. A. Pharmer, and E. Salas, "When member homogeneity
is needed in work teams - A meta-analysis," Small Group Research, vol.
31, pp. 305-327, Jun 2000.

[71] A. W. Woolley, C. F. Chabris, A. Pentland, N. Hashmi, and T. W.
Malone, "Evidence for a Collective Intelligence Factor in the
Performance of Human Groups," Science, vol. 330, pp. 686-688, Oct 2010.

[72] L. Hong and S. E. Page, "Problem solving by heterogeneous agents,"
Journal of Economic Theory, vol. 97, pp. 123-163, Mar 2001.

[73] L. Hong and S. E. Page, "Groups of diverse problem solvers can
outperform groups of high-ability problem solvers," Proceedings of the
National Academy of Sciences of the United States of America, vol. 101,
pp. 16385-16389, Nov 2004.

[74] D. Maliniak, R. Powers, and B. F. Walter, "The Gender Citation Gap
in International Relations," International Organization, vol. 67, pp.
889-922, Fal 2013.

[75] SAGE. (2012, 9 May). 10 Ways to Increase Usage and Citation of your
Published Article Using Social Media. Available:

[76] N. Ale Ebrahim, S. Ahmed, and Z. Taha, "Virtual R & D teams in
small and medium enterprises: A literature review," Scientific Research
and Essay, vol. 4, pp. 1575–1590, December 2009.

[77] N. Ale Ebrahim and H. Salehi, "Maximize Visibility: A Way to
Increase Citation Frequency," UM HIR SPECIAL FEATURE (27 May 2013), pp.
1-5, 27 May 2013.

[78] W. Kieńć. (2015, 4th June 2015). Blog on your own and blog with
your publisher. Available:

[79] A. G. Smith, "Citations and Links as a Measure of Effectiveness of
Online LIS Journals," IFLA Journal, vol. 31, pp. 76-84, March 1, 2005

[80] Taylor & Francis Group. (2012, 9 May). Promote your article.

[81] G. Eysenbach, "Can Tweets Predict Citations? Metrics of Social
Impact Based on Twitter and Correlation with Traditional Metrics of
Scientific Impact (vol 13, e123, 2011)," Journal of Medical Internet
Research, vol. 14, p. 2, Jan-Feb 2012.

[82] M. Terras, "The impact of social media on the dissemination of
research: Results of an experiment," Journal of Digital Humanities, vol.
1, 2012.

[83] L. Public Policy Group, "Maximizing the impacts of your research: a
handbook for social scientists," London School of Economics and
Political Science, London, UK.2011.

[84] Elsevier BV, "Get Noticed: Promoting your article for maximum impact," 2014.

[85] Derek. (2010, 9 June 2015). Citation Competition. Available:

[86] A. J. Dorta-Contreras, R. Arencibia-Jorge, Y. Marti-Lahera, and J.
A. Araujo-Ruiz, "[Productivity and visibility of Cuban neuroscientists:
bibliometric study of the period 2001-2005]," Rev Neurol, vol. 47, pp.
355-60, Oct 1-15 2008.

[87] M. Burger, "How to improve the impact of your paper," Elsevier B.V.2014.

[88] V. Calcagno, E. Demoinet, K. Gollner, L. Guidi, D. Ruths, and C. de
Mazancourt, "Flows of Research Manuscripts Among Scientific Journals
Reveal Hidden Submission Patterns," Science, vol. 338, pp. 1065-1069,
November 23, 2012 2012.

[89] P. Ball. (2012, 11 October) Rejection improves eventual impact of
manuscripts. Nature. Available:

[90] H. A. Piwowar, R. S. Day, and D. B. Fridsma, "Sharing Detailed
Research Data Is Associated with Increased Citation Rate," PLoS ONE,
vol. 2, p. e308, 2007.

[91] S. Dorch, "On the Citation Advantage of linking to data: Astrophysics," ed.

[92] J. R. Sears, "Data sharing effect on article citation rate in paleoceanography."

[93] E. A. Henneken and A. Accomazzi, Linking to Data: Effect on Citation Rates in Astronomy vol. 461, 2012.

[94] A. M. Pienta, G. C. Alter, and J. A. Lyle, "The enduring value of
social science research: the use and reuse of primary research data,"

[95] M. C. Whitlock, M. A. McPeek, M. D. Rausher, L. Rieseberg, and A.
J. Moore, "Data Archiving," American Naturalist, vol. 175, pp. E45-146,
Feb 2010.

[96] H. A. Piwowar and T. J. Vision, "Data reuse and the open data citation advantage," Peerj, vol. 1, Oct 2013.

[97] M. J. McCabe, "Online Access and the Scientific Journal Market: An
Economist’s Perspective," University of Michigan and SKEMA Business
SchoolJune 2011.

[98] E. Garfield and R. K. Merton, "Perspective on Citation Analysis of
Scientists," in Citation indexing: Its theory and application in
science, technology, and humanities. vol. 8, ed: Wiley New York, 1979.

[99] N. Ale Ebrahim, "How to Promote Your Article," University of Malaya Research Bulletin, vol. 1, 23 June 2014.

[100] D. Sahu, "Open Access: Why India Should Brace it?," ed, 2005, pp. 1-49.

[101] Utrecht University. (2014, June 2015). Research Impact &
Visibility: Researcher profiles. Available:

[102] A. Aghaei Chadegani, H. Salehi, M. M. Yunus, H. Farhadi, M.
Fooladi, M. Farhadi, et al., "A Comparison between Two Main Academic
Literature Collections: Web of Science and Scopus Databases," Asian
Social Science, vol. 9, pp. 18-26, April 27 2013.

Maximized Research Impact: Effective Strategies for Increasing Citations | International SERIES on Information Systems and Management in Creative eMedia (CreMedia)

Tuesday, 14 November 2017

Thing 23: Altmetrics – 23 Research Things (2017)


Thing 23: Altmetrics

Image: “altmetrics” by AJC1 via Flickr (CC BY-SA 2.0)

The rise of Web 2.0 technologies is
linked to non-traditional scholarly publishing formats such as reports,
data sets, blogs, and outputs on other social media platforms. But how
do you track impact when traditional measures such as citation counts
don’t apply? Altmetrics to the rescue!

Getting Started

The term “altmetrics” (=alternative metrics) was coined in a tweet in 2010, and its development since then has gone from strength to strength, resulting in a manifesto. With no absolute definition the term can refer to

  • impact
    measured on the basis of online activity, mined or gathered from online
    tools and social media (e.g. tweets, mentions, shares, links, downloads, clicks, views,  comments, ratings, followers and so on);

  • metrics for alternative research outputs, for example citations to datasets;
  • alternative ways of measuring research impact. 
Benefits of altmetrics include the fact that they can

  • provide a faster method of accumulation than the more traditional citation-based metrics; 
  • complement traditional citation-based metrics providing a more diverse range of research impact and engagement; 
  • offer
    an opportunity to track the increasing availability of data,
    presentations, software, policy documents and other research outputs in
    the online environment.  
Below we introduce some of the tools available to University of Melbourne staff and students to start collecting altmetrics.

Altmetric Explorer

Altmetric Explorer
has been monitoring online sources since 2012, collating data from Web
of Science, Scopus, Mendeley, Facebook, Twitter, Google+, LinkedIn,
Wikipedia (English language), YouTube, public policy documents, blogs
and more. Altmetric Explorer uses a donut graphic to visually identify the type and quantity of attention a research output has received:

The University of Melbourne Library’s subscription to Altmetric Explorer provides access to institutional data, as well as data for individual researchers and their outputs. Consider installing the Altmetric bookmarklet
on your toolbar to view Altmetric metrics for your publications. (Note:
this is only available for PubMed, arXiv or pages containing a DOI with
Google Scholar friendly citation metadata.)

PlumX Metrics

PlumX brings together research metrics for all types of scholarly research output, categorised as follows:  

  • Usage: clicks, downloads, views, library holdings, video plays… 
  • Captures: bookmarks, code forks, favorites, readers, watchers… 
  • Mention: blog posts, comments, reviews, Wikipedia links, news media… 
  • Social Media: tweets, Facebook likes, shares… 
  • Citations: citation indexes (CrossRef/Scopus/Pubmed Central etc), patent citations, clinical citations, policy citations…
PlumX Metrics for individual articles, conference proceedings, book
chapters, and other resources, are available via the University of
Melbourne Library’s Discovery
search service – look for the Plum Print in the results list for your
search, and hover your cursor over it to expand details of the metrics:

Example of PlumX Metrics for an article retrieved through Discovery

also displays PlumX Metrics for articles where available, offering an
interesting opportunity to view altmetrics alongside “traditional”
citation metrics,  and the Scopus field-weighted citation impact.


Impactstory is an
open source, web-based researcher profile that provides altmetrics to
help researchers measure and share the impacts of their research outputs
for traditional outputs (e.g. journal articles), as well as alternative
research outputs such as blog posts, datasets and software. Researchers
can register an Impactstory profile for free via their Twitter account,
then link other profiles such as ORCID and Google Scholar, as well as Pubmed IDs, DOIs, Webpage URLs, Slideshare and Github usernames. Impactstory
then provides an overview of the attention these connected collections
have received. Information from Impactstory can be exported for private

Have a look at this Impactstory example profile to find out more.

Public Library of Science – Article Level Metrics (PLOS ALMs)

If you publish research in the life sciences you can use PLOS ALMs to help guide understanding of the influence and impact of work before the accrual of academic citations.  

  • All PLOS
    journal articles display PLOS ALMs – quantifiable measures that
    incorporate both academic and social metrics to document the many ways
    in which both scientists and the general public engage with published
  • PLOS ALMs are presented on the metrics tab on every published article. 
  • Use ALM reports to guide you to the most important and influential work published. 

Minerva Access

The University of Melbourne’s institutional repository, Minerva Access,
allows research higher degree students and research staff to safely
promote and self-publish their research. There are a number of
incentives for including in the repository: 

  • Minerva Access is harvested by Google Scholar, which in turn provides exposure and potential citation follow on
  • Minerva
    Access provides usage statistics for each item in the Repository as
    well as each collection and sub-collection. See the left hand foot of
    each page and click on the Statistics icon/link to see data on the
    number of times each record has been viewed (and from which countries),
    and – if applicable – the number of times any associated PDF has been
    downloaded. Data is available by month and by year. 


  • Altmetrics have several advantages over traditional citation counts:
    they are quicker to accumulate, they document non-scholarly attention
    and influence, and they can be used to track the attention for
    non-traditional research outputs. However, they cannot tell anything
    about the quality of the research. You need both types of metrics –
    traditional and alternative – to get the full picture of research
  • Manual work is needed to assess the underlying qualitative data that makes up the metrics (who is saying what about research). 
  • While
    altmetrics are good at indentifying ‘trending’ research, they have not
    yet been proven to be a good indicator for lasting, long-term impact. 
  • Researchers
    seeking to evaluate non-English-language sources will find that
    altmetrics coverage is currently limitied for these outputs. 

Learn More

  • For guidance around the tools, including useful summaries and tips, have a look at the University of Melbourne Library’s Altmetrics Subject Research Guide.

  • The Altmetric Explorer website provides a range of case studies  of what researchers and institutions have used to track the societal attention to their research.
  • In this blog post,
    Prof. Jonathan A. Eisen at the University of California, Davis,
    describes how he used Impactstory to look at the altmetrics of his
    research papers and data. 
  • Dip into the readings of the PLOS Altmetrics Collection
    and gather understanding on the statistical analysis of altmetrics data
    sources, validation of models of scientific discovery and qualitative
    research describing discovery based on altmetrics. 
  • The London School of Economics Impact Blog regularly runs features on Altmetrics.
This post was written by Georgina Binns (Faculty Librarian, VCA and MCM) and Fransie Naudé (Liaison Librarian, Education).

Thing 23: Altmetrics – 23 Research Things (2017)

Thursday, 9 November 2017

4 New things about Google Scholar - UI, recommendations, and citation networks


I'm actually a pretty big fan of Google Scholar, which in some ways is better than our library discovery service ,but even if you aren't a fan, given it's popularity it's important for librarians to keep up with the latest developments.

In any case, I'm happy to see that Google continues to enhance Google
Scholar with new features. These are some of the new features and things
I've learnt about Google Scholar lately.

1. Google Scholar's new UI 

The new interface is a lot cleaner, particularly when on mobile and most
of the changes aren't really major (e.g. replacing text of "save" and
"cite" with icons) but I miss the easy access to advanced search the old
interface had. 
Click the down arrow button to get access to advanced search in the old Google Scholar

In new Google Scholar interface, it now tucked under the "ham burger menu", where more people might notice it. 
On the plus side, very few people knew Google Scholar had a advanced
search or even how to access it , so overall it might be a ok trade-off,
though it takes two clicks instead of one to access the advanced
Also the change to Google Scholar doesn't seem to have affected link
resolvers, various extensions that rely on Google Scholar via scraping
such as Publish or Perish , Google Scholar button, so this is still a relatively minor layout change. 

2.  Get recommendations of related works of other scholars' works

Official change announcement. 

For a long time Google Scholar had a odd gap. As arguably the largest
scholarly index in the world, with perhaps the largest number of users
of any scholarly search engine, it was well posed to use all this data
to create a fantastic recommendation system. Add Google's famed machine
learning and it looked like a no-brainer.

But it was only in 2012, nearly 8 years after launch that Google Scholar added a recommendation system.
And as you might expect, the recommendations are excellent. While other
recommendation systems for scholarly material exist (e.g. BX
recommender, Mendeley's, various publisher based ones), none in my
opinion are as broad ranging or timely as Google Scholar for the reasons
already mentioned.

Google Scholar recommended articles

Still there was a curious gap. The recommendation system only gave recommendations based on the works already in your Google Scholar profile.

The flaw here is obvious, what if you were working in a new area you
haven't published formally yet? Arguably this is exactly when you have a
greater need of the help of a recommendation system.

 I wanted a feature where I could put a set of articles into Google
Scholar and it would give recommended articles over time. One crazy idea
I had back at the time was to create a brand new fake Google Scholar
profile , load it up with works of articles I'm interested in , keep the
profile private and leverage on the recommendations provided.

Unfortunately this doesn't work, because the Google Scholar profile has to be public before recommendations appear.

Another way that probably works is to exploit the fact that papers
deposited into ResearchGate, preprint servers do appear in Google
Scholar and hence can be added to your profile fairly quickly. So you
could example, create a quick working paper (with citations to works you
know of) and deposit it in a institutional repository or preprint
server that is indexed by Google Scholar. Add those into your Google
Scholar profile and wait for recommendations to appear. But this still
seems really forced and do you really want to mess up your profile just
to get a few recommendations?

So the new feature added by Google is much appreciated. While you still
cant add any arbitrary set of articles, you can go to any Scholarly
profile and choose to follow the author's new works, citiations and most
importantly articles related to the author's research.

Follow Harzing profile to get recommended articles similar to her research publications in Google Scholar

It's not super clear to me if it just
sends new articles via email or whether it updates the recommended
articles list you get within Google Scholar, I suspect the former and
technically articles shown this way are alerts not recommendations?, but
it's still useful. 

3. Google citation profile improvements - allows one time export to ORCID 

This isn't a new feature in Google Scholar but fairly new feature in ORCID.
I often find many researchers have their Google Scholar profile fully
filled up with their works (no doubt partly because Google makes it so
easy , particularly with auto or semi-auto updates and partly becuase
they reason the profile increases their visibility), but are reluctant
to spend the time to get their ORCID profile populated.  
Exporting works via BibTex
This of course only works as a one-time upload and you would have to
continue to update future works, hopefully by other automated ways (e.g.
via Journal crossref links, from CRIS/RIMs etc). 
Another fairly new feature in Google Scholar citations is that they now
try to group together authors by institutions. So for example when you
search for the name of an institution in Google Scholar, you get
something like this.
Searching by institutions in Google Scholar
Clicking on the link gets you this.
Top cited profiles from the institution
There's a study on how accurate this institution matching is  but what are the practical implications for normal librarians who aren't doing advanced bibliometrics?
For one it allows you to fairly easy get top 10, 20 etc cited authors of
your institution, to complement the lists you get from Web of
Science/incites or Scopus/Scival. 
You can't jump to the end of results to gauge how many authors your institution has on this platform. 

It's unfortunate that for this set of results, Google doesn't list the
number of results, and neither can you gauge it by looking at the number
of pages in the results list  and you can only go forward page by
page(see above). 
I don't know of a way around it , even if you alter the url parameters "&after_author "or "&astart=30" it doesn't work.

4. Scraping of Google Scholar to create network diagrams/ Bibliometric networks

It basically works as follows. First the system allows you to search via Google Scholar for papers to add.
Below I searched the term Open access, and then added some of the papers
into the system. You can of course search for specific papers by title.
Once you are done with a set of papers, you can click on "Check
Citations" and it will use Google Scholar's "search within citing
articles" feature to see if the articles in your set of papers are
It took me a while to understand how it worked but here's a specific example.
Say you have the following two papers
"Eysenbach, G. (2006). Citation advantage of open access articles. PLoS biology, 4(5), e157."
"Harnad, S., & Brody, T. (2004). Comparing the impact of open access
(OA) vs. non-OA articles in the same journals. D-lib Magazine, 10(6)."
The system will automatically go to say the Google Scholar list of citations for Harnad, S., & Brody, T. (2004) and using the "Search within citing articles" check to see if G. Eysenbach, is included. 
It will do this for all pairs of authors in your set of reference
articles automatically. All these searches are done in a popup window,
if the volume is too big , Google Scholar will throw up a captcha for
you to solve and it will then continue. 
You can then export a basic visualization of the author network which
shows coauthorships and citations. Here's my first toy example, using
papers I cited in a recent working paper.
It isn't too impressive probably because I don't have enough papers but
it does show the structure I expected with 2 main clusters - one around
LC smith (1981) (old paper on citation analysis for library collection
evaluation) and one around Eysenbach (2006) , a well cited early paper
on citation advantage. I would have thought they would not be connected
at all (particularly since I remove Eugene Garfield's seminal
publications) but they still seem to be linked indirectly.though an
author who cited both. 
You can export the network for further study into the open source Gephi network visualization tool and
I have spent some time doing so playing with more complicated networks
like publication & author networks, using modularity to identify
clusters of works. I'll probably blog about this in a separate blog post
next time, but for now I'm very intrigued.

How useful are such networks for researchers? 

Could doing such network graphs be useful for researchers, particularly
those new to the field to help them see how their research fits into
existing research, and see connections they wouldn't otherwise have
Could this be autogenerated from references of existing papers to help
the reader get a sense of where the current piece of article sits?
Can such network graphs provide improved recommendations (or does
recommendations from Google Scholar etc already implicitly do that?)
How big a network (or set of articles) is needed before this becomes
useful? e.g. Is this useful only for thesis with over 50 references (or
better yet include everything in your reference manager not just what
you cited)? Would most researchers find that these network graphs only
reproduce clusters they already intuitively know or provide some
unexpected insights?
In a future post I will talk about my experiments on these 3 scenarios
a) Visualizing networks between publications that cite my old 2009 article  
b) Visualizing networks between publications cited in my old 2009 article and newest paper
c) Visualizing networks between publications cited in an article not in
my field. (to see if it helps orientate me better in an area I'm not
familiar with). 
Would I learn anything from doing such visualizations?
Of course this idea isn't new, I'm guessing there should be research out there on this. 
Existing tools like web of science have limited citation map
capabilities and the newer incites and scival also provide mapping
capabilities though often at the higher level meant for research
On the free side, VOSviewer also provides the ability to visualization networks of citations. The newest version 1.6.6  actually adds the ability to generate networks not just from Scopus and Web of Science citations but also from Crossref.

VOSviewer 1.6.6 supports Web of Science, Scopus, PubMed and Crossref 
So one can generate similar networks using dois from VOSviewer.
Still I suspect scraping from Google Scholar might give richer results
due to the much large scale of Google Scholar compared to say Scopus.
Also given the popularity of Google Scholar has a discovery tool, one
might find relying on other tools such as Scopus to create networks
might risk missing too many works found via Google Scholar. 


Hope you found some of this useful.

It's good to see Google continue to improve Google Scholar, while we may
not know when Google might decide to abandon Google Scholar , the
recent spate of improvements are a good sign it won't  be anytime soon.

Posted by

4 New things about Google Scholar - UI, recommendations, and citation networks

Tuesday, 7 November 2017

Thing 20: Avoiding Deceptive, Unethical, Predatory and Vanity Publishing – 23 Research Things (2017)


Thing 20: Avoiding Deceptive, Unethical, Predatory and Vanity Publishing

Image: “Banner header attention caution” by geralt via Pixabay (CC0)

Some argue that the breakdown of
trusted information sources is one of the major challenges faced in the
21st century (Gray, 2017). This view is influenced by the growth in
deceptive, unethical and predatory publishing practices occurring
online. As victims, academics and their institutions, often experience
financial and reputational damage from unethical scholarly publishing.

Getting Started

When the time comes to consider suitable scholarly publishing outlets
for your research, we highly recommend undertaking due diligence to
select quality sources. Becoming vigilant and regularly updating your
knowledge of scholarly publishing outlets to assess their quality, is a
means to avoid publishing traps and pitfalls.

A predatory publisher has been defined as a type of scholarly
publishing company established primarily to collect Article Processing
Charges (APC) and provide very fast publishing without peer review or
even checking grammar or spelling. They often spam academics with
requests for submissions and reviews and requests to join their
editorial boards (Shen & Björk, 2015).

Warning Signs

Key characteristics of deceptive, unethical, predatory and vanity publishing practices can comprise:

  • spelling and grammar errors, along with distorted images on the website;
  • advertising fake metrics e.g. Global Impact Factor (GIF)
  • a journal website with an overly broad scope;
  • language that targets authors rather than readers;
  • promises of rapid publication;
  • a lack of information about retraction policies, manuscript handling or digital preservation;
  • manuscript submissions by e-mail;
  • taking copyright ownership of material (usually theses);
  • hijacking journal titles, establishing duplicate websites and using business names that are like respected publishers;
  • organizing conferences to collect funds from presenters and participants without peer review or a formal program
  • promoting non-existent conferences;
  • adding academics to editorial boards without permission; and
  • unexpected fees after accepting submissions
The following site monitors problematic publishers:  Distraction Watch.

of predatory publishing practices” – original graphic by Tanja
Ivacic-Ramljak (Liaison Librarian (Learning & Teaching), Veterinary
& Agricultural Sciences). Click image for full screen.


There is no single and absolute authority to determine the best or worst scholarly publishing outlet.

There are many useful resources to help evaluate suitable publishing
outlets for your scholarly research. The usefulness of serials
directories such as UlrichsWeb, Scimago and SHERPA/RoMEO will depend upon your scholarly publishing requirements and field.

The Committee on Publication Ethics (COPE)
has members worldwide from all academic fields. Membership is open to
editors of academic journals and others interested in publication
ethics. If you find the COPE logo on a journal’s website, it is an
indication that the journal has been critiqued by COPE as a prerequisite
for membership. Together with Open Access Scholarly Publishers Association (OASPA), Directory of Open Access Journals (DOAJ), and World Association of Medical Editors (WAME),
a minimum criteria has been set that journals will be assessed against
when they apply for membership of the respective organisations; here is a
link to the full criteria on principles of transparency and best practice.

Try This

We recommend the following sources for critiquing any publisher that
has approached you with an invitation to publish your research:

  • Cabell’s Scholarly Analytics includes a whitelist of over 11,000 journals and a blacklist of “likely deceptive or fraudulent academic journals” for selected disciplines.

  • PubsHub is
    a database of submission criteria for peer-reviewed medical journals
    and congresses. The database contains information on 6,000 medical
  • The openly available resource Think Check Submit. Follow this checklist to make sure you choose trusted journals for your research.
Other useful criteria are available from:

We encourage you to contact your Liaison Librarian for further advice.

Learn More

Author Mills

Stromberg, J. (2014). I sold my undergraduate thesis to a print content farm: A trip through the shadowy, surreal world of an academic book mill. The Slate. 

Growth in Predatory Publishing

Clark J., & Smith R. (2015) Firm action needed on predatory journals. BMJ. 350 (Jan16_1): h210

Beall J. (2012) Predatory publishers are corrupting open access. Nature. 489(7415): 179-180.

Shen, C., & Björk, B.-C. (2015). Predatory’ open access: A longitudinal study of article volumes and market characteristics. BMC Medicine

Xia J. (2015) Predatory journals and their article publishing charges. Learn Pub. 28(1): 69–74.

Predatory Conferences

Pai, M., & Franco, E. (2016, updated 2017). Predatory conferences undermine science and scam academics. Huffington Post Blog.

Byard, Roger W. (2016) The forensic implications of predatory publishing. Forensic Science, Medicine and Pathology 12.4: 391-393.


Dadkhah, Mehdi, Tomasz Maliszewski, and Jaime A. Teixeira da Silva. (2016) Hijacked
journals, hijacked websites, journal phishing, misleading metrics, and
predatory publishing: actual and potential threats to academic integrity
and publishing ethics
Forensic Science, Medicine, and Pathology 12.3: 353-362.

Fake News

Gray, R. (2017). Lies, propaganda and fake news: A challenge for our age. [online]

This post was written by Lisa Kruesi (Faculty Librarian, Health
& Life Sciences), Satu Alakangas (Liaison Librarian (Research), Law)
and Sarah Charing (Liaison Librarian (Research), Architecture, Building
& Planning).
Original graphic by Tanja Ivacic-Ramljak (Liaison Librarian (Learning & Teaching), Veterinary & Agricultural Sciences).

Thing 20: Avoiding Deceptive, Unethical, Predatory and Vanity Publishing – 23 Research Things (2017)

Thing 19: Open Access and Your Thesis – 23 Research Things (2017)


Thing 19: Open Access and Your Thesis

Image: “Open Access (storefront)” by Gideon Burton via Flickr (CC BY-SA 2.0)

making my PhD thesis Open Access, I hope to inspire people around the
world to look up at the stars and not down at their feet; to wonder
about our place in the universe and to try and make sense of the
cosmos.” (
Stephen Hawking on the release of his 1966 PhD thesis)

Making your thesis publicly accessible
requires consideration of a number of concepts: institutional policy;
attitudes of prospective publishers; 3rd-party copyright; and indexing
in search engines, including the effect on citation and impact of your
work. This installment of 23 Research Things aims to shed some light on
these considerations. 

Institutional Policies

All universities have policies and procedures for enabling public
access to higher degree theses. The specifics of the University of
Melbourne policy are laid out at My Thesis in the Library and Preparation of Graduate Research Theses Rules. Advice elsewhere will reflect particular institutional requirements.  

Over the last 20 years universities have supplemented public access
to print copies of theses on library shelves with online access via
institutional repositories, such as Minerva Access. 

How Are Theses Discovered?

Theses are a link in the scholarly communications chain and the
provision of public access to them is long-standing university practice.
Discovery of print theses in university libraries has been facilitated
by discovery services like Google Books, Google Scholar and Trove. Online open access extends this discovery and access, but comes with particular challenges and opportunities.  

Why Make Your Thesis OA – Impact, Engagement and Profile

Why is OA important?  An online thesis is one way you can establish your profile in a subject area, bringing you to the attention of potential collaborators, colleagues and employers. Theses indexed in Google Scholar will include citation data if referred to in other publications. Repositories provide counts for downloads and views, indicating both volume and location of your readership. Some institutions
have begun to track “alt-metric” counts for theses, providing further
indications of impact and engagement. You may never reach the dizzying
heights of Stephen Hawking’s thesis download or altmetric count but you can always dream!  

Should You Embargo Your Thesis?

So, if OA theses are both personally rewarding and a social good, why
would you choose an embargo? In fact, embargoes are a legitimate
response to institutional and individual concerns around immediate OA. 

While there can be commercial and legal issues, or issues of cultural
sensitivity which demand embargo, most often the concern is around the
perceived threat to subsequent publication from the thesis. Is such
concern warranted? A 2014 survey of science publishers
found that over 80% would, with some qualification, accept article
submissions from work based on OA theses. A 2017 in-house survey of 50 key business and economics journals
found none would outright exclude a publication stemming from an OA
thesis. Two of the 50 commented they would reserve the right to refuse
if there was considerable duplication between the thesis and the
submitted journal article, however it was also noted that it would be
rare for a thesis to simply be repurposed as an article without
substantial changes!   

What about books? Some publishers, based on their public statements,
see OA theses as advantageous, allowing for the early identification of
viable new publications. Another large-scale publisher survey
found that 50% of university presses in social sciences and humanities
would accept submissions based on OA theses. However, a substantial
minority would not or would do so on a case-by-case basis.  

Copyright and Your Thesis

In most parts of the world, including Australia, an OA thesis is
considered to be “published” which means that using copyright material
created by other people – “third party copyright” such as text, images
and graphs etc – requires not only explicit acknowledgment but may also
require permission from the copyright owner. However, there are some
circumstances where permission may not be required, for example for
purposes of criticism or review [or for satire or parody]. For more
information about these circumstances, see here.

Reusing your own, already-published work in your online
“thesis-by/with-publication” may also require permission from other
copyright owners. Publisher author rights policies generally support the
use of the author-accepted-manuscript (see Sherpa/Romeo). However, for OA papers published with a Creative Commons Licence, and subscription papers from some publishers like Elsevier, there is no problem using the final published version. 

Learn More

This post was written by Stephen Cramond (Manager, Institutional Repository) and Jenny McKnight (Research Consultant (Open Access)).

Thing 19: Open Access and Your Thesis – 23 Research Things (2017)