Measuring The Impact Of Academic Research Paper

  • Abramo, G., & D’Angelo, C. (2011). Evaluating research: From informed peer review to bibliometrics. Scientometrics,87(3), 499–514. doi:10.1007/s11192-011-0352-7.CrossRefGoogle Scholar

  • Austrian Science Fund. (2007). Rethinking the impact of basic research on society and the economy. Vienna: Austrian Science Fund.Google Scholar

  • Ban, T. A. (2006). The role of serendipity in drug discovery. Dialogues in Clinical Neuroscience,8(3), 335–344.Google Scholar

  • Bornmann, L. (2011a). Mimicry in science? Scientometrics,86(1), 173–177. doi:10.1007/s11192-010-0222-8.CrossRefGoogle Scholar

  • Bornmann, L. (2011b). Scientific peer review. Annual Review of Information Science and Technology,45, 199–245.CrossRefGoogle Scholar

  • Bornmann, L. (2012). Measuring the societal impact of research. EMBO Reports,13(8), 673–676.CrossRefGoogle Scholar

  • Bornmann, L. (2013a). Research misconduct—Definitions, manifestations and extent. Publications,1(3), 87–98.CrossRefGoogle Scholar

  • Bornmann, L. (2013b). What is societal impact of research and how can it be assessed? A literature survey. Journal of the American Society of Information Science and Technology,64(2), 217–233.CrossRefGoogle Scholar

  • Bornmann, L. (2014). Is there currently a scientific revolution in scientometrics? Journal of the Association for Information Science and Technology,65(3), 647–648.CrossRefGoogle Scholar

  • Bornmann, L., & Daniel, H.-D. (2007). Multiple publication on a single research study: Does it pay? The influence of number of research articles on total citation counts in biomedicine. Journal of the American Society for Information Science and Technology,58(8), 1100–1107.CrossRefGoogle Scholar

  • Bornmann, L., de Moya-Anegón, F., & Leydesdorff, L. (2010). Do scientific advancements lean on the shoulders of giants? A bibliometric investigation of the Ortega hypothesis. PLoS ONE,5(10), e11344.CrossRefGoogle Scholar

  • Bornmann, L., Haunschild, R., & Marx, W. (2016). Policy documents as sources for measuring societal impact: How is climate change research perceived in policy documents? Retrieved February 26, 2016, from http://arxiv.org/abs/1512.07071.

  • Bornmann, L., & Marx, W. (2012). The Anna Karenina principle: A way of thinking about success in science. Journal of the American Society for Information Science and Technology,63(10), 2037–2051. doi:10.1002/asi.22661.CrossRefGoogle Scholar

  • Bornmann, L., & Marx, W. (2014). How should the societal impact of research be generated and measured? A proposal for a simple and practicable approach to allow interdisciplinary comparisons. Scientometrics,98(1), 211–219.CrossRefGoogle Scholar

  • Bornmann, L., Mutz, R., & Daniel, H.-D. (2013). A multilevel-statistical reformulation of citation-based university rankings: The Leiden Ranking 2011/2012. Journal of the American Society for Information Science and Technology,64(8), 1649–1658.CrossRefGoogle Scholar

  • Boyack, K. W., Klavans, R., Sorensen, A. A., & Ioannidis, J. P. A. (2013). A list of highly influential biomedical researchers, 1996–2011. European Journal of Clinical Investigation,43(12), 1339–1365. doi:10.1111/eci.12171.CrossRefGoogle Scholar

  • Campanario, J. M. (1996). Using citation classics to study the incidence of serendipity in scientific discovery. Scientometrics,37(1), 3–24. doi:10.1007/bf02093482.CrossRefGoogle Scholar

  • Campbell, P., & Grayson, M. (2014). Assessing science. Nature,511(7510), S49. doi:10.1038/511S49a.CrossRefGoogle Scholar

  • Cohen, G., Schroeder, J., Newson, R., King, L., Rychetnik, L., Milat, A. J., et al. (2015). Does health intervention research have real world policy and practice impacts: Testing a new impact assessment tool. Health Research Policy and Systems,13, 12. doi:10.1186/1478-4505-13-3.CrossRefGoogle Scholar

  • Dahler-Larsen, P. (2011). The evaluation society. Stanford: Stanford University Press.CrossRefGoogle Scholar

  • de Bellis, N. (2009). Bibliometrics and citation analysis: From the science citation index to cybermetrics. Lanham, MD: Scarecrow Press.Google Scholar

  • Derrick, G. E., & Pavone, V. (2013). Democratising research evaluation: Achieving greater public engagement with bibliometrics-informed peer review. Science and Public Policy,40(5), 563–575. doi:10.1093/scipol/sct007.CrossRefGoogle Scholar

  • Douglas, H. (2014). Pure science and the problem of progress. Studies in History and Philosophy of Science Part A, 46, 55–63. doi:10.1016/j.shpsa.2014.02.001.CrossRefGoogle Scholar

  • ERiC. (2010). Evaluating the societal relevance of academic research: A guide. Delft: Delft University of Technology.Google Scholar

  • Evidence Ltd. (2007). The use of bibliometrics to measure research quality in UK higher education institutions. London: Universities UK.Google Scholar

  • Feist, G. J. (2006). The psychology of science and the origins of the scientific mind. New Haven, CT: Yale University Press.Google Scholar

  • Finkel, A. (2014). Perspective: Powering up citations. Nature,511(7510), S77. doi:10.1038/511S77a.CrossRefGoogle Scholar

  • Garfield, E. (2006). The history and meaning of the journal impact factor. Journal of the American Medical Association,295(1), 90–93.CrossRefGoogle Scholar

  • Geisler, E. (2000). The metrics of science and technology. Westport, CT: Quorum Books.Google Scholar

  • Gieryn, T. F. (1995). Boundaries of science. In S. Jasanoff, G. E. Markle, J. C. Petersen, & T. Pinch (Eds.), Handbook of science and technology studies (pp. 393–443). London: Sage.Google Scholar

  • Hammarfelt, B. (2014). Using altmetrics for assessing research impact in the humanities. Scientometrics,. doi:10.1007/s11192-014-1261-3.Google Scholar

  • Haustein, S. (2014). Readership metrics. In B. Cronin & C. R. Sugimoto (Eds.), Beyond bibliometrics: Harnessing multi-dimensional indicators of performance (pp. 327–344). Cambridge, MA: MIT Press.Google Scholar

  • Hazelkorn, E. (2011). Rankings and the reshaping of higher education. The battle for world-class excellence. New York, NY: Palgrave Macmillan.CrossRefGoogle Scholar

  • Hicks, D., & Melkers, J. (2013). Bibliometrics as a tool for research evaluation. In A. N. Link & N. S. Vonortas (Eds.), Handbook on the theory and practice of program evaluation (pp. 323–349). Northampton, MA: Edward Elgar.Google Scholar

  • Hicks, D., Wouters, P., Waltman, L., de Rijcke, S., & Rafols, I. (2015). Bibliometrics: The Leiden Manifesto for research metrics. Nature,520(7548), 429–431.CrossRefGoogle Scholar

  • Hug, S. E., Ochsner, M., & Daniel, H.-D. (2013). Criteria for assessing research quality in the humanities—A Delphi study among scholars of English literature, German literature and art history. Research Evaluation,22(5), 369–383.CrossRefGoogle Scholar

  • Ioannidis, J. P. A., Boyack, K. W., & Klavans, R. (2014). Estimates of the continuously publishing core in the scientific workforce. PLoS ONE,9(7), e101698. doi:10.1371/journal.pone.0101698.CrossRefGoogle Scholar

  • Ke, Q., Ferrara, E., Radicchi, F., & Flammini, A. (2015). Defining and identifying sleeping beauties in science. Proceedings of the National Academy of Sciences,. doi:10.1073/pnas.1424329112.Google Scholar

  • Khazragui, H., & Hudson, J. (2015). Measuring the benefits of university research: Impact and the REF in the UK. Research Evaluation,24(1), 51–62. doi:10.1093/reseval/rvu028.CrossRefGoogle Scholar

  • King’s College London and Digital Science. (2015). The nature, scale and beneficiaries of research impact: An initial analysis of Research Excellence Framework (REF) 2014 impact case studies. London: King’s College London.Google Scholar

  • Kousha, K., & Thelwall, M. (in press). Patent citation analysis with Google. Journal of the Association for Information Science and Technology. doi:10.1002/asi.23608.

  • Kuhn, T. S. (1962). The structure of scientific revolutions (2nd ed.). Chicago, IL: University of Chicago Press.Google Scholar

  • Lawrence, P. A. (2003). The politics of publication. Authors, reviewers and editors must act to protect the quality of research. Nature,422(6929), 259–261.CrossRefGoogle Scholar

  • Lewison, G., & Sullivan, R. (2008). The impact of cancer research: How publications influence UK cancer clinical guidelines. British Journal of Cancer,98(12), 1944–1950.CrossRefGoogle Scholar

  • Liu, J. (2014). New source alert: Policy documents. Retrieved September 10, 2014, http://www.altmetric.com/blog/new-source-alert-policy-documents/.

  • Liu, C. L., Xu, Y. Q., Wu, H., Chen, S. S., & Guo, J. J. (2013). Correlation and interaction visualization of altmetric indicators extracted from scholarly social network activities: Dimensions and structure. Journal of Medical Internet Research,15(11), 17. doi:10.2196/jmir.2707.CrossRef

  • Mendeley Stats

    Knowing the impact of your research can be invaluable when you’re applying for funding, seeking a new position or working towards a promotion. Mendeley Stats gives you publication readership data within days of publication. As an author, Stats helps you understand in greater detail and with greater speed how your publications are being read, shared and cited.


    Citations

    Citations are a well-established measure of research impact; a citation can mean recognition or validation of one's research by others. Elsevier's weekly CiteAlert service helps authors keep track of where their work is being cited.


    Usage

    A more immediate way to track the reach of a paper is to look into how the article is being viewed and downloaded online. Elsevier's Article Usage Alerts send corresponding authors a quarterly email linking to a dashboard of ScienceDirect usage data for the first year after publication of their article.


    Mendeley Readership

    Mendeley is the leading source of readership information, drawing from its global community of millions of researchers. The Scopus Mendeley Readership app shows the total number of readers who have added the paper, detailing the top three countries, subject areas and career status of readers.


    Alternative Metrics

    Plum Analytics’ metrics are incorporated into Elsevier’s world leading research products – Scopus, ScienceDirect and Pure, as well as Elsevier’s leading journal and society partner sites. PlumX Metrics provide insights into the ways people interact with individual pieces of research output (articles, conference proceedings, book chapters, and many more) in the online environment. We categorize metrics into 5 separate categories: Usage, Captures, Mentions, Social Media, and Citations. Many journal homepages display the top 10 most popular articles according to PlumX social media mentions.

    0 comments

    Leave a Reply

    Your email address will not be published. Required fields are marked *