- -

Link-based approach to study scientific software usage: the case of VOSviewer

RiuNet: Repositorio Institucional de la Universidad Politécnica de Valencia

Compartir/Enviar a

Citas

Estadísticas

  • Estadisticas de Uso

Link-based approach to study scientific software usage: the case of VOSviewer

Mostrar el registro sencillo del ítem

Ficheros en el ítem

dc.contributor.author Orduña Malea, Enrique es_ES
dc.contributor.author Costas, Rodrigo es_ES
dc.date.accessioned 2023-04-28T18:00:52Z
dc.date.available 2023-04-28T18:00:52Z
dc.date.issued 2021-09 es_ES
dc.identifier.issn 0138-9130 es_ES
dc.identifier.uri http://hdl.handle.net/10251/193017
dc.description.abstract [EN] Scientific software is a fundamental player in modern science, participating in all stages of scientific knowledge production. Software occasionally supports the development of trivial tasks, while at other instances it determines procedures, methods, protocols, results, or conclusions related with the scientific work. The growing relevance of scientific software as a research product with value of its own has triggered the development of quantitative science studies of scientific software. The main objective of this study is to illustrate a link-based webometric approach to characterize the online mentions to scientific software across different analytical frameworks. To do this, the bibliometric software VOSviewer is used as a case study. Considering VOSviewer's official website as a baseline, online mentions to this website were counted in three different analytical frameworks: academic literature via Google Scholar (988 mentioning publications), webpages via Majestic (1,330 mentioning websites), and tweets via Twitter (267 mentioning tweets). Google scholar mentions shows how VOSviewer is used as a research resource, whilst mentions in webpages and tweets show the interest on VOSviewer's website from an informational and a conversational point of view. Results evidence that URL mentions can be used to gather all sorts of online impacts related to non-traditional research objects, like software, thus expanding the analytical scientometric toolset by incorporating a novel digital dimension. es_ES
dc.description.sponsorship RC was partially funded by the South African DST-NRF Center of Excellence in Scientometrics and Science, Technology, and Innovation Policy (SciSTIP). es_ES
dc.language Inglés es_ES
dc.publisher Springer-Verlag es_ES
dc.relation.ispartof Scientometrics es_ES
dc.rights Reconocimiento (by) es_ES
dc.subject Scientific software es_ES
dc.subject Link analysis es_ES
dc.subject Informetrics es_ES
dc.subject Webometrics es_ES
dc.subject Scholarly communication es_ES
dc.subject Social media metrics es_ES
dc.subject VOSviewer es_ES
dc.subject.classification BIBLIOTECONOMIA Y DOCUMENTACION es_ES
dc.title Link-based approach to study scientific software usage: the case of VOSviewer es_ES
dc.type Artículo es_ES
dc.identifier.doi 10.1007/s11192-021-04082-y es_ES
dc.rights.accessRights Abierto es_ES
dc.contributor.affiliation Universitat Politècnica de València. Facultad de Bellas Artes - Facultat de Belles Arts es_ES
dc.description.bibliographicCitation Orduña Malea, E.; Costas, R. (2021). Link-based approach to study scientific software usage: the case of VOSviewer. Scientometrics. 126(9):8153-8186. https://doi.org/10.1007/s11192-021-04082-y es_ES
dc.description.accrualMethod S es_ES
dc.relation.publisherversion https://doi.org/10.1007/s11192-021-04082-y es_ES
dc.description.upvformatpinicio 8153 es_ES
dc.description.upvformatpfin 8186 es_ES
dc.type.version info:eu-repo/semantics/publishedVersion es_ES
dc.description.volume 126 es_ES
dc.description.issue 9 es_ES
dc.relation.pasarela S\443474 es_ES
dc.contributor.funder Centre of Excellence in Scientometrics and Science, Technology and Innovation Policy, Sudáfrica es_ES
dc.description.references Bruns, A., Weller, K., Zimmer, M., & Proferes, N. J. (2014). A topology of Twitter research: Disciplines, methods, and ethics. Aslib Journal of Information Management, 66(3), 250–261. es_ES
dc.description.references Cronin, B., Snyder, H. W., Rosenbaum, H., Martinson, A., & Callahan, E. (1998). Invoked on the Web. Journal of the American Society for Information Science, 49(14), 1319–1328. es_ES
dc.description.references Delgado López-Cózar, E., Orduna-Malea, E., & Martín-Martín, A. (2019). Google Scholar as a data source for research assessment. In W. Glänzel, H. Moed, U. Schmoch, & M. Thelwall (Eds.), Springer handbook of science and technology indicators (pp. 95–127). Springer. es_ES
dc.description.references Delgado López-Cózar, E., Orduna-Malea, E., Martín-Martín, A., & Ayllón, J. M. (2017). Google Scholar: The big data bibliographic tool. In F. J. Cantú-Ortiz (Ed.), Research analytics: Boosting university productivity and competitiveness through scientometrics (pp. 59–80). Taylor and Francis. es_ES
dc.description.references Díaz-Faes, A., Bowman, T. D., & Costas, R. (2019). Towards a second generation of ‘social media metrics’: Characterizing Twitter communities of attention around science. PLoS ONE, 14(5), e0216408. https://doi.org/10.1371/journal.pone.0216408 es_ES
dc.description.references Du, C., Cohoon, J., Lopez, P., & Howison, J. (2021). Softcite dataset: A dataset of software mentions in biomedical and economic research publications. Journal of the Association for Information Science and Technology. https://doi.org/10.1002/asi.24454 es_ES
dc.description.references Gusenbauer, M. (2019). Google Scholar to overshadow them all? Comparing the sizes of 12 academic search engines and bibliographic databases. Scientometrics, 118(1), 177–214. es_ES
dc.description.references Hafer, L., & Kirkpatrick, A. E. (2009). Assessing open source software as a scholarly contribution. Communications of the ACM, 52(12), 126–129. es_ES
dc.description.references Halavais, A. (2008). The hyperlink as organizing principle. In J. Turow & L. Lokman (Eds.), The hyperlinked Society: Questioning connections in the digital age (pp. 39–55). The University of Michigan Press. es_ES
dc.description.references Hannay, J. E., MacLeod, C., Singer, J., Langtangen, H. P., Pfahl, D., & Wilson, G. (2009). How do scientists develop and use scientific software? Proceedings of the 2009 ICSE workshop on software engineering for computational science and engineering, SECSE 2009, 1–8. https://ieeexplore.ieee.org/abstract/document/5069155. es_ES
dc.description.references Haustein, S., Bowman, T. D., & Costas, R. (2016). Interpreting “altmetrics”: Viewing acts on social media through the lens of citation and social theories. In C. Sugimoto (Ed.), Theories of informetrics and scholarly communication (pp. 372–406). De Gruyter Saur. es_ES
dc.description.references Hey, T., Tansley, S., & Tolle, K.M. (Ed.) (2009). The fourth paradigm: data-intensive scientific discovery. Redmond, WA: Microsoft research. https://www.microsoft.com/en-us/research/wp-content/uploads/2009/10/Fourth_Paradigm.pdf. es_ES
dc.description.references Howison, J., & Bullard, J. (2016). Software in the scientific literature: Problems with seeing, finding, and using software mentioned in the biology literature. Journal of the Association for Information Science and Technology, 67(9), 2137–2155. es_ES
dc.description.references Howison, J., & Herbsleb, J. D. (2011). Scientific software production: incentives and collaboration. Proceedings of the ACM 2011 conference on computer supported cooperative work –CSCW ’11, 513–522. https://doi.org/10.1145/1958824.1958904 es_ES
dc.description.references Howison, J., Deelman, E., McLennan, M. J. M., Da Silva, R. F., & Herbsleb, J. D. (2015). Understanding the scientific software ecosystem and its impact: Current and future measures. Research Evaluation, 24(4), 454–470. es_ES
dc.description.references Jansen, B. J., Jung, S.G., & Salminen, J. (2020). Data Quality in Website Traffic Metrics: A Comparison of 86 Websites Using Two Popular Analytics Services. http://www.bernardjjansen.com/uploads/2/4/1/8/24188166/traffic_analytics_comparison.pdf. es_ES
dc.description.references Jones, D. (2012). Flow Metrics™ will change the way you look at links. Majestic Blog. https://blog.majestic.com/development/flow-metrics. es_ES
dc.description.references Katz D. S., Choi S-. C. T., Niemeyer, K. E. et al. (2016). Report on the third workshop on sustainable software for science: practice and experiences (WSSSPE3). https://arxiv.org/abs/1602.02296. es_ES
dc.description.references Li, K., Chen, P. Y., & Yan, E. (2019). Challenges of measuring software impact through citations: An examination of the lme4 R package. Journal of Informetrics, 13(1), 449–461. es_ES
dc.description.references Li, K., & Yan, E. (2018). Co-mention network of R packages: Scientific impact and clustering structure. Journal of Informetrics, 12(1), 87–100. es_ES
dc.description.references Li, K., Yan, E., & Feng, Y. (2017). How is R cited in research outputs? Structure, impacts, and citation standard. Journal of Informetrics, 11(4), 989–1002. es_ES
dc.description.references Lepori, B., Aguillo, I. F., & Seeber, M. (2014). Size of web domains and interlinking behavior of higher education institutions in Europe. Scientometrics, 100(2), 497–518. es_ES
dc.description.references Niemeyer, K. E., Smith, A. M., & Katz, D. S. (2016). The challenge and promise of software citation for credit, identification, discovery, and reuse. Journal of Data and Information Quality, 7(4), 1–5. es_ES
dc.description.references Orduna-Malea, E. (2021). Dot-Science Top Level Domain: Academic websites or dumpsites? Scientometrics, 126(4), 3565–3591. https://doi.org/10.1007/s11192-020-03832-8 es_ES
dc.description.references Orduna-Malea, E. (2020). Investigando con Twitter: una mirada según el Reglamento General de Protección de Datos. In Francisca Ramón-Fernández (Ed.). Marco jurídico de la ciencia de datos (pp. 331–378). Valencia: Tirant lo Blanch. es_ES
dc.description.references Orduna-Malea, E., & Alonso-Arroyo, A. (2017). Cybermetric techniques to evaluate organizations using web-based data. Chandos Publishing. es_ES
dc.description.references Orduna-Malea, E., Ayllón, J. M., Martín-Martín, A., & Delgado López-Cózar, E. (2015). Methods for estimating the size of Google Scholar. Scientometrics, 104(3), 931–949. es_ES
dc.description.references Orduna Malea, E., Martín-Martín, A., & Delgado-López-Cózar, E. (2017). Google Scholar as a source for scholarly evaluation: A bibliographic review of database errors. Revista Española De Documentación Científica, 40(4), 1–33. es_ES
dc.description.references Orduna-Malea, E., & Regazzi, J. J. (2014). US academic libraries: Understanding their web presence and their relationship with economic indicators. Scientometrics, 98(1), 315–336. es_ES
dc.description.references Ortega, J. L. (2014). Academic search engines: A quantitative outlook. Elsevier. es_ES
dc.description.references Ovadia, S. (2009). Exploring the potential of Twitter as a research tool. Behavioral & Social Sciences Librarian, 28(4), 202–205. es_ES
dc.description.references Pan, X., Cui, M., Yu, X., & Hua, W. (2017). How is CiteSpace used and cited in the literature? An analysis of the articles published in English and Chinese core journals. ISSI 2017–16th International conference on Scientometrics and Informetrics. http://issi-society.org/proceedings/issi_2017/2017ISSI%20Conference%20Proceedings.pdf. es_ES
dc.description.references Pan, X., Yan, E., & Hua, W. (2016). Disciplinary differences of software use and impact in scientific literature. Scientometrics, 109(3), 1–18. es_ES
dc.description.references Pan, X., Yan, E., Cui, M., & Hua, W. (2018). Examining the usage, citation, and diffusion patterns of bibliometric mapping software: A comparative study of three tools. Journal of Informetrics, 12(2), 481–493. es_ES
dc.description.references Pan, X., Yan, E., Cui, M., & Hua, W. (2019). How important is software to library and information science research? A content analysis of full-text publications. Journal of Informetrics, 13(1), 397–406. es_ES
dc.description.references Pan, X., Yan, E., Wang, Q., & Hua, W. (2015). Assessing the impact of software on science: A bootstrapped learning of software entities in full-text papers. Journal of Informetrics, 9(4), 860–871. es_ES
dc.description.references Park, H. W., & Thelwall, M. (2003). Hyperlink analyses of the World Wide Web: A review. Journal of computer-mediated communication. https://doi.org/10.1111/j.1083-6101.2003.tb00223.x es_ES
dc.description.references Park, H., & Wolfram, D. (2019). Research software citation in the Data Citation Index: Current practices and implications for research software sharing and reuse. Journal of Informetrics, 13(2), 574–582. es_ES
dc.description.references Pia, M. G., Basaglia, T., Bell, Z. W., & Dressendorfer, P. V. (2009). Geant4 in scientific literature. IEEE Nuclear Science Symposium Conference Record, 189–194. https://ieeexplore.ieee.org/document/5401810. es_ES
dc.description.references Piwowar, H. A. (2013). Value all research products. Nature, 493, 159. es_ES
dc.description.references Pradal, C., Varoquaux, G., & Langtangen, H. P. (2013). Publishing scientific software matters. Journal of Computational Science, 4(5), 311–312. es_ES
dc.description.references Smith, K. (2020). 58 Incredible and Interesting Twitter Stats and Statistics. Brandwatch. https://www.brandwatch.com/blog/twitter-stats-and-statistics. es_ES
dc.description.references Smith, A. M., Katz, D. S., & Niemeyer, K. E. (2016). Software citation principles. PeerJ Computer Science, 2, e86. https://peerj.com/articles/cs-86/. es_ES
dc.description.references Soito, L., & Hwang, L. J. (2016). Citations for Software: Providing identification, access and recognition for research software. IJDC, 11(2), 48–63. es_ES
dc.description.references Stewart, B. (2017). Twitter as method: Using Twitter as a tool to conduct research. L. Sloan, & A. Quan-Haase, Social Media Research Methods, 251–266. es_ES
dc.description.references Thelwall, M. (2004). Link Analysis: An information science approach. Elsevier. es_ES
dc.description.references Thelwall, M. (2006). Interpreting social science link analysis research: A theoretical framework. Journal of the American Society for Information Science and Technology, 57(1), 60–68. es_ES
dc.description.references Thelwall, M., & Kousha, K. (2016). Academic software downloads from google code. Information Research, 21(1). http://informationr.net/ir/21-1/paper709.html#.XzelJ-gzbIU. es_ES
dc.description.references Van Eck, N., & Waltman, L. (2010). Software survey: VOSviewer, a computer program for bibliometric mapping. Scientometrics, 84(2), 523–538. es_ES
dc.description.references Williams, S. A., Terras, M. M., & Warwick, C. (2013). What do people study when they study Twitter? Classifying Twitter related academic papers. Journal of Documentation, 69(3), 384–410. es_ES
dc.description.references Wouters, P., Zahedi, Z., & Costas, R. (2019). Social media metrics for new research evaluation. In W. Glänze, H. F. Moed, U. Schmoch, & M. Thelwall (Eds.), Springer handbook of science and technology indicators (pp. 687–713). Springer. es_ES
dc.description.references Yang, B., Rousseau, R., Wang, X., & Huang, S. (2018). How important is scientific software in bioinformatics research? A comparative study between international and Chinese research communities. Journal of the Association for Information Science and Technology, 69(9), 1122–1133. es_ES


Este ítem aparece en la(s) siguiente(s) colección(ones)

Mostrar el registro sencillo del ítem