Excellence or Misconduct: How the Visibility of Team Leaders Impacts the Research Project Competition in the Republic of Moldova?

Authors

DOI:

https://doi.org/10.15407/scine19.02.003

Keywords:

peer review, bibliometrics indicators, correlation, research evaluation, projects selection

Abstract

Introduction. Distributing public funds to the “best” researchers is a key element of the science policy. Evaluation is a fundamental activity for the allocation of competitive funding. The flaws of peer review have led to increased interest in the use of bibliometric indicators for the evaluation of the research project proposals.
Problem Statement. The advantajes and advance of bibliometrc is stimulated interest toward the correlation of peer review and applicants’ bibliometric indicators. The results of such studies are different and heterogeneous. Such studies are insufficient in Eastern Europe.
Purpose. To establish the correlation between peer review and bibliometric indicators of project team leaders within the call for research projects in Moldova, which are financed from public funds for 2020—2023.
Material and Methods. Statistical correlation of the results of national competition of R&D proposals (evaluation and funding) and the bibliometrics indicators of project team leaders (publications ant patents); analytical analysis of the contextual factors influencing this correlation.
Results. The results of the analysis have shown a positive, albeit weak correlation between the scores assigned by experts and the previous performances of leaders. The most significant relation is between the call results and the Hirsh index in Web of Science and Scopus databases. However, the projects proposed by the most cited researchers in WoS and Scopus or the founders of scientific schools did not receive funding.
Conclusions. The analysis of the national R&D competition has proved that previous scientific performance of team leaders influenced the evaluation results and the funding of project proposals. However, these dependencies are not linear and seem to be affected by the conflicts of interest and “old boys” schemes. This fact calls for significant changes of the process: ensuring the transparency, the involvement of foreign experts and the use of bibliometric indicators in evaluation.

Downloads

Download data is not yet available.

References

Cabezas-Clavijo, Á., Robinson-García, N., Escabias, M., Jiménez-Contreras, E. (2013). Reviewers’ Ratings and Bibliometric Indicators: Hand in Hand When Assessing Over Research Proposals. PloS One, 8(6), e68258. https://doi.org/10.1371/journal.pone.0068258.

van den Besselaar, P., Leydesdorff, L. (2009). Past performance, peer review, and project selection: A case study in the social and behavioral sciences. Research Evaluation, 18(4), 273—288. https://doi.org/10.3152/095820209X475360.

Lešková, A. (2018). The success of peer review evaluation in university research funding — the case study form Slovakia. In Knowledge Based Sustainable Economic Development: proceedings of the 4th International Conference Proceedings of ERAZ 2018 (Eds. A. Myrtaj (Rexhepi), I. Malollari, L. Pinguli). Sofia: AEMB (pp. 372—382). https://doi.org/10.31410/eraz.2018.372.

Južnič, P., Pečlin, S., Žaucer, M., Mandelj, T. (2010). Scientometric indicators: peer-review, bibliometric methods and conflict of interests. Scientometrics, 85(2), 429—441. https://doi.org/10.1007/s11192-010-0230-8.

Abramo, G., D’Angelo, C. A., Reale, E. (2019). Peer review vs bibliometrics: which method better predicts the scholarly impact of publications? Scientometrics, 121, 537—554. https://doi.org/10.1007/s11192-019-03184-y.

Geuna, A., Martin, B. R. (2003). University research evaluation and funding: an International comparison. Minerva, 41, 277—304. https://doi.org/10.1023/B:MINE.0000005155.70870.bd.

Abramo, G., D’Angelo, C. A. (2011). Evaluating research: from informed peer review to bibliometrics. Scientometrics, 87(3), 499—514. https://doi.org/10.1007/s11192-011-0352-7.

Wilsdon, J., Allen, L., Belfiore, E., Campbell, P., ..., Johnson, В. (2015). The Metric Tide: Report of the Independent Review of the Role of Metrics in Research Assessment and Management. Bristol: HEFCE. https://doi.org/10.13140/RG.2.1.4929.1363.

Aksnes, D. W., Taxt, R. E. (2004). Peer reviews and bibliometric indicators: a comparative study at a Norwegian university. Research Evaluation, 13(1), 33—41. https://doi.org/10.3152/147154404781776563.

Bornmann, L., Wallon, G., Ledin, A. (2008). Does the Committee Peer Review Select the Best Applicants for Funding? An Investigation of the Selection Process for Two European Molecular Biology Organization Programmes. PloS One, 3(10), e3480. https://doi.org/10.1371/journal.pone.0003480.

Kulczycki, E., Korzeń, M., Korytkowski, P. (2017). Toward an excellence-based research funding system: Evidence from Poland. Journal of Informetrics, 11(1), 282—298. https://doi.org/10.1016/j.joi.2017.01.001.

Li, J., Sanderson, M. Willett, P., Norris, M., Oppenheim, C. (2010). Ranking of library and information science researchers: Comparison of data sources for correlating citation data, and expert judgments. Journal of Informetrics, 4(4), 554—563. https://doi.org/10.1016/j.joi.2010.06.005.

Allen, L., Jones, C., Dolby, K., Lynn, D., Walport, M. (2009). Looking for landmarks: The role of expert review and bibliometric analysis in evaluating scientific publication outputs. PloS One, 4(6), e5910. https://doi.org/10.1371/journal.pone.0005910.

Moed, H. F. (2008). UK Research Assessment Exercises: Informed judgments on research quality or quantity? Scientometrics, 74(1), 153—161. https://doi.org/10.1007/s11192-008-0108-1.

Bertocchi, G., Gambardella, A., Jappelli, T., Nappi, C. A., Peracchi, F. (2015). Bibliometric evaluation vs informed peer review: Evidence from Italy. Research Policy, 44(2), 451—466. https://doi.org/10.1016/j.respol.2014.08.004.

van Raan, A. F. J. (2006). Comparison of the Hirsch-index with standard bibliometric indicators and with peer judgment for 147 chemistry research groups. Scientometrics, 67(3), 491—502. https://doi.org/10.1556/Scient.67.2006.3.10.

Rinia, E. J., van Leeuwen, T. N., van Vuren, H. G., van Raan, A. F. J. (1998). Comparative analysis of a set of bibliometric indicators and central peer review criteria: Evaluation of condensed matter physics in the Netherlands. Research Policy, 27(1), 95—107. https://doi.org/10.1016/S0048-7333(98)00026-2.

Lovegrove, B. G., Johnson, S. D. (2008). Assessment of Research Performance in Biology: How Well Do Peer Review and Bibliometry Correlate? Bio Science, 58(2), 160—164. https://doi.org/10.1641/B580210.

Reale, E., Barbara, A., Costantini, A. (2007). Peer review for the evaluation of academic research: lessons from the Italian experience. Research Evaluation, 16(3), 216—228. https://doi.org/10.3152/095820207X227501.

Maier, G. (2006). Impact factors and peer judgment: the case of regional science journals. Scientometrics, 69(3), 651—667. https://doi.org/10.1007/s11192-006-0175-0.

Thomas, R., Watkins, D. (1998). Institutional research rankings via bibliometric analysis and direct peer-review: A comparative case study with policy implications. Scientometrics, 41(3), 335—355. https://doi.org/10.1007/BF02459050.

Baccini, A., De Nicolao, G. (2016). Do they agree? Bibliometric evaluation versus informed peer review in the Italian research assessment exercise. Scientometrics, 108(3), 1651—1671. https://doi.org/10.1007/s11192-016-1929-y.

Rahman, J. A. I. M., Guns, R., Rousseau, R., Engels, T. C. E. (2017). Cognitive Distances between Evaluators and Evalue es in Research Evaluation: A Comparison between Three Informetric Methods at the Journal and Subject Category Aggregation Level. Front. Res. Metr. Anal., 2(6), 1—13. https://doi.org/10.3389/frma.2017.00006.

Bornmann, L., Daniel, H-D. (2006). Selecting scientific excellence through committee peer review — A citation analysis of publications previously published to approval or rejection of post-doctoral research fellowship applicants. Scientometrics, 68(3), 427—440. https://doi.org/10.1007/s11192-006-0121-1.

Meho, L. I., Sonnenwald, D. H. (2000). Citation ranking versus peer evaluation of senior faculty research performance: a case study of Kurdish scholarship. JASIS, 51(2), 123—138. https://doi.org/10.1002/(SICI)1097-4571(2000)51:2<123::AID-ASI4>3.0.CO;2-N.

Vieira, E. S., Gomes, J. F. (2018). The peer-review process: The most valued dimensions according to the researcher’s scientific career. Research Evaluation, 27(3), 246—261. https://doi.org/10.1093/reseval/rvy009.

Derrick, G. E., Haynes, A., Chapman, S., Hall, W. D. (2011). The association between four citation metrics and peer rankings of research influence of Australian researchers in six fields of public health. PLoS One, 6(4), e18521. https://doi.org/10.1371/journal.pone.0018521.

Nicol, M., Henadeera, K., Butler, L. (2007). NHMRC grant applications: a comparison of “track record” scores allocated by grant assessors with bibliometric analysis of publications. Medical Journal of Australia, 187(6), 348—352. https://doi.org/10.5694/j.1326-5377.2007.tb01279.x.

Hornbostel, S., Böhmer, S., Klingsporn, B., Neufeld, J., von Ins, M. (2009). Funding of young scientist and scientific excellence. Scientometrics, 79(1), 171—190. https://doi.org/10.1007/s11192-009-0411-5.

Melin, G., Danell, R. (2006). The top eight percent: development of approved and rejected applicants for a prestigious grant in Sweden. Science and Public Policy, 33(10), 702—712. https://doi.org/10.3152/147154306781778579.

Matcharashvili, T., Tsveraidze, Z., Sborshchikovi, A., Matcharashvili, T. (2014). The importance of bibliometric indicators for the analysis of research performance in Georgia. Trames Journal of the Humanities and Social Sciences, 18(4), 345—356. https://doi.org/10.3176/tr.2014.4.03.

NARD. (2019). “State Program (2020—2023)” competition. URL: https://ancd.gov.md/ro/content/concurs-deschisprogram-de-stat-2020-2023 (Last accessed: 04.09.2022) [in Romanian].

NARD. (2020). Order regarding the approval of the projects selected for financing and the volume of budget allocations for the year 2020 of the projects within the “State Program (2020—2023)” competition. (Order No. 01-PC, January 10). URL: https://ancd.gov.md/ro/content/ordine-de-finan%C8%9Bare-0 (Last accessed: 04.09.2022) [in Romanian].

Cuciureanu, G., Cojocaru, I., Minciună, V., Manic, S., Manic, L. (2020). Competition of Project Proposals “State Program 2020—2023” — A New Step Towards the Dissolution of Science in the Republic of Moldova? Intellectus, 1—2, 116—126. URL: https://ibn.idsi.md/ro/vizualizare_articol/108702 (Last accessed: 04.09.2022) [in Romanian].

Langfeldt, L. (2004). Expert panels evaluating research: decision-making and sources of bias. Research Evaluation, 13(1), 51—62. https://doi.org/10.3152/147154404781776536.

Moldoveanu, B., Cuciureanu, G. (2020). Publishing as an Indicator of Scientific Research Quality and Ethics: The Case of Law Journals from Moldova. Sci Eng. Ethics, 26(2), 1039—1052. https://doi.org/10.1007/s11948-020-00189-2.

Jargin, S. V. (2011). Pathology in the former Soviet Union: scientific misconduct and related phenomena. Dermatology practical & conceptual, 1(1), 75—81. https://doi.org/10.5826/dpc.0101a16.

van Leeuwen, T. N. and Moed, H. F. (2012). Funding decisions, peer review, and scientific excellence in physical sciences, chemistry, and geosciences. Research Evaluation, 21(3), 189—198. https://doi.org/10.1093/reseval/rvs009.

Downloads

Published

2023-03-18

How to Cite

Cuciureanu, G., Turcan, N., Cojocaru, I., & Cojocaru, I. (2023). Excellence or Misconduct: How the Visibility of Team Leaders Impacts the Research Project Competition in the Republic of Moldova?. Science and Innovation, 19(2), 3–16. https://doi.org/10.15407/scine19.02.003

Issue

Section

General Questions on Modern Scientific, Technical and Innovation Policy