中文    English

Journal of Library and Information Science in Agriculture ›› 2023, Vol. 35 ›› Issue (2): 95-104.doi: 10.13998/j.cnki.issn1002-1248.23-0090

Previous Articles     Next Articles

Applications of Crowdsourcing in Evidence Synthesis: A Case Study of Cochrane Crowd

LI Xiao1,2, QU Jiansheng1,2,3,*, KOU Leilei4   

  1. 1. Northwest Institute of Eco-Environment and Resources, Chinese Academy of Sciences, Lanzhou 730000;
    2. Department of Library Information and Archives Management, School of Economics and Management, University of Chinese Academy of Sciences, Beijing 100049;
    3. Chengdu Library and Information Center, Chinese Academy of Sciences, Chengdu 610041;
    4. Qinghai-Tibet Plateau Human Environment Research Center, Lanzhou University, Lanzhou 730000
  • Received:2023-01-21 Published:2023-04-17

Abstract: [Purpose/Significance] Evidence-informed decision-making is a means to bridge the gap between research and policy and evidence synthesis has become an important tool for evidence-based decision-making in many fields. However, evidence synthesis is resource-intensive, especially when it comes to scientific knowledge on complex issues. The efficiency of evidence synthesis currently cannot meet the needs of decision makers. Crowdsourcing is seen as a potential way to improve the productivity of evidence synthesis. At present, the research and practice on the applications of crowdsourcing in evidence synthesis is still in its infancy. This study takes the application of crowdsourcing in the Cochrane Crowd citizen science project as an example to summarize the practical applications of crowdsourcing in evidence synthesis. The comprehensive analysis of the application mechanism of crowdsourcing in Cochrane Crowd project will provide certain reference and inspiration for the use of crowdsourcing in evidence synthesis, so as to improve the production efficiency of evidence synthesis and provide timely and powerful scientific information for evidence-based decision-making. [Method/Process] The application mechanism of crowdsourcing in the Cochrane Crowd citizen science project was analyzed from five dimensions: crowdsourcer, volunteers, crowdsourcing task, Cochrane Crowd platform and effectiveness evaluation, using literature research, network investigation, case analysis and other methods. Cochrane Crowd provides an easy-to-use interface for contributors to engage volunteers to participate and design , in addition to task-focused learning activities, diverse ways of accessing tasks, interactive online training modules and feedback mechanisms to improve the likelihood of volunteers' performing tasks correctly. At the same time, an agreement algorithm is provided at the platform level to aggregate the crowd classification results, which further improves the possibility of correct classification of records. In addition, the platform has used the records identified by the crowd to build a machine-learning model called as RCT classifier which can predict how likely a new citation is to be described an RCT to reduce the manual burden. [Results/Conclusions] Crowdsourcing is an effective method to improve the efficiency of evidence synthesis and shorten the production cycle. With comprehensive participant training and appropriate quality control mechanisms, it is possible to produce high quality crowdsourcing results that meet the "gold standard" of evidence synthesis. In order to motivate volunteers to participate and promote continued engagement, participants are suggested to be provided with clear goals, clear tasks, and timely feedback or rewards. Interest and activity in introducing crowdsourcing into evidence synthesis is growing rapidly, and new tools and platforms to facilitate crowdsourcing also need to be further developed as researchers from different disciplines use crowdsourcing in the evidence synthesis projects. In the future, the application of crowdsourcing in evidence synthesis in different fields and in different stages of evidence synthesis should be further studied.

Key words: evidence synthesis, crowdsourcing, Cochrane Crowd, evidence-based research

CLC Number: 

  • G254
[1] LITTELL J H.Conceptual and practical classification of research reviews and other evidence synthesis products[J]. Campbell systematic reviews, 2018, 14(1): 1-21.
[2] DONNELLY C A, BOYD I, CAMPBELL P, et al.Four principles to make evidence synthesis more useful for policy[Z]. Nature Publishing Group, 2018.
[3] BORNMANN L, MUTZ R.Growth rates of modern science: A bib-liometric analysis based on the number of publications and cited references[J]. Journal of the association for information science and technology, 2015, 66(11): 2215-2222.
[4] HADDAWAY N R, WESTGATE M J.Predicting the time needed for environmental systematic reviews and systematic maps[J]. Con-servation biology, 2019, 33(2): 434-443.
[5] CHALMERS I, BRACKEN M B, DJULBEGOVIC B, et al.How to increase value and reduce waste when research priorities are set[J]. The lancet, 2014, 383(9912): 156-165.
[6] TSAFNAT G, GLASZIOU P, CHOONG M K, et al.Systematic review automation technologies[J]. Systematic reviews, 2014, 3(1): 1-15.
[7] BROWN A W, ALLISON D B.Using crowdsourcing to evaluate published scientific literature: Methods and example[J]. PLoS one, 2014, 9(7): e100647.
[8] SUN Y, CHENG P, WANG S, et al. Crowdsourcing information ex-traction for biomedical systematic reviews[J]. arXiv preprint arXiv:1609.01017, 2016.
[9] NAMA N, ILIRIANI K, XIA M Y, et al.A pilot validation study of crowdsourcing systematic reviews: Update of a searchable database of pediatric clinical trials of high-dose vitamin D[J]. Translational pediatrics, 2017, 6(1): 18.
[10] NAMA N, SAMPSON M, BARROWMAN N, et al.Crowdsourcing the citation screening process for systematic reviews: Validation study[J]. Journal of medical Internet research, 2019, 21(4): e12953.
[11] MORTENSEN M L, ADAM G P, TRIKALINOS T A, et al.An exploration of crowdsourcing citation screening for systematic reviews[J]. Research synthesis methods, 2017, 8(3): 366-386.
[12] PIANTA M J, MAKRAI E, VERSPOOR K M, et al.Crowdsourcing critical appraisal of research evidence (CrowdCARE) was found to be a valid approach to assessing clinical research quality[J]. Journal of clinical epidemiology, 2018, 104: 8-14.
[13] BUJOLD M, GRANIKOV V, SHERIF R E, et al.Crowdsourcing a mixed systematic review on a complex topic and a heterogeneous population: Lessons learned[J]. Education for information, 2018, 34(4): 293-300.
[14] Cochrane. About us[EB/OL].[2022-09-04]. https://www.cochrane.org/about-us.
[15] Cochrane Collaboration.Cochrane Crowd[EB/OL].[2022-06-18].https://community.cochrane.org/help/tools-and-software/cochrane-crowd.
[16] 尚宏利, 张思洁, 魏志鹏, 等. 循证数字人文证据整合的基本框架与具体流程研究[J]. 农业图书情报学报, 2022, 34(11): 38-47.
SHANG H L, ZHANG S J, WEI Z P, et.al. Evidence integration framework of evidence-based digital humanities[J]. Journal of library and information science in agriculture, 2022, 34(11): 38-47.
[17] PHAM B, BAGHERI E, RIOS P, et al.Improving the conduct of systematic reviews: A process mining perspective[J]. Journal of clinical epidemiology, 2018, 103: 101-111.
[18] NOEL-STORR A, DOOLEY G, ELLIOTT J, et al.An evaluation of Cochrane Crowd found that crowdsourcing produced accurate results in identifying randomized trials[J]. Journal of clinical epidemiology, 2021, 133: 130-139.
[19] NOEL-STORR A, DOOLEY G, AFFENGRUBER L, et al.Citation screening using crowdsourcing and machine learning produced accurate results: Evaluation of Cochrane's modified Screen4Me service[J]. Journal of clinical epidemiology, 2021, 130: 23-31.
[20] NOEL-STORR A, GARTLEHNER G, DOOLEY G, et al.Crowdsourcing the identification of studies for COVID-19-related Cochrane Rapid Reviews[J]. Research synthesis methods, 2022, 13(5): 585-594.
[21] NOEL-STORR A H, REDMOND P, LAMé G, et al. Crowdsourcing citation-screening in a mixed-studies systematic review: A feasibility study[J]. BMC medical research methodology, 2021, 21(1): 1-10.
[22] THOMAS J, NOEL-STORR A, MARSHALL I, et al.Living system-atic reviews: 2. Combining human and machine effort[J]. Journal of clinical epidemiology, 2017, 91: 31-37.
[23] NAKATSU R T, GROSSMAN E B, IACOVOU C L.A taxonomy of crowdsourcing based on task complexity[J]. Journal information science, 2014, 40(6): 823-834.
[24] PRPIC J, SHUKLA P P, KIETZMANN J H, et al.How to work a crowd: Developing crowd capital through crowdsourcing[J]. Business horizons, 2015, 58(1): 77-85.
[25] Towards an integrated crowdsourcing definition[J]. Journal information science, 2012, 38(2): 189-200.
[26] HOSSEINI M, PHALP K, TAYLOR J, et al.The four Pillars of crowdsourcing: A reference model[C] //2014 IEEE Eighth Interna-tional Conference on Research Challenges in Information Science (RCIS). Piscataway, NJ, USA: IEEE, 2014: 1-12.
[27] ZHAO Y X, ZHU Q H.Evaluation on crowdsourcing research: Cur-rent status and future direction[J]. Information systems frontiers, 2014, 16(3): 417-434.
[28] PEDERSEN J, KOCSIS D, TRIPATHI A, et al.Conceptual foundations of crowdsourcing: A review of IS research[C] //2013 46th Hawaii International Conference on System Sciences. Piscataway, NJ, USA: IEEE, 2013: 579-588.
[29] The Cochrane Library.Cochrane COVID-19 register of studies[EB/OL].[2022-09-05].https://covid-19.cochrane.org/.
[30] WALLACE B C, NOEL-STORR A, MARSHALL I J, et al.Identifying reports of randomized controlled trials (RCTs) via a hybrid machine learning and crowdsourcing approach[J]. Journal of the American medical informatics association, 2017, 24(6): 1165-1168.
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed   
No Suggested Reading articles found!