中文    English

Journal of library and information science in agriculture ›› 2025, Vol. 37 ›› Issue (3): 92-105.doi: 10.13998/j.cnki.issn1002-1248.25-0067

Previous Articles     Next Articles

Influencing Factors of User Participation Intention of Crowdsourcing in Evidence Synthesis

LI Xiao1, QU Jiansheng2()   

  1. 1.Shanxi Police College, Taiyuan 030000
    2.Chengdu Library and Information Center, Chinese Academy of Sciences, Chengdu 610041
  • Received:2025-02-14 Online:2025-03-05 Published:2025-06-10
  • Contact: QU Jiansheng E-mail:jsqu@lzb.ac.cn

Abstract:

[Purpose/Significance] This paper aims to provide evidence synthesis crowdsourcing initiators with references to understand user participation behavior, propose and implement relevant behavioral intervention strategies, and guide and promote user participation. [Method/Process] A model of influencing factors of user participation intention was constructed based on the planned behavior theory (TPB), the technology acceptance model (TAM) and motivation theory. Hypotheses were also proposed. The relevant data of the sample objects were collected through a questionnaire survey, and the hypotheses were tested using a structural equation model. [Results/Conclusions] Evidence synthesis crowdsourcing itself is an academic, scientific and non-commercial task. In crowdsourced evidence synthesis, factors such as attitude, self-efficacy and trust significantly impact users' willingness to participate. The academic atmosphere has no significant impact on users' willingness to participate. Monetary rewards, recognition, and skill improvement have a significant positive impact on attitude, while perceived effort has a significant negative impact on attitude, and enjoyment of the activity has no significant impact on attitude. Attitude, self-efficacy, and trust directly influence willingness to participate, while monetary rewards, recognition, skill improvement and perceived effort indirectly influence willingness to participate through attitude. The order of influence of individual motivational factors on willingness to participate, from greatest to least, is as follows: skill improvement, recognition, and monetary rewards. The results show that, compared with monetary rewards, non-material factors are a more important driving force for users to participate in crowdsourced evidence synthesis. In terms of perceived behavioral control factors, efficacy is one of the main cognitive forces that guides users' willingness to participate. The degree of trust directly determines whether users are willing to learn about and participate in the project. Therefore, when organizing crowdsourcing activities and recruiting participants, as well as providing incentive measures, the crowdsourcing party should consider the following aspects: for potential target groups, online training opportunities, development of related learning resources, establishment of interactive feedback mechanisms, and an online community for timely communication can be provided. Participants can be provided with the opportunity to receive signatures, certificates or emails expressing appreciation and encouragement or other forms of recognition. Organizers can provide resources and support to reduce users' perceived burden. This can be done through examples such as organizing training and establishing communities. According to specific circumstances, appropriate monetary rewards can be provided to participants. Organizers should adopt strategies that improve users' self-efficacy, convey relevant information and provide assistance. To gain the trust of users and make appropriate commitments to protect their legitimate rights and interests, organizers are encouraged to provide detailed information about themselves, including the affiliation, research experience of the team and their academic achievements.

Key words: evidence synthesis, crowdsourcing, participation intention, influencing factors, information behavior

CLC Number: 

  • G350

Fig.1

Theoretical model of factors influencing user participation intention of crowdsourcing in evidence synthesis"

Table 1

Questionnaire measurement items and source references"

变量测量题项来源文献
货币奖励(MON)我希望收到货币奖励作为参与活动的回报ZHENG等[36];KE等[43]
我很关心参与证据合成众包获得的货币奖励
参与这个活动时如果能赚到钱会让我有强烈的动力
认可(REC)我希望参与证据合成众包可以获得来自组织方的证明材料ACAR[44]
我希望参与证据合成众包可以获得别人的认可
我很关心参与证据合成众包任务是否有机会在文献中署名
技能提升(SKL)参与证据合成众包使我有机会学习其代表性方法系统评价相关知识OREG等[45]
参与证据合成众包为我提供了一种发展技能的途径
参与证据合成众包会增强我的相关知识并使我更精通系统评价或元分析方法
享受乐趣(ENJ)我觉得证据合成众包任务很有趣NOV等[37]
参与感兴趣主题的系统评价是件有意思的事
我对证据合成众包任务充满了好奇且乐于尝试
感知努力(PE)我需要努力理解证据合成众包的任务要求GARBARINO等[46];WANG[47]
我需要投入大量时间和精力来执行证据合成众包任务
我需要付出很多努力完成证据合成众包任务
证据合成众包任务比较复杂
态度(ATT)参与证据合成众包任务是值得的AJZEN等[48];TOHIDINIA等[49]
参与证据合成众包任务是有必要的
参与证据合成众包任务是明智的选择
自我效能(SE)我有能力完成证据合成众包任务KANKANHALLI等[50]
我符合参与证据合成众包活动所需的基本要求
我有信心完成证据合成众包任务
信任(TRU)证据合成众包组织者是值得信赖的ZHENG等[36]
证据合成众包组织者会信守承诺
证据合成众包组织者会保障参与者的合法权益
学科氛围(DC)在我所处学科环境中,我听说了系统评价或元分析BOCK等[51];TOHIDINIA等[49]
我所在学科会发表系统评价或元分析研究
我所在学科会使用系统评价或元分析方法
我所在学科比较认可系统评价或元分析研究
参与意愿(PI)我可能会参与证据合成众包活动AJZEN等[48];TOHIDINIA等[49]
我打算参与证据合成众包活动
未来我将尝试参与证据合成众包活动

Table 2

Demographics of respondents"

属性类别频次比例/%
性别11943.0
15857.0
年龄18~24岁11641.9
25~34岁12946.6
35~44岁259.0
45~54岁41.4
55岁及以上31.1
当前身份在读本科生186.5
在读硕士生14452.0
在读博士生145.1
在职工作者(本科学历)72.5
在职工作者(研究生学历)9233.2
其他20.7
学科医学5519.9
理科4215.2
工科6322.7
人文学科3914.1
社会科学7125.6
其他72.5
对证据合成代表性方法系统评价或元分析的了解程度不知道6423.1
听说过13247.7
了解但从没做过7627.4
了解且至少做过一次51.8

Table 3

Rotated factor matrix"

项目12345678910
PE30.900
PE40.868
PE20.866
PE10.826
DC40.842
DC20.800
DC10.772
DC30.770
ENJ10.892
ENJ20.875
ENJ30.867
SE20.883
SE30.839
SE10.809
SKL10.861
SKL30.812
SKL20.776
TRU20.887
TRU30.878
TRU10.783
REC20.826
REC10.801
REC30.714
ATT30.790
ATT10.785
ATT20.745
MON20.843
MON30.797
MON10.744
PI10.781
PI30.759
PI20.734

Table 4

Model fit indices"

拟合指标推荐值指标值拟合情况
卡方值(CMIN)620.556
自由度(DF)427.000
CMIN/DF<31.453良好
拟合优度指数(GFI)>0.80.880良好
Tucker-Lewis指数(TLI)>0.80.943良好
比较拟合指数(CFI)>0.80.951良好
近似误差均方根(RMSEA)<0.080.041良好

Table 5

Reliability analysis results"

变量测量项因子载荷Cronbach's αCRAVE

货币奖励

(MON)

MON10.7060.7720.7750.535
MON20.784
MON30.701

认可

(REC)

REC10.7000.7660.7680.526
REC20.776
REC30.696

技能提升

(SKL)

SKL10.8510.7990.8060.584
SKL20.646
SKL30.782

享受乐趣

(ENJ)

ENJ10.8660.8720.8740.699
ENJ20.826
ENJ30.815

感知努力

(PE)

PE10.8150.9140.9150.729
PE20.872
PE30.866
PE40.861

态度

(ATT)

ATT10.7470.8120.8060.581
ATT20.765
ATT30.775

自我效能

(SE)

SE10.7580.8330.8340.626
SE20.797
SE30.817

信任

(TRU)

TRU10.7240.8340.8380.634
TRU20.834
TRU30.826

学科氛围

(DC)

DC10.6390.8210.8230.539
DC20.739
DC30.724
DC40.824

参与意愿

(PI)

PI10.7300.7720.7760.540
PI20.621
PI30.837

Table 6

Validity analysis results"

项目TRUSEDCSKLRECMONENJPEATTPI
信任(TRU)0.796
自我效能(SE)0.2410.791
学科氛围(DC)0.1410.0800.734
技能提升(SKL)0.015-0.029-0.2390.764
认可(REC)0.1420.2540.2040.0460.725
货币奖励(MON)0.1790.2910.200-0.2350.3890.731
享受乐趣(ENJ)-0.019-0.045-0.008-0.0780.1270.0220.836
感知努力(PE)-0.115-0.134-0.110-0.121-0.310-0.157-0.2350.854
态度(ATT)0.1070.1460.0380.3550.4680.2430.174-0.4700.762
参与意愿(PI)0.2730.3170.1070.1830.3410.2290.084-0.3100.6020.735

Fig.2

Hypothesis testing results for the model"

Table 7

Summary of hypothesis testing results"

假设路径系数TP假设检验
H1货币奖励→态度0.1672.1850.029支持
H2认可→态度0.2843.5100.000支持
H3技能提升→态度0.3534.8960.000支持
H4享受乐趣→态度0.0941.5040.133不支持
H5感知努力→态度-0.291-4.3040.000支持
H6态度→参与意愿0.5557.0370.000支持
H7自我效能→参与意愿0.1942.8760.004支持
H8信任→参与意愿0.1612.4310.015支持
H9学科氛围→参与意愿0.0470.7480.455不支持
1 李晓, 曲建升, 寇蕾蕾. 众包在证据合成中的实践应用研究: 以Cochrane Crowd公民科学项目中的众包应用为例[J]. 农业图书情报学报, 2023, 35(2): 95-104.
LI X, QU J S, KOU L L. Applications of crowdsourcing in evidence synthesis: A case study of cochrane crowd[J]. Journal of library and information science in agriculture, 2023, 35(2): 95-104.
2 BROWN A W, ALLISON D B. Using crowdsourcing to evaluate published scientific literature: Methods and example[J]. PLoS One, 2014, 9(7): e100647.
3 SUN Y, CHENG P, WANG S, et al. Crowdsourcing information extraction for biomedical systematic reviews[J/OL]. arXiv preprint arXiv:, 2016.
4 MORTENSEN M L, ADAM G P, TRIKALINOS T A, et al. An exploration of crowdsourcing citation screening for systematic reviews[J]. Research synthesis methods, 2017, 8(3): 366-386.
5 NAMA N, SAMPSON M, BARROWMAN N, et al. Crowdsourcing the citation screening process for systematic reviews: Validation study[J]. Journal of medical Internet research, 2019, 21(4): e12953.
6 NAMA N, ILIRIANI K, XIA M Y, et al. A pilot validation study of crowdsourcing systematic reviews: Update of a searchable database of pediatric clinical trials of high-dose vitamin D[J]. Translational pediatrics, 2017, 6(1): 18-26.
7 NOEL-STORR A H, REDMOND P, LAMÉ G, et al. Crowdsourcing citation-screening in a mixed-studies systematic review: A feasibility study[J]. BMC medical research methodology, 2021, 21(1): 88.
8 NOEL-STORR A, GARTLEHNER G, DOOLEY G, et al. Crowdsourcing the identification of studies for COVID-19-related cochrane rapid reviews[J]. Research synthesis methods, 2022, 13(5): 585-594.
9 THOMAS J, NOEL-STORR A, MARSHALL I, et al. Living systematic reviews: Combining human and machine effort[J]. Journal of clinical epidemiology, 2017, 91: 31-37.
10 STRANG L, SIMMONS R K. Citizen science: Crowdsourcing for systematic reviews[M]. Cambridge: THIS Institute, 2018.
11 FELIZARDO K R, DE SOUZA E F, LOPES R, et al. Crowdsourcing in systematic reviews: A systematic mapping and survey[C]//2020 46th Euromicro Conference on Software Engineering and Advanced Applications (SEAA). August 26-28, 2020. Portoroz, Slovenia. IEEE, 2020: 404-412.
12 MOREAU D, GAMBLE B. Conducting a meta-analysis in the age of open science: Tools, tips, and practical recommendations[J]. Psychological methods, 2022, 27(3): 426.
13 WEISS M, ABUALHAOL I, AMIN M. A leader-driven open collaboration platform for exploring new domains[C]//Proceedings of the 12th International Symposium on Open Collaboration. Berlin Germany. ACM, 2016: 1-4.
14 WEISS M. Crowdsourcing literature reviews in new domains[J]. Technology innovation management review, 2016, 6(2): 5-14.
15 NAMA N, BARROWMAN N, O’HEARN K, et al. Quality control for crowdsourcing citation screening: The importance of assessment number and qualification set size[J]. Journal of clinical epidemiology, 2020, 122: 160-162.
16 KRIVOSHEEV E, CASATI F, CAFORIO V, et al. Crowdsourcing paper screening in systematic literature reviews[J]. Proceedings of the AAAI conference on human computation and crowdsourcing, 2017, 5: 108-117.
17 KRIVOSHEEV E, CASATI F, BENATALLAH B. Crowd-based multi-predicate screening of papers in literature reviews[C]//Proceedings of the 2018 World Wide Web Conference on World Wide Web - WWW '18. Lyon, France: ACM, 2018: 55-64.
18 SAMPSON M, NAMA N, O'HEARN K, et al. Creating enriched training sets of eligible studies for large systematic reviews: The utility of PubMed's Best Match algorithm[J]. International journal of technology assessment in health care, 2020, 37: e7.
19 RAMIREZ J, KRIVOSHEEV E, BAEZ M, et al. Crowdrev: A platform for crowd-based screening of literature reviews[J/OL]. arXiv preprint arXiv:, 2018.
20 SANTOS V, IWAZAKI A, SOUZA É, et al. CrowdSLR: A tool to support the use of crowdsourcing in systematic literature reviews[C]//Brazilian Symposium on Software Engineering. Joinville Brazil. ACM, 2021: 341-346.
21 NOEL-STORR A, DOOLEY G, ELLIOTT J, et al. An evaluation of Cochrane Crowd found that crowdsourcing produced accurate results in identifying randomized trials[J]. Journal of clinical epidemiology, 2021, 133: 130-139.
22 PIANTA M J, MAKRAI E, VERSPOOR K M, et al. Crowdsourcing critical appraisal of research evidence (CrowdCARE) was found to be a valid approach to assessing clinical research quality[J]. Journal of clinical epidemiology, 2018, 104: 8-14.
23 ASHKANASE J, NAMA N, SANDARAGE R V, et al. Identification and evaluation of controlled trials in pediatric cardiology: Crowdsourced scoping review and creation of accessible searchable database[J]. Canadian journal of cardiology, 2020, 36(11): 1795-1804.
24 SHAH N, GUO Y J, WENDELSDORF K V, et al. A crowdsourcing approach for reusing and meta-analyzing gene expression data[J]. Nature biotechnology, 2016, 34(8): 803-806.
25 荷兰心理统计联盟. 共建管理学元分析数据库[EB/OL]. [2022-03-22]. .
Netherlands Society for Psychometrics and Statistics. Co-constructing a meta-analysis database for management research[EB/OL]. [2022-03-22]. .
26 BOSCO F A, UGGERSLEV K L, STEEL P. MetaBUS as a vehicle for facilitating meta-analysis[J]. Human resource management review, 2017, 27(1): 237-254.
27 LEBEL E P, MCCARTHY R J, EARP B D, et al. A unified framework to quantify the credibility of scientific findings[J]. Advances in methods and practices in psychological science, 2018, 1(3): 389-402.
28 SHACKELFORD G E, KEMP L, RHODES C, et al. Accumulating evidence using crowdsourcing and machine learning: A living bibliography about existential risk and global catastrophic risk[J]. Futures, 2020, 116: 102508.
29 HONG Q N, BOUIX-PICASSO J, RUCHON C. Creation of an online inventory for choosing critical appraisal tools[J]. Education for information, 2022, 38(2): 205-210.
30 AJZEN I. From intentions to actions: A theory of planned behavior[M]//Action Control: From Cognition to Behavior. New York: Springer, 1985: 11-39.
31 AJZEN I. The theory of planned behavior[J]. Organizational behavior & human decision processes, 1991, 50(2): 179–211.
32 AJZEN I. Perceived behavioral control, self-efficacy, locus of control, and the theory of planned behavior[J]. Journal of applied social psychology, 2002, 32(4): 665-683.
33 DAVIS F D. Perceived usefulness, perceived ease of use, and user acceptance of information technology[J]. MIS quarterly, 1989, 13(3): 319.
34 PETRI H L, GOVERN J M. Motivation: Theory, research, and application[M]. Boston: Cengage Learning, 2012.
35 MILLER K A, DECI E L, RYAN R M. Intrinsic motivation and self-determination in human behavior[J]. Contemporary sociology, 1988, 17(2): 253.
36 ZHENG H C, LI D H, HOU W H. Task design, motivation, and participation in crowdsourcing contests[J]. International journal of electronic commerce, 2011, 15(4): 57-88.
37 NOV O, NAAMAN M, YE C. Analysis of participation in an online photo-sharing community: A multidimensional perspective[J]. Journal of the American society for information science and technology, 2010, 61(3): 555-566.
38 GOH D H, PE-THAN E P P, LEE C S. Perceptions of virtual reward systems in crowdsourcing games[J]. Computers in human behavior, 2017, 70: 365-374.
39 LAKHANI K R, JEPPESEN L B, LOHSE P A, et al. The value of openess in scientific problem solving[M]//Division of Research. Boston, MA: Harvard Business School, 2007.
40 KATZ D. The functional approach to the study of attitudes[J]. Public opinion quarterly, 1960, 24(2): 163-204.
41 ARMITAGE C J, CONNER M. Efficacy of the theory of planned behaviour: A meta-analytic review[J]. British journal of social psychology, 2001, 40(4): 471-499.
42 Collaboration Cochrane. Cochrane crowd[EB/OL]. [2022-06-18]. .
43 KE W L, ZHANG P. Motivations in open source software communities: The mediating role of effort intensity and goal commitment[J]. International journal of electronic commerce, 2009, 13(4): 39-66.
44 ACAR O A. Motivations and solution appropriateness in crowdsourcing challenges for innovation[J]. Research policy, 2019, 48(8): 103716.
45 OREG S, NOV O. Exploring motivations for contributing to open source initiatives: The roles of contribution context and personal values[J]. Computers in human behavior, 2008, 24(5): 2055-2073.
46 GARBARINO E C, EDELL J A. Cognitive effort, affect, and choice[J]. Journal of consumer research, 1997, 24(2): 147-158.
47 WANG M M. Encouraging solvers to sustain participation intention on crowdsourcing platforms: An investigation of social beliefs[J]. Information technology and management, 2022, 23(1): 39-50.
48 AJZEN I, FISHBEIN M. The influence of attitudes on behavior[M]//The Handbook of Attitudes. London, England: Psychology Press, 2014: 187-236.
49 TOHIDINIA Z, MOSAKHANI M. Knowledge sharing behaviour and its predictors[J]. Industrial management & data systems, 2010, 110(4): 611-631.
50 KANKANHALLI, TAN, WEI. Contributing knowledge to electronic knowledge repositories: An empirical investigation[J]. MIS quarterly, 2005, 29(1): 113.
51 BOCK, ZMUD, KIM, et al. Behavioral intention formation in knowledge sharing: Examining the roles of extrinsic motivators, social-psychological forces, and organizational climate[J]. MIS quarterly, 2005, 29(1): 87.
52 HAIR JR J F, HULT G T M, RINGLE C M, et al. A primer on partial least squares structural equation modeling(PLS-SEM)[M]. Thousand Oaks: Sage Publications, 2021.
53 STRAUB D, GEFEN D. Validation guidelines for IS positivist research[J]. Communications of the association for information systems, 2004, 13(1): 24.
54 FORNELL C, LARCKER D F. Evaluating structural equation models with unobservable variables and measurement error[J]. Journal of marketing research, 1981, 18(1): 39-50.
55 CASRAI. 14 contributor roles[EB/OL]. [2022-08-22]. .
[1] GOU Ruike, LUO Wei. Influencing Factors of Continuous Use Intention of "Generation Z" Users of an AIGC Platform [J]. Journal of library and information science in agriculture, 2025, 37(3): 66-80.
[2] SHI Qin, XIE Jing, WU Shang. Influencing Factors and Correlations of User Satisfaction with Mobile Health Applications [J]. Journal of library and information science in agriculture, 2025, 37(1): 33-46.
[3] YOU Ge, LI Jielin, ZHANG Fangshun. Generating Mechanism of Online Public Opinion Heat in Public Emergencies from the Perspective of Information Ecology: Fuzzy Set Qualitative Comparative Analysis Based on 50 Cases [J]. Journal of library and information science in agriculture, 2025, 37(1): 86-99.
[4] Guowei GAO, Shanshan ZHANG, Jialan YU. A Review of Health Information Behaviors of Older People from the Perspective of Topic Differentiation [J]. Journal of library and information science in agriculture, 2024, 36(7): 34-49.
[5] Liqin YAO, Hai ZHANG. Model Construction and Empirical Research on the Influencing Factors of AIGC User Dropout Behavior [J]. Journal of library and information science in agriculture, 2024, 36(5): 79-92.
[6] Chunling GAO, Liyuan JIANG. Elderly People's Online Health Information Seeking Behavior Based on Evolutionary Dynamics [J]. Journal of library and information science in agriculture, 2024, 36(5): 65-78.
[7] LIU Yang, LYU Shuyue, LI Ruojun. Concept, Task, and Application of Social Robots in Information Behavior Research [J]. Journal of library and information science in agriculture, 2024, 36(3): 4-20.
[8] ZHOU Xin. Machine Functionalism and the Digital-Intelligence Divide: Evolutionary Pathways, Generative Logic and Regulatory Strategies [J]. Journal of library and information science in agriculture, 2024, 36(3): 59-71.
[9] SHI Yanqing, LI Lu, SHI Qin. Impact of User Heterogeneity on Knowledge Collaboration Effectiveness from a Network Structure Perspective [J]. Journal of library and information science in agriculture, 2024, 36(3): 72-82.
[10] WANG Yueying. Exploring the Causes of Low Health Information Literacy Among Rural Middle-Aged and Elderly Adults and its Improvement Strategies [J]. Journal of library and information science in agriculture, 2024, 36(2): 81-93.
[11] WANG Weizheng, QIAO Hong, LI Xiaojun, WANG Jingjing. User Willingness to Use Generative Artificial Intelligence Based on AIDUA Framework [J]. Journal of library and information science in agriculture, 2024, 36(2): 36-50.
[12] HAN Xi, LIAO Ke. Factors Influencing Misinformation Propagation: A Systemic Review [J]. Journal of library and information science in agriculture, 2024, 36(12): 45-63.
[13] Zheng WANG, Miao ZHUANG, Yudi ZHANG, Yaqi ZHANG. Factors Influencing Online Health Information Acquisition Behavior of Rural Elderly Groups in Western China: A Field Study from the Guanzhong Region of Shaanxi Province [J]. Journal of library and information science in agriculture, 2024, 36(10): 23-37.
[14] Yijia WAN, Liping GU. Behavioral Motivation and Influencing Factors of Graduate Students Using AIGC Tool: An Empirical Analysis Based on Questionnaire Survey [J]. Journal of library and information science in agriculture, 2024, 36(10): 4-22.
[15] SUN Lili, WANG WeiJie, SHENG Jiefei. Influencing Factors of Scientific Data Value Increment Based on System Dynamics [J]. Journal of library and information science in agriculture, 2023, 35(9): 28-42.
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed   
No Suggested Reading articles found!