TY - GEN
T1 - Does exposure to shared solutions lead to better outcomes? An empirical investigation in online crowdsourcing contests
AU - Hou, Jingbo
AU - Chen, Pei Yu
AU - Gu, Bin
N1 - Publisher Copyright:
© 2021 IEEE Computer Society. All rights reserved.
PY - 2021
Y1 - 2021
N2 - Crowdsourcing contests provide an effective way to elicit novel ideas and creative solutions from collective intelligence. A key design feature of crowdsourcing contests is the competition between contest participants to complete a specific task with financial awards to the winner(s). In recent years, some crowdsourcing contest platforms provide options to contest participants for solution sharing during the competition. This study intends to evaluate the influence of exposure to shared solutions on different stakeholders, including the team, and the requester. Our study employs a multiple-level panel data from a large online crowdsourcing platform, Kaggle.com, to examine these effects. For teams, exposure to shared solutions helps new entrant teams to jump-start and help teams to achieve better performance in the subsequent submissions, and the teams' skill level negatively moderates these positive effects. For requesters, allowing solution sharing has both benefits and costs in terms of improving the best performance of the crowd. We highlight the theoretical implications of the study and provide practical suggestions for crowdsourcing contest platforms to help them decide whether to allow solution sharing during the competition.
AB - Crowdsourcing contests provide an effective way to elicit novel ideas and creative solutions from collective intelligence. A key design feature of crowdsourcing contests is the competition between contest participants to complete a specific task with financial awards to the winner(s). In recent years, some crowdsourcing contest platforms provide options to contest participants for solution sharing during the competition. This study intends to evaluate the influence of exposure to shared solutions on different stakeholders, including the team, and the requester. Our study employs a multiple-level panel data from a large online crowdsourcing platform, Kaggle.com, to examine these effects. For teams, exposure to shared solutions helps new entrant teams to jump-start and help teams to achieve better performance in the subsequent submissions, and the teams' skill level negatively moderates these positive effects. For requesters, allowing solution sharing has both benefits and costs in terms of improving the best performance of the crowd. We highlight the theoretical implications of the study and provide practical suggestions for crowdsourcing contest platforms to help them decide whether to allow solution sharing during the competition.
UR - http://www.scopus.com/inward/record.url?scp=85108333678&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85108333678&partnerID=8YFLogxK
M3 - Conference contribution
AN - SCOPUS:85108333678
T3 - Proceedings of the Annual Hawaii International Conference on System Sciences
SP - 6553
EP - 6562
BT - Proceedings of the 54th Annual Hawaii International Conference on System Sciences, HICSS 2021
A2 - Bui, Tung X.
PB - IEEE Computer Society
T2 - 54th Annual Hawaii International Conference on System Sciences, HICSS 2021
Y2 - 4 January 2021 through 8 January 2021
ER -