{"title":"Agency Conflict in Internal Corporate Innovation Contests","authors":"S. Ransbotham, G. Westerman","doi":"10.2139/ssrn.2887679","DOIUrl":null,"url":null,"abstract":"Crowdsourcing innovation aims to solicit a large volume of diverse ideas but inherently increases demands on resources to assess those contributions. As a result, organizations may now crowdsource the assessment of the ideas as well. However, crowd assessment of crowd generated ideas may diverge from organizational objectives. We investigate crowd versus expert assessment in the context of a recurring innovation contest at a global technology company. Textual analysis of 14,697 submitted ideas reveals agency conflict between the two assessments. Experts focus on stated corporate objectives, while the preferences of the employee crowd negatively relate to corporate direction. Topic popularity and social concerns influence crowds of employees. While experts exhibit less agency conflict than employees relative to stated corporate objectives, they are far less numerous and potentially more expensive than employee resources. We identify hybrid mechanisms that balance use of constrained expert resources with the potential assessment biases of the crowd.","PeriodicalId":11062,"journal":{"name":"Development of Innovation eJournal","volume":"86 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2016-12-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"3","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Development of Innovation eJournal","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.2139/ssrn.2887679","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 3
Abstract
Crowdsourcing innovation aims to solicit a large volume of diverse ideas but inherently increases demands on resources to assess those contributions. As a result, organizations may now crowdsource the assessment of the ideas as well. However, crowd assessment of crowd generated ideas may diverge from organizational objectives. We investigate crowd versus expert assessment in the context of a recurring innovation contest at a global technology company. Textual analysis of 14,697 submitted ideas reveals agency conflict between the two assessments. Experts focus on stated corporate objectives, while the preferences of the employee crowd negatively relate to corporate direction. Topic popularity and social concerns influence crowds of employees. While experts exhibit less agency conflict than employees relative to stated corporate objectives, they are far less numerous and potentially more expensive than employee resources. We identify hybrid mechanisms that balance use of constrained expert resources with the potential assessment biases of the crowd.