T. van Gelder, Ariel Kruger, Sujai Thomman, Richard de Rozario, Elizabeth Silver, Morgan Saletta, Ashley Barnett, R. Sinnott, G. Jayaputera, M. Burgman
{"title":"通过众包和结构化分析技术改进分析推理","authors":"T. van Gelder, Ariel Kruger, Sujai Thomman, Richard de Rozario, Elizabeth Silver, Morgan Saletta, Ashley Barnett, R. Sinnott, G. Jayaputera, M. Burgman","doi":"10.1177/1555343420926287","DOIUrl":null,"url":null,"abstract":"How might analytic reasoning in intelligence reports be substantially improved? One conjecture is that this can be achieved through a combination of crowdsourcing and structured analytic techniques (SATs). To explore this conjecture, we developed a new crowdsourcing platform supporting groups in collaborative reasoning and intelligence report drafting using a novel SAT we call “Contending Analyses.” In this paper we present findings from a large study designed to assess whether groups of professional analysts working on the platform produce better-reasoned reports than those analysts produce when using methods and tools normally used in their organizations. Secondary questions were whether professional analysts working on the platform produce better reasoning than the general public working on the platform; and how usable the platform is. Our main finding is a large effect size (Cohen’s d = 1.37) in favor of working on platform. This provides early support for the general conjecture. We discuss limitations of our study, implications for intelligence organizations, and future directions for the work as a whole.","PeriodicalId":46342,"journal":{"name":"Journal of Cognitive Engineering and Decision Making","volume":"14 1","pages":"195 - 217"},"PeriodicalIF":2.2000,"publicationDate":"2020-08-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1177/1555343420926287","citationCount":"6","resultStr":"{\"title\":\"Improving Analytic Reasoning via Crowdsourcing and Structured Analytic Techniques\",\"authors\":\"T. van Gelder, Ariel Kruger, Sujai Thomman, Richard de Rozario, Elizabeth Silver, Morgan Saletta, Ashley Barnett, R. Sinnott, G. Jayaputera, M. Burgman\",\"doi\":\"10.1177/1555343420926287\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"How might analytic reasoning in intelligence reports be substantially improved? One conjecture is that this can be achieved through a combination of crowdsourcing and structured analytic techniques (SATs). To explore this conjecture, we developed a new crowdsourcing platform supporting groups in collaborative reasoning and intelligence report drafting using a novel SAT we call “Contending Analyses.” In this paper we present findings from a large study designed to assess whether groups of professional analysts working on the platform produce better-reasoned reports than those analysts produce when using methods and tools normally used in their organizations. Secondary questions were whether professional analysts working on the platform produce better reasoning than the general public working on the platform; and how usable the platform is. Our main finding is a large effect size (Cohen’s d = 1.37) in favor of working on platform. This provides early support for the general conjecture. We discuss limitations of our study, implications for intelligence organizations, and future directions for the work as a whole.\",\"PeriodicalId\":46342,\"journal\":{\"name\":\"Journal of Cognitive Engineering and Decision Making\",\"volume\":\"14 1\",\"pages\":\"195 - 217\"},\"PeriodicalIF\":2.2000,\"publicationDate\":\"2020-08-17\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://sci-hub-pdf.com/10.1177/1555343420926287\",\"citationCount\":\"6\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Journal of Cognitive Engineering and Decision Making\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1177/1555343420926287\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q3\",\"JCRName\":\"ENGINEERING, INDUSTRIAL\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Cognitive Engineering and Decision Making","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1177/1555343420926287","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"ENGINEERING, INDUSTRIAL","Score":null,"Total":0}
Improving Analytic Reasoning via Crowdsourcing and Structured Analytic Techniques
How might analytic reasoning in intelligence reports be substantially improved? One conjecture is that this can be achieved through a combination of crowdsourcing and structured analytic techniques (SATs). To explore this conjecture, we developed a new crowdsourcing platform supporting groups in collaborative reasoning and intelligence report drafting using a novel SAT we call “Contending Analyses.” In this paper we present findings from a large study designed to assess whether groups of professional analysts working on the platform produce better-reasoned reports than those analysts produce when using methods and tools normally used in their organizations. Secondary questions were whether professional analysts working on the platform produce better reasoning than the general public working on the platform; and how usable the platform is. Our main finding is a large effect size (Cohen’s d = 1.37) in favor of working on platform. This provides early support for the general conjecture. We discuss limitations of our study, implications for intelligence organizations, and future directions for the work as a whole.