Pub Date : 2021-10-22eCollection Date: 2022-03-01DOI: 10.1007/s40614-021-00313-y
John Michael Falligant, Michael P Kranak, Louis P Hagopian
Reliable and accurate visual analysis of graphically depicted behavioral data acquired using single-case experimental designs (SCEDs) is integral to behavior-analytic research and practice. Researchers have developed a range of techniques to increase reliable and objective visual inspection of SCED data including visual interpretive guides, statistical techniques, and nonstatistical quantitative methods to objectify the visual-analytic interpretation of data to guide clinicians, and ensure a replicable data interpretation process in research. These structured data analytic practices are now more frequently used by behavior analysts and the subject of considerable research within the field of quantitative methods and behavior analysis. First, there are contemporaneous analytic methods that have preliminary support with simulated datasets, but have not been thoroughly examined with nonsimulated clinical datasets. There are a number of relatively new techniques that have preliminary support (e.g., fail-safe k), but require additional research. Other analytic methods (e.g., dual-criteria and conservative dual criteria) have more extensive support, but have infrequently been compared against other analytic methods. Across three studies, we examine how these methods corresponded to clinical outcomes (and one another) for the purpose of replicating and extending extant literature in this area. Implications and recommendations for practitioners and researchers are discussed.
{"title":"Further Analysis of Advanced Quantitative Methods and Supplemental Interpretative Aids with Single-Case Experimental Designs.","authors":"John Michael Falligant, Michael P Kranak, Louis P Hagopian","doi":"10.1007/s40614-021-00313-y","DOIUrl":"https://doi.org/10.1007/s40614-021-00313-y","url":null,"abstract":"<p><p>Reliable and accurate visual analysis of graphically depicted behavioral data acquired using single-case experimental designs (SCEDs) is integral to behavior-analytic research and practice. Researchers have developed a range of techniques to increase reliable and objective visual inspection of SCED data including visual interpretive guides, statistical techniques, and nonstatistical quantitative methods to objectify the visual-analytic interpretation of data to guide clinicians, and ensure a replicable data interpretation process in research. These structured data analytic practices are now more frequently used by behavior analysts and the subject of considerable research within the field of quantitative methods and behavior analysis. First, there are contemporaneous analytic methods that have preliminary support with simulated datasets, but have not been thoroughly examined with nonsimulated clinical datasets. There are a number of relatively new techniques that have preliminary support (e.g., fail-safe <i>k</i>), but require additional research. Other analytic methods (e.g., dual-criteria and conservative dual criteria) have more extensive support, but have infrequently been compared against other analytic methods. Across three studies, we examine how these methods corresponded to clinical outcomes (and one another) for the purpose of replicating and extending extant literature in this area. Implications and recommendations for practitioners and researchers are discussed.</p>","PeriodicalId":44993,"journal":{"name":"Perspectives on Behavior Science","volume":null,"pages":null},"PeriodicalIF":2.5,"publicationDate":"2021-10-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8894533/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142044179","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2021-10-21DOI: 10.1007/s40614-022-00332-3
M. Lanovaz
{"title":"Some Characteristics and Arguments in Favor of a Science of Machine Behavior Analysis","authors":"M. Lanovaz","doi":"10.1007/s40614-022-00332-3","DOIUrl":"https://doi.org/10.1007/s40614-022-00332-3","url":null,"abstract":"","PeriodicalId":44993,"journal":{"name":"Perspectives on Behavior Science","volume":null,"pages":null},"PeriodicalIF":2.0,"publicationDate":"2021-10-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"42474353","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2021-10-19eCollection Date: 2021-12-01DOI: 10.1007/s40614-021-00320-z
Megan S Kirby, Trina D Spencer, John Ferron
[This corrects the article DOI: 10.1007/s40614-021-00301-2.].
[这更正了文章DOI: 10.1007/s40614-021-00301-2]。
{"title":"Correction to: How to Be RAD: Repeated Acquisition Design Features that Enhance Internal and External Validity.","authors":"Megan S Kirby, Trina D Spencer, John Ferron","doi":"10.1007/s40614-021-00320-z","DOIUrl":"https://doi.org/10.1007/s40614-021-00320-z","url":null,"abstract":"<p><p>[This corrects the article DOI: 10.1007/s40614-021-00301-2.].</p>","PeriodicalId":44993,"journal":{"name":"Perspectives on Behavior Science","volume":null,"pages":null},"PeriodicalIF":2.0,"publicationDate":"2021-10-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8738835/pdf/40614_2021_Article_320.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"39750447","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2021-10-18eCollection Date: 2022-03-01DOI: 10.1007/s40614-021-00319-6
Lee Mason, Maria Otero, Alonzo Andrews
[This corrects the article DOI: 10.1007/s40614-021-00315-w.].
[此处更正了文章 DOI:10.1007/s40614-021-00315-w]。
{"title":"Correction to: Cochran's Q Test of Stimulus Overselectivity within the Verbal Repertoire of Children with Autism.","authors":"Lee Mason, Maria Otero, Alonzo Andrews","doi":"10.1007/s40614-021-00319-6","DOIUrl":"https://doi.org/10.1007/s40614-021-00319-6","url":null,"abstract":"<p><p>[This corrects the article DOI: 10.1007/s40614-021-00315-w.].</p>","PeriodicalId":44993,"journal":{"name":"Perspectives on Behavior Science","volume":null,"pages":null},"PeriodicalIF":2.0,"publicationDate":"2021-10-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8894519/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139940900","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2021-09-15eCollection Date: 2022-03-01DOI: 10.1007/s40614-021-00315-w
Lee Mason, Maria Otero, Alonzo Andrews
Stimulus overselectivity remains an ill-defined concept within behavior analysis, because it can be difficult to distinguish truly restrictive stimulus control from random variation. Quantitative models of bias are useful, though perhaps limited in application. Over the last 50 years, research on stimulus overselectivity has developed a pattern of assessment and intervention repeatedly marred by methodological flaws. Here we argue that a molecular view of overselectivity, under which restricted stimulus control has heretofore been examined, is fundamentally insufficient for analyzing this phenomenon. Instead, we propose the use of the term "overselectivity" to define temporally extended patterns of restrictive stimulus control that have resulted in disproportionate populations of responding that cannot be attributed to chance alone, and highlight examples of overselectivity within the verbal behavior of children with autism spectrum disorder. Viewed as such, stimulus overselectivity lends itself to direct observation and measurement through the statistical analysis of single-subject data. In particular, we demonstrate the use of the Cochran Q test as a means of precisely quantifying stimulus overselectivity. We provide a tutorial on calculation, a model for interpretation, and a discussion of the implications for the use of Cochran's Q by clinicians and researchers.
{"title":"Cochran's Q Test of Stimulus Overselectivity within the Verbal Repertoire of Children with Autism.","authors":"Lee Mason, Maria Otero, Alonzo Andrews","doi":"10.1007/s40614-021-00315-w","DOIUrl":"10.1007/s40614-021-00315-w","url":null,"abstract":"<p><p>Stimulus overselectivity remains an ill-defined concept within behavior analysis, because it can be difficult to distinguish truly restrictive stimulus control from random variation. Quantitative models of bias are useful, though perhaps limited in application. Over the last 50 years, research on stimulus overselectivity has developed a pattern of assessment and intervention repeatedly marred by methodological flaws. Here we argue that a molecular view of overselectivity, under which restricted stimulus control has heretofore been examined, is fundamentally insufficient for analyzing this phenomenon. Instead, we propose the use of the term \"overselectivity\" to define temporally extended patterns of restrictive stimulus control that have resulted in disproportionate populations of responding that cannot be attributed to chance alone, and highlight examples of overselectivity within the verbal behavior of children with autism spectrum disorder. Viewed as such, stimulus overselectivity lends itself to direct observation and measurement through the statistical analysis of single-subject data. In particular, we demonstrate the use of the Cochran <i>Q</i> test as a means of precisely quantifying stimulus overselectivity. We provide a tutorial on calculation, a model for interpretation, and a discussion of the implications for the use of Cochran's <i>Q</i> by clinicians and researchers.</p>","PeriodicalId":44993,"journal":{"name":"Perspectives on Behavior Science","volume":null,"pages":null},"PeriodicalIF":2.5,"publicationDate":"2021-09-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8894513/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"52671301","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2021-09-13eCollection Date: 2021-12-01DOI: 10.1007/s40614-021-00311-0
José E Burgos
A recent discussion in this journal revolved around the issue of whether postulating internal clocks is harmful or beneficial to scientific psychology, and how. I argue that this and other discussions on the topic have yet to address the real problem: The concept of a hypothetical construct is unintelligible. Psychologists agree that all entities that constitute hypothetical constructs are unobservable, importantly different from observable entities, including overt behavior and its environment. The root issue at hand here, then, is the observable-unobservable distinction. Psychologists have implicitly but erroneously taken it for granted as sufficiently unproblematic to warrant meaningful discussions based on it, when in fact it is a pernicious untenable remnant of logical positivism. All previous discussions of hypothetical constructs in psychology have overlooked arguments against this view in the philosophy of science. These arguments are sufficiently compelling to at least question, if not cease altogether, talk of observability, unobservability, and HCs in psychology as useless, even harmful.
{"title":"The Real Problem with Hypothetical Constructs.","authors":"José E Burgos","doi":"10.1007/s40614-021-00311-0","DOIUrl":"https://doi.org/10.1007/s40614-021-00311-0","url":null,"abstract":"<p><p>A recent discussion in this journal revolved around the issue of whether postulating internal clocks is harmful or beneficial to scientific psychology, and how. I argue that this and other discussions on the topic have yet to address the real problem: The concept of a hypothetical construct is unintelligible. Psychologists agree that all entities that constitute hypothetical constructs are unobservable, importantly different from observable entities, including overt behavior and its environment. The root issue at hand here, then, is the observable-unobservable distinction. Psychologists have implicitly but erroneously taken it for granted as sufficiently unproblematic to warrant meaningful discussions based on it, when in fact it is a pernicious untenable remnant of logical positivism. All previous discussions of hypothetical constructs in psychology have overlooked arguments against this view in the philosophy of science. These arguments are sufficiently compelling to at least question, if not cease altogether, talk of observability, unobservability, and HCs in psychology as useless, even harmful.</p>","PeriodicalId":44993,"journal":{"name":"Perspectives on Behavior Science","volume":null,"pages":null},"PeriodicalIF":2.0,"publicationDate":"2021-09-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8738803/pdf/40614_2021_Article_311.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"39750446","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2021-09-03eCollection Date: 2021-09-01DOI: 10.1007/s40614-021-00309-8
Janet S Twyman
To better understand the effectiveness of Direct Instruction (DI), the empirical base related to DI's instructional design components (explicit teaching, judicious selection and sequencing of examples) and principles (identifying big ideas, teaching generalizable strategies, providing mediated instruction, integrating skills and concepts, priming background knowledge, and providing ample review) are analyzed. Attention is given to the converging evidence supporting the design characteristics of DI, which has broad applicability across different disciplines, teaching methodologies, and perspectives.
{"title":"The Evidence is in the Design.","authors":"Janet S Twyman","doi":"10.1007/s40614-021-00309-8","DOIUrl":"https://doi.org/10.1007/s40614-021-00309-8","url":null,"abstract":"<p><p>To better understand the effectiveness of Direct Instruction (DI), the empirical base related to DI's instructional design components (explicit teaching, judicious selection and sequencing of examples) and principles (identifying big ideas, teaching generalizable strategies, providing mediated instruction, integrating skills and concepts, priming background knowledge, and providing ample review) are analyzed. Attention is given to the converging evidence supporting the design characteristics of DI, which has broad applicability across different disciplines, teaching methodologies, and perspectives.</p>","PeriodicalId":44993,"journal":{"name":"Perspectives on Behavior Science","volume":null,"pages":null},"PeriodicalIF":2.0,"publicationDate":"2021-09-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8476665/pdf/40614_2021_Article_309.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"39504028","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2021-08-30eCollection Date: 2021-09-01DOI: 10.1007/s40614-021-00314-x
William L Heward, Janet S Twyman
{"title":"Whatever the Kid Does Is the Truth: Introduction to the Special Section on Direct Instruction.","authors":"William L Heward, Janet S Twyman","doi":"10.1007/s40614-021-00314-x","DOIUrl":"10.1007/s40614-021-00314-x","url":null,"abstract":"","PeriodicalId":44993,"journal":{"name":"Perspectives on Behavior Science","volume":null,"pages":null},"PeriodicalIF":2.0,"publicationDate":"2021-08-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8476699/pdf/40614_2021_Article_314.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"39505535","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2021-08-19eCollection Date: 2021-09-01DOI: 10.1007/s40614-021-00312-z
Patrick C Friman
[This corrects the article DOI: 10.1007/s40614-021-00285-z.].
[这更正了文章DOI: 10.1007/s40614-021-00285-z.]。
{"title":"Correction to: Dissemination of Direct Instruction: Ponder These while Pursuing That.","authors":"Patrick C Friman","doi":"10.1007/s40614-021-00312-z","DOIUrl":"https://doi.org/10.1007/s40614-021-00312-z","url":null,"abstract":"<p><p>[This corrects the article DOI: 10.1007/s40614-021-00285-z.].</p>","PeriodicalId":44993,"journal":{"name":"Perspectives on Behavior Science","volume":null,"pages":null},"PeriodicalIF":2.0,"publicationDate":"2021-08-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1007/s40614-021-00312-z","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"39504430","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2021-08-18eCollection Date: 2021-09-01DOI: 10.1007/s40614-021-00310-1
Janet S Twyman
We are in the midst of a global learning crisis. The National Center for Education Statistics (2019) reports that 65% of fourth- and 66% of eighth-grade students in the United States did not meet proficient standards for reading. A 2017 report from UNESCO reports that 6 out of 10 children worldwide do not achieve minimum proficiency in reading and mathematics. For far too many learners, instruction is riddled with confusion and ambiguity. Engelmann and Carnine's (1991) approach to improving learning is to design instruction that communicates one (and only one) logical interpretation by the learner. Called "faultless communication" this method can be used to teach learners a wide variety of concepts or skills and underpins all Direct Instruction programs. By reducing errors and misinterpretation, it maximizes learning for all students. To ensure effectiveness, the learner's performance is observed, and if necessary, the communication is continually redesigned until faultless (i.e., the learner learns). This "Theory of Instruction" is harmonious with behavior analysis and beneficial to anyone concerned with improving student learning-the heart and soul of good instruction.
{"title":"Faultless Communication: The Heart and Soul of DI.","authors":"Janet S Twyman","doi":"10.1007/s40614-021-00310-1","DOIUrl":"https://doi.org/10.1007/s40614-021-00310-1","url":null,"abstract":"<p><p>We are in the midst of a global learning crisis. The National Center for Education Statistics (2019) reports that 65% of fourth- and 66% of eighth-grade students in the United States did <i>not</i> meet proficient standards for reading. A 2017 report from UNESCO reports that 6 out of 10 children worldwide do not achieve minimum proficiency in reading and mathematics. For far too many learners, instruction is riddled with confusion and ambiguity. Engelmann and Carnine's (1991) approach to improving learning is to design instruction that communicates one (and only one) logical interpretation by the learner. Called \"faultless communication\" this method can be used to teach learners a wide variety of concepts or skills and underpins all Direct Instruction programs. By reducing errors and misinterpretation, it maximizes learning for all students. To ensure effectiveness, the learner's performance is observed, and if necessary, the communication is continually redesigned until faultless (i.e., the learner learns). This \"Theory of Instruction\" is harmonious with behavior analysis and beneficial to anyone concerned with improving student learning-the heart and soul of good instruction.</p>","PeriodicalId":44993,"journal":{"name":"Perspectives on Behavior Science","volume":null,"pages":null},"PeriodicalIF":2.0,"publicationDate":"2021-08-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1007/s40614-021-00310-1","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"39504027","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}