{"title":"科学的陪伴:一种整合项目开发、证据和评估的新模式","authors":"Patricia Lannen, L. Jones","doi":"10.1108/jcs-09-2021-0037","DOIUrl":null,"url":null,"abstract":"\nPurpose\nCalls for the development and dissemination of evidence-based programs to support children and families have been increasing for decades, but progress has been slow. This paper aims to argue that a singular focus on evaluation has limited the ways in which science and research is incorporated into program development, and advocate instead for the use of a new concept, “scientific accompaniment,” to expand and guide program development and testing.\n\n\nDesign/methodology/approach\nA heuristic is provided to guide research–practice teams in assessing the program’s developmental stage and level of evidence.\n\n\nFindings\nIn an idealized pathway, scientific accompaniment begins early in program development, with ongoing input from both practitioners and researchers, resulting in programs that are both effective and scalable. The heuristic also provides guidance for how to “catch up” on evidence when program development and science utilization are out of sync.\n\n\nOriginality/value\nWhile implementation models provide ideas on improving the use of evidence-based practices, social service programs suffer from a significant lack of research and evaluation. Evaluation resources are typically not used by social service program developers and collaboration with researchers happens late in program development, if at all. There are few resources or models that encourage and guide the use of science and evaluation across program development.\n","PeriodicalId":45244,"journal":{"name":"Journal of Childrens Services","volume":" ","pages":""},"PeriodicalIF":1.4000,"publicationDate":"2022-08-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Scientific accompaniment: a new model for integrating program development, evidence and evaluation\",\"authors\":\"Patricia Lannen, L. Jones\",\"doi\":\"10.1108/jcs-09-2021-0037\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"\\nPurpose\\nCalls for the development and dissemination of evidence-based programs to support children and families have been increasing for decades, but progress has been slow. This paper aims to argue that a singular focus on evaluation has limited the ways in which science and research is incorporated into program development, and advocate instead for the use of a new concept, “scientific accompaniment,” to expand and guide program development and testing.\\n\\n\\nDesign/methodology/approach\\nA heuristic is provided to guide research–practice teams in assessing the program’s developmental stage and level of evidence.\\n\\n\\nFindings\\nIn an idealized pathway, scientific accompaniment begins early in program development, with ongoing input from both practitioners and researchers, resulting in programs that are both effective and scalable. The heuristic also provides guidance for how to “catch up” on evidence when program development and science utilization are out of sync.\\n\\n\\nOriginality/value\\nWhile implementation models provide ideas on improving the use of evidence-based practices, social service programs suffer from a significant lack of research and evaluation. Evaluation resources are typically not used by social service program developers and collaboration with researchers happens late in program development, if at all. There are few resources or models that encourage and guide the use of science and evaluation across program development.\\n\",\"PeriodicalId\":45244,\"journal\":{\"name\":\"Journal of Childrens Services\",\"volume\":\" \",\"pages\":\"\"},\"PeriodicalIF\":1.4000,\"publicationDate\":\"2022-08-16\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Journal of Childrens Services\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1108/jcs-09-2021-0037\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"SOCIAL WORK\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Childrens Services","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1108/jcs-09-2021-0037","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"SOCIAL WORK","Score":null,"Total":0}
Scientific accompaniment: a new model for integrating program development, evidence and evaluation
Purpose
Calls for the development and dissemination of evidence-based programs to support children and families have been increasing for decades, but progress has been slow. This paper aims to argue that a singular focus on evaluation has limited the ways in which science and research is incorporated into program development, and advocate instead for the use of a new concept, “scientific accompaniment,” to expand and guide program development and testing.
Design/methodology/approach
A heuristic is provided to guide research–practice teams in assessing the program’s developmental stage and level of evidence.
Findings
In an idealized pathway, scientific accompaniment begins early in program development, with ongoing input from both practitioners and researchers, resulting in programs that are both effective and scalable. The heuristic also provides guidance for how to “catch up” on evidence when program development and science utilization are out of sync.
Originality/value
While implementation models provide ideas on improving the use of evidence-based practices, social service programs suffer from a significant lack of research and evaluation. Evaluation resources are typically not used by social service program developers and collaboration with researchers happens late in program development, if at all. There are few resources or models that encourage and guide the use of science and evaluation across program development.