{"title":"对南非公共服务部门技能能力讲习班的准实验性评估","authors":"P. Jonck, R. D. Coning","doi":"10.4102/aej.v8i1.421","DOIUrl":null,"url":null,"abstract":"Background: A paucity of evaluation studies could be identified that investigated the impact of training. The lacuna of research should be viewed in light of austerity measures as well as inability to measure return of investment on training expenditure, which is substantial year on year, especially in the context of public service. Objectives: This article reports on an impact evaluation of a research methodology skills capacity workshop. Method: A quasi-experimental evaluation design in which comparison groups were utilised to evaluate the impact of a research methodology skills development intervention. A paired-sample t -test was used to measure the knowledge increase whilst controlling for the influence of comparison groups by means of an analysis of variance. A hierarchical multiple regression analysis was performed to determine how much of the variance in research methodology knowledge could be contributed to the intervention whilst controlling for facilitator effect. Results: Results indicated that the intervention had a statistically significant impact on research methodology knowledge. Furthermore, the intervention group significantly differed statistically from the control and comparison groups with respect to research methodology knowledge. Facilitator effect was found to be a moderating variable. Hierarchical regression analysis performed to isolate the impact of intervention in the absence of facilitator effect revealed a statistically significant result. Conclusion: The study augments the corpus of knowledge by providing evidence of training impact within the South African public service, especially utilising a quasi-experimental pre-test–post-test research design and isolating the impact of facilitator effect from the intervention itself.","PeriodicalId":37531,"journal":{"name":"African Evaluation Journal","volume":" 17","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2020-03-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"A quasi-experimental evaluation of a skills capacity workshop in the South African public service\",\"authors\":\"P. Jonck, R. D. Coning\",\"doi\":\"10.4102/aej.v8i1.421\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Background: A paucity of evaluation studies could be identified that investigated the impact of training. The lacuna of research should be viewed in light of austerity measures as well as inability to measure return of investment on training expenditure, which is substantial year on year, especially in the context of public service. Objectives: This article reports on an impact evaluation of a research methodology skills capacity workshop. Method: A quasi-experimental evaluation design in which comparison groups were utilised to evaluate the impact of a research methodology skills development intervention. A paired-sample t -test was used to measure the knowledge increase whilst controlling for the influence of comparison groups by means of an analysis of variance. A hierarchical multiple regression analysis was performed to determine how much of the variance in research methodology knowledge could be contributed to the intervention whilst controlling for facilitator effect. Results: Results indicated that the intervention had a statistically significant impact on research methodology knowledge. Furthermore, the intervention group significantly differed statistically from the control and comparison groups with respect to research methodology knowledge. Facilitator effect was found to be a moderating variable. Hierarchical regression analysis performed to isolate the impact of intervention in the absence of facilitator effect revealed a statistically significant result. Conclusion: The study augments the corpus of knowledge by providing evidence of training impact within the South African public service, especially utilising a quasi-experimental pre-test–post-test research design and isolating the impact of facilitator effect from the intervention itself.\",\"PeriodicalId\":37531,\"journal\":{\"name\":\"African Evaluation Journal\",\"volume\":\" 17\",\"pages\":\"\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2020-03-31\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"African Evaluation Journal\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.4102/aej.v8i1.421\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"Social Sciences\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"African Evaluation Journal","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.4102/aej.v8i1.421","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"Social Sciences","Score":null,"Total":0}
引用次数: 0
摘要
背景:调查培训影响的评估研究很少。应从紧缩措施以及无法衡量培训支出的投资回报率的角度来看待这一研究空白,因为培训支出每年都很可观,尤其是在公共服务领域。目的:本文报告了对研究方法技能能力讲习班的影响评估。评估方法采用准实验评估设计,利用比较组来评估研究方法技能发展干预措施的影响。采用配对样本 t 检验来衡量知识增长情况,同时通过方差分析来控制对比组的影响。进行了分层多元回归分析,以确定研究方法学知识的差异有多少是干预措施造成的,同时控制促进者效应。结果显示结果表明,干预对研究方法论知识的影响具有统计学意义。此外,在研究方法论知识方面,干预组与对照组和比较组在统计上有明显差异。研究发现,促进者效应是一个调节变量。在没有促进者效应的情况下,为分离干预措施的影响而进行的层次回归分析表明,结果在统计学上具有重要意义。结论本研究提供了南非公共服务部门培训影响的证据,特别是利用了准实验性的前测-后测研究设计,并将促进者效应的影响与干预本身分离开来,从而丰富了知识库。
A quasi-experimental evaluation of a skills capacity workshop in the South African public service
Background: A paucity of evaluation studies could be identified that investigated the impact of training. The lacuna of research should be viewed in light of austerity measures as well as inability to measure return of investment on training expenditure, which is substantial year on year, especially in the context of public service. Objectives: This article reports on an impact evaluation of a research methodology skills capacity workshop. Method: A quasi-experimental evaluation design in which comparison groups were utilised to evaluate the impact of a research methodology skills development intervention. A paired-sample t -test was used to measure the knowledge increase whilst controlling for the influence of comparison groups by means of an analysis of variance. A hierarchical multiple regression analysis was performed to determine how much of the variance in research methodology knowledge could be contributed to the intervention whilst controlling for facilitator effect. Results: Results indicated that the intervention had a statistically significant impact on research methodology knowledge. Furthermore, the intervention group significantly differed statistically from the control and comparison groups with respect to research methodology knowledge. Facilitator effect was found to be a moderating variable. Hierarchical regression analysis performed to isolate the impact of intervention in the absence of facilitator effect revealed a statistically significant result. Conclusion: The study augments the corpus of knowledge by providing evidence of training impact within the South African public service, especially utilising a quasi-experimental pre-test–post-test research design and isolating the impact of facilitator effect from the intervention itself.
期刊介绍:
The journal publishes high quality peer-reviewed articles merit on any subject related to evaluation, and provide targeted information of professional interest to members of AfrEA and its national associations. Aims of the African Evaluation Journal (AEJ): -AEJ aims to be a high-quality, peer-reviewed journal that builds evaluation-related knowledge and practice in support of effective developmental policies on the African continent. -AEJ aims to provide a communication platform for scholars and practitioners of evaluation to share and debate ideas about evaluation theory and practice in Africa. -AEJ aims to promote cross-fertilisation of ideas and methodologies between countries and between evaluation scholars and practitioners in the developed and developing world. -AEJ aims to promote evaluation scholarship and authorship, and a culture of peer-review in the African evaluation community.