对南非公共服务部门技能能力讲习班的准实验性评估

Q2 Social Sciences African Evaluation Journal Pub Date : 2020-03-31 DOI:10.4102/aej.v8i1.421
P. Jonck, R. D. Coning
{"title":"对南非公共服务部门技能能力讲习班的准实验性评估","authors":"P. Jonck, R. D. Coning","doi":"10.4102/aej.v8i1.421","DOIUrl":null,"url":null,"abstract":"Background: A paucity of evaluation studies could be identified that investigated the impact of training. The lacuna of research should be viewed in light of austerity measures as well as inability to measure return of investment on training expenditure, which is substantial year on year, especially in the context of public service. Objectives: This article reports on an impact evaluation of a research methodology skills capacity workshop. Method: A quasi-experimental evaluation design in which comparison groups were utilised to evaluate the impact of a research methodology skills development intervention. A paired-sample t -test was used to measure the knowledge increase whilst controlling for the influence of comparison groups by means of an analysis of variance. A hierarchical multiple regression analysis was performed to determine how much of the variance in research methodology knowledge could be contributed to the intervention whilst controlling for facilitator effect. Results: Results indicated that the intervention had a statistically significant impact on research methodology knowledge. Furthermore, the intervention group significantly differed statistically from the control and comparison groups with respect to research methodology knowledge. Facilitator effect was found to be a moderating variable. Hierarchical regression analysis performed to isolate the impact of intervention in the absence of facilitator effect revealed a statistically significant result. Conclusion: The study augments the corpus of knowledge by providing evidence of training impact within the South African public service, especially utilising a quasi-experimental pre-test–post-test research design and isolating the impact of facilitator effect from the intervention itself.","PeriodicalId":37531,"journal":{"name":"African Evaluation Journal","volume":" 17","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2020-03-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"A quasi-experimental evaluation of a skills capacity workshop in the South African public service\",\"authors\":\"P. Jonck, R. D. Coning\",\"doi\":\"10.4102/aej.v8i1.421\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Background: A paucity of evaluation studies could be identified that investigated the impact of training. The lacuna of research should be viewed in light of austerity measures as well as inability to measure return of investment on training expenditure, which is substantial year on year, especially in the context of public service. Objectives: This article reports on an impact evaluation of a research methodology skills capacity workshop. Method: A quasi-experimental evaluation design in which comparison groups were utilised to evaluate the impact of a research methodology skills development intervention. A paired-sample t -test was used to measure the knowledge increase whilst controlling for the influence of comparison groups by means of an analysis of variance. A hierarchical multiple regression analysis was performed to determine how much of the variance in research methodology knowledge could be contributed to the intervention whilst controlling for facilitator effect. Results: Results indicated that the intervention had a statistically significant impact on research methodology knowledge. Furthermore, the intervention group significantly differed statistically from the control and comparison groups with respect to research methodology knowledge. Facilitator effect was found to be a moderating variable. Hierarchical regression analysis performed to isolate the impact of intervention in the absence of facilitator effect revealed a statistically significant result. Conclusion: The study augments the corpus of knowledge by providing evidence of training impact within the South African public service, especially utilising a quasi-experimental pre-test–post-test research design and isolating the impact of facilitator effect from the intervention itself.\",\"PeriodicalId\":37531,\"journal\":{\"name\":\"African Evaluation Journal\",\"volume\":\" 17\",\"pages\":\"\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2020-03-31\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"African Evaluation Journal\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.4102/aej.v8i1.421\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"Social Sciences\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"African Evaluation Journal","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.4102/aej.v8i1.421","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"Social Sciences","Score":null,"Total":0}
引用次数: 0

摘要

背景:调查培训影响的评估研究很少。应从紧缩措施以及无法衡量培训支出的投资回报率的角度来看待这一研究空白,因为培训支出每年都很可观,尤其是在公共服务领域。目的:本文报告了对研究方法技能能力讲习班的影响评估。评估方法采用准实验评估设计,利用比较组来评估研究方法技能发展干预措施的影响。采用配对样本 t 检验来衡量知识增长情况,同时通过方差分析来控制对比组的影响。进行了分层多元回归分析,以确定研究方法学知识的差异有多少是干预措施造成的,同时控制促进者效应。结果显示结果表明,干预对研究方法论知识的影响具有统计学意义。此外,在研究方法论知识方面,干预组与对照组和比较组在统计上有明显差异。研究发现,促进者效应是一个调节变量。在没有促进者效应的情况下,为分离干预措施的影响而进行的层次回归分析表明,结果在统计学上具有重要意义。结论本研究提供了南非公共服务部门培训影响的证据,特别是利用了准实验性的前测-后测研究设计,并将促进者效应的影响与干预本身分离开来,从而丰富了知识库。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
A quasi-experimental evaluation of a skills capacity workshop in the South African public service
Background: A paucity of evaluation studies could be identified that investigated the impact of training. The lacuna of research should be viewed in light of austerity measures as well as inability to measure return of investment on training expenditure, which is substantial year on year, especially in the context of public service. Objectives: This article reports on an impact evaluation of a research methodology skills capacity workshop. Method: A quasi-experimental evaluation design in which comparison groups were utilised to evaluate the impact of a research methodology skills development intervention. A paired-sample t -test was used to measure the knowledge increase whilst controlling for the influence of comparison groups by means of an analysis of variance. A hierarchical multiple regression analysis was performed to determine how much of the variance in research methodology knowledge could be contributed to the intervention whilst controlling for facilitator effect. Results: Results indicated that the intervention had a statistically significant impact on research methodology knowledge. Furthermore, the intervention group significantly differed statistically from the control and comparison groups with respect to research methodology knowledge. Facilitator effect was found to be a moderating variable. Hierarchical regression analysis performed to isolate the impact of intervention in the absence of facilitator effect revealed a statistically significant result. Conclusion: The study augments the corpus of knowledge by providing evidence of training impact within the South African public service, especially utilising a quasi-experimental pre-test–post-test research design and isolating the impact of facilitator effect from the intervention itself.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
African Evaluation Journal
African Evaluation Journal Social Sciences-Sociology and Political Science
CiteScore
1.50
自引率
0.00%
发文量
16
审稿时长
20 weeks
期刊介绍: The journal publishes high quality peer-reviewed articles merit on any subject related to evaluation, and provide targeted information of professional interest to members of AfrEA and its national associations. Aims of the African Evaluation Journal (AEJ): -AEJ aims to be a high-quality, peer-reviewed journal that builds evaluation-related knowledge and practice in support of effective developmental policies on the African continent. -AEJ aims to provide a communication platform for scholars and practitioners of evaluation to share and debate ideas about evaluation theory and practice in Africa. -AEJ aims to promote cross-fertilisation of ideas and methodologies between countries and between evaluation scholars and practitioners in the developed and developing world. -AEJ aims to promote evaluation scholarship and authorship, and a culture of peer-review in the African evaluation community.
期刊最新文献
Erratum: Review of Goldman and Pabari’s book through the lens of the work of Sulley Gariba Table of Contents Vol 11, No 1 (2023) Improving citizen-based monitoring in South Africa: A social media model A results-based monitoring and evaluation system for the Namibian Child Support Grant programme Lessons learned from an occupational therapy programme needs assessment
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1