Pub Date : 2020-12-04DOI: 10.1177/1035719X20977522
Leanne M. Kelly, C. Reid
Monitoring is largely ignored in its capacity to provide a distinct contribution to evaluation. It is often thought of as a process of collecting data to feed into an evaluation, rather than for its own powerful transformative potential. Evaluation is considered a mechanism for producing findings that enable learning, improvement and decision-making; but what if monitoring could produce these same outcomes with, in some cases, greater alignment to quality characteristics of utility, timeliness, feasibility, propriety, accuracy, completeness and monitoring accountability? This article examines the utilisation and value of monitoring through a case study of a government funded 12-month rural health project in Victoria, Australia. The project initially commissioned a baseline to assess against post-project outcomes. However, adopting a utilisation-focused perspective to prepare for use and support stakeholder engagement enabled implementation of a multipurpose monitoring framework. The case study provides examples of monitoring in action with timely learning, decision-making and improvements resulting in incremental system and behaviour changes, rather than relying on periodic outcome recommendations at evaluation completion. This article adds to evaluation theory and practice through highlighting monitoring as a significant mechanism for enabling learning, decision-making, and improvement.
{"title":"Baselines and monitoring: More than a means to measure the end","authors":"Leanne M. Kelly, C. Reid","doi":"10.1177/1035719X20977522","DOIUrl":"https://doi.org/10.1177/1035719X20977522","url":null,"abstract":"Monitoring is largely ignored in its capacity to provide a distinct contribution to evaluation. It is often thought of as a process of collecting data to feed into an evaluation, rather than for its own powerful transformative potential. Evaluation is considered a mechanism for producing findings that enable learning, improvement and decision-making; but what if monitoring could produce these same outcomes with, in some cases, greater alignment to quality characteristics of utility, timeliness, feasibility, propriety, accuracy, completeness and monitoring accountability? This article examines the utilisation and value of monitoring through a case study of a government funded 12-month rural health project in Victoria, Australia. The project initially commissioned a baseline to assess against post-project outcomes. However, adopting a utilisation-focused perspective to prepare for use and support stakeholder engagement enabled implementation of a multipurpose monitoring framework. The case study provides examples of monitoring in action with timely learning, decision-making and improvements resulting in incremental system and behaviour changes, rather than relying on periodic outcome recommendations at evaluation completion. This article adds to evaluation theory and practice through highlighting monitoring as a significant mechanism for enabling learning, decision-making, and improvement.","PeriodicalId":37231,"journal":{"name":"Evaluation Journal of Australasia","volume":"21 1","pages":"40 - 53"},"PeriodicalIF":0.0,"publicationDate":"2020-12-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1177/1035719X20977522","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"48385495","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2020-12-01DOI: 10.1177/1035719X20971853
P. Graves
This article examines what impeded programme evaluation from being embedded in the Australian Public Service (APS), being relevant to the Australian Government’s current priority of embedding evaluation in the APS. It draws on a case study of evaluation as the major element of the 1980s APS ‘Managing for Results’ (MfR) reform and the reasons for evaluation’s later demise. During MfR, evaluation was intended to demonstrate the effectiveness of APS programmes. Although evaluation was incorporated into APS practice by 1992, after 1997, evaluation was no longer required. Currently, agencies must demonstrate their annual non-financial performance over 4 years under the Public Governance, Performance and Accountability Act 2013, with evaluation being recommended to support this requirement. It is pertinent to current Government consideration of a National Indigenous Evaluation Strategy, which supports the creation of an independent Evaluator-General to embed APS evaluation practice.
{"title":"Evaluation in the Australian Public Service: Formerly practised – Not yet embedded","authors":"P. Graves","doi":"10.1177/1035719X20971853","DOIUrl":"https://doi.org/10.1177/1035719X20971853","url":null,"abstract":"This article examines what impeded programme evaluation from being embedded in the Australian Public Service (APS), being relevant to the Australian Government’s current priority of embedding evaluation in the APS. It draws on a case study of evaluation as the major element of the 1980s APS ‘Managing for Results’ (MfR) reform and the reasons for evaluation’s later demise. During MfR, evaluation was intended to demonstrate the effectiveness of APS programmes. Although evaluation was incorporated into APS practice by 1992, after 1997, evaluation was no longer required. Currently, agencies must demonstrate their annual non-financial performance over 4 years under the Public Governance, Performance and Accountability Act 2013, with evaluation being recommended to support this requirement. It is pertinent to current Government consideration of a National Indigenous Evaluation Strategy, which supports the creation of an independent Evaluator-General to embed APS evaluation practice.","PeriodicalId":37231,"journal":{"name":"Evaluation Journal of Australasia","volume":"20 1","pages":"229 - 243"},"PeriodicalIF":0.0,"publicationDate":"2020-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1177/1035719X20971853","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"45666649","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2020-12-01DOI: 10.1177/1035719x20972217
Bronwyn Rossingh, Carol Quadrelli, Jeff Adams, Kylie L. Kingston
{"title":"Editorial","authors":"Bronwyn Rossingh, Carol Quadrelli, Jeff Adams, Kylie L. Kingston","doi":"10.1177/1035719x20972217","DOIUrl":"https://doi.org/10.1177/1035719x20972217","url":null,"abstract":"","PeriodicalId":37231,"journal":{"name":"Evaluation Journal of Australasia","volume":"20 1","pages":"193 - 196"},"PeriodicalIF":0.0,"publicationDate":"2020-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1177/1035719x20972217","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"43455226","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2020-11-09DOI: 10.1177/1035719X20971854
K. Adusei-Asante, E. Bennett, W. Simpson, Sharlene Hindmarsh, Beth Harvey, Cherilyn McMeekin
Evaluability assessment focuses on the readiness of organisations to carry out evaluations. Scholars argue that evaluability assessment needs to focus on internal evaluation systems and tools and their capability to measure programmes and services reliably and credibly. Even so, literature on best practice guidelines on evaluability assessment within the context of the not-for-profit sector appears to be rare. We seek to begin to fill this gap by presenting lessons learned from Ngala, Western Australia, when we reviewed the organisation’s evaluation practice and culture in 2018/2019. The Service Model and Outcomes Measurement Audit project assessed the extent to which service models within Ngala aligned with the organisation’s standardised service model and individual service contracts, as well as consistency of outcomes, data collection and reporting practices. Insights obtained from the project and their implications for evaluability assessment practice are discussed.
{"title":"Evaluating our evaluability: Lessons from Ngala, Western Australia","authors":"K. Adusei-Asante, E. Bennett, W. Simpson, Sharlene Hindmarsh, Beth Harvey, Cherilyn McMeekin","doi":"10.1177/1035719X20971854","DOIUrl":"https://doi.org/10.1177/1035719X20971854","url":null,"abstract":"Evaluability assessment focuses on the readiness of organisations to carry out evaluations. Scholars argue that evaluability assessment needs to focus on internal evaluation systems and tools and their capability to measure programmes and services reliably and credibly. Even so, literature on best practice guidelines on evaluability assessment within the context of the not-for-profit sector appears to be rare. We seek to begin to fill this gap by presenting lessons learned from Ngala, Western Australia, when we reviewed the organisation’s evaluation practice and culture in 2018/2019. The Service Model and Outcomes Measurement Audit project assessed the extent to which service models within Ngala aligned with the organisation’s standardised service model and individual service contracts, as well as consistency of outcomes, data collection and reporting practices. Insights obtained from the project and their implications for evaluability assessment practice are discussed.","PeriodicalId":37231,"journal":{"name":"Evaluation Journal of Australasia","volume":"20 1","pages":"212 - 228"},"PeriodicalIF":0.0,"publicationDate":"2020-11-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1177/1035719X20971854","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"45360724","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2020-11-05DOI: 10.1177/1035719X20969986
Tania Buck
The COVID-19 pandemic has forced Australia, and our global counterparts, on a steep learning curve and rapid adaptation journey. Understanding the real impact of pandemic response interventions will be crucial to our collective success in this ongoing endeavour. This article will propose and outline an integrated, sequential, mixed methods approach to analysing the impact of Australia’s emergency response to the COVID-19 pandemic across its states and territories. We begin with a brief description of the Australian Health Sector Emergency Response Plan for Novel Coronavirus (COVID-19). In the second section, the discussion will move on to defining the nature of impact analysis and evaluation and when to use them. The third section outlines considerations relevant to selecting an impact analysis approach as it pertains to infectious disease pandemic response and virus transmission, morbidity and mortality outcomes. The fourth section presents and discusses a conceptual framework for an integrated impact analysis approach, clearly delineating the link between proposed approach, impact analysis questions and the outlined pandemic response considerations.
{"title":"Analysing the impact of the Australian health sector emergency response plan for Novel Coronavirus (COVID-19): A proposed approach","authors":"Tania Buck","doi":"10.1177/1035719X20969986","DOIUrl":"https://doi.org/10.1177/1035719X20969986","url":null,"abstract":"The COVID-19 pandemic has forced Australia, and our global counterparts, on a steep learning curve and rapid adaptation journey. Understanding the real impact of pandemic response interventions will be crucial to our collective success in this ongoing endeavour. This article will propose and outline an integrated, sequential, mixed methods approach to analysing the impact of Australia’s emergency response to the COVID-19 pandemic across its states and territories. We begin with a brief description of the Australian Health Sector Emergency Response Plan for Novel Coronavirus (COVID-19). In the second section, the discussion will move on to defining the nature of impact analysis and evaluation and when to use them. The third section outlines considerations relevant to selecting an impact analysis approach as it pertains to infectious disease pandemic response and virus transmission, morbidity and mortality outcomes. The fourth section presents and discusses a conceptual framework for an integrated impact analysis approach, clearly delineating the link between proposed approach, impact analysis questions and the outlined pandemic response considerations.","PeriodicalId":37231,"journal":{"name":"Evaluation Journal of Australasia","volume":"20 1","pages":"197 - 211"},"PeriodicalIF":0.0,"publicationDate":"2020-11-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1177/1035719X20969986","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"44616157","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2020-10-30DOI: 10.1177/1035719X20969441
Amy Lawton
{"title":"Book Review: Practical mapping for applied research and program evaluation","authors":"Amy Lawton","doi":"10.1177/1035719X20969441","DOIUrl":"https://doi.org/10.1177/1035719X20969441","url":null,"abstract":"","PeriodicalId":37231,"journal":{"name":"Evaluation Journal of Australasia","volume":"20 1","pages":"248 - 250"},"PeriodicalIF":0.0,"publicationDate":"2020-10-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1177/1035719X20969441","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"46560507","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2020-09-01DOI: 10.1177/1035719X20944010
P. Harris, M. Barry, Lyndal Sleep, Jess Griffiths, L. Briggs
Partners in Recovery (PIR) was an Australian government initiative designed to provide support and service linkage for individuals with complex needs living with severe and persistent mental illness. This article reports the external evaluation process and approach that was undertaken of the Gold Coast Partners in Recovery initiative between September and December 2015 regarding the achievement of PIR outcomes. The evaluation of this consortia-based initiative was framed using principles of realistic evaluation and recovery-oriented practice. Numerous evaluations of similar initiatives have recently been undertaken, each adopting different approaches and methods in accordance with local needs and expectations. The incorporation of realistic evaluation with recovery-oriented principles in this mixed methods research design, however, offers a unique perspective. This can be used to inform future developments in evaluative practice particularly in the area of recovery-oriented services and/or partnership-focused, capacity-building initiatives.
{"title":"Integrating recovery-oriented and realistic evaluation principles into an evaluation of a Partners in Recovery programme","authors":"P. Harris, M. Barry, Lyndal Sleep, Jess Griffiths, L. Briggs","doi":"10.1177/1035719X20944010","DOIUrl":"https://doi.org/10.1177/1035719X20944010","url":null,"abstract":"Partners in Recovery (PIR) was an Australian government initiative designed to provide support and service linkage for individuals with complex needs living with severe and persistent mental illness. This article reports the external evaluation process and approach that was undertaken of the Gold Coast Partners in Recovery initiative between September and December 2015 regarding the achievement of PIR outcomes. The evaluation of this consortia-based initiative was framed using principles of realistic evaluation and recovery-oriented practice. Numerous evaluations of similar initiatives have recently been undertaken, each adopting different approaches and methods in accordance with local needs and expectations. The incorporation of realistic evaluation with recovery-oriented principles in this mixed methods research design, however, offers a unique perspective. This can be used to inform future developments in evaluative practice particularly in the area of recovery-oriented services and/or partnership-focused, capacity-building initiatives.","PeriodicalId":37231,"journal":{"name":"Evaluation Journal of Australasia","volume":"20 1","pages":"140 - 156"},"PeriodicalIF":0.0,"publicationDate":"2020-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1177/1035719X20944010","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"44945101","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}