Pub Date : 2021-06-01DOI: 10.1177/1035719X211014397
R. Cummings
Rick Cummings has a long history of outstanding contributions to the field of evaluation. With over 40 years of working in the field, Rick is still active in working closely with the Australian Evaluation Society (AES) through his involvement in committees, workshops, mentoring and other supporting roles that the AES and the field of evaluation greatly benefit from. He has served several terms on the AES Board, including as Vice President and President in the early 2000s, and was inducted as an AES Fellow in 2013. Rick’s responses and stories below are an inspiring read, commencing with how he got started in evaluation and how his experience grew from there, including being employed in evaluation and research positions in several state government agencies where he oversaw and conducted evaluation studies in the areas of health, crime prevention, education and training. In 1995, Rick took up an academic position at Murdoch University where, among other activities, he taught policy and research at postgraduate level. Rick has also conducted a range of evaluation studies through his own small consultancy business. Currently, Rick balances work as an Emeritus Professor at Murdoch University with his role on the AES Awards and Recognition Working Group, Convenor of the AES Fellows and one of four Fellows involved in the newly commenced Mentoring Pilot Program. Not to mention, providing workshops in evaluation for the AES and the Institute for Public Administration of Australia in Western Australia. On the lighter side of this phenomenal load, Rick and his wife Kathy enjoy walking the Bibbulmun Track, a beautiful walking trail spanning 1,000 km from the hills of Kalamunda on the outskirts of Perth, all the way down to Albany in the south west of Western Australia. Rick’s interests include evaluation use, teaching evaluation, organisational learning and evaluative thinking.
{"title":"Evaluator Perspective","authors":"R. Cummings","doi":"10.1177/1035719X211014397","DOIUrl":"https://doi.org/10.1177/1035719X211014397","url":null,"abstract":"Rick Cummings has a long history of outstanding contributions to the field of evaluation. With over 40 years of working in the field, Rick is still active in working closely with the Australian Evaluation Society (AES) through his involvement in committees, workshops, mentoring and other supporting roles that the AES and the field of evaluation greatly benefit from. He has served several terms on the AES Board, including as Vice President and President in the early 2000s, and was inducted as an AES Fellow in 2013. Rick’s responses and stories below are an inspiring read, commencing with how he got started in evaluation and how his experience grew from there, including being employed in evaluation and research positions in several state government agencies where he oversaw and conducted evaluation studies in the areas of health, crime prevention, education and training. In 1995, Rick took up an academic position at Murdoch University where, among other activities, he taught policy and research at postgraduate level. Rick has also conducted a range of evaluation studies through his own small consultancy business. Currently, Rick balances work as an Emeritus Professor at Murdoch University with his role on the AES Awards and Recognition Working Group, Convenor of the AES Fellows and one of four Fellows involved in the newly commenced Mentoring Pilot Program. Not to mention, providing workshops in evaluation for the AES and the Institute for Public Administration of Australia in Western Australia. On the lighter side of this phenomenal load, Rick and his wife Kathy enjoy walking the Bibbulmun Track, a beautiful walking trail spanning 1,000 km from the hills of Kalamunda on the outskirts of Perth, all the way down to Albany in the south west of Western Australia. Rick’s interests include evaluation use, teaching evaluation, organisational learning and evaluative thinking.","PeriodicalId":37231,"journal":{"name":"Evaluation Journal of Australasia","volume":"21 1","pages":"120 - 123"},"PeriodicalIF":0.0,"publicationDate":"2021-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1177/1035719X211014397","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"45002716","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2021-04-30DOI: 10.1177/1035719X211010823
Colin Sharp
{"title":"Book review: Doing qualitative research in a digital world","authors":"Colin Sharp","doi":"10.1177/1035719X211010823","DOIUrl":"https://doi.org/10.1177/1035719X211010823","url":null,"abstract":"","PeriodicalId":37231,"journal":{"name":"Evaluation Journal of Australasia","volume":"21 1","pages":"182 - 184"},"PeriodicalIF":0.0,"publicationDate":"2021-04-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1177/1035719X211010823","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"43455671","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2021-04-19DOI: 10.1177/1035719X211008263
C. Roche, Graham Brown, Samantha Clune, N. Shields, V. Lewis
Adopting complexity thinking in the design, implementation and evaluation of health and social development programmes is of increasing interest. Understanding institutional contexts in which these programmes are located directly influences shaping and eventual uptake of evaluations and relevant findings. A nuanced appreciation of the relationship between complexity, institutional arrangements and evaluation theory and practice provides an opportunity to optimise both programme design and eventual success. However, the application of complexity and systems thinking within programme design and evaluation is variously understood. Some understand complexity as the multiple constituent aspects within a system, while others take a more sociological approach, understanding interactions between beliefs, ideas and systems as mechanisms of change. This article adopts an exploratory approach to examine complexity thinking in the relational, recursive interactions between context and project design, implementation and evaluation. In doing so, common terms will be used to demonstrate the nature of shared aspects of complexity across apparently different projects.
{"title":"Thinking with complexity in evaluation: A case study review","authors":"C. Roche, Graham Brown, Samantha Clune, N. Shields, V. Lewis","doi":"10.1177/1035719X211008263","DOIUrl":"https://doi.org/10.1177/1035719X211008263","url":null,"abstract":"Adopting complexity thinking in the design, implementation and evaluation of health and social development programmes is of increasing interest. Understanding institutional contexts in which these programmes are located directly influences shaping and eventual uptake of evaluations and relevant findings. A nuanced appreciation of the relationship between complexity, institutional arrangements and evaluation theory and practice provides an opportunity to optimise both programme design and eventual success. However, the application of complexity and systems thinking within programme design and evaluation is variously understood. Some understand complexity as the multiple constituent aspects within a system, while others take a more sociological approach, understanding interactions between beliefs, ideas and systems as mechanisms of change. This article adopts an exploratory approach to examine complexity thinking in the relational, recursive interactions between context and project design, implementation and evaluation. In doing so, common terms will be used to demonstrate the nature of shared aspects of complexity across apparently different projects.","PeriodicalId":37231,"journal":{"name":"Evaluation Journal of Australasia","volume":"21 1","pages":"146 - 162"},"PeriodicalIF":0.0,"publicationDate":"2021-04-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1177/1035719X211008263","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"46283252","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2021-04-05DOI: 10.1177/1035719X211000154
R. Cummings
{"title":"Book review: Changing bureaucracies: Adapting to uncertainty, and how evaluation can help","authors":"R. Cummings","doi":"10.1177/1035719X211000154","DOIUrl":"https://doi.org/10.1177/1035719X211000154","url":null,"abstract":"","PeriodicalId":37231,"journal":{"name":"Evaluation Journal of Australasia","volume":"21 1","pages":"124 - 126"},"PeriodicalIF":0.0,"publicationDate":"2021-04-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1177/1035719X211000154","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"42903421","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2021-03-24DOI: 10.1177/1035719X211000880
Alexandra Kate Williamson, Kylie L. Kingston
Philanthropic foundations routinely evaluate and measure the performance of nonprofit organisations to which they distribute funds, as well as the programmes that are funded. Another aspect of philanthropic foundations’ evaluation processes, which receives comparatively little attention within academic or practitioner literature, concerns evaluations of grant applications. While the focus of philanthropic evaluation literature is mainly on ‘how evaluation is done’, the focus of this article is on ‘how evaluation is understood’. This article details perspectives from interviews with 28 managers and trustees of Public Ancillary Funds as part of a wider study on the accountability of foundations. These public foundations must fundraise from the public, and donations to them are deductible against the taxable income of the donor, resulting in a significant element of accountability to the public for their effectiveness and evaluation of the distribution of their funds. Four main themes emerged through the exploration of how evaluation is understood from the perspective of these senior foundation leaders: motivations, values, criteria and processes of evaluation.
{"title":"Performance measurement, evaluation and accountability in public philanthropic foundations","authors":"Alexandra Kate Williamson, Kylie L. Kingston","doi":"10.1177/1035719X211000880","DOIUrl":"https://doi.org/10.1177/1035719X211000880","url":null,"abstract":"Philanthropic foundations routinely evaluate and measure the performance of nonprofit organisations to which they distribute funds, as well as the programmes that are funded. Another aspect of philanthropic foundations’ evaluation processes, which receives comparatively little attention within academic or practitioner literature, concerns evaluations of grant applications. While the focus of philanthropic evaluation literature is mainly on ‘how evaluation is done’, the focus of this article is on ‘how evaluation is understood’. This article details perspectives from interviews with 28 managers and trustees of Public Ancillary Funds as part of a wider study on the accountability of foundations. These public foundations must fundraise from the public, and donations to them are deductible against the taxable income of the donor, resulting in a significant element of accountability to the public for their effectiveness and evaluation of the distribution of their funds. Four main themes emerged through the exploration of how evaluation is understood from the perspective of these senior foundation leaders: motivations, values, criteria and processes of evaluation.","PeriodicalId":37231,"journal":{"name":"Evaluation Journal of Australasia","volume":"21 1","pages":"101 - 119"},"PeriodicalIF":0.0,"publicationDate":"2021-03-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1177/1035719X211000880","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"42257432","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2021-03-01DOI: 10.1177/1035719X21993938
J. Guenther, I. Falk
Evaluations are often focused on assessing merit, value, outcome or some other feature of a programme, project, policy or some other object. Evaluation research is then more concerned with the particular rather than the general – even more so, when qualitative methods are used. But does this mean that evaluations should not be used to generalise? If it is possible to generalise from evaluations, under what circumstances can this be legitimately achieved? The authors of this article have previously argued for generalising from qualitative research (GQR), and in this article, they extrapolate the discussion to the field of evaluation. First, the article begins with a discussion of the definitions of generalisability in research, recapping briefly on our arguments for GQR. Second, the differentiation between research and evaluation is explored with consideration of what literature there is to justify generalisation from qualitative evaluation (GQE). Third, a typology derived from the literature is developed, to sort 54 evaluation projects. Fourth, material from a suite of evaluation projects is drawn from to demonstrate how the typology of generalisation applies in the context of evaluations conducted in several fields of study. Finally, we suggest a model for GQE.
{"title":"Generalising from qualitative evaluation","authors":"J. Guenther, I. Falk","doi":"10.1177/1035719X21993938","DOIUrl":"https://doi.org/10.1177/1035719X21993938","url":null,"abstract":"Evaluations are often focused on assessing merit, value, outcome or some other feature of a programme, project, policy or some other object. Evaluation research is then more concerned with the particular rather than the general – even more so, when qualitative methods are used. But does this mean that evaluations should not be used to generalise? If it is possible to generalise from evaluations, under what circumstances can this be legitimately achieved? The authors of this article have previously argued for generalising from qualitative research (GQR), and in this article, they extrapolate the discussion to the field of evaluation. First, the article begins with a discussion of the definitions of generalisability in research, recapping briefly on our arguments for GQR. Second, the differentiation between research and evaluation is explored with consideration of what literature there is to justify generalisation from qualitative evaluation (GQE). Third, a typology derived from the literature is developed, to sort 54 evaluation projects. Fourth, material from a suite of evaluation projects is drawn from to demonstrate how the typology of generalisation applies in the context of evaluations conducted in several fields of study. Finally, we suggest a model for GQE.","PeriodicalId":37231,"journal":{"name":"Evaluation Journal of Australasia","volume":"21 1","pages":"7 - 23"},"PeriodicalIF":0.0,"publicationDate":"2021-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1177/1035719X21993938","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"45046118","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2021-03-01DOI: 10.1177/1035719x21999573
A. Rutter
Anthea Rutter has an incredible background in evaluation, working with many other like-minded and highprofile evaluators over many years helping to shape the evaluation landscape in Australia. Anthea is a fellow of the AES and is a committed and dedicated person who works tirelessly and is always willing to help others. She is an Investigator in the Centre for Programme Evaluation at the University of Melbourne and has extensive experience working with a wide range of community and national organisations. She is particularly knowledgeable and experienced in social research including education, mental health, police and military, emergency management and social welfare projects. Anthea’s current projects include an evaluation of Club Respect for Victorian Women’s Trust, the evaluation of Community Grants Programme for Melbourne Disability Institute and the evaluation of VicHealth’s Arts Strategy.
{"title":"Evaluator Perspective","authors":"A. Rutter","doi":"10.1177/1035719x21999573","DOIUrl":"https://doi.org/10.1177/1035719x21999573","url":null,"abstract":"Anthea Rutter has an incredible background in evaluation, working with many other like-minded and highprofile evaluators over many years helping to shape the evaluation landscape in Australia. Anthea is a fellow of the AES and is a committed and dedicated person who works tirelessly and is always willing to help others. She is an Investigator in the Centre for Programme Evaluation at the University of Melbourne and has extensive experience working with a wide range of community and national organisations. She is particularly knowledgeable and experienced in social research including education, mental health, police and military, emergency management and social welfare projects. Anthea’s current projects include an evaluation of Club Respect for Victorian Women’s Trust, the evaluation of Community Grants Programme for Melbourne Disability Institute and the evaluation of VicHealth’s Arts Strategy.","PeriodicalId":37231,"journal":{"name":"Evaluation Journal of Australasia","volume":"21 1","pages":"54 - 57"},"PeriodicalIF":0.0,"publicationDate":"2021-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1177/1035719x21999573","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"42571061","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2021-03-01DOI: 10.1177/1035719X21990117
Bronwyn Rossingh
Welcome to the first issue of the EJA for 2021. The EJA like any business or organisation is in a cycle of continuous improvement with a keen focus on developing strategies and opportunities to increase the number of submissions. We endeavour to inspire and attract articles of professional merit on any subject connected with evaluation including information of professional interest. The EJA aims to position itself as a quality journal that enjoys a growing national and international readership to ensure that the practice and theory of evaluation continue to be stimulated, expanded and strengthened while providing a platform of professional learning. This March 2021 issue encapsulates a representation of the work, history and thinking of a diverse range of evaluators who demonstrate and bring to life exactly what the EJA strives to achieve in keeping with its vision and purpose. A further platform of professional development and learning is the annual International Evaluation Conference conducted by the Australian Evaluation Society (AES) to be held in late September this year in Brisbane. The theme for the 2021 conference is Through the Lens. This theme invites prospective attendees to consider the way we reflect on the wisdom of our elders; to refract our knowledge and learning in ways that enable us to embrace diversity in our communities: as well as to refocus on the future and create shared visions that resonate. This issue provides an early introduction to some of the subthemes of the conference by offering an alternative thinking under the Refract subtheme to continually evolve and diversify our evaluative thinking and practice to be relevant. These articles also do justice to the Refocus subtheme to push the boundaries, practising on the edge and therefore expanding and strengthening knowledge within the evaluation field. It is hoped that these articles will inspire readers to submit their own articles to the EJA as early preparation for conference presentations. The peer-reviewing process supports authors to improve their submissions for eventual publication. In this issue, we have three very unique articles offering different ways of thinking in the evaluation field as the authors seek to strengthen and inform evaluation theory and practice by extending evaluative thinking through tools and knowledge sharing. In the first article for this issue, John Guenther and Ian Falk formulate a model to enable an approach to generalise from qualitative evaluation (GQE). The article titled ‘Generalising from qualitative evaluation’ commences with discussion of the definitions of generalisability in a research context, then seeks to find the differentiation 990117 EVJ0010.1177/1035719X21990117Evaluation Journal of AustralasiaEditorial editorial2021
{"title":"Editorial","authors":"Bronwyn Rossingh","doi":"10.1177/1035719X21990117","DOIUrl":"https://doi.org/10.1177/1035719X21990117","url":null,"abstract":"Welcome to the first issue of the EJA for 2021. The EJA like any business or organisation is in a cycle of continuous improvement with a keen focus on developing strategies and opportunities to increase the number of submissions. We endeavour to inspire and attract articles of professional merit on any subject connected with evaluation including information of professional interest. The EJA aims to position itself as a quality journal that enjoys a growing national and international readership to ensure that the practice and theory of evaluation continue to be stimulated, expanded and strengthened while providing a platform of professional learning. This March 2021 issue encapsulates a representation of the work, history and thinking of a diverse range of evaluators who demonstrate and bring to life exactly what the EJA strives to achieve in keeping with its vision and purpose. A further platform of professional development and learning is the annual International Evaluation Conference conducted by the Australian Evaluation Society (AES) to be held in late September this year in Brisbane. The theme for the 2021 conference is Through the Lens. This theme invites prospective attendees to consider the way we reflect on the wisdom of our elders; to refract our knowledge and learning in ways that enable us to embrace diversity in our communities: as well as to refocus on the future and create shared visions that resonate. This issue provides an early introduction to some of the subthemes of the conference by offering an alternative thinking under the Refract subtheme to continually evolve and diversify our evaluative thinking and practice to be relevant. These articles also do justice to the Refocus subtheme to push the boundaries, practising on the edge and therefore expanding and strengthening knowledge within the evaluation field. It is hoped that these articles will inspire readers to submit their own articles to the EJA as early preparation for conference presentations. The peer-reviewing process supports authors to improve their submissions for eventual publication. In this issue, we have three very unique articles offering different ways of thinking in the evaluation field as the authors seek to strengthen and inform evaluation theory and practice by extending evaluative thinking through tools and knowledge sharing. In the first article for this issue, John Guenther and Ian Falk formulate a model to enable an approach to generalise from qualitative evaluation (GQE). The article titled ‘Generalising from qualitative evaluation’ commences with discussion of the definitions of generalisability in a research context, then seeks to find the differentiation 990117 EVJ0010.1177/1035719X21990117Evaluation Journal of AustralasiaEditorial editorial2021","PeriodicalId":37231,"journal":{"name":"Evaluation Journal of Australasia","volume":"21 1","pages":"3 - 6"},"PeriodicalIF":0.0,"publicationDate":"2021-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1177/1035719X21990117","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"41905250","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2021-03-01DOI: 10.1177/1035719X20986251
Mardi Trompf, F. Kotvojs
Donors prioritise evaluation of Value for Money (VfM) in development interventions; however, the theory and practice of doing so is still developing and applied inconsistently. Theory found in donor government guides and textbooks is often high-level and economic evaluation theory can be difficult to apply in practice. This is compounded when there are multiple stakeholder groups, patchy data quality and short time horizons for decision making. This article demonstrates how Cost-Utility Analysis (CUA) can be used as a programme evaluation tool to bring practice together with theory to meet donor needs, suit development environments and provide evaluation robustness in defensible VfM conclusions. The example described here is in the evaluation of a programme in Samoa, where almost AU$10 million was donated by the Governments of New Zealand and Australia for tourism industry assistance in recovery from the 2012 Tropical Cyclone Evan (TCE). The programme design had six delivery modalities and its subsequent evaluation included an analysis of the cost utility of each modality, feeding into a VfM conclusion. This practical application of CUA theory demonstrates an effective approach to evaluating VfM.
{"title":"Practical application of cost-utility analysis in summative evaluation","authors":"Mardi Trompf, F. Kotvojs","doi":"10.1177/1035719X20986251","DOIUrl":"https://doi.org/10.1177/1035719X20986251","url":null,"abstract":"Donors prioritise evaluation of Value for Money (VfM) in development interventions; however, the theory and practice of doing so is still developing and applied inconsistently. Theory found in donor government guides and textbooks is often high-level and economic evaluation theory can be difficult to apply in practice. This is compounded when there are multiple stakeholder groups, patchy data quality and short time horizons for decision making. This article demonstrates how Cost-Utility Analysis (CUA) can be used as a programme evaluation tool to bring practice together with theory to meet donor needs, suit development environments and provide evaluation robustness in defensible VfM conclusions. The example described here is in the evaluation of a programme in Samoa, where almost AU$10 million was donated by the Governments of New Zealand and Australia for tourism industry assistance in recovery from the 2012 Tropical Cyclone Evan (TCE). The programme design had six delivery modalities and its subsequent evaluation included an analysis of the cost utility of each modality, feeding into a VfM conclusion. This practical application of CUA theory demonstrates an effective approach to evaluating VfM.","PeriodicalId":37231,"journal":{"name":"Evaluation Journal of Australasia","volume":"21 1","pages":"24 - 39"},"PeriodicalIF":0.0,"publicationDate":"2021-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1177/1035719X20986251","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"44246734","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2021-02-05DOI: 10.1177/1035719X21999110
Ruth L. Knight, Kylie L. Kingston
This article charts the innovative evaluation journey an Australian nonprofit organisation, The Pyjama Foundation (PJF), has taken when designing an evaluation instrument to gain feedback from programme beneficiaries. PJF sought to develop a formal, targeted approach to hear the perspectives of children living in out-of-home care, who are involved with their Love of Learning educational programme. The design process included two focus group discussions with foster carers, programme volunteers, and child development experts. From this, an evaluation survey for children to use was developed. The survey’s underpinning conceptual framework, based on key protective factors influencing educational outcomes for children in out-of-home care, is a key contribution of this research. In addition, the design and implementation issues PJF encountered contribute insights for other nonprofit organisations and evaluators and academic knowledge towards evaluations involving children and vulnerable stakeholders. Hearing children’s views on programmes they are involved in is vital in helping to develop safe spaces for children to engage, where their thoughts are valued and opinions matter. As such, the processes detailed within this article support the development of evaluation practices that value children’s voices.
{"title":"Valuing beneficiary voice: Involving children living in out-of-home care in programme evaluation","authors":"Ruth L. Knight, Kylie L. Kingston","doi":"10.1177/1035719X21999110","DOIUrl":"https://doi.org/10.1177/1035719X21999110","url":null,"abstract":"This article charts the innovative evaluation journey an Australian nonprofit organisation, The Pyjama Foundation (PJF), has taken when designing an evaluation instrument to gain feedback from programme beneficiaries. PJF sought to develop a formal, targeted approach to hear the perspectives of children living in out-of-home care, who are involved with their Love of Learning educational programme. The design process included two focus group discussions with foster carers, programme volunteers, and child development experts. From this, an evaluation survey for children to use was developed. The survey’s underpinning conceptual framework, based on key protective factors influencing educational outcomes for children in out-of-home care, is a key contribution of this research. In addition, the design and implementation issues PJF encountered contribute insights for other nonprofit organisations and evaluators and academic knowledge towards evaluations involving children and vulnerable stakeholders. Hearing children’s views on programmes they are involved in is vital in helping to develop safe spaces for children to engage, where their thoughts are valued and opinions matter. As such, the processes detailed within this article support the development of evaluation practices that value children’s voices.","PeriodicalId":37231,"journal":{"name":"Evaluation Journal of Australasia","volume":"21 1","pages":"69 - 84"},"PeriodicalIF":0.0,"publicationDate":"2021-02-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1177/1035719X21999110","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"45227172","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}