{"title":"对权力说真话:通过评估者和决策者的眼睛探索部委的评估部门","authors":"Lotte Levelt, Nicky R. M. Pouw","doi":"10.1177/13563890221109620","DOIUrl":null,"url":null,"abstract":"‘Evidence-based’ development policy has caused impact evaluations to prioritise accountability over addressing processual learning questions. Moreover, evaluation scholarship is dominated by surveys, whereas qualitative research remains scant. This article traces one particular evaluation, within the independent Evaluation Department of the Dutch Ministry of Foreign Affairs. It asks, ‘How do evaluators and policymakers interact and what adjustments follow from the illustrative evaluation?’ It used participant observations, documents and interviews with policymakers and evaluators. An in-depth thematic analysis resulted in a typology of evaluator roles: (1) knowledge broker, (2) facilitator, (3) archive, (4) truth-revealing and (5) critical voice. Finally, policymakers and managers adjusted in three ways: symbolic, instrumental and empowerment. These results imply that if evaluators deliberate a suitable role, they (1) increase their partial understandings of the programme under scrutiny and the involved stakeholders, and (2) enhance the potential of synergies in collective learning to emerge in an evaluation team and the broader institution.","PeriodicalId":19964,"journal":{"name":"Performance Evaluation","volume":null,"pages":null},"PeriodicalIF":1.0000,"publicationDate":"2022-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Speaking truth to power: Exploring a Ministry’s evaluation department through evaluators’ and policymakers’ eyes\",\"authors\":\"Lotte Levelt, Nicky R. M. Pouw\",\"doi\":\"10.1177/13563890221109620\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"‘Evidence-based’ development policy has caused impact evaluations to prioritise accountability over addressing processual learning questions. Moreover, evaluation scholarship is dominated by surveys, whereas qualitative research remains scant. This article traces one particular evaluation, within the independent Evaluation Department of the Dutch Ministry of Foreign Affairs. It asks, ‘How do evaluators and policymakers interact and what adjustments follow from the illustrative evaluation?’ It used participant observations, documents and interviews with policymakers and evaluators. An in-depth thematic analysis resulted in a typology of evaluator roles: (1) knowledge broker, (2) facilitator, (3) archive, (4) truth-revealing and (5) critical voice. Finally, policymakers and managers adjusted in three ways: symbolic, instrumental and empowerment. These results imply that if evaluators deliberate a suitable role, they (1) increase their partial understandings of the programme under scrutiny and the involved stakeholders, and (2) enhance the potential of synergies in collective learning to emerge in an evaluation team and the broader institution.\",\"PeriodicalId\":19964,\"journal\":{\"name\":\"Performance Evaluation\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":1.0000,\"publicationDate\":\"2022-07-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Performance Evaluation\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://doi.org/10.1177/13563890221109620\",\"RegionNum\":4,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q4\",\"JCRName\":\"COMPUTER SCIENCE, HARDWARE & ARCHITECTURE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Performance Evaluation","FirstCategoryId":"94","ListUrlMain":"https://doi.org/10.1177/13563890221109620","RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q4","JCRName":"COMPUTER SCIENCE, HARDWARE & ARCHITECTURE","Score":null,"Total":0}
Speaking truth to power: Exploring a Ministry’s evaluation department through evaluators’ and policymakers’ eyes
‘Evidence-based’ development policy has caused impact evaluations to prioritise accountability over addressing processual learning questions. Moreover, evaluation scholarship is dominated by surveys, whereas qualitative research remains scant. This article traces one particular evaluation, within the independent Evaluation Department of the Dutch Ministry of Foreign Affairs. It asks, ‘How do evaluators and policymakers interact and what adjustments follow from the illustrative evaluation?’ It used participant observations, documents and interviews with policymakers and evaluators. An in-depth thematic analysis resulted in a typology of evaluator roles: (1) knowledge broker, (2) facilitator, (3) archive, (4) truth-revealing and (5) critical voice. Finally, policymakers and managers adjusted in three ways: symbolic, instrumental and empowerment. These results imply that if evaluators deliberate a suitable role, they (1) increase their partial understandings of the programme under scrutiny and the involved stakeholders, and (2) enhance the potential of synergies in collective learning to emerge in an evaluation team and the broader institution.
期刊介绍:
Performance Evaluation functions as a leading journal in the area of modeling, measurement, and evaluation of performance aspects of computing and communication systems. As such, it aims to present a balanced and complete view of the entire Performance Evaluation profession. Hence, the journal is interested in papers that focus on one or more of the following dimensions:
-Define new performance evaluation tools, including measurement and monitoring tools as well as modeling and analytic techniques
-Provide new insights into the performance of computing and communication systems
-Introduce new application areas where performance evaluation tools can play an important role and creative new uses for performance evaluation tools.
More specifically, common application areas of interest include the performance of:
-Resource allocation and control methods and algorithms (e.g. routing and flow control in networks, bandwidth allocation, processor scheduling, memory management)
-System architecture, design and implementation
-Cognitive radio
-VANETs
-Social networks and media
-Energy efficient ICT
-Energy harvesting
-Data centers
-Data centric networks
-System reliability
-System tuning and capacity planning
-Wireless and sensor networks
-Autonomic and self-organizing systems
-Embedded systems
-Network science