{"title":"重新评估在线口语考试","authors":"Felicitas Starr-Egger","doi":"10.1111/tger.12232","DOIUrl":null,"url":null,"abstract":"<p>Testing students’ proficiency in an oral, face-to-face setting has been a central part of education in many disciplines from medicine to modern foreign languages in the United Kingdom for a long time. Uses range from admission interviews to PhD defenses. However, the Coronavirus (COVID-19) pandemic afforded a re-evaluation of this assessment method and provided an opportunity for online implementation. There is a wealth of literature on online assessment in general as well as on the increased use in response to the pandemic (Butler-Henderson & Crawford, <span>2020</span>; Clark et al., <span>2020</span>; Montenegro-Rueda, <span>2021</span>; Pokhrel & Chhetri, <span>2021</span>; Jadav, <span>2022</span>), including guidelines (Haus et al., <span>2020</span>; Tuah & Naing, <span>2020</span>) to help prevent an uncritical transfer of the previously used traditional format, that is, in-person, on campus, and on paper, to a virtual learning environment. Shelton et al. (<span>2020</span>) also warn about the possible de-humanization of learning along with assessment and, citing Callaghan (<span>1964</span>), highlight the risk of “<i>becoming swept up in the flawed cult of efficiency… with a crude focus on quick, standardized evaluation of student learning at scale</i>” (p. 125). It could be argued that oral examinations—whether online or face-to-face—are the very antithesis to mass assessment events, re-humanize examinations, and produce better outcomes (Houston et al., <span>2006</span>; Odafe, <span>2006</span>; Roecker, <span>2007</span>). Akimov and Malin (<span>2020</span>) claim that “literature that discusses oral examination in an online context is practically non-existent” (p. 5); Graf et al. (<span>2021</span>, p. 5) make a similar assertion. This is perhaps not quite the case since studies investigating the use of online video conferencing tools for assessment purposes appear to go back many years (Isbell & Winke, <span>2019</span>; Isbell et al., <span>2019</span>; Li & Link, <span>2018</span>; Newhouse & Cooper, <span>2013</span>; Okada et al., <span>2015</span>). However, for universities in the United Kingdom, online oral examinations were and still are a novelty.</p><p>The following discussion focuses on German courses within the institution-wide modern foreign language program (IWLP) at a UK university, spanning six proficiency levels (A1–C1/C2 of the Common European Framework of Reference for Languages CEFR). Such courses are called modules in the United Kingdom and go over two terms with 40 contact hours between October and March. Students take these modules for degree credit (factored into their overall year grade) or extra credit (recorded on their transcript but not part of their degree); content and assessment are identical in both. Assessment combines “take-home” coursework, a written, and an oral examination (both of which are compulsory). Each examination contributes about one-third to the overall module grade. Oral examinations are always conducted by two examiners and vary in length from 10 to 12 min for level one to 23 to 25 min for level six.</p><p>From February 2020 onward, normal operations of teaching and assessment were severely disrupted when COVID-19 case numbers in the United Kingdom increased drastically. Although it was possible to complete teaching face-to-face as scheduled on campus (as the academic year runs from October to June), examinations were a challenge for two reasons: First, a large number of students left for home in the third week of March just prior to when exams were to take place. Second, a large number of lecturers called in sick or declared that they were unable to attend for health and safety reasons. Both of these factors required that examinations be postponed and later moved to an online environment. It was decided to hold all oral examinations online on Teams (as the university's video conference tool of choice, and later Zoom became an option). The transition required the following steps: prompt communication with the students about the new modus operandi; staff training on the use of Teams; setting up of Teams exam meetings; evaluation and re-design of the exam content. Following intense staff training (partly as online workshops or mini-teaching sessions, partly as short mock exams among colleagues), the first oral examination was held for German (level two, A2 CEFR) within a few weeks.</p><p>The next step was the scheduling phase. Although traditionally, oral examinations have been associated with a heavy workload (Chambers & Richards, <span>2007</span>; Antwerp University's Centre for Expertise in Higher Education, <span>n.d</span>.) often due to the sheer numbers of candidates involved, the established use of shared, online spreadsheets for scheduling greatly reduced the administrative effort. Nevertheless, some administrative effort was required as individual Teams meetings had to be created.</p><p>A simple comparison of estimated support staff hours spent on preparing online versus on-campus oral exams shows significant time savings for the online format: a Teams meeting in Outlook can be created and sent to candidates and examiners in approximately 2 min with no further input from the admin team required. For the 742 candidates examined since the start of the pandemic, the total time would be just over 3 working days. For on-campus exams, which require candidate check-in, workspace allocation, invigilation, and so forth, in addition to the initial emailing, the total estimated time required is almost 8 working days. Another clear benefit of Teams meetings is the centrally stored video recordings for all examinations, which can be used for quality assurance purposes, that is, checking by external examiners, in potential student appeals or academic misconduct investigations. Apart from the time savings outlined above, the travel time (and money) saved both by candidates and examiners, is also worth mentioning.</p><p>Pertinent discussions of the COVID-19-induced shift to online assessment in general invariably include a focus on cheating (Nažnean, <span>2021</span>), ranging from reports of relatively small numbers to staggering percentages (Janke, et al., <span>2021</span>; Sokol, <span>2022</span>). Furthermore, it has to be borne in mind that, as Nažnean citing Wenzel and Reinhard (<span>2020</span>) puts it, “cases of cheating may be underreported due to the fact that people rarely admit to their own cheating” (p. 104). It is equally likely that educational establishments are reluctant to publish accurate statistics on academic misconduct for fear of reputational damage. This would strengthen the case for a return to traditional on-campus examinations in general. However, as far as online oral examinations are concerned, the German lecturing team reported no cases of academic misconduct in the first 2 years of operation. In 2021–2022, unfortunately, a few cases of two forms of academic misconduct came to light. Candidates either appeared to type words or phrases into translation software or run simultaneous interpreting software alongside the call. Students used stalling expressions, such as, “<i>Können Sie das bitte wiederholen</i>” or “<i>Entschuldigung, ich habe nicht verstanden</i>” to gain time. However, both crude cheating methods were very easily detected, students were asked to desist, or the examination would be terminated, and an investigation launched.</p><p>While it has to be acknowledged that online oral examinations are not cheat-proof, the format may probably, at least for the time being, be assumed to be more robust than written online examinations as the candidates are known to examiners and can be identified. Furthermore, the examination produces an authentic response.</p><p>Returning to the concept of oral examinations in general, on the one hand, concerns regarding their reliability and validity (Fulcher, <span>2015</span>; Memon et al., <span>2010</span>) are a clear disadvantage. However, on the other hand, some studies have shown that they can be regarded as a form of inclusive assessment (Huxhama et al., <span>2012</span>; Symonds, <span>2008</span>), and there is evidence that an online oral examination format supports anxious students better (Theobold, <span>2021</span>, Waterford & West, <span>2006</span>) and benefits students with a physical disability (Basilaia & Kvavadze, <span>2020</span>).</p><p>Following a review of the instructor and student experiences with the online oral examination format and its pros and cons, it was decided to continue its use within the IWLP for the foreseeable future.</p>","PeriodicalId":43693,"journal":{"name":"Unterrichtspraxis-Teaching German","volume":"56 1","pages":"53-57"},"PeriodicalIF":0.6000,"publicationDate":"2023-05-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://onlinelibrary.wiley.com/doi/epdf/10.1111/tger.12232","citationCount":"0","resultStr":"{\"title\":\"Re-evaluating online oral examinations\",\"authors\":\"Felicitas Starr-Egger\",\"doi\":\"10.1111/tger.12232\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p>Testing students’ proficiency in an oral, face-to-face setting has been a central part of education in many disciplines from medicine to modern foreign languages in the United Kingdom for a long time. Uses range from admission interviews to PhD defenses. However, the Coronavirus (COVID-19) pandemic afforded a re-evaluation of this assessment method and provided an opportunity for online implementation. There is a wealth of literature on online assessment in general as well as on the increased use in response to the pandemic (Butler-Henderson & Crawford, <span>2020</span>; Clark et al., <span>2020</span>; Montenegro-Rueda, <span>2021</span>; Pokhrel & Chhetri, <span>2021</span>; Jadav, <span>2022</span>), including guidelines (Haus et al., <span>2020</span>; Tuah & Naing, <span>2020</span>) to help prevent an uncritical transfer of the previously used traditional format, that is, in-person, on campus, and on paper, to a virtual learning environment. Shelton et al. (<span>2020</span>) also warn about the possible de-humanization of learning along with assessment and, citing Callaghan (<span>1964</span>), highlight the risk of “<i>becoming swept up in the flawed cult of efficiency… with a crude focus on quick, standardized evaluation of student learning at scale</i>” (p. 125). It could be argued that oral examinations—whether online or face-to-face—are the very antithesis to mass assessment events, re-humanize examinations, and produce better outcomes (Houston et al., <span>2006</span>; Odafe, <span>2006</span>; Roecker, <span>2007</span>). Akimov and Malin (<span>2020</span>) claim that “literature that discusses oral examination in an online context is practically non-existent” (p. 5); Graf et al. (<span>2021</span>, p. 5) make a similar assertion. This is perhaps not quite the case since studies investigating the use of online video conferencing tools for assessment purposes appear to go back many years (Isbell & Winke, <span>2019</span>; Isbell et al., <span>2019</span>; Li & Link, <span>2018</span>; Newhouse & Cooper, <span>2013</span>; Okada et al., <span>2015</span>). However, for universities in the United Kingdom, online oral examinations were and still are a novelty.</p><p>The following discussion focuses on German courses within the institution-wide modern foreign language program (IWLP) at a UK university, spanning six proficiency levels (A1–C1/C2 of the Common European Framework of Reference for Languages CEFR). Such courses are called modules in the United Kingdom and go over two terms with 40 contact hours between October and March. Students take these modules for degree credit (factored into their overall year grade) or extra credit (recorded on their transcript but not part of their degree); content and assessment are identical in both. Assessment combines “take-home” coursework, a written, and an oral examination (both of which are compulsory). Each examination contributes about one-third to the overall module grade. Oral examinations are always conducted by two examiners and vary in length from 10 to 12 min for level one to 23 to 25 min for level six.</p><p>From February 2020 onward, normal operations of teaching and assessment were severely disrupted when COVID-19 case numbers in the United Kingdom increased drastically. Although it was possible to complete teaching face-to-face as scheduled on campus (as the academic year runs from October to June), examinations were a challenge for two reasons: First, a large number of students left for home in the third week of March just prior to when exams were to take place. Second, a large number of lecturers called in sick or declared that they were unable to attend for health and safety reasons. Both of these factors required that examinations be postponed and later moved to an online environment. It was decided to hold all oral examinations online on Teams (as the university's video conference tool of choice, and later Zoom became an option). The transition required the following steps: prompt communication with the students about the new modus operandi; staff training on the use of Teams; setting up of Teams exam meetings; evaluation and re-design of the exam content. Following intense staff training (partly as online workshops or mini-teaching sessions, partly as short mock exams among colleagues), the first oral examination was held for German (level two, A2 CEFR) within a few weeks.</p><p>The next step was the scheduling phase. Although traditionally, oral examinations have been associated with a heavy workload (Chambers & Richards, <span>2007</span>; Antwerp University's Centre for Expertise in Higher Education, <span>n.d</span>.) often due to the sheer numbers of candidates involved, the established use of shared, online spreadsheets for scheduling greatly reduced the administrative effort. Nevertheless, some administrative effort was required as individual Teams meetings had to be created.</p><p>A simple comparison of estimated support staff hours spent on preparing online versus on-campus oral exams shows significant time savings for the online format: a Teams meeting in Outlook can be created and sent to candidates and examiners in approximately 2 min with no further input from the admin team required. For the 742 candidates examined since the start of the pandemic, the total time would be just over 3 working days. For on-campus exams, which require candidate check-in, workspace allocation, invigilation, and so forth, in addition to the initial emailing, the total estimated time required is almost 8 working days. Another clear benefit of Teams meetings is the centrally stored video recordings for all examinations, which can be used for quality assurance purposes, that is, checking by external examiners, in potential student appeals or academic misconduct investigations. Apart from the time savings outlined above, the travel time (and money) saved both by candidates and examiners, is also worth mentioning.</p><p>Pertinent discussions of the COVID-19-induced shift to online assessment in general invariably include a focus on cheating (Nažnean, <span>2021</span>), ranging from reports of relatively small numbers to staggering percentages (Janke, et al., <span>2021</span>; Sokol, <span>2022</span>). Furthermore, it has to be borne in mind that, as Nažnean citing Wenzel and Reinhard (<span>2020</span>) puts it, “cases of cheating may be underreported due to the fact that people rarely admit to their own cheating” (p. 104). It is equally likely that educational establishments are reluctant to publish accurate statistics on academic misconduct for fear of reputational damage. This would strengthen the case for a return to traditional on-campus examinations in general. However, as far as online oral examinations are concerned, the German lecturing team reported no cases of academic misconduct in the first 2 years of operation. In 2021–2022, unfortunately, a few cases of two forms of academic misconduct came to light. Candidates either appeared to type words or phrases into translation software or run simultaneous interpreting software alongside the call. Students used stalling expressions, such as, “<i>Können Sie das bitte wiederholen</i>” or “<i>Entschuldigung, ich habe nicht verstanden</i>” to gain time. However, both crude cheating methods were very easily detected, students were asked to desist, or the examination would be terminated, and an investigation launched.</p><p>While it has to be acknowledged that online oral examinations are not cheat-proof, the format may probably, at least for the time being, be assumed to be more robust than written online examinations as the candidates are known to examiners and can be identified. Furthermore, the examination produces an authentic response.</p><p>Returning to the concept of oral examinations in general, on the one hand, concerns regarding their reliability and validity (Fulcher, <span>2015</span>; Memon et al., <span>2010</span>) are a clear disadvantage. However, on the other hand, some studies have shown that they can be regarded as a form of inclusive assessment (Huxhama et al., <span>2012</span>; Symonds, <span>2008</span>), and there is evidence that an online oral examination format supports anxious students better (Theobold, <span>2021</span>, Waterford & West, <span>2006</span>) and benefits students with a physical disability (Basilaia & Kvavadze, <span>2020</span>).</p><p>Following a review of the instructor and student experiences with the online oral examination format and its pros and cons, it was decided to continue its use within the IWLP for the foreseeable future.</p>\",\"PeriodicalId\":43693,\"journal\":{\"name\":\"Unterrichtspraxis-Teaching German\",\"volume\":\"56 1\",\"pages\":\"53-57\"},\"PeriodicalIF\":0.6000,\"publicationDate\":\"2023-05-04\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://onlinelibrary.wiley.com/doi/epdf/10.1111/tger.12232\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Unterrichtspraxis-Teaching German\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://onlinelibrary.wiley.com/doi/10.1111/tger.12232\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"0\",\"JCRName\":\"LANGUAGE & LINGUISTICS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Unterrichtspraxis-Teaching German","FirstCategoryId":"1085","ListUrlMain":"https://onlinelibrary.wiley.com/doi/10.1111/tger.12232","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"0","JCRName":"LANGUAGE & LINGUISTICS","Score":null,"Total":0}
Testing students’ proficiency in an oral, face-to-face setting has been a central part of education in many disciplines from medicine to modern foreign languages in the United Kingdom for a long time. Uses range from admission interviews to PhD defenses. However, the Coronavirus (COVID-19) pandemic afforded a re-evaluation of this assessment method and provided an opportunity for online implementation. There is a wealth of literature on online assessment in general as well as on the increased use in response to the pandemic (Butler-Henderson & Crawford, 2020; Clark et al., 2020; Montenegro-Rueda, 2021; Pokhrel & Chhetri, 2021; Jadav, 2022), including guidelines (Haus et al., 2020; Tuah & Naing, 2020) to help prevent an uncritical transfer of the previously used traditional format, that is, in-person, on campus, and on paper, to a virtual learning environment. Shelton et al. (2020) also warn about the possible de-humanization of learning along with assessment and, citing Callaghan (1964), highlight the risk of “becoming swept up in the flawed cult of efficiency… with a crude focus on quick, standardized evaluation of student learning at scale” (p. 125). It could be argued that oral examinations—whether online or face-to-face—are the very antithesis to mass assessment events, re-humanize examinations, and produce better outcomes (Houston et al., 2006; Odafe, 2006; Roecker, 2007). Akimov and Malin (2020) claim that “literature that discusses oral examination in an online context is practically non-existent” (p. 5); Graf et al. (2021, p. 5) make a similar assertion. This is perhaps not quite the case since studies investigating the use of online video conferencing tools for assessment purposes appear to go back many years (Isbell & Winke, 2019; Isbell et al., 2019; Li & Link, 2018; Newhouse & Cooper, 2013; Okada et al., 2015). However, for universities in the United Kingdom, online oral examinations were and still are a novelty.
The following discussion focuses on German courses within the institution-wide modern foreign language program (IWLP) at a UK university, spanning six proficiency levels (A1–C1/C2 of the Common European Framework of Reference for Languages CEFR). Such courses are called modules in the United Kingdom and go over two terms with 40 contact hours between October and March. Students take these modules for degree credit (factored into their overall year grade) or extra credit (recorded on their transcript but not part of their degree); content and assessment are identical in both. Assessment combines “take-home” coursework, a written, and an oral examination (both of which are compulsory). Each examination contributes about one-third to the overall module grade. Oral examinations are always conducted by two examiners and vary in length from 10 to 12 min for level one to 23 to 25 min for level six.
From February 2020 onward, normal operations of teaching and assessment were severely disrupted when COVID-19 case numbers in the United Kingdom increased drastically. Although it was possible to complete teaching face-to-face as scheduled on campus (as the academic year runs from October to June), examinations were a challenge for two reasons: First, a large number of students left for home in the third week of March just prior to when exams were to take place. Second, a large number of lecturers called in sick or declared that they were unable to attend for health and safety reasons. Both of these factors required that examinations be postponed and later moved to an online environment. It was decided to hold all oral examinations online on Teams (as the university's video conference tool of choice, and later Zoom became an option). The transition required the following steps: prompt communication with the students about the new modus operandi; staff training on the use of Teams; setting up of Teams exam meetings; evaluation and re-design of the exam content. Following intense staff training (partly as online workshops or mini-teaching sessions, partly as short mock exams among colleagues), the first oral examination was held for German (level two, A2 CEFR) within a few weeks.
The next step was the scheduling phase. Although traditionally, oral examinations have been associated with a heavy workload (Chambers & Richards, 2007; Antwerp University's Centre for Expertise in Higher Education, n.d.) often due to the sheer numbers of candidates involved, the established use of shared, online spreadsheets for scheduling greatly reduced the administrative effort. Nevertheless, some administrative effort was required as individual Teams meetings had to be created.
A simple comparison of estimated support staff hours spent on preparing online versus on-campus oral exams shows significant time savings for the online format: a Teams meeting in Outlook can be created and sent to candidates and examiners in approximately 2 min with no further input from the admin team required. For the 742 candidates examined since the start of the pandemic, the total time would be just over 3 working days. For on-campus exams, which require candidate check-in, workspace allocation, invigilation, and so forth, in addition to the initial emailing, the total estimated time required is almost 8 working days. Another clear benefit of Teams meetings is the centrally stored video recordings for all examinations, which can be used for quality assurance purposes, that is, checking by external examiners, in potential student appeals or academic misconduct investigations. Apart from the time savings outlined above, the travel time (and money) saved both by candidates and examiners, is also worth mentioning.
Pertinent discussions of the COVID-19-induced shift to online assessment in general invariably include a focus on cheating (Nažnean, 2021), ranging from reports of relatively small numbers to staggering percentages (Janke, et al., 2021; Sokol, 2022). Furthermore, it has to be borne in mind that, as Nažnean citing Wenzel and Reinhard (2020) puts it, “cases of cheating may be underreported due to the fact that people rarely admit to their own cheating” (p. 104). It is equally likely that educational establishments are reluctant to publish accurate statistics on academic misconduct for fear of reputational damage. This would strengthen the case for a return to traditional on-campus examinations in general. However, as far as online oral examinations are concerned, the German lecturing team reported no cases of academic misconduct in the first 2 years of operation. In 2021–2022, unfortunately, a few cases of two forms of academic misconduct came to light. Candidates either appeared to type words or phrases into translation software or run simultaneous interpreting software alongside the call. Students used stalling expressions, such as, “Können Sie das bitte wiederholen” or “Entschuldigung, ich habe nicht verstanden” to gain time. However, both crude cheating methods were very easily detected, students were asked to desist, or the examination would be terminated, and an investigation launched.
While it has to be acknowledged that online oral examinations are not cheat-proof, the format may probably, at least for the time being, be assumed to be more robust than written online examinations as the candidates are known to examiners and can be identified. Furthermore, the examination produces an authentic response.
Returning to the concept of oral examinations in general, on the one hand, concerns regarding their reliability and validity (Fulcher, 2015; Memon et al., 2010) are a clear disadvantage. However, on the other hand, some studies have shown that they can be regarded as a form of inclusive assessment (Huxhama et al., 2012; Symonds, 2008), and there is evidence that an online oral examination format supports anxious students better (Theobold, 2021, Waterford & West, 2006) and benefits students with a physical disability (Basilaia & Kvavadze, 2020).
Following a review of the instructor and student experiences with the online oral examination format and its pros and cons, it was decided to continue its use within the IWLP for the foreseeable future.