Due to globalization, many graduate programs in Taiwan offer English medium instruction (EMI) lo encourage enrollment of international students and lo prepare local students for competitive job markets. These programs require evaluation of English oral proficiency for student admission purposes. Current standardized speaking tests are designed to assess general proficiency and are scored manually, which is costly and time-consuming. The present paper describes the development of a computer-assisted speaking test scored using Automatic Speech Recognition technology. The new speaking test was used along with existing standardized speaking, reading, and listening tests with a group of science and engineering graduate students before and after a six-week summer intensive course for specific academic purposes (ESAP). The results show that the new speaking test was highly correlated to the full-length manually scored general speaking test and to the ESAP reading and listening tests taken before the course. The new speaking lest was also correlated to the ESAP reading and listening test after the course. The students also made significant progress in speaking and reading in their professional areas. The student perception questionnaires indicated that the speaking test was well received by the test takers. This study contributed to an improved understanding of the design and implementation of computer-assisted tests used to assess content-related speaking skills.
為了持續優化網站功能與使用者體驗,本網站將Cookies分析技術用於網站營運、分析和個人化服務之目的。
若您繼續瀏覽本網站,即表示您同意本網站使用Cookies。