MOOCs 的環境下產生龐大的作業量,所以部分老師使用了同儕互評機制,過去研究得知,為了讓學習者的評分更具公平性和學習診斷的意義,選擇分析型評分規準是一個有用且好用的評量工具。本研究建構一個評分規準量表可使用於MOOCs 學習平臺的動畫課程,目的是幫助學習者理解作業評分標準,並促進評分者保持客觀、理性和公正。研究對象均為選修「2D 動畫製作」並至少繳交四次作業之學習者。分析後結果得知,經過一段時間學習的互評者與專家的評分達一致性,並且評分者的評分具有高度的效度。本研究證實分析型評分規準改善了同儕互評的問題,同時也讓學習者透過評分規準中的學習向度做自我檢視,提升學習者的評分能力。
The prevalence of massive open online courses (MOOC) has forced teachers to rely on peer assessment systems to cope with the increased workload. A number of past papers have indicated that analytical rubrics are a favorable approach in maintaining score fairness and extrapolating learners' learning performance. However, few papers have examined the scoring rubrics for design type of courses on MOOCs. In this paper, we developed a scoring rubric scale for an applied animation course on a MOOCs platform, aiming to help learners understand the scoring standards for their assignments while ensuring that raters remain objective, reasonable, and fair. The data analyzed in this paper were peer assessment scores and interview outcomes. The participants were learners who selected "2D Animation Production" as their elective course and submitted at least four assignments. Analysis findings showed that the correlation between raters' scores and the experts' scores increased over time to achieve high correlation, suggesting an increased consistency between the scores produced by the raters and those produced by the experts. This paper verifies that analytic rubrics can effectively resolve the fairness problem. Learners can also use the learning dimensions contained in the rubric to perform self-assessments and improve their scoring abilities.