透過您的圖書館登入
IP:18.117.107.90
  • 期刊

操作技能直接觀察評量一致性之醫事放射教學經驗分享

The Experience on Radiological Training for Consistency of Direct Observation of Procedural Skills

摘要


本研究目的使用操作技能直接觀察評量表訂定標準化評分規範,建立評量表單信效度及評分者一致性訓練,以提升教學成效及學習效果。比較兩者評核題目(面膜製作、鉛合金擋塊製作)之表單信效度,並收集15名臨床教師分別觀看教案標準帶、25名醫事實習學生及4名PGY學員側拍影帶之評分者一致性訓練成果。研究結果顯示:(1)兩者評核題目於紀錄審查表之同意度百分比皆達80%,內容符合正確性及適合性,顯示表單具專家效度;其Cronbach's α係數,分別為0.848 及0.787,達評量表具信度可接受之標準。(2)教案標準帶之前後測成績,其Pearson相關係數分別為:0.83(P < 0.005)、0.87(P < 0.001),顯示15名評分者間有顯著正相關,具良好一致性。(3)醫事實習學生側拍影帶之信度分析:Kappa(K)同意度係數:0.83及0.87、Kendall's W值:0.328(P < 0.01)及0.032(P < 0.01)、ICC(α)組內相關係數:0.83及0.87。(4)PGY學員側拍影帶之信度分析:Kappa(K)同意度係數:0.92及0.86、Kendall's W值:0.45(P = 0.00)及0.39(P = 0.00)、ICC(α)組內相關係數:0.92及0.86,皆顯示評分者間具有良好評分者一致性。

關鍵字

信度 效度 一致性

並列摘要


The purpose of this study is to formulate standardized score criteria by Direct Observation of Procedural Skills (DOPS), to establish reliability and validity for checklist and consistency training of scorer, to elevate performance for training and learning. Comparing reliability and validity of checklist for 2 evaluation subjects (mask making, block making), and collect consistency training results from 15 clinical teachers who watch training videos respectively, 25 medical interns and 4 PGY trainees who make training video. Study results shows: (1) the agree level in record checklist for both subjects are close to 80%, and it matches with correctness and suitability, which indicates checklist is provided with Expert Validity; the Cronbach's αratio are 0.848, 0.787 respectively, which reach the acceptable criteria for reliable checklist. (2) Pre-score and post-score for training video has 0.83(P < 0.005) and 0.87(P < 0.001) in Pearson correlation coefficient, which shows it is significant positive correlated for 15 scorers and with good consistency. (3) Reliability analysis for side shot video made by Medical interns: Kappa (K) coefficient of agreement: 0.83 and 0.87, Kendall's W value: 0.328 (P < 0.01) and 0.032 (P < 0.01), ICC (α) intraclass correlation coefficient: 0.83 and 0.87. (4) Reliability analysis for side shot video made by PGY trainees: Kappa (K) coefficient of agreement: 0.92 and 0.86, Kendall's W value: 0.45 (P < 0.00) and 0.39(P < 0.00), ICC (α) intraclass correlation coefficient: 0.92 and 0.86.

並列關鍵字

Reliability Validity Consistency

延伸閱讀