透過您的圖書館登入
IP:3.141.41.187
  • 期刊

IS COHEN'S KAPPA A GOOD MEASURE OF AGREEMENT IN TRADITIONAL CHINESE MEDICINE?

Cohen Kappa測度在中醫一致性研究上是好測度嗎?

摘要


Four Diagnostic Methods (FDM) and Eight Principal Syndromes (EPS) are two pivots to discriminate dialectical for the diagnoses of the Traditional Chinese Medicine (TCM). Syndrome identification is the main characteristic of TCM diagnoses; in contrast, the western medicine focuses mainly on diseases identification. Therefore, how to use a suitable measure of agreement for the TCM diagnoses becomes a more and more important issue in the study of TCM. Concerning the inter-rater agreement, the well-known Cohen's kappa coefficient is a popular measure. However, in the situations when the proportion of agreement is extremely high or extremely low, Cohen's kappa might underestimate or overestimate the degree of agreement. In this paper, we propose a novel and simple corrected measure of agreement to remedy the drawback. In order to demonstrate our corrected kappa measure is a better alternative than Cohen's kappa in the diagnosis of TCM, we recruited twenty participants in the TCM health evaluation program and collect the corresponding tongue, palpation and hybrid diagnosis data. Then the diagnoses from five experienced TCM physicians of the Department of Traditional Chinese Medicine, Changhua Christian Hospital, Taiwan, were solicited to classify the participants' syndromes according to EPS (CCH IRB NO. 140704). The result shows that the proposed corrected measure is more reasonable than Cohen’s Kappa in most cases in TCM.

並列摘要


四診與八綱是中醫診斷的兩大支柱。不同於西方醫學著重於辨病,傳統中醫著重於辨證;因此,不同訓練背景的中醫師可能會得出不同的中醫診斷結果,而如何使用一種合適的測度來量測不同中醫師診斷的一致性就變得越來越重要 了。Cohen kappa測度是兩兩判別者之間最普遍被使用的一致性測度,然而,在資料的一致比例呈現極端的一致或不一致時,Cohen kappa常會低估或高估真實的一致性。在我們這篇論文的研究中,我們提出一個簡易的新測度來改善Cohen kappa的缺點。為了說明我們的測度比較合理一些,我們利用在彰化基督教醫院中醫部做中醫健診的二十個成年人的舌脈診資料,邀請五位專業中醫師根據舌脈診及八綱來判讀此二十人的資料,並且對八綱(陰陽寒熱表裡虛實)做判讀的一致性分析。結果顯示,我們提出的測度在大部分的情況下,都比Cohen kappa更合理許多。

並列關鍵字

中醫 一致性分析 Cohen kappa測度

參考文獻


Artstein, R.,Poesio, M.(2005).NLE technical note: Vol. 05-1. kappa3 .Colchester:University of Essex.
Bakeman, R.,Quera, V.,McArthur, D.,Robinson, B. F.(1997).Detecting sequential patterns and determining theirreliability with fallible observers.Psychological Methods.2(4),357-370.
Brennan, R. L.,Prediger, D.J.(1981).Coefficient Kappa: Some Uses, Misuses, and Alternatives.Educat Psychol Measur.41,687-699.
Cicchetti, D.V.,Feinsten, A.R.(1990).High Agreement but Low Kappa: II. Resolving the Paradoxes.Journal of Clinical Epidemiology.43,551-558.
Cohen, J.(1960).A coefficient of agreement for nominal scales.Educat Psychol Measur.20,213-220.

延伸閱讀