簡易檢索 / 詳目顯示

研究生: 王成元
論文名稱: 以認知診斷模型分析台灣與亞洲四國(地區)八年級學生在TIMSS 2007的數學學習成就表現:以DINA模型為例
指導教授: 蔡蓉青
學位類別: 碩士
Master
系所名稱: 數學系
Department of Mathematics
論文出版年: 2012
畢業學年度: 100
語文別: 中文
論文頁數: 240
中文關鍵詞: TIMSS 2007認知診斷模型DINA
英文關鍵詞: TIMSS 2007, cognitive diagnostic model, DINA
論文種類: 學術論文
相關次數: 點閱:436下載:47
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  •   本研究旨在針對TIMSS 2007的八年級數學成就測驗試題之解題所需認知屬性,透過認知診斷模型中的DINA模型進行分析,以了解並比較臺灣與亞洲四國(地區)之八年級學生在認知屬性的精熟情形。本研究依據測驗的題本四之試題分析出內容、歷程、技能/試題類型等三大類共20項認知屬性的Q矩陣架構,研究樣本共包含臺灣290名、韓國301名、新加坡306名、香港243名與日本301名等受測學生。研究主要發現如下:
    一、臺灣與亞洲四國(地區)在大部分的認知屬性皆至少有一半的學生能精熟,其中臺灣在「數」、「代數」、「應用數、量與形的知識於計算或判斷」、「機率、統計與閱讀理解」、「數學思維」與「試題特徵」等面向皆有屬性的精熟情形顯著優於部分國家(地區),但在「機率、統計與閱讀理解」面向的部份屬性之精熟表現顯著不如日本與韓國。
    二、將認知屬性分組檢視屬性組型的分布情形後發現:(1) 在「數」、「代數」與「機率、統計與閱讀理解」方面,皆有相對多數的學生精熟所有相關屬性;(2) 在「幾何」方面,相對多數的學生皆精熟或是皆未精熟所有相關屬性,表現具雙峰化現象;(3) 在「數學思維」方面,相對多數的學生在所有相關屬性皆精熟、皆未精熟或是僅未精熟屬性「解析的思維」;(4) 在「試題特徵」方面,相對多數的學生在所有相關屬性皆精熟、皆未精熟或是僅精熟屬性「開放式的題目」;(5) 臺灣表現最好的面向為「代數」與「試題特徵」,最不佳的為「機率、統計與閱讀理解」;(6) 臺灣在幾何面向的表現有些微雙峰化現象,即所有相關屬性皆精熟與皆未精熟的學生皆較大部分國家(地區)多。
    三、精熟試題所需所有屬性的情形方面,臺灣在數與代數兩維度的情形較幾何和機率與統計維度試題都要好;亞洲四國(地區)在數、代數以及機率與統計等維度的情形皆較幾何維度好。此外,臺灣與韓國學生的精熟情形相近,而在代數維度試題的表現優於新加坡、香港與日本的情形最佳,數與幾何兩個維度次之,機率與統計維度的優異情形最不明顯。

    This study focuses on cognitive attributes that required for solving the mathematical items of the TIMSS 2007 eighth-grade. It conducts analysis through the DINA model of the cognitive diagnostic model in order to understand and to compare the mastery of cognitive attributes among students of the eighth grade in Taiwan and other four countries in Asia (region). This study based on the questions in Test Booket Four to analyse a Q matrix framework covering 20 cognitive attributes, which are cataglorized into three major camps including “content”, “process”, and”skill / item type”. The sampling of this study takes up 290 examinees from Taiwan and 301 from Korea, 306 from Singapore, 243 from Hong Kong, and 301 from Japan. The major findings are as follows:
    1) More than half of students in Taiwan and four other Asian countries (region) show masteries in most cognitive attributes. Taiwan students particularly outperform in dimentions including "number", " algebra”, " computational and judgmental applications of knowledge in number, quantity, and the geometry", "probability and the basic statistics”, “mathematical thinking” and “characteristics of items” while Taiwan students underperformed Japan and Korean in “probability, statistics, and reading comprehension”.
    2) This study examined the distribution of attribute patterns via grouping attributes and found out: (1) Relatively a larger number of students master in attributes such as "number", "algebra" and "probability, statistics, and reading comprehension"; (2) relatively a larger number of students master all or master none of the related attributes in “ geometry ", showing a bimodal phenomenon; (3) in “mathematical thinking ", relatively a larger number of students master all or master none of the related attributes, or only not master the attribute “analytical thinking”; (4) in ”characteristics of items” , relatively a larger number of students master all or master none of the related attributes, or only master the attribute “ open-ended items"; (5) Taiwan outperformed in "algebra" and " characteristics of items“ while underperformed in “probability, statistics, and reading comprehension”; (6) Taiwan shows slight bimodal phenomenon in “geometry” with most students all master or master none of the related attributes while compared with most countries (regions).
    3) In item related attributes, Taiwan performs better in “number” and “algebra” than in “geometry” and “probability and statistics”; four Asian countries (regions) perform better in “number”, “algebra” and “probability and statistics” than in”geometry”. In addition, Taiwan students demonstrate similar mastery as Korean students, outperforming students from Singapore, Hong Kong and Japan in “algerbra”, the best situation followed by “number” and “geometry”, while the performance in “probability and statistical dimensions” is the least obvious.

    中文摘要 英文摘要 目次.....................................................I 圖目次...................................................III 表目次..................................................IV 第一章 緒論...............................................1   第一節 研究動機........................................1   第二節 研究目的與問題...................................5   第三節 名詞釋義........................................6   第四節 研究範圍與限制..................................10 第二章 文獻探討..........................................11   第一節 國際數學與科學教育成就趨勢調查簡介................11   第二節 認知診斷模型..................................18   第三節 認知屬性集.....................................32 第三章 研究設計與實施....................................47   第一節 研究架構......................................47   第二節 研究對象.......................................49   第三節 研究工具與流程...................................53   第四節 資料分析方法.....................................62 第四章 研究結果.........................................65 第一節 分析與比較我國與亞洲四國(地區)學生在解TIMSS 2007數學試題所需之各認知屬性的精熟情形.................................65 第二節 分析與比較我國與亞洲四國(地區)學生在解TIMSS 2007數學試題所需認知屬性的屬性組型分布情形............................82 第三節 分析與比較我國與亞洲四國(地區)學生精熟試題解題所需所有認知屬性情形.............................................96 第五章 結論與建議.....................................109 第一節 結論.............................................109 第二節 建議...........................................113 參考文獻.............................................122 附錄.................................................129 附錄一..................................................129 附錄二..................................................174 附錄三.................................................176 附錄四...............................................180 附錄五................................................183 附錄六.................................................186 附錄七..............................................219

    中文部份
    王文卿(2009)。DINA模式與G-DINA模式參數估計比較。國立臺中教育大學教育測驗統計研究所碩士論文,未出版,台中市。
    王建中(2006)。不同試題論述方式對學生作答表現影響之分析-以TIMSS 2003試題為例。國立臺灣師範大學科學教育研究所碩士論文,未出版,台北市。
    吳毓文(2011)。國小二年級學童時間概念之認知診斷研究。國立臺中教育大學教育測驗統計研究所碩士論文,未出版,臺中市。
    李雯雅(2009)。台灣國二學生數學學習成就之相關因素研究:以TIMSS 2007問卷為例。國立臺灣大學理學院數學系碩士論文,未出版,台北市。
    李曉嵐、呂淳郁、吳慧珉、許天維(2011,12月)。電腦化認知診斷測驗之編製-以國小五年級數學小數估算單元為例。論文發表於電腦與網路科技在教育上的應用研討會,新竹市。
    林靚瑜(2011)。國小低年級乘法概念認知診斷之研究。國立臺中教育大學教育測驗統計研究所碩士論文,未出版,臺中市。
    洪碧霞、蕭嘉偉、林素微(2009)。PISA 數學素養認知成份分析對補救教學的意涵。課程與教學季刊,13(1),47-66。
    胡冠璇(2011)。利用認知診斷評量探討投影片教學對學生學習成就之研究-以等量公理為例。國立交通大學理學院科技與數位學習學程碩士論文,未出版,新竹市。
    涂金堂(2003)。認知診斷評量的探究。南師學報,37(2),67-97。
    國際數學與科學教育成就趨勢調查 2003計畫簡介(無日期)。2012年3月20日,取自http://www.dorise.info/DER/01_timss_2003_html/index.html
    國際數學與科學教育成就趨勢調查 2007計畫簡介(無日期)。2012年3月20日,取自http://www.dorise.info/DER/01_timss_2007_html/index.html
    國際數學與科學教育成就趨勢調查 2011計畫簡介(無日期)。2012年3月20日,取自http://www.sec.ntnu.edu.tw/timss2011/introduce.asp
    張文良(2010)。 PISA中學校背景、學校經營因素與學生學科成就表現關係之研究-以臺韓為例。國立高雄師範大學教育系博士論文,未出版,高雄市。
    張惠芳(2010)。國中數學成就試題難度認知成份分析之研究:以花蓮地區為例。國立東華大學數學系碩士論文,未出版,花蓮縣。
    曹博盛(2009a)。TIMSS 2007的數學評量架構。2011年12月1日,取自http://math.tmue.edu.tw/front/bin/ptlist.phtml?Category=161
    曹博盛(2009b)。TIMSS 2007臺灣八年級學生的數學成就及其相關因素之探討。2011年12月1日,取自http://math.tmue.edu.tw/front/bin/ptlist.phtml?Category=161
    陳亭宇(2010)。DINA模式與G-DINA模式參數不變性探討。國立臺中教育大學教育測驗統計研究所碩士論文,未出版,台中市。
    黃楷智(2011)。動態幾何系統GeoGebra對數學學習成效與認知診斷影響之研究。國立交通大學理學院科技與數位學習學程碩士論文,未出版,新竹市。
    鄭勃毅(2011)。利用認知診斷評量探討分隔訊息之交互作用對學生學習成效的影響-以三角形重心幾何證明為例。國立交通大學理學院科技與數位學習學程碩士論文,未出版,新竹市。
    鄧怡君(2011)。利用認知診斷評量探討數位教材設計對學習成效之影響-以排容原理為例。國立交通大學理學院科技與數位學習學程碩士論文,未出版,新竹市。
    戴曉霞(1994)。IEA研究及其方法論之探討。國立臺灣師範大學教育研究所博士論文,未出版,台北市。
    謝謹謙(2011)。利用認知診斷測驗探討激發式動態教學成效之研究-以簡易二次函數及圖形為例。國立交通大學理學院科技與數位學習學程碩士論文,未出版,新竹市。
    藍乙琳(2009)。從PISA 2006看閱讀教育的推動。研習資訊,26(4),65-69。
    譚克平(2009)。TIMSS國際教育評比研究簡介。研習資訊,29(6),11-20。

    西文部份
    Akaike, H. (1970). Statistical predictor identification. Annals of the Institute of Statistical Mathematics, 22(1), 203-217.
    Birenbaum, M., Nasser, F., & Tatsuoka, C. (2005). Large-scale diagnostic assessment: Mathematics performance in two educational systems. Educational Research and Evaluation, 11(5), 487-507.
    Birenbaum, M., Tatsuoka, C., & Xin, T. (2007). Large-scale diagnostic assessment: Comparison of eighth graders' mathematics performance in the United states, Singapore and Israel. Assessment in Education: Principles, Policy & Practice, 12(2), 167-181.
    Birenbaum, M., Tatsuoka, C., & Yamada, T. (2004). Diagnostic assessment in TIMSS-R: Between-countries and within-country comparisons of eighth graders' mathematics performance. Studies in Educaional Evaluation, 30, 151-173.
    Birnbaum, A. (1968). Some latent trait models and their use in inferring an examinee's ability. In F. M. Lord & M. R. Novick (Eds.), Statistical Theories of Mental Test Scores.Reading, MA: Addison-Wesley.
    Brief History of IEA: 50 Years of Educational Research. (n.d.). Retrieved March 20, 2012, from http://www.iea.nl/brief_history.html
    Chen, Y.-H., Gorin, J. S., Thompson, M. S., & Tatsuoka, K. K. (2008). An alternative examin-ation of Chinese Taipei mathematics achievement: Application of the rule-space method to TIMSS 1999 data. In M. von Davier & D. Hastedt (Eds.), IERI monograph series: Issues and methodologies in large scale assessments. (Vol. 1, pp. 23-49).
    Chen, Y.-H., Gorin, J. S., Thompson, M. S., & Tatsuoka, K. K. (2008). Cross-cultural validity of the TIMSS-1999 mathematics test: Verification of a cognitive model. International Journal of Testing, 8, 251-271.
    Corter, J. E., Tatsuoka, K. K., Guerrero, A., Dean, M., & Dogan, E. (2006). Revised coding manual for identifying item involvement of content, context, and process subskills for the TIMSS and TIMSS-R 8th grade mathematics tests (Tech. Rep. No. Rec-0126064). New York: Columbia University, Department of Human Development, Teachers College.
    de la Torre, J. (2009a). A cognitive diagnosis model for cognitively based multiple-choice options. Applied Psychological Measurement, 33(3), 163-183.
    de la Torre, J. (2009b). DINA model and parameter estimation: A didactic. Journal of Educational and Behavioral Statistics, 34(1), 115-130.
    de la Torre, J. (2011). The generalized DINA model framework. Psychometrika, 76(2), 179-199.
    de la Torre, J., & Douglas, J. A. (2004). Higher-order latent trait models for cognitive diagnosis. Psychometrika, 69, 333-353.
    de la Torre, J., & Douglas, J. A. (2008). Model evaluation and multiple strategies in cognitive diagnosis: An analysis of fraction subtraction data. Psychometrika, 73(4), 595-624.
    de la Torre, J., & Lee, Y.-S. (2010). A note on the invariance of the DINA model parameters. Journal of Educational Measurement, 47(1), 115-127.
    Decarlo, L. T. (2011). On the analysis of fraction subtration data: The DINA model, classification, latent class sizes, and the Q-matrix. Applied Psychological Measurement, 35(1), 8-26.
    DiBello, L. V., Roussos, L. A., & Stout, W. F. (2007). Review of cognitively diagnostic assessment and a summary of psychometric models. In C. R. Rao & S. Sinharay (Eds.), Handbook of statistics (Vol. 26, pp. 979-1030). Amsterdam, The Netherlands: North-Holland.
    DiBello, L. V., Stout, W. F., & Roussos, L. A. (1995). Unified cognitive/psychometric diagnostic assessment likelihood-based classification techniques. In P. D. Nichols, S. F. Chipman & R. L. Brennan (Eds.), Cognitively Diagnostic Assessment (pp. 361-389). Mahwah, NJ: Erlbaum.
    Dogan, E., & Tatsuoka, K. K. (2008). An international comparison using a diagnostic testing model : Turkish students' profile of mathematical skills on TIMSS-R. Educational Studies in Mathematics, 68(3), 263-272.
    Doornik, J. A. (2003). Objective-oriented matrix programming using Ox (Version 3.1). London: Timberlake Consultants Press.
    Embretsom, S. E., & Daniel, R. C. (2008). Understanding and qualifying cognitive complexity level in mathematical problem solving items. Psychology Science Quarterly, 50(3), 328-344.
    Embretson, S. E. (1985). Multicomponent latent trait models for test design. In S. E. Embretson (Ed.), Test Design: Developments in Psychology and Psychometrics (pp. 195-218). New York: Academic Press.
    Embretson, S. E. (1997). Multicomponent response models. In W. J. van der Linden & R. L. Hambleton (Eds.), Handbook of Modern Item Response Theory. (pp. 305-321). New York: Springer.
    Fischer, G. H. (1983). Logistic latent trait models with linear constraints. Psychometrika, 48(1), 3-26.
    Foy, P., & Olson, J. F. (2009). TIMSS 2007 user guide for the international database. Chestnut Hill, MA: TIMSS & PIRLS International Study Center, Lynch School of Education, Boston College.
    Gitomer, D. H., & Yamamoto, K. (1991). Performance modeling that integrates latent trait and class theory. Journal of Educational Measurement, 28(2), 173-189.
    Glaser, R. (1981). The future of testing: A researcg agenda for cognitive psychology and psychometrics. American Psychologist, 36, 923-936.
    Glass, G. V. (1986). Testing old, testing new: Schoolboy psychology and the allocation of intellectual resources. In B. S. Plake & J. C. Witt (Eds.), The future of testing (pp. 9-28). Hillsdale, N.J.: L. Erlbaum Associates
    Haertel, E. H. (1984). An application of latent class models to assessment data. Applied Psychological Measurement, 8(3), 333-346.
    Haertel, E. H. (1989). Using restricted latent class models to map the skill structure of assessment items. Journal of Educational Measurement, 26(4), 301-321.
    Haertel, E. H. (1990). Continuous and discrete latent structure models of item response data. Psychometrika, 55(3), 477-494.
    Hartz, S. M. (2002). A Bayesian framework for the Unified Model for assessing cognitive abilities: Blending theory with practicality. Unpublished doctoral dissertation, University f Illinois, Champaign, IL.
    Hartz, S. M., & Roussos, L. A. (2005). The Fusion Model for skills diagnosis: Blending theory with practice. NJ: Educational Testing Service.
    Henson, R., & Templin, J. (2007). Q-matrix construction. Retrieved November, 25, 2011, from http://jtemplin.coe.uga.edu/files/dcm/dcm07ncme/rhenson_qmatrix_ncme07.pdf
    Henson, R. A., & Douglas, J. A. (2005). Test construction for cognitive diagnosis. Applied Psychological Measurement, 29(4), 262-277.
    Henson, R. A., Templin, J. L., & Willse, J. T. (2009). Defining a family of cognitive diagnosis models using log-linear models with latent variables. Psychometrika, 74, 191-210.
    Huebner, A. (2010). An overview of recent developments in cognitive diagnostic computer daptive assessments. Practical Assessment, Research & Evaluation, 15(3). Retrieved March 20, 2012, from http://0-pareonline.net.opac.lib.ntnu.edu.tw/pdf/v15n3.pdf
    Husén, T. (1967). International Study of Achievement in Mathematics ( Vols. I & II). Stockholm: Almqvist & Wiksell.
    Im, S., & Park, H. J. (2010). A comparison of US and Korean students' mathematics skills using a cognitive diagnostic testing method: linkage to instruction. Educational Research and Evaluation, 16(3), 287-301.
    Junker, B. W., & Sijtsma, K. (2001). Cognitive assessment models with few assumptions, and connections with nonparametric item response theory. Applied Psychological Measurement, 25(3), 258-272.
    Lee, Y.-S., de la Torre, J., & Park, Y. (2011). Relationships between cognitive diagnosis, CTT, and IRT indices: an empirical investigation. Asia Pacific Education Review, 1-13.
    Lee, Y.-S., Park, Y. S., & Taylan, D. (2011). A Cognitive Diagnostic Modeling of Attribute Mastery in Massachusetts, Minnesota, and the U.S. National Sample Using the TIMSS 2007. International Journal of Testing, 11(2), 144-177.
    Leighton, J. P., & Girel, M. J. (Eds.). (2007). Cognitive diagnostic assessment for education: Theory and applications. New York: Cambridge University Press.
    Macready, G. B., & Dayton, C. M. (1977). The use of probabilistic models in the assessment of mastery. Journal of Educational Statistics, 2(2), 99-120.
    Maris, E. (1999). Estimating multiple classification latent class models. Psychometrika 64(2), 187-212.
    Mullis, I. V. S., Martin, M. O., & Foy, P. (2008). TIMSS 2007 international mathematics report: Findings from IEA's Trends in International Mathematics and Science Study at the fourth and eighth grades. Chestnut Hill, MA: TIMSS & PIRLS International Study Center, Lynch School of Education, Boston College.
    Mullis, I. V. S., Martin, M. O., Ruddock, G. J., O'Sullivan, C. Y., Arora, A., & Erberber, E. (2005). TIMSS 2007 assessment frameworks: TIMSS & PIRLS International Study Center, Lynch School of Education, Boston College.
    Mullis, I. V. S., Martin, M. O., Smith, T. A., Garden, R. O., & Gregory, K. O. (2003). TIMSS assessment frameworks and specifications 2003. Chestnut Hill, MA: Boston College.
    Nichols, P. D. (1994). A framework for developing cognitively diagnostic assessments. Review of Educational Research, 64(4), 575-603.
    Nichols, P. D., Chipman, S. F., & Brennan, R. L. (Eds.). (1995). Cognitively diagnostic assessment. Hillsdale, New Jersey: Lawrence Erlbaum Associates.
    Olson, J. F., Martin, M. O., & Mullis, I. V. S. (2008). TIMSS 2007 technical report. Chestnut Hill, MA: TIMSS & PIRLS International Study Center, Lynch School of Education, Boston College.
    Rasch, G. (1961). On general laws and the meaning of measurement in psychology. In J. Neyman (Ed.), Proceedings of the Fourth Berkeley Symposium on Mathematical Statistics and Probability, vol. 4. Berkeley, CA: University of California Press.
    Reckase, M. D., & McKinley, R. L. (1991). The discriminating power of items that measure more than one dimension. Applied Psychological Measurement 15(4), 361-373.
    Rupp, A. A., Templin, J., & Henson, R. A. (2010). Diagnostic measurement: Theory, method, and applications. New York: Guilford.
    Schwarz, G.. (1978). Estimating the dimention of a model. Annals of Statistics, 6(2), 461-464.
    Sympson, J. B. (1977). A model for testing with multidimensional items. In D. J. Weiss (Ed.), Proceedings of the 1977 Computerized Adaptive Testing Conference (pp. 82-88). Minneapolis: University of Minnesota, Department of Psychology, Psychometric Methods Program.
    Tatsuoka, K. K. (1983). Rule space: An approach for dealing with misconceptions based on item response theory. Journal of Educational Measurement, 20(4), 345-354.
    Tatsuoka, K. K. (1995). Architecture of knowledge structures and cognitively diagnosis: A statistical pattern recognition and classfication approach. In P. D. Nichols, S. F. Chipman & R. L. Brennan (Eds.), Cognitively diagnostic assessment (pp. 327-359). Hillsdale, N.J. : L. Erlbaum Associates.
    Tatsuoka, K. K. (2009). Cognitive assessment : An introduction to the rule space method. New York: Routledge.
    Tatsuoka, K. K., & Boodoo, G. M. (2000). Subgroup differences on the GRE quantitative test based on the underlying cognitive processes and knowledge handbook of research design in mathematics and science education (pp. 821-857). Hillsdale, N.J. : L. Erlbaum Associates.
    Tatsuoka, K. K., Corter, J. E., & Tatsuoka, C. (2004). Patterns of diagnosed mathematical content and process skills in TIMSS-R across a sample of 20 countries. American Educational Research Journal, 41(4), 901-926.
    Templin, J. L., & Henson, R. A. (2006). Measurement of psychological disorders using cognitive diagnosis models. Psychological Methods, 11(3), 287-305.
    Turner, R. (2009, September). Using mathematical competencies to predict item difficulty in PISA. In R. Turner (Chair), PISA research conference. Symposium conducted at the meeting of Australian Council for Educational Research, Kiel, Germany.
    UNSCO. (1999). Operational Manual for ISCED 1997 (International Standard Classification of Education) (first edition). Paris: UNSCO.
    Whitely, S. E. (1980). Multicomponent latent trait models for ability tests. Psychometrika, 45(4), 479-494. Name is now Embretson, S.E.

    下載圖示
    QR CODE