簡易檢索 / 詳目顯示

研究生: 吳皇慶
Wu, Huang-Ching
論文名稱: 新世代多媒體線上科學素養評量的發展與應用
MOST for LESS: The Next Generation of Scientific Literacy Assessment
指導教授: 張俊彥
Chang, Chun-Yen
學位類別: 博士
Doctor
系所名稱: 地球科學系
Department of Earth Sciences
論文出版年: 2019
畢業學年度: 107
語文別: 英文
論文頁數: 76
中文關鍵詞: 科學素養探究能力知識信念評量
英文關鍵詞: Scientific literacy, Inquiry ability, Epistemic beliefs, Assessment
DOI URL: http://doi.org/10.6345/NTNU201900382
論文種類: 學術論文
相關次數: 點閱:125下載:35
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  • 本文作者在研究中發展了一套適用於小六學生的自然科學素養線上多媒體評量(MOST for LESS)。研究中作者進行了以下的工作:(1)討論臺灣科學教育改革的背景,(2)介紹和描述十二年國教自然科學領域課程綱要的的特點及其與國際相關研究的接軌情形,(3)評價現行評量試題的相對優勢和局限性,(4)描述MOST for LESS的試題架構,(5) 進行一系列試題分析來驗證效度,(6)運用試題實際評量六年級學生的科學素養。作者參考了有關科學素養評量的相關文獻,以十二年國教自然科學領域課程綱要作為試題架構,研究人員最終開發了33道多媒體線上評量試題,用於衡量學生的探究能力;以及20道用於衡量學生的科學態度和科學本質的問卷試題。測量結果表明, MOST for LESS具備可以接受的信效度、難易度和鑑別度。研究亦發現,學生的探究能力(包括問題解決能力和思考智能)與對科學知識的信念(Epistemic Beliefs)之間存在直接的因果關係。即使MOST for LESS具備可接受的試題品質,但未來還需要更多研究來確定測驗的效度。此外,未來若能輔以更多元的評量方式,將能獲得更完整且準確的科學素養評量結果,如此亦有助於提升研究的參考價值。最後,本研究除了提供國內外科學教育界一個小六學生科學素養評量發展的參考模組,更揭示透過引導小六學生發展科學的知識信念,將有助於他們展現更優異的探究能力。

    In this dissertation, the author (a) discussed the background of science education reform in Taiwan, (b) introduced and described the features of new science curriculum guidelines (NSCG), (c) evaluated the relative strengths and limitations of the present assessments, (d) described a framework for aligning assessment with the NSCG, (e) conducted a series of pilot studies for item analysis and (f) examined 6th graders’ scientific literacy (SL) via the validated assessment. The relevant literature on SL was reviewed. And in reference of the framework of Taiwan's NSCG, the researcher ultimately developed a multimedia online scientific test for literacy of elementary school students (MOST for LESS). The instrument contains 33 multimedia online items for measuring students’ inquiry ability and 20 items for measuring students’ attitude toward science and beliefs of the nature of science. The testing results indicate a direct causal link between students’ scientific inquiry (including problem solving and thinking abilities) and their epistemic beliefs (EB). While MOST meet the standard of reliability and validity, it needs to be further validated, in terms of directly applying MOST to SL-oriented curriculum, to test its’ capability of measuring students ' SL. In the future, MOST results must be complemented with other relevant test methods to provide an accurate evaluation of students' SL. MOST for LESS will be applied to Taiwan's scientific education community to help students develop SL while also providing the international science education community with a more convenient, low-burden and accurate way of assessing primary students’ SL.‬

    Note on contributors 1 Acknowledgment 2 摘要 3 Epitome 4 Prologue 5 Episode 1 7 Episode 2 25 Episode 3 53 Epilogue 76

    1. Abi-El-Mona, I., & Abd-El-Khalick, F. (2011). Perceptions of the nature and “goodness”of argument among college students, science teachers, and scientists. International Journal of Science Education, 33(4), 573-605.
    2. Abrami, P. C., Bernard, R. M., Borokhovski, E., Wade, A., Surkes, M. A., Tamim, R., &
    Zhang, D. (2008). Instructional Interventions Affecting Critical Thinking Skills and Dispositions: A Stage 1 Meta-Analysis. Review of Educational Research, 78(4), 1102–1134.
    3. AERA, APA, & NCME (1999). Standards for educational and psychological testing. Washington, D.C.: Author.
    4. Aldridge, J. M., Fraser, B. J., & Huang, I. T. C. (1999). Investigating classroom environment in Taiwan and Australia with multiple research methods. Journal of Educational Research, 93, 48–62.
    5. American Association for the Advancement of Science [AAAS]. (1990). Science for all Americans. New York: Oxford University Press.
    6. American Association for the Advancement of Science (1993). Benchmarks for Science Literacy. New York: Oxford Uni- versity Press.
    7. American Association for the Advancement of Science [AAAS]. (2010). Vision and Change: A Call to Action, Washington, DC.
    8. American Chamber of Commerce in Taipei. (2013). 2013 Taiwan White Paper. Downloaded from the Publications section of the AmCham website at www.amcham.com.tw
    9. Anderson, J. (2009). Cognitive psychology and its implications. 7th Ed. New York, NY: Worth Publisher.
    10. Archer, L. et al. (2010), “‘Doing’ science versus ‘being’ a scientist: Examining 10/11-year-old schoolchildren’s constructions of science through the lens of identity”, Science Education, Vol. 94/4, pp. 617-639.
    11. Arons, A. B. (1983). Achieving wider scientific literacy. Daedalus, 112(2), 91 – 122.
    12. Bailin, S., & Siegel, H. (2003). Critical thinking. In N. Blake, P. Smeyers, R. Smith, & P. Standish (Eds.), The Blackwell guide to the philosophy of education (pp. 1 8 1-193). Oxford, UK: Blackwell
    13. Barzilai, S., & Zohar, A. (2012). Epistemic thinking in action: Evaluating and integrating
    online sources. Cognition & Instruction, 30(1), 39-85.
    14. Bricker, L. A., & Bell, P. (2008). Conceptualizations of argumentation from science studies and the learning sciences and their implications for the practices of science education. Science Education, 92, 473-498.
    15. Coil D,WenderothMP, CunninghamM, Dirks C (2010). Teaching the process of science: faculty perceptions and an effective methodology. CBE Life Sci Educ 9, 524–535.
    16. Cronbach, L. (1951). Coefficient alpha and the internal structure of tests. Psychometrika 16, 297–334.
    17. Daud, A. M., Omar J., Turiman P., & Osman K. (2012). Creativity in Science Education.Procedia - Social and Behavioral Sciences 59 (2012) 467 – 474
    18. Davis, L. (1992). Instrument review: Getting the most from your panel of experts. Applied Nursing Research, 5, 194-197
    19. DeBoer, G. E. (2000). Scientific literacy: another look at its historical and contemporary meanings and its relationship to science education reform. J Res Sci Teach 37, 582–601.
    20. DeVellis, R. (1991). Scale development: Theory and applications. Thousand Oaks: Sage Publications.
    21. Ebel, R. (1965). Measuring Educational Achievement, Englewood Cliffs, NJ: Prentice Hall.
    22. Eckhoff, A., & Urbach, J. (2008). Understanding Imaginative Thinking During Childhood: Sociocultural Conceptions of Creativity and Imaginative Thought. Early Childhood Educational Journal, 36:179–185
    23. Eshach, H. (2006). Science literacy in primary schools and pre-schools. Dordrecht: Springer.
    24. Evans, J. D. (1996). Straightforward statistics for the behavioral sciences. Pacific Grove, CA: Brooks/Cole Publishing.
    25. Facione, P. A. (1990). Critical thinking: A statement of expert consensus for purposes of educational assessment and instruction. Research findings and recommendations. Newark, DE: American Philosophical Association. (ERIC Document Reproduction Service No. ED3 15423
    26. Feldt, L. S. (1993). The relationship between the distribution of item difficulties and test reliability. Appl Meas Educ 6, 37–48.
    27. Fives, H., Huebner, W., Birnbaum, A. S., & Nicolich, M. (2014). Developing a measure of scientific literacy for middle school students. Science Education, 98(4), 549–580.
    28. Ford, M. J. (2012). A dialogic account of sense-making in scientific argumentation and
    reasoning. Cognition & Instruction, 30, 207-245.
    29. Gauld, C. (1982). The scientific attitude and science education: A critical reappraisal. Science Education, 66, 109 – 121
    30. Gormally, C., Brickman, P., & Lutz, M. (2012). Developing a Test of Scientific Literacy Skills (TOSLS): Measuring Undergraduates’ Evaluation of Scientific Information and Arguments. CBE—Life Sciences Education, 11(4), 364–377. doi:10.1187/cbe.12-03-0026
    31. Guo, C.J. & Chiu, M. H. (2016). Opportunities and Challenges for Science Education in Asia: Perspectives Based on the Taiwan Experience. In M. H. Chiu (Ed.). Science Education Research and Practice in Asia. Springer.
    32. Halloun, I. A. (2004). Modeling theory in science education. Dordrecht, The Netherlands: Kluwer Academic.
    33. Hofer, B. K. (2004). Epistemological understanding as a metacognitive process: Thinking aloud during online searching. Educational Psychologist, 39(1), 43-55.
    34. Hogan, K. (2002). A sociocultural analysis of school and community settings as sites for
    developing environmental practitioners. Environmental Education Research, 8,413-437.
    35. Holbrook, J., & Rannikemae, M. (2007). The nature of science education for enhancing scientific literacy. International Journal of Science Education, 29, 1347 – 1362.
    36. Holyoak, K., & Morrison, B. (Eds.) (2005). The Cambridge handbook of thinking and reasoning. Cambridge, UK: Cambridge University Press.
    37. Huang M, T., & Chen W. D. (2012). The connotation of scientific literacy. Teacher World, 178, 11-16. (In Chinese)
    38. Jenkins, E. (2003). School science: Too much, too little, or a problem with science itself ? Canadian Journal of Math. Science & Technology Education. 269-274.
    39. John, J.-P., Chiu, M.-H., & Chung, S.-L. (2015). The Use of Modeling-Based Text to Improve Students’ Modeling Competencies. Science Education, 99(5), 986–1018. doi:10.1002/sce.21164
    40. Johnson-Laird, P. N.,&Byrne, M. J. (1991). Deduction. Hillsdale: Lawrence Erlbaum Associates.
    41. Kreiter, C. (2015). When I say … response process validity. Medical education. 49. 247-8. 10.1111/medu.12572.
    42. Kuo, C. Y., Wu, H.-K.*, Jen, T. H., & Hsu, Y. S. (2015). Development and validation of a multimedia-based assessment of scientific inquiry abilities. International Journal of Science Education, 37(14), 2326-2357.
    43. Lai, H.R. & Chou, W.L. & Miao, N.F. & Wu, Y.P. & Lee, P.H. & Jwo, J.C. (2015). A Comparison of Actual and Preferred Classroom Environments as Perceived by Middle School Students. The Journal of school health. 85. 388-97. 10.1111/josh.12263.
    44. Lawson, A. E. (2010). Basic inferences of scientific reasoning, argumentation, and discovery. Science Education, 94(2), 336–364. https://doi.org/10.1002/sce.20357
    45. Lederman, J. S. (2009). Teaching scientific inquiry: Exploration, directed, guided, and opened-ended levels. In National geographic science: Best practices and research base (pp. 8–20). Hapton-Brown Publishers.
    46. Lederman, J. S., Lederman, N. G., Bartos, S. A., Bartels, S. A., Antink Meyer, A., & Schwartz, R. (2014). Meaningful assessment of learners’ understandings about scientific inquiry—The views about scientific inquiry (VASI) questionnaire. Journal of Research in Science Teaching, 51, 65–83.
    47. Lin, J. W., & Chiu, M. H. (2010). The mismatch between students’ mental models of acids/bases and their sources and their teacher’s anticipations thereof. International Journal of Science Education, 32(12), 1617-1646.。
    48. Lin, T.-J., Deng, F., Chai, C.S., & Tsai, C.-C. (2013). High school students’ scientific epistemological beliefs, motivation in learning science, and their relationships: A comparative study within the Chinese culture. International Journal of Educational Development, 33(1), 37-47.
    49. Martin, M.O., Mullis, I.V.S., Foy, P., & Hooper, M. (2016). TIMSS 2015 International results in Science.
    50. Martin, M. O., Mullis, I. V. S., and Hooper, M. (Eds.). (2016). Methods and Procedures in TIMSS 2015. Retrieved from Boston College, TIMSS & PIRLS International Study Center website: http://timssandpirls.bc.edu/publications/timss/2015-methods.html
    51. Mason, L., Boldrin, A., & Ariasi, N. (2009). Epistemic metacognition in context: Evaluating and learning online information. Metacognition & Learning, 5(1), 67-90.
    52. Mason, L., Ariasi, N., & Boldrin, A. (2011). Epistemic beliefs in action: Spontaneous reflections about knowledge and knowing during online information searching and their influence on learning. Learning & Instruction, 21(1), 137-151.
    53. Mayer, R. E. (2001). Multimedia learning. Cambridge: Cambridge University Press.
    54. Means, M. L., & Voss, J. F. (1996). Who reasons well? Two studies of informal reasoning of different grade, ability, and knowledge level. Cognition and Instruction, 14(2), 139–179. https://doi.org/10.1207/ s1532690xci1402_1
    55. Ministry of education (2014). The General Curriculum Guidelines for Twelve-year Compulsory Education. (Taipei: Ministry of Education)
    56. Ministry of education (2016). The Science Curriculum Guidelines for Twelve-year Compulsory Education (The Draft). (Taipei: Ministry of Education)
    57. Ministry of education (2017). The General Curriculum Guidelines for Twelve-year Compulsory Education. (Taipei: Ministry of Education)
    58. Ministry of education (2018). The Science Curriculum Guidelines for Twelve-year Compulsory Education. (Taipei: Ministry of Education)
    59. Muis.K., Duffy.M., Trevors.M., Foy, M. J., Ranellucci, J., Gierus, B., & Wang, X. (2012, April). What were they thinking? Using cognitive interviewing to examine the nature and validity of self-reported epistemic beliefs. Paper presented at the 2012 Meeting of the American Educational Research Association, Vancouver, Canada.
    60. Mullis, I. V. S., & Martin, M. O. (Eds.). (2017). TIMSS 2019 Assessment Frameworks. Retrieved from Boston College, TIMSS & PIRLS International Study Center website: http://timssandpirls.bc.edu/timss2019/frameworks/
    61. National Academy for educational Research (2014). The Suggestions for Curriculum Development for twelve-year compulsory Education. (Taipei: National Academy for educational Research)
    62. NAGB. (2014) Science framework for the 2015 National Assessment of Educational Progress. Washington, DC: U.S. Department of Education
    63. National Research Council. (1996). National Science Education Standards. Washington, DC: National Academy of Science Press.
    64. National Research Council. (2012). A Framework for K-12 Science Education: Practices, Crosscutting Concepts, and Core Ideas. Committee on a Conceptual Framework for New K-12 Science Education Standards. Board on Science Education, Division of Behavioral and Social Sciences and Education. Washington, DC: The National Academies Press.
    65. National Research Council [NRC]. (2000). Inquiry and the national science education standards. Washington, DC: National Academy Press.
    66. National Research Council [NRC]. (2011). A framework for K-12 science education: Practices, crosscutting concepts, and core ideas. Washington, DC: National Academy Press.
    67. National Science Teachers Association [NSTA]. (1982). Science-technology-society: Science education for the 1980s. (An NSTA position statement.) Washington, DC: National Science Teachers Association.
    68. National Science Teachers Association [NSTA]. (2016). NSTA Position Statement: The Next Generation Science Standards.
    69. Neumann, K., Fischer, H., & Kauertz, A. (2010). From PISA to educational standards: The impact of large-scale assessments on science education in Germany. International Journal of Science and Mathematics Education, 8(3), 545–563.
    70. Nick C. (2017). NAP sample assessment science literacy 2015: public report. Retrieved from Analysis and Policy Observatory Website: https://apo.org.au/node/74438
    71. Oh, P. S., & Oh, S. J. (2010). What Teachers of Science Need to Know about Models: An overview. International Journal of Science Education, 33(8), 1109–1130. doi:10.1080/09500693.2010.502191
    72. Organisation for Economic Co-operation and Development. (2001). Knowledge and skills for life: First results from PISA 2000. Paris: Author.
    73. Organisation for Economic Co-operation and Development. (2006). Assessing scientific, reading, and mathematical literacy. Paris: Author.
    74. Organization for Economic Co-operation and Development. (2007). PISA 2006: Science competencies for tomorrow’s world, volume I analysis. Paris: Author.
    75. Organization for Economic Co-operation and Development. (2016). PISA 2015 Results: Excellence and Equity in Education, volume I. Paris: Author.
    76. Osborne, J., Erduran, S., & Simon, S. (2004). Enhancing the quality of argumentation in
    School science. Journal of Research in Science Teaching, 41, 994-1020.
    77. Partnership, G. (2018). Learning Progression Definition. [online] The Glossary of Education Reform. Available at: https://www.edglossary.org/learning-progression/ [Accessed 6 Oct. 2018].
    78. Perkins, D. N., & Salomon, G. (1988). Teaching for transfer. Educational Leadership, 46(1), 22-32.
    79.
    80. Reynolds, C. R., Livingston, R. B., & Wilson, V. (2006). Measurement and assessment in education. Boston: Pearson.
    81. Roberts, D. A. (2007). Scientific literacy/science literacy. In S.K. Abell & N.G. Lederman (Eds.), Handbook of research on science education (pp. 729 – 780). Mahwah, NJ: Erlbaum.
    82. Rogoff, B. (1990). Apprenticeship in thinking: Cognitive development in social context. New York, NY, US: Oxford University Press.
    83. Shaw, D., & Young, S. (2004). Revised guidelines for conducting item analyses of classroom tests. The Researcher,18, 15 – 22.
    84. Shen, B. S. P. (1975). Science literacy and the public understanding of science. In S. B. Day (Ed.), Communication of scientific information (pp. 44 – 52). Basel, Switzerland: S. Karger A.G.
    85. Smith, K. V., Loughran, J., Berry, A., & Dimitrakopoulos, C. (2012). Developing scientific literacy in a primary school. International Journal of Science Education, 1(1), 127-152.
    86. Tavakol, M., Dennick, R., (2011). Making sense of Cronbach’s alpha. Int. J. Med. Educ. 2, 53-55.
    87. Tu, Y. W., Shih, M., & Tsai, C. C. (2008). Eighth graders’ web searching strategies and outcomes: The role of task types, web experiences and epistemological beliefs. Computers & Education, 51, 1142-1153.
    88. UNEP (2012), 21 Issues for the 21st Century: Result of the UNEP Foresight Process on Emerging Environmental Issues, United Nations Environment Programme (UNEP), Nairobi, Kenya,
    89. Vygotsky, L. S. (1930/2004). Imagination and creativity in childhood. Journal of Russian and East European Psychology, 42(1), 7–97.
    90. Wang, Z. (2004). An Antidote to Modern Test-oriented Education: Toward a Constructive Postmodern Education. Paper presented at the Forum for Integrated Education and Educational Reform.
    91. Wenning, C. J. (2007). Assessing inquiry skills as a component of scientific literacy. Journal of Physics Teacher Education Online, 4, 21 – 24.
    92. Wothke, W. (1993) Nonpositive definite matrices in structural modeling. In K.A. Bollen & J.S. Long (Eds.), Testing Structural Equation Models. Newbury Park NJ: Sage. (Chap. 11, pp. 256-293).
    93. Wu, H. C., Yeh, T. K., & Chang, C. Y.* (2010). The design of an animation-based test system in the area of earth sciences. British Journal of Educational Technology, 41(3), E53-E57. Colloquium
    94. Wu, H. C., Chang, C. Y.*, Chen, D., Yeh, T. K., & Liu, C. C. (2010). Comparison of earth science achievement between animation-based and graphic-based testing design. Research in Science Education, 40(5), 639-673.
    95. Yang, F.-Y. (2017) Examining the Reasoning of Conflicting Science Information from the Information Processing Perspective: An Eye Movement Analysis. J. Res. Sci. Teach. 2017, 54 (10), 1347−1372.

    下載圖示
    QR CODE