Tahsin OĞUZ BAŞOKÇU
(Ege Üniversitesi, Eğitim Fakültesi, Eğitim Bilimleri Bölümü, Eğitimde Ölçme ve Değerlendirme Anabilim Dalı, İzmir, Türkiye)
Tuncay ÖĞRETMEN
(Adres Yazılmamış)
Erdinç ÇAKIROĞLU
(Orta Doğu Teknik Üniversitesi, Ankara, Türkiye)
Vefa BARDAKÇI
(Adres Yazılmamış)
Bünyamin YURDAKUL
(Adres Yazılmamış)
Gözde AKYÜZ
(Adres Yazılmamış)
Proje Grubu: TÜBİTAK SOBAG ProjeSayfa Sayısı: 154Proje No: 115K531Proje Bitiş Tarihi: 15.11.2018Türkçe

0 0
Uluslararası Geniş Ölçekli Sınavlarda Türkiye'nin Matematik Başarısını Arttırabilmek İçin Bir Model Önerisi: Bilişsel Tanıya Dayalı İzleme Modelinin Etkililiği
Bu projenin temel amacı, Uluslararası geniş ölçekli sınavlarda (PISA, TIMSS vb.) Türkiye?nin matematik başarısını arttırabilmek için ortaöğretim 6. sınıf matematik dersindeki alan bilgisi ve bilişsel süreç becerilerinin bir arada ölçülebilmesine dayalı ölçme araçları tasarlamak, öğrencilerin bu bilgi ve becerilere sahip olma düzeylerini etkili istatistiksel yöntemlerle belirlemek, belirlenen öğrenme yetersizlikleri ve kaynakları konusunda öğretmen ve öğrencilere geribildirimler sunarak başarının gelişimini izlemektir. Proje, deneysel desen kullanılarak yürütülmüştür. Tabakalı, tesadüfi küme örnekleme yöntemi kullanılarak İzmir?in 10 ilçesindeki 20 okul ve 148 şubesinden seçilen toplam 4562 öğrenci araştırmanın örneklemi olarak belirlenmiştir. Desen gereği, belirlenen örneklem Deney-1, Deney-2 ve Kontrol gruplarına ayrılmıştır. Deney-1 grubu 8 okulun, 51 şubesindeki toplam 1481 öğrenciden; Deney-2, 5 okulun 42 şubesindeki toplam 1433 öğrenciden; Kontrol grubu ise 7 okulun 55 şubesindeki toplam 1678 öğrenciden oluşmuştur. Deney-1 öğrencilerine izleme testleri yanında BTM analizlerine göre kazanım, bilişsel süreç ve hata konularında detaylı geribildirim verilmiştir. Deney-2, test etkisinin belirlenebilmesi için oluşturulduğundan bu gruptaki öğrencilere izleme testleri uygulanmış ancak sadece toplam puan üzerinden geribildirim verilmiştir. Kontrol grubu öğrencilerine ise sadece ön ve son-testler uygulanmıştır. Deney-1 ve Deney-2, BTM?ye göre geribildirim vermenin; Kontrol grubu hem izleme testlerinin hem de geribildirimin etkisini belirlemek için oluşturulmuştur. Tüm gruplara PISA testi ile eşdeğerliği kanıtlanan ön test, deney süreci sonunda son test olarak tekrar uygulanmıştır. Sonuçlar, araştırmanın temel hipotezinin kanıtlandığını göstermiş; Deney-1 ile Deney-2 ve Deney-2 ile Kontrol grubu arasında son test ortalamalarında anlamlı farklar saptanmıştır. PISA, TIMSS gibi uluslararası geniş ölçekli sınavlarda ölçülen özelliklere sahip olma düzeyinin üst düzey düşünme becerilerini ölçen testlerin kullanılarak arttırılabileceği görülmüştür. Yanı sıra, BTM ile belirlenen öğrenci profilleri kullanılarak sunulan geribildirimlerin başarı düzeyini daha da yükselttiği ortaya çıkmıştır.
  • Aydın, E., & Önder, O. (2010). Sınava Hazırlık Biçiminin Farklı Sınav Türlerinde Ölçülen Matematik Sınav Başarı Düzeylerine Etkisi. M.Ü. Atatürk Eğitim Fakültesi Eğitim Bilimleri Dergisi, (31), 5–24.
  • Bahrick, H. P. (1970). Two-phase model for prompted recall. Psychological Review. https://doi.org/10.1037/h0029099
  • Barahal, S. L. (2008). Thinking about thinking: Preservice teachers strengthen their thinking artfully. Phi Delta Kappan. https://doi.org/10.1177/003172170809000412
  • Basokcu, T. O. (2010). Öğrenme Eksiklerinin Belirlenmesinde Klasik Test Teorisine Dayalı Yöntemler ve DINA Modelin Karşılaştırılması TheComparison of DINA Model andClassical Test Theory in Determiningthe Learning Gaps. Ege Eğitim Dergisi, 1(1), 59–83. https://doi.org/10.1016/B978-1-59749-661-2.00010-3
  • Basokcu, T. O., & Canpolat, A. (2016). Study in Local Equating of Cognitively Diagnostic Modeled Observed Scores with Anchor-test Design. In The Association of Measurement and Evaluation in Education and Psychology (pp. 26–34).
  • Basokcu, T. O., Canpolat, A., & Gönülal, G. (2016). DINA Modelde Madde Doygunluğunun ve Madde Uyum Düzeyinin Model Veri Uyumuna Etkisi. In IIIrd International Eurasian Educational Research Congress (pp. 1593–1594). Mugla: Anı Yayıncılık.
  • Basokcu, T. O., & Ceylan, S. (2017). Generation Study of PISA Maths Proficiency Levels in Turkish 6th Grade Students. In M. Shelley, M. Pehlivan, & M. T. Hebeci (Eds.), International Conference on Education in Mathematics, Science & Technology ICEMST (pp. 14–23). Kuşadası: ISRES Publishing.
  • Basokcu, T. O., & Ceylan, S. (2018). Yapılandırılmış ve Çoktan Seçmeli Maddelerde Parametre Değişmezliğinin İncelenmesi: Bilişsel Tanı Modellerinde Madde Formatının Etkisi. In International Cogress on Measurement and Evaluation in Education and Psychology (p. 389). Prizen/Kosovo: Cmeep.
  • Basokcu, T. O., & Duran, N. (2016). Çevrimiçi Editörlük Sisteminin Test Geliştirme Süreçlerine Uyarlanması. In 3rd. International Conference on New Trends in Education.
  • Basokcu, T. O., & Gökçe, S. (2016). Hakem Katılığının YapılandırılmıĢ-Yanıtlı Maddelerde Yetenek ve Parametre Kestirimine Etkisi. In V. Ululsal Egitimde ve Psikolojide Ölçme ve Değerlendirme Kongresi (p. 237). Antalya: EPOD.
  • Basokcu, T. O., & Güngör, D. (2016). Comparison of DINA Model and Exploratory Latent Class Analysis. In International Meeting of the Psychometric Society (IMPS) (p. 75). Asheville, NC: Psychometric Society.
  • Basokcu, T. O., & Güzel, M. A. (2017). A Cognitive Diagnostic Model: High-order Thinking and Metacognition involved in Mathematics. In EARLI 17th Biennial Conference European Assocition for Research on Learning and Instruction (p. 57). Tampere/Finlandiya.
  • Basokcu, T. O., & Güzel, M. A. (2018). Beyond Counting the Correct Responses: Metacognitive Monitoring and Estimations about Test Scores. In 8th International Biennial Conference of EARLI SIG 16 Metacognition (p. 73). Zurich.
  • Basokcu, T. O., & Ozel, S. (2018). Longitudinal Analysis of The Students’ Attitudes Towards Real Life Problems Testing Their High-Level Thinking Skills in Mathematics. In International Conference on Education in Mathematics, Science & Technology (ICEMST2018) (p. 175). Muğla.
  • Bloom, B. S. (1956). Taxonomy of Educational Objectives: The Classification of Educational Goals. D. McKay.
  • Boaler, J. (1998). Open and Closed Mathematics: Student Experiences and Understandings. Journal for Research in Mathematics Education, 29(1), 41–62. https://doi.org/10.2307/749717
  • Bridgeman, B. (1992). A Comparison of Quantitative Questions in Open-Ended and Multiple- Choice Formats. Journal of Educational Measurement. https://doi.org/10.1111/j.1745- 3984.1992.tb00377.x
  • Brookhart, S. M. (1994). Teachers’ Grading: Practice and Theory. Applied Measurement in Education, 7(4), 279–301. https://doi.org/10.1207/s15324818ame0704_2
  • Brookhart, S. M. (2010). How to Assess Higher-order Thinking Skills in Your Classroom. ASCD.
  • Bümen, N. T. (2006). Program Geliştirmede Bir Dönüm Noktası: Yenilenmiş Bloom Taksonomisi. Eğitim ve Bilim, 32(142).
  • Buxton, L. (1981). Do You Panic about Maths?: Coping with Maths Anxiety. Heinemann Educational.
  • Büyüköztürk, Ş. (2010). Sosyal bilimler için veri analizi el kitabı: istatistik, araştırma deseni, SPSS uygulamaları ve yorum. Pegem Akademi.
  • Carrol, T. M. (1989). Critical thinking : Promoting it in the Classroom. ERIC-Education Resources Information Center, (ERIC Document Reproduction Service No: ED306554)., 0–3.
  • Collins, L. M., & Lanza, S. T. (2013). Latent Class and Latent Transition Analysis: With Applications in the Social, Behavioral, and Health Sciences. Wiley. Retrieved from https://books.google.com.tr/books?id=gPJQWKsgh3YC
  • Collins, R. (2014). Skills for the 21st Century: teaching higher-order thinking. Curriculum & Leadership Journal, 12(14).
  • Conklin, W. (2012). Strategies for Developing Higher-Order Thinking Skills. Shell Educational Publishing.
  • Council, N. R. (2001). Knowing What Students Know: The Science and Design of Educational Assessment. (J. W. Pellegrino, N. Chudowsky, & R. Glaser, Eds.). Washington, DC: The National Academies Press. https://doi.org/10.17226/10019
  • Dayton, C. M., & Macready, G. B. (2006). 13 Latent Class Analysis in Psychometrics. In C. R. Rao & S. Sinharay (Eds.), Handbook of Statistics (Vol. Volume 26, pp. 421–446). Elsevier. https://doi.org/http://dx.doi.org/10.1016/S0169-7161(06)26013-9
  • de la Torre, J. (2011). The Generalized DINA Model Framework. Psychometrika, 76(2), 179– 199. https://doi.org/10.1007/s11336-011-9207-7
  • de la Torre, J. (2008). An Empirically Based Method of Q-Matrix Validation for the DINA Model: Development and Applications. Journal of Educational Measurement, 45(4), 343–362. https://doi.org/10.1111/j.1745-3984.2008.00069.x
  • de la Torre, J. (2009). A Cognitive Diagnosis Model for Cognitively Based Multiple-Choice Options. Applied Psychological Measurement, 33(3), 163–183. https://doi.org/10.1177/0146621608320523
  • de la Torre, J., & Chiu, C.-Y. (2016). A General Method of Empirical Q-matrix Validation. Psychometrika, 81(2), 253–273. https://doi.org/10.1007/s11336-015-9467-8
  • de la Torre, J., & Douglas, J. (2004). Higher-order latent trait models for cognitive diagnosis. Psychometrika, 69(3), 333–353. https://doi.org/10.1007/BF02295640
  • de la Torre, J., & Douglas, J. (2008). Model Evaluation and Multiple Strategies in Cognitive Diagnosis: An Analysis of Fraction Subtraction Data. Psychometrika, 73(4), 595–624. https://doi.org/10.1007/s11336-008-9063-2
  • de la Torre, J., & Lee, Y.-S. (2010). A Note on the Invariance of the DINA Model Parameters. Journal of Educational Measurement, 47(1), 115–127. https://doi.org/10.1111/j.1745-3984.2009.00102.x
  • DeCarlo, L. T. (2011). On the Analysis of Fraction Subtraction Data: The DINA Model, Classification, Latent Class Sizes, and the Q-Matrix. Applied Psychological Measurement, 35(1), 8–26. https://doi.org/10.1177/0146621610377081
  • Edition, S. (2009). PISA Data Analysis Manual: SPSS, Second Edition. Analysis. https://doi.org/10.1787/9789264056275-en
  • Embretson, S. E. (1999). Cognitive psychology applied to testing. In F. T. Durso (Ed.), Handbook of applied cognition (p. xxii, 881-xxii, 881). New York, NY, US: John Wiley & Sons Ltd.
  • Embretson, S. E., & Reise, S. P. (2013). Item Response Theory. Taylor & Francis. Retrieved from https://books.google.com.tr/books?id=AcWQtfGjCawC
  • Ennis, R. H., & Norris, S. P. (1990). Critical thinking assessment: Status, issues, needs. Cognitive Assessment of Language and Math Outcomes Legg. https://doi.org/10.1371/journal.pone.0075214
  • Ersözlü, Z. N., & Kazu, H. (2011). İlköğretim Beşinci Sınıf Sosyal Bilgiler Dersinde Uygulanan Yansıtıcı Düşünmeyi Geliştirme Etkinliklerinin Akademik Başarıya Etkisi. Uludağ Üniversitesi Eğitim Fakültesi Dergisi, 24(1), 141–159.
  • Fischbach, A., Keller, U., Preckel, F., & Brunner, M. (2013). PISA proficiency scores predict educational outcomes. Learning and Individual Differences, 24, 63–72. https://doi.org/10.1016/j.lindif.2012.10.012
  • Fischer, G. H. (1973). The linear logistic test model as an instrument in educational research. Acta Psychologica, 37(6), 359–374. https://doi.org/10.1016/0001-6918(73)90003-6
  • Flavell, J. H. (1979). Metacognition and cognitive monitoring: A new area of cognitive- developmental inquiry. American Psychologist. https://doi.org/10.1037/0003- 066X.34.10.906
  • Foong, P. Y. (2000). Open-ended problems for higher-order thinking in mathematics. Teaching and Learning, 20(2), 49–57.
  • Frederiksen, N. (1984). Implications of Cognitive Theory for Instruction in Problem Solving. Review of Educational Research. https://doi.org/10.3102/00346543054003363
  • Garofalo, J. (2018). Beliefs, Responses, and Mathematics Education: Observations From the Back of the Classroom. School Science and Mathematics, 89(6), 451–455. https://doi.org/10.1111/j.1949-8594.1989.tb11947.x
  • Goodson, L., & Rohani, F. (1998). Higher order thinking skills: Definition, teaching strategies, assessment. Center for Advancement of Learning and Assessment. https://doi.org/10.1063/1.3155135
  • Gough, D. (1991). Thinking About Thinking. Alexandria, VA.: National Association of Elementary School Principals. https://doi.org/10.1080/0266736910060407
  • Green, D. G., & Swets, J. A. (1966). Signal detection theory and psychophysics. Wiley & Sons, Inc. https://doi.org/10.1901/jeab.1969.12-475
  • Greenwood, J. (1984). My Anxieties About Math Anxiety. The Mathematics Teacher, 77(9), 662–663.
  • Haertel, E. H. (1989). Using Restricted Latent Class Models to Map the Skill Structure of Achievement Items. Journal of Educational Measurement, 26(4), 301–321. https://doi.org/10.1111/j.1745-3984.1989.tb00336.x
  • Haladyna, T. M., & Rodriguez, M. C. (2013). Developing and Validating Test Items. Taylor & Francis.
  • Hambleton, R. K., & Swaminathan, H. (2013). Item Response Theory: Principles and Applications. Springer Netherlands.
  • Haney, W., & Madaus, G. (1989). Searching for Alternatives to Standardized Tests: Whys, Whats, and Whithers. Phi Delta Kappan.
  • Hartz, S. M. C. (2002). A Bayesian Framework for the Unified Model for Assessing Cognitive Abilities: Blending Theory With Practicality. Retrieved from https://books.google.com.tr/books?id=5zomjwEACAAJ
  • Henningsen, M., & Stein, M. K. (1997). Mathematical Tasks and Student Cognition: Classroom-Based Factors That Support and Inhibit High-Level Mathematical Thinking and Reasoning. Journal for Research in Mathematics Education, 28(5), 524–549. https://doi.org/10.2307/749690
  • Henson, R., & Douglas, J. (2005). Test Construction for Cognitive Diagnosis. Applied Psychological Measurement, 29(4), 262–277. https://doi.org/10.1177/0146621604272623
  • Henson, R., Roussos, L., Douglas, J., & Xuming He. (2008). Cognitive Diagnostic Attribute- Level Discrimination Indices. Applied Psychological Measurement, 32(4), 275–288. https://doi.org/10.1177/0146621607302478
  • Herman, J. L., Klein, D. C. D., & Wakai, S. T. (1997). American Students’ Perspectives on Alternative Assessment: do they know it’s different? Assessment in Education: Principles, Policy & Practice, 4(3), 339–352. https://doi.org/10.1080/0969594970040302
  • Higham, P. A. (2002). Strong cues are not necessarily weak: Thomson and Tulving (1970) and the encoding specificity principle revisited. Memory and Cognition. https://doi.org/10.3758/BF03195266
  • Higham, P. A. (2007). No special K! A signal detection framework for the strategic regulation of memory accuracy. Journal of Experimental Psychology: General. https://doi.org/10.1037/0096-3445.136.1.1
  • Higham, P. A., & Arnold, M. M. (2007). How many questions should I answer? Using bias profiles to estimate optimal bias and maximum score on formula-scored tests. European Journal of Cognitive Psychology. https://doi.org/10.1080/09541440701326121
  • Higham, P. A., & Tam, H. (2005). Generation failure: Estimating metacognition in cued recall. Journal of Memory and Language. https://doi.org/10.1016/j.jml.2005.01.015
  • Hirose, S. (1992). Critical Thinking in Community Colleges. ERIC-Education Resources Information Center, (ERIC Document Reproduction Service No: 348128., 1–7.
  • Holland, P. W., Dorans, N. J., & Petersen, N. S. (2006). 6 Equating Test Scores. In C. R. Rao & S. Sinharay (Eds.), Handbook of Statistics (Vol. Volume 26, pp. 169–203). Elsevier. https://doi.org/http://dx.doi.org/10.1016/S0169-7161(06)26006-1
  • Jaworski, B. (2006). Theory and Practice in Mathematics Teaching Development: Critical Inquiry as a Mode of Learning in Teaching. Journal of Mathematics Teacher Education, 9(2), 187–211. https://doi.org/10.1007/s10857-005-1223-z
  • Kato, K. (2009). Improving efficiency of cognitive diagnosis by using diagnostic items and adaptive testing. University of Minnesota.
  • Kintsch, W. (1970). Models for free recall and recognition. In D. A. Normal (Ed.), Models of Human Memory (pp. 333–370). New York, NY, US: Academic Press.
  • Kolen, M. J., & Brennan, R. L. (2004). Test Equating, Scaling, and Linking: Methods and Practices. Springer. Retrieved from https://books.google.com.tr/books?id=tusOwjb7LWsC
  • Koray, Ö., Altunçekiç, A., & Yaman, S. (2005). Fen Bilgisi Öğretmenlerinin Soru Sorma Becerilerinin Bloom Taksonomisine Göre Değerlendirilmesi. Pamukkale Üniversitesi Eğitim Fakültesi Dergisi, 17(17), 33–39.
  • Krathwohl, D. R. (2002). A Revision of Bloom’s Taxonomy: An Overview. Theory Into Practice, 41(4), 212–218. https://doi.org/10.1207/s15430421tip4104_2
  • Krosnick, J. A., & Presser, S. (2010). Question and Questionnaire Design. In P. V Marsden & J. D. Wright (Eds.), Handbook of Survey Research (Vol. 2, pp. 263–314). Bingley, UK: Emerald Group Publishing Limited. https://doi.org/10.1111/j.1432-1033.1976.tb10115.x
  • Lave, J., Murtaugh, M., & de la Rocha, O. (1984). The dialectic of arithmetic in grocery shopping. In Everyday cognition: Its development in social context. (pp. 67–94). Cambridge, MA, US: Harvard University Press.
  • Leech, N., Barrett, K., & Morgan, G. A. (2007). SPSS for Intermediate Statistics: Use and Interpretation, Third Edition. Taylor & Francis.
  • Leighton, J., & Gierl, M. (2007). Cognitive Diagnostic Assessment for Education: Theory and Applications. Cambridge University Press. Retrieved from http://books.google.com.tr/books?id=E13E9lOFUMYC
  • Lenzner, T., Neuert, C., & Otto, W. (2016). Cognitive Pretesting. In GESIS Survey Guidelines. Mannheim, Germeny: GESIS – Leibniz Institute for the Social Sciences. https://doi.org/10.15465/gesis-sg_en_010
  • Lewis, A., & Smith, D. (1993). Defining higher order thinking. Theory Into Practice, 32(3), 131–137. https://doi.org/10.1080/00405849309543588
  • Libman, Z. (2010). Integrating Real-Life Data Analysis in Teaching Descriptive Statistics: A Constructivist Approach. Journal of Statistics Education, 18(1). https://doi.org/10.1080/10691898.2010.11889477
  • Lodico, M. G., Spaulding, D. T., & Voegtle, K. H. (2006). Methods in Educational Research: From Theory to Practice. Wiley. Retrieved from https://books.google.com.tr/books?id=G9D81mh9xCAC
  • Lord, F. M. (1980). Applications of item response theory to practical testing problems. Erlbaum Associates. Retrieved from http://books.google.com.tr/books?id=7YhqAAAAMAAJ
  • Lord, T., & Baviskar, S. (2007). Moving Students From Information Recitation to Information Understanding: Exploiting Bloom’s Taxonomy in Creating Science Questions. Journal of College Science Teaching, 36(5), 40–44.
  • Luce, R. D., & Krumhansl, C. L. (1988). Measurement, scaling, and psychophysics. In Stevens’ handbook of experimental psychology: Perception and motivation; Learning and cognition, Vols. 1-2, 2nd ed. (pp. 3–74). Oxford, England: John Wiley & Sons.
  • Lukhele, R., Thissen, D., & Wainer, H. (1994). On the Relative Value of Multiple-Choice, Constructed Response, and Examinee-Selected Items on Two Achievement Tests. Journal of Educational Measurement. https://doi.org/10.1111/j.1745- 3984.1994.tb00445.x
  • Maier, N. R. F. (1937). Reasoning in rats and human beings. Psychological Review. https://doi.org/10.1037/h0062900
  • MAIER, N. R. F. (1933). AN ASPECT OF HUMAN REASONING. British Journal of Psychology. General Section. https://doi.org/10.1111/j.2044-8295.1933.tb00692.x
  • Maris, E. (1999). Estimating multiple classification latent class models. Psychometrika, 64(2), 187–212. https://doi.org/10.1007/bf02294535
  • Martinez, M. E. (1999). Cognition and the question of test item format. Educational Psychologist. https://doi.org/10.1207/s15326985ep3404_2
  • Marzano, R. J. (1998). A Theory-Based Meta-Analysis of Research on Instruction. Aurora, Colorado: Mid-continent Regional Educational Laboratory.
  • McCutcheon, A. L. (1987). Latent Class Analysis. SAGE Publications.
  • McDonald, R. P. (2013). Test Theory: A Unified Treatment. Taylor & Francis.
  • McKeachie, W. J. (1987a). Can evaluating instruction improve teaching? New Directions for Teaching and Learning. https://doi.org/10.1002/tl.37219873103
  • McKeachie, W. J. (1987b). Cognitive skills and their transfer: Discussion. International Journal of Educational Research. https://doi.org/10.1016/0883-0355(87)90010-3
  • McLeod, D. B. (1992). Research on affect in mathematics education: A reconceptualization. In Handbook of research on mathematics teaching and learning: A project of the National Council of Teachers of Mathematics. (pp. 575–596). New York, NY, England: Macmillan Publishing Co, Inc.
  • Messick, S. (1995). Validity of psychological assessment. American Psychologist. https://doi.org/10.1037//0003-066X.50.9.741
  • Miri, B., David, B.-C., & Uri, Z. (2007). Purposely Teaching for the Promotion of Higher-order Thinking Skills: A Case of Critical Thinking. Research in Science Education, 37(4), 353– 369. https://doi.org/10.1007/s11165-006-9029-2
  • Mislevy, R. J., & Levy, R. (2006). 26 Bayesian Psychometric Modeling From An Evidence- Centered Design Perspective. In C. R. Rao & S. Sinharay (Eds.), Handbook of Statistics (Vol. Volume 26, pp. 839–865). Elsevier. https://doi.org/http://dx.doi.org/10.1016/S0169-7161(06)26026-7
  • Mullis, I. V. S., Martin, M. O. (Eds. . (2013). TIMSS 2015 Assessment Frameworks. TIMSS 2015 Assessment Frameworks. https://doi.org/10.1108/AJEMS-03-2012-0017
  • Murphy, K. R., & Davidshofer, C. O. (2005). Psychological testing: principles and applications. Pearson/Prentice Hall.
  • Naveh-Benjamin, M., McKeachie, W. J., Lin, Y. G., & Tucker, D. G. (1986). Inferring Students’ Cognitive Structures and Their Development Using the “Ordered Tree Technique.” Journal of Educational Psychology. https://doi.org/10.1037/0022- 0663.78.2.130
  • Nichols, P. D. (1994). A Framework for Developing Cognitively Diagnostic Assessments. Review of Educational Research, 64(4), 575–603. https://doi.org/10.3102/00346543064004575
  • Nickerson, R. S. (1989). New Directions in Educational Assessment. Educational Researcher, 18(9), 3–7. https://doi.org/10.3102/0013189X018009003
  • Nitko, A. J., & Brookhart, S. M. (2007). Educational Assessment of Students. Pearson Merrill Prentice Hall.
  • Nunes, T., Schliemann, A. D., & Carraher, D. W. (1993). Street mathematics and school mathematics. New York, NY, US: Cambridge University Press.
  • Nuthall, G., & Alton-Lee, A. (1995). Assessing Classroom Learning: How Students Use Their Knowledge and Experience to Answer Classroom Achievement Test Questions in Science and Social Studies. American Educational Research Journal. https://doi.org/10.3102/00028312032001185
  • O’Neil Jr., H. F., & Brown, R. S. (1998). Differential Effects of Question Formats in Math Assessment on Metacognition and Affect. Applied Measurement in Education, 11(4), 331–351. https://doi.org/10.1207/s15324818ame1104_3
  • Oecd. (2013). PISA 2015 Draft Mathematics Framework. Oecd, (March 2013), 52. https://doi.org/10.1177/0022146512469014
  • OECD. (n.d.). PISA 2012 Assessment and Analytical Framework. Retrieved from file:///content/book/9789264190511-en
  • OECD. (2009a). PISA Data Analysis Manual: SPSS. Oecd. https://doi.org/10.1787/9789264056275-en
  • OECD. (2009b). PISA PISA Data Analysis Manual: SAS, Second Edition. OECD Publishing. Retrieved from https://books.google.com.tr/books?id=idUjxoSfnZIC
  • OECD. (2015a). PISA 2015: Results in focus. Pisa 2015. https://doi.org/10.1787/9789264266490-en
  • OECD. (2015b). PISA 2015 Assessment and Analytical Framework. Retrieved from file:///content/book/9789264255425-en
  • Oulte, D. (2011). Sampling Methods. GRIN Verlag.
  • Patz, R. J., Junker, B. W., Johnson, M. S., & Mariano, L. T. (2002). The Hierarchical Rater Model for Rated Test Items and Its Application to Large-Scale Educational Assessment Data. Journal of Educational and Behavioral Statistics, 27(4), 341–384. Retrieved from http://www.jstor.org/stable/3648122
  • Popham, W. J. (2003). Test Better, Teach Better: The Instructional Role of Assessment. Association for Supervision and Curriculum Development.
  • Porst, R. (2011). Fragebogen: Ein Arbeitsbuch. Wiesbaden: VS Verlag für Sozialwissenschaften.
  • Prawat, R. S. (1996). Constructivisms, modern and postmodern. Educational Psychologist, 31(3–4), 215–225. https://doi.org/10.1207/s15326985ep3103&4_6
  • Reckase, M. D. (2006). 18 Multidimensional Item Response Theory. In C. R. Rao & S. Sinharay (Eds.), Handbook of Statistics (Vol. Volume 26, pp. 607–642). Elsevier. https://doi.org/http://dx.doi.org/10.1016/S0169-7161(06)26018-8
  • Roussos, L. A., Stout, W., DiBello, L. V, Hartz, S. M., Templin, J. L., & Henson, R. A. (2007). The Fusion Model Skills Diagnosis System. In J. P. L. M. J. Gierl (Ed.), Cognitive Diagnostic Assessment for Education Theory and Applications (pp. 275–318). New York: Cambridge University Press.
  • Rupp, A. A., & Mislevy, R. J. (2007). Cognitive psychology as it applies to diagnostic assessment. In J. P. Leighton & M. J. Gierl (Eds.), Cognitive diagnostic assessment in education: Theory and practice. New York, NY, US: Cambridge University Press. https://doi.org/http://dx.doi.org/10.1017/CBO9780511611186
  • Smith, M. S., & Stein, M. K. (1998). Selecting and creating mathematical tasks: From research to practice. Mathematics Teaching in the Middle School.
  • Sousa, D. A. (2008). How the Brain Learns Mathematics. SAGE Publications.
  • Stenmark, J. K., EQUALS., & Committee, C. M. C. C. for M. A. (1989). Assessment Alternatives in Mathematics: An Overview of Assessment Techniques that Promote Learning. EQUALS. Retrieved from https://books.google.com.tr/books?id=9QOOq6V2HF8C
  • Stodolsky, S. S. (1985). Telling Math: Origins of Math Aversion and Anxiety. Educational Psychologist, 20(3), 125–133. https://doi.org/10.1207/s15326985ep2003_2
  • Stout, W. (2002). Psychometrics: From practice to theory and back. Psychometrika, 67(4), 485–518. https://doi.org/10.1007/BF02295128
  • Tatsuoka, K. K. (1983). Rule Space: An Approach for Dealing with Misconceptions Based on Item Response Theory. Journal of Educational Measurement, 20(4), 345–354. Retrieved from http://www.jstor.org/stable/1434951
  • Tatsuoka, K. K. (1995). Architecture of Knowledge Structures and Cognitive. In R. L. B. Paul D. Nichols, Susan F. Chipman (Ed.), Cognitively Diagnostic Assessment. L. Erlbaum. Retrieved from https://books.google.com.tr/books?id=ZnPUnp_nNy0C
  • Tatsuoka, K. K., & Linn, R. L. (1983). Indices for Detecting Unusual Patterns: Links Between Two General Approaches and Potential Applications. Applied Psychological Measurement, 7(1), 81–96. https://doi.org/10.1177/014662168300700111
  • The White House. (2003). An Overview of No Child Left Behind Act. Retrieved from http://www.whitehouse.gov/news/reports/no-child-left-behind.html
  • Thompson, S. K. (2012). Sampling. Wiley. Retrieved from https://books.google.com.tr/books?id=-sFtXLIdDiIC
  • Traub, R. E., & Rowley, G. L. (2018). Understanding Reliability. Educational Measurement: Issues and Practice, 10(1), 37–45. https://doi.org/10.1111/j.1745-3992.1991.tb00183.x
  • van der Linden, W. J., & Wiberg, M. (2010). Local Observed-Score Equating With Anchor- Test Designs. Applied Psychological Measurement, 34(8), 620–640. https://doi.org/10.1177/0146621609349803
  • Walstad, W. B., & Becker, W. E. (1994). Achievement Differences on Multiple-Choice and Essay Tests in Economics. The American Economic Review.
  • Watkins, M. J., & Gardiner, J. M. (1979). An appreciation of generate-recognize theory of recall. Journal of Verbal Learning and Verbal Behavior. https://doi.org/10.1016/S0022- 5371(79)90397-9
  • Xin, T., & Zhang, J. (2015). Local Equating of Cognitively Diagnostic Modeled Observed Scores. Applied Psychological Measurement, 39(1), 44–61. https://doi.org/10.1177/0146621614542427
  • Zohar, A., Degani, A., & Vaaknin, E. (2001). Teachers’ beliefs about low-achieving students and higher order thinking. Teaching and Teacher Education, 17(4), 469–485. https://doi.org/https://doi.org/10.1016/S0742-051X(01)00007-5
  • Zoller, U. (1993). Are lecture and learning compatible? Maybe for LOCS: Unlikely for HOCS. Journal of Chemical Education. https://doi.org/10.1021/ed070p195
  • Zoller, U. (1999). Scaling-up of higher-order cognitive skills-oriented college chemistry teaching: An action-oriented research. Journal of Research in Science Teaching. https://doi.org/10.1002/(SICI)1098-2736(199905)36:5<583::AID-TEA5>3.0.CO;2-M
  • Zoller, U., & Pushkin, D. (2007). Matching Higher-Order Cognitive Skills (HOGS) promotion goals with problem-based laboratory practice in a freshman organic chemistry course. Chemistry Education Research and Practice. https://doi.org/10.1039/B6RP90028C
  • Zoller, U., & Tsaparlis, G. (1997). Higher and lower-order cognitive skills: The case of chemistry. Research in Science Education. https://doi.org/10.1007/BF02463036
  • Züll, C. (2016). Open-Ended Questions. In GESIS Survey Guidelines. Mannheim, Germany: GESIS – Leibniz Institute for the Social Sciences. https://doi.org/10.15465/gesis- sg_en_002

TÜBİTAK ULAKBİM Ulusal Akademik Ağ ve Bilgi Merkezi Cahit Arf Bilgi Merkezi © 2019 Tüm Hakları Saklıdır.