SELDA YILDIRIM
(Abant İzzet Baysal Üniversitesi, Bolu, Türkiye)
Yıl: 2008Cilt: 2008Sayı: 34ISSN: 1300-5340 / 2536-4758Sayfa Aralığı: 297 - 307Türkçe

162 0
Farklı İşleyen Maddelerin Belirlenmesinde Sınırlandırılmış Faktör Çözümlemesinin Olabilirlik-Oranı ve Mantel-Haenszel Yöntemleriyle Karşılaştırılması
Bu araştırmanın amacı farklı işleyen maddelerin belirlenmesinde, Mantel-Haenszel (M-H) ve Olabilirlik Oranı Analizi (OOA) gibi yöntemlere kıyasla, daha seyrek kullanılan Sınırlandırılmış Faktör Çözümlemesi (SFÇ) yöntemini M-H ve OOA ile karşılaştırmaktır. Söz konusu çalışmada PISA 2003 matematik maddeleri kullanılmış, İngilizce ve Türkçe formları arasında yanlı çalışma potansiyeli olan sorular tespit edilmiştir. Gerçek veri kullanılan bu analizlere ek olarak, elde edilen bulguların kontrol edilmesine ve açıklanmasına yönelik bir simülasyon çalışması da yapılmıştır. Sonuçta SFÇ yönteminin karşılaştırılan grup ortalamaları farklı ya da eşit olduğu durumlarda M-H ve OOA yöntemlerine göre, faklı işleyen maddeleri belirleme oranına göre daha doğru sonuçlar verdiği görülmüştür.
Sosyal > Eğitim, Eğitim Araştırmaları
DergiAraştırma MakalesiErişime Açık
  • Ackerman, T.A. (1992). A didactic explanation of item bias, item impact, and item validity from a multidimensional perspective. Journal of Educational Measurement, 29(1), 67-91.
  • Allalouf, A., Hambleton, R.K. ve Sireci, S.G. (1999). Identifying the causes of DIF in translated verbal items. Journal of Educational Measurement, 36(3), 185 – 198.
  • Angoff, W.H. ve Ford, S.F. (1973). Item-race interaction on a test of scholastic aptitude. Journal of Educational Measurement, 10, 95-105.
  • Arim, R.G. ve Ercikan, K. (2005 April). Comparability Between the US and Turkish Versions of the Third International Mathematics and Science Study’s Mathematics Test Results. Paper presented at National Council on Measurement in Evaluation, Montreal, Canada.
  • Beaton, A.E. (1998). Comparing cross-national student performance on TIMSS using different test items. International Journal of Educational Research, 29, 529-542.
  • Benito J.G. ve Ara M.J.N. (2000). A comparison of X2, RFA and IRT based procedures in the detection of DIF. Quality ve Quantity, 34, 17-31.
  • Benjamini, Y. ve Hochberg, Y. (1995). Controlling the false discovery rate: a practical and powerful approach to multiple testing. Journal of the Royal Statistical Society.series B, 57, 89-300.
  • Camilli, G. ve Congdon, P. (1999). Application of a method of estimating DIF for polytomous test items. Journal of Educational and Behavioral Statistics, 24(4), 323-341.
  • Camilli, G. ve Shepard, L.A. (1994). Methods for identifying biased test items. California: Sage Publications.
  • Çet, S., Yıldırım, H.H. ve Berberoğlu, G. (2006 June-July). Differential item functioning (DIF) analysis of PISA 2003 mathematics items across Gender and SES groups. Paper presented at the Third International Conference on the Teaching of Mathematics, İstanbul, Turkey.
  • Deng, L.Y. ve Lin, D.K.J. (2000). Random Number Generation for the New Century.The American Statistician, 54(2), 145-150.
  • Donoghue, J.R. ve Allen, N.L. (1993). Thin versus thick matching in the Mantel-Haenszel procedure for detecting DIF. Journal of Educational Statistics, 18(2), 131 –154.
  • Engelhard, G. (1990). Gender differences in performance on mathematics items: evidence from the United States and Thailand. Contemporary Educational Psychology, 15, 13-26.
  • Ercikan, K. (2002). Disentangling sources of differential item functioning in multilanguage assessments. International Journal of Testing, 2(3ve4), 199-215.
  • Gao, L. ve Wang, C. (2005). Using five procedures to detect DIF with passage-based testlets. A Paper prepared for the Poster Presentation at the Graduate Student Poster Session at the Annual Meeting of the National Council of Measurement in Education, Montreal, Quebec.
  • Gierl, M. J., Jodoin, M. ve Ackerman T. (2000). Performance of Mantel-Haenszel, Simultaneous Item Bias Test, and Logistic Regression when the proportion of DIF items is large. Paper Presented at the Annual Meeting of the American Educational Research Association, New Orleans, Louisiana, USA.
  • Gierl, M. ve Khaliq, S. (2001). Identifying soruces of differential item functioning on translated achievement tests: A confirmatory analysis. Journal of Educational Measurement, 38, 164-187.
  • Hambleton, R., Clauser, B., Mazor,K. ve Jones, R. (1993). Advances in the detection of differentially functioning test items. European Journal of Psychological Assessment, 9(1), 1-18.
  • Hambleton, R.K. ve Rogers, H.J. (1989). Detecting potentially biased test items: comparison of IRT area and Mantel-Haenszel methods. Applied Measurement in Education, 2(4), 313-334.
  • Harris, A. M. ve Carlton, S. T. (1993). Patterns of gender differences on mathematics items on the scholastic aptitude test. Applied Measurement in Education, 6, 137-151.
  • Hidalgo, M.D. ve Pina, S.A.L. (2004). Differential item functioning detection and effect size: a comparison between logistic regression and Mantel-Haenszel procedures. Educational and Psychological Measurement, 64(6), 905-915.
  • Hui, C.H. ve Triandis, H.C. (1989). Effects of culture and responce format on extreme response style. Journal of Cross-Cultural Psychology, 20(3), 296-309.
  • Jöreskog, K. ve Sörbom, D. (2001). LISREL 8: User’s reference guide. Chicago: Scientific Software International Inc, USA.
  • Jöreskog, K. ve Sörbom, D. (2002). PRELIS 2:User’s reference guide. Chicago: Scientific Software International Inc, USA.
  • Kelloway, E. K. (1998). Using LISREL for structural equation modeling. London, New Delhi: Sage Publications.
  • Kim, S. ve Cohen, A.S. (1992). Effects of linking methods on detection of DIF. Journal of Educational Measurement, 29(1), 51-66.
  • Lim, R.G. ve Drasgow, F. (1990). Evaluation of two methods for estimating item response theory parameters when assessing differential item functioning. Journal of Applied Psychology, 75, 164-174.
  • Mislevy, R. J. ve Bock, R. D. (1984). BILOG: Maximum likelihood item analysis and test scoring with logistic models. Mooresville, IN: Scientific Software.
  • Narayanan, P. ve Swaminathan, H. (1996). Identification of items that show nonuniform DIF. Applied Psychological Measurement, 20, 257-274.
  • Oort, F.J. (1992). Using restricted factor analysis to detect item bias. Methodika, 6, 150-166.
  • Rogers, J. ve Swaminathan, H. (1993). A comparison of logistic regression and Mantel-Haenszel procedures for detecting differential item functioning. Applied Psychological Measurement, 17(2), 105-116.
  • Sireci, S.G. ve Allalouf, A. (2003). Appraising item equivalence across multiple languages and cultures. Language Testing, 20(2), 148-166.
  • Sireci, S.G., Bastari, B. ve Allalouf, A. (1998 August). Evaluating construct equivalence across adapted tests. Paper presented at American Psychological Association, San Francisco, CA.
  • Sireci, S.G. ve Berberoğlu, G. (2000). Using bilingual respondents to evaluate translated-adapted items. Applied Measurement in Education, 13(3), 229-248.
  • Swaminathan H. ve Rogers, J.H. (1990). Detecting differential item functioning using logistic regression Procedures. Journal of Educational Measurement, 27(4), 361-370.
  • Thissen, D. (2001). IRTLRDIF v.2.0b: Software for the computation of the statistics involved in item response theory likelihood-ratio tests for differential item functioning. Retrieved October 15, 2005, from http://www.unc.edu/~dthissen/dl.html
  • Thissen, D., Steinberg, L. ve Wainer, H. (1988). Use of item response theory in the study of group differences in trace lines. In H. Wainer ve H. Braun (Eds.), Test Validity. (pp. 147-169). Hillsdale, NJ: Erlbaum.
  • Waller, N.G. (2005). EZDIF: A computer program for detecting uniform and nonuniform differential item functioning with the Mantel-Haenszel and logistic regression procedures. Retrieved April 10, 2005, from http://peabody.vanderbilt.edu /depts/ psych_and_hd/ faculty/wallern/
  • Williams, V.S.L., Jones, L.V. ve Tukey, J.W. (1999). Controlling error in multiple comparisons with examples from state-to- state differences in educational achievement. Journal of Educational and Behavioral Statistics, 24(1),42-69.
  • Yurdugül, H. ve Aşkar P. (2004a) Ortaöğretim kurumları öğrenci seçme ve yerleştirme sınavının cinsiyete göre madde yanlılığı açısından incelenmesi. Eğitim Bilimleri ve Uygulama Dergisi, 3(5), 3-20.
  • Yurdugül, H. ve Aşkar, P. (2004b) Ortaöğretim kurumları öğrenci seçme ve yerleştirme sınavının öğrencilerin yerleşim yerlerine göre diferansiyel madde fonksiyonu açısından incelenmesi. Hacettepe Üniversitesi Eğitim Fakültesi Dergisi, 27, 268-275.
  • Zenisky, A.L., Hambleton, R.K. ve Robin, F. (2003). Detection of DIF in large-scale state sssessments: a study evaluating a two-stage approach. Educational and Psychological Measurement, 63(1), 51-64.

TÜBİTAK ULAKBİM Ulusal Akademik Ağ ve Bilgi Merkezi Cahit Arf Bilgi Merkezi © 2019 Tüm Hakları Saklıdır.