9 1

Proje Grubu: EEEAG Sayfa Sayısı: 104 Proje No: 115E248 Proje Bitiş Tarihi: 01.05.2018 Metin Dili: Türkçe İndeks Tarihi: 05-03-2020

Farelerde Ağrıya Bağlı Yüz İfadesinin Otomatik Değerlendirilmesi (FARE-MİMİK)

Öz:
Moleküler agrı mekanizmalarının temellerinin anlasılmasındaki önemli ilerlemelere ve endüstrideki büyük yatırımlara karsın, agrı kesici ilaç gelistirme için yapılan temel arastırma ile klinik uygulama arasındaki translasyonel çalısmalardaki basarılar oldukça sınırlı kalmıstır. Bu arayısın önündeki önemli engellerden biri mevcut hayvan modellerindeki sezinlenebilirlik (specifity) basarısının düsük olması, dolayısı ile gelistirilecek ilaçlar için hızlı ve güvenilir bir deneysel tarama testinin bulunmamasıdır. Bu ihtiyacı karsılamaya yönelik olarak bu projede, fare deneylerinde agrı paradigmaları uygulanarak farelerin yüzlerinde olusan agrı mimiklerinin video kayıtlarından otomatik olarak derecelendirilmesini saglayacak hesaplamalı yöntemlerin gelistirilmesi hedeflenmistir. Literatürde Langford ve ekibi (2010) tarafından gelistirilen Fare Yüzburusturması Derecelendirmesi (Mouse Grimace Scaling, MGS) isimli manuel bir yöntem mevcuttur. Farelerde agrı mimiklerinin otomatik derecelendirilmesi, hız ve daha tarafsız bir etiketleme basarısının saglanmasının yanısıra etiketlemeyi manuel olarak yapabilecek uzmanların yetistirilmesi zorunlulugunu ortadan kaldırmasından dolayı da önem tasımaktadır. Proje, ortak doktora programı bulunan Hacettepe Üniversitesi (HÜ) ile Orta Dogu Teknik Üniversitesi'nin (ODTÜ) nörolojik bilimler ve teknoloji alanlarındaki deneyimleri bir araya getirilerek yürütülmüstür. Projede HÜ Nörolojik Bilimler ve Psikiyatri Enstitüsü (NBPE) tarafında yapılan çalısmalarda, iki degisik agrı paradigması kullanılarak farelerde basagrısı ve karın agrısı yaratılmıs, farelerde olusan agrıya baglı yüz ifadeleriyle ilgili video kayıtları toplanmıs, agrı miktarı uzmanlar tarafından manuel derecelendirilerek veri etiketlemesi yapılmıstır. ODTÜ Nörobilim ve Nöroteknoloji (NSNT) - Elektrik ve Elektronik Mühendisligi (EEMB) tarafındaki çalısmalarda ise toplanan video verilerinde fare yüzünün tespit ve takip edilmesini ve yüz ifadelerinden agrının otomatik derecelendirilmesini saglamak üzere bilgisayarla görme ve derin ögrenme adı verilen makine ögrenmesine dayalı hesaplamalı yöntemlerin gelistirilmesi yoluna gidilmistir. Serbest dolasan farelerde otomatik agrı derecelendirmek üzere, bu projede önerilerek gelistirilen 6 kameralı ODTÜ-HÜ gözlem kutusu, farenin hareketlerinin kısıtlandıgı 2 kameralı Langford kutusuna göre daha kullanıslı ve basarılı olmustur. Bu projeyle gelistirilen yöntem, farklı agrıların fare yüz ifadelerinden otomatik tespitinde ve potansiyel agrı kesicilerin hızlı taranması için tranlasyonel tıp alanında nesnel, kolay uygulanır ve güvenilir bir yaklasım getirmesi açısından önem tasımaktadır.
Anahtar Kelime: yapay ögrenme bilgisayarla görme farede agrı için otomatik yüz ifadesi tanıma Agrı derin ögrenme

Konular: Tıbbi İnformatik Mühendislik, Elektrik ve Elektronik Bilgisayar Bilimleri, Teori ve Metotlar Psikiyatri
Erişim Türü: Erişime Açık
  • 1. Aifanti, N., Papachristou, C. and Delopoulos, A., 2010, April. The MUG facial expression database. In Image analysis for multimedia interactive services (WIAMIS), IEEE, 2010 11th international workshop on, 1-4.
  • 1- Mouse face tracking using convolutional neural networks (Makale - Diger Hakemli Makale),
  • 2. Ashraf, A.B., Lucey, S., Cohn, J.F., Chen, T., Ambadar, Z., Prkachin, K.M. and Solomon, P.E., 2009. “The painful face–pain expression recognition using active appearance models”. Image and vision computing, 27(12), 1788-1796.
  • 2- Eral M, Çakır-Aktas C, Eren-Koçak E, Dalkara T, Halıcı U, Assessment of pain in mouse facial images / Fare Yüzü Görüntülerinde Agrı DerecelendirilmesiDOI: 10.1109/BIYOMUT.2016.7849416 (Bildiri - Ulusal Bildiri - Poster Sunum),
  • 3. Bartlett, M.S., Javier, R., Littlewort, M. and Fasel, I. 2014. “Fully automatic coding of basic expr essions from video. Machine Perception Laboratory” , Institute for Neural Computation University of California, San Die go, CA 92093,
  • 3- Akkaya B, Tabar YR, Gharbalchi F, Ulusoy I, Halıcı U, Tracking Mice Face in Video /Fare Yüzünün Videoda Takip EdilmesiDOI: 10.1109/BIYOMUT.2016.7849406 (Bildiri - Ulusal Bildiri - Poster Sunum),
  • 4. Bartlett, M.S., Littlewort, G., Fasel, I., and Movellan, J.R. June 2003. “Real time face detection and facial expression recognition: Development and applications to human computer interaction”. In Computer Vision and Pattern Recognition Workshop, 5, 53–53.
  • 4- Eral M, Çakır-Aktas C, Eren-Koçak E, Dalkara T, Halıcı U, Assessment of pain in mouse facial images / Fare Yüzü Görüntülerinde Agrı DerecelendirilmesiDOI: 10.1109/BIYOMUT.2016.7849416 (Bildiri - Ulusal Bildiri - Poster Sunum),
  • 5. Cehovin, L., Kristan, M., and Leonardis, A. 2014. “Is my new tracker really better than yours?”, In IEEE Winter Conference on Applications of Computer Vision, 540–547.
  • 5- Akkaya B, Tabar YR, Gharbalchi F, Ulusoy I, Halıcı U, Tracking Mice Face in Video /Fare Yüzünün Videoda Takip EdilmesiDOI: 10.1109/BIYOMUT.2016.7849406 (Bildiri - Ulusal Bildiri - Poster Sunum),
  • 6. Chatfield, K., Simonyan, K., Vedaldi, A., Zisserman, A. 2014. “Return of the Devil in the Details: Delving Deep into Convolutional Nets”, British Machine Vision Conference.
  • 6- MOUSE FACE TRACKING USING CONVOLUTIONAL NEURAL NETWORKS (Tez (Arastırmacı Yetistirilmesi) - Yüksek Lisans Tezi),
  • 7. Cootes, T.F., Edwards, G.j., Taylor, C.J. 1998. “Active Appereance Models”, Computer Vision, ECCV'98. Lecture Notes in Computer Science, 1407, 484.
  • 7- DEEP LEARNING APPROACH FOR LABORATORY MICE GRIMACE SCALING (Tez (Arastırmacı Yetistirilmesi) - Yüksek Lisans Tezi),
  • 8. Cottrell, G. and Padgett, C. 1996. “Representing face images for emotion classification”. Department of Computer Science University of California, San Diego La Jolla.
  • 9. Craig, K.D., Prkachin, K.M., Grunau, R.V.E. 2001. “The facial expression of pain”, Handbook of pain assessment, Editors: Turk D.C. and R. Melzack, New York: Guilford.
  • 10. Danelljan, M., Hager, G., Shahbaz Khan, F. and Felsberg, M. 2016. “Adaptive decontamination of the training set: A unified formulation for discriminative visual tracking”. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, 1430-1438.
  • 11. Danelljan, M., Hager, G., Shahbaz Khan, F. and Felsberg, M., 2015. “Learning spatially regularized correlation filters for visual tracking”. In Proceedings of the IEEE International Conference on Computer Vision, 4310-4318.
  • 12. Danelljan, M., Robinson, A., Sahbaz Khan, F.S. and Felsberg, M. 2016. “Beyond correlation filters: Learning continuous convolution operators for visual tracking”. In European Conference on Computer Vision, Springer, Cham, 472-488.
  • 13. Darwin, C. 1872. The Expression of the Emotions in Man and Animals. London: Albemarle.
  • 14. Rumelhart, D.E., Hinton, G.E. and Williams, R.J., 1986. “Learning representations by back-propagating errors”. nature, 323(6088), 533.
  • 15. Defensor, E.B., Corley, M.J., Blanchard, R.J., Blanchard, D.C. 2012. “Facial expressions of mice in aggressive and fearful contexts”, Physiology and Behavior, 107, 680–685.
  • 16. Ekman, P., Friesen, W.V. 1978. “Manual for the Facial Action Coding System”. Palo Alto CA: Consulting Psychologists Press.
  • 17. Girshick, R., Donahue, J., Darrell, T. and Malik, J. 2014. “Rich feature hierarchies for accurate object detection and semantic segmentation”. In Proceedings of the IEEE conference on computer vision and pattern recognition, 580-587.
  • 18. Hai, T.S., Thai, L.H. and Thuy, N.T., 2015. “Facial expression classification using artificial neural network and k-nearest neighbor”. International Journal of Information Technology and Computer Science (IJITCS), 7(3), 27.
  • 19. Hammal, Z. and Cohn, J.F. 2012. “Automatic detection of pain intensity”. In Proceedings of the 14th ACM international conference on Multimodal interaction, 47-52.
  • 20. Hammal, Z., Kunz, M. 2012. “Pain Monitoring: A Dynamic and Context-sensitive System”, Pattern Recognition, 45(4), 1265-1280.
  • 21. Hammal, Z. 2009. “Context based recognition of pain expression intensities”. In The 5th Workshop on Emotion in Human-Computer Interaction-Real World Challenges-held at the 23rd BCS HCI Group conference. Cambridge University, Cambridge, UK.
  • 22. Held, D., Thrun, S. and Savarese, S., 2016. “Learning to track at 100 fps with deep regression networks”. In European Conference on Computer Vision, 749-765. Springer, Cham.
  • 23. Hong, S., You, T., Kwak, S. and Han, B. 2015. “Online tracking by learning discriminative saliency map with convolutional neural network”. In International Conference on Machine Learning, 597-606.
  • 24. Houben, T., Loonen, I.C.M., Baca, S.M., Schenke, M., Meijer, J.H., Ferrari, M.D., Terwindt, G.M., Voskuyl, R.A., Charles, A., Maagdenberg, A.M.J.M., Tolner, E.A. 2016. "Optogenetic induction of cortical spreading depression in anesthetized and freely behaving mice", Journal of Cerebral Blood Flow and Metabolism.
  • 25. Ioffe, S. and Szegedy, C., 2015. “Batch normalization: Accelerating deep network training by reducing internal covariate shift”. arXiv preprint arXiv:1502.03167.
  • 26. Jiang, H. and Learned-Miller, E., 2017. “Face detection with the faster R-CNN”. In Automatic Face & Gesture Recognition (FG 2017), 2017 12th IEEE International Conference on, 650-657.
  • 27. Karatas, H., Erdener, S.E., Gursoy-Ozdemir, Y., Lule, S., Eren-Koçak, E., Sen, Z.D. and Dalkara, T., 2013. “Spreading depression triggers headache by activating neuronal Panx1 channels”. Science, 339(6123), 1092-1095.
  • 28. Khanam, A., Shafiq, M.Z. and Akram, M.U., 2008. “Fuzzy based facial expression recognition”. IEEE, In Image and Signal Processing, 2008. CISP'08. Congress on, 1, 598-602.
  • 29. Lee, Y.H., Han, W., Kim, Y. and Kim, C.G., 2014. “Robust emotion recognition algorithm for ambiguous facial expression using optimized AAM and k-NN”. International Journal of Security and Its Applications, 8(5), 203-212.
  • 30. Kotsia, I. and I. Pitas. 2005. “Real time facial expression recognition from image sequences using support vector machines”. In Image Processing, ICIP 2005. IEEE International Conference on, 2, II–966–9.
  • 31. Kotsia, I., Nikolaidis, N. and Pitas, I. 2007. “Facial expression recognition in videos using a novel multi-class support vector machines variant”. In Acoustics, Speech and Signal Processing, ICASSP 2007. IEEE International Conference on, 2, II-585.
  • 32. Krizhevsky, A., Sutskever, I. and Hinton, G.E. 2012. “Imagenet classification with deep convolutional neural networks”. In Advances in neural information processing systems 1097-1105.
  • 33. Kumar, V., Namboodiri, A. and Jawahar, C.V. 2015. “Visual phrases for exemplar face detection. In Proceedings of the IEEE International Conference on Computer Vision, 1994-2002.
  • 34. Langford, D.J., Bailey, A.L., Chanda, M.L., Clarke, S.E., Drummond, T.E., Echols, S., Glick, S., Ingrao, J., Klassen-Ross, T., LaCroix-Fralish, M.L. and Matsumiya, L. 2010. “Coding of facial expressions of pain in the laboratory mouse”. Nature methods, 7(6), 447-449.
  • 35. Leach, M.C., Klaus, K., Miller, A.L., Di Perrotolo, M.S., Sotocinal, S.G. and Flecknell, P.A., 2012. “The assessment of post-vasectomy pain in mice using behaviour and the Mouse Grimace Scale”. PloS one, 7(4), e35656.
  • 36. Cun, Y.L., Boser, B., Denker, J.S., Howard, R.E., Habbard, W., Jackel, L.D. and Henderson, D., 1990. “Handwritten digit recognition with a back-propagation network”. In Advances in neural information processing systems, Morgan Kaufmann Publishers Inc, 2 396-404.
  • 37. LeCun, Y., Bottou, L., Bengio, Y. and Haffner, P., 1998. “Gradient-based learning applied to document recognition”. Proceedings of the IEEE, 86(11), 2278-2324.
  • 38. Lei, G., Li, X.H., Zhou, J.L. and Gong, X.G., 2009. “Geometric feature based facial expression recognition using multiclass support vector machines”. In Granular Computing, 2009, GRC'09. IEEE International Conference on, 318-321.
  • 39. Li, H., Li, Y. and Porikli, F. 2016. “Deeptrack: Learning discriminative feature representations online for robust visual tracking”. IEEE Transactions on Image Processing, 25(4), 1834-1848.
  • 40. Li, W., Li, M., Su, Z. and Zhu, Z. 2015. “A deep-learning approach to facial expression recognition with candid images”. In Machine Vision Applications (MVA), 2015 14th IAPR International Conference on, 279-282.
  • 41. Li, Y. and Zhu, J. 2014. “A scale adaptive kernel correlation filter tracker with feature integration”. In European Conference on Computer Vision, Springer, Cham, 254-265.
  • 42. Li, Y., Zhu, J. and Hoi, S.C., 2015. “Reliable patch trackers: Robust visual tracking by exploiting reliable patches”. In Computer Vision and Pattern Recognition (CVPR), 2015 IEEE Conference on, 353-361.
  • 43. Liao, S., Jain, A.K. and Li, S.Z. 2016.” A fast and accurate unconstrained face detector”. IEEE transactions on pattern analysis and machine intelligence, 38(2), 211-223.
  • 44. Gwen, C., Bartlett, M.S., Littlewort, G.C. and Kang, L. 2007. “Faces of Pain: Automated Measurement of Spontaneous Facial Expressions of Genuine and Posed Pain”. In Proceedings of ICMI, 7, 12-15.
  • 45. Ma, C., Huang, J.B., Yang, X. and Yang, M.H. 2015. “Hierarchical convolutional features for visual tracking”. In Proceedings of the IEEE International Conference on Computer Vision, 3074-3082.
  • 46. Makowska, J., Weary, D.M. 2013. “Assessing the emotions of laboratory rats”, Applied Animal Behaviour Science, 148, 1-12.
  • 47. Martins, P., 2008. “Active appearance models for facial expression recognition and monocular head pose estimation”. Portugal: Department of Electrical and Computer Engineering, Faculty of Sciences and Technology, University of Coimbra.
  • 48. Mogil, J.S., Crager, S.E. 2004. “What should we be measuring in behavioral studies of chronic pain in animals?” Pain, 112,12-15.
  • 49. Mogil, J.S., Davis, K.D., Derbyshire, S.W. 2010. “The necessity of animal models in pain research”, Pain, 151, 12-17.
  • 50. Mogil, J.S. 2009. “Animal models of pain: progress and challenges”, Nat Rev Neurosci, 10, 283-294.
  • 51. Monwar, M.M. and Rezaei, S. 2006. “Pain recognition using artificial neural network”. In Signal Processing and Information Technology, 2006 IEEE International Symposium on pp. 28-33.
  • 52. Mufti, M. and Khanam, A. 2006. “Fuzzy rule based facial expression recognition”. In Computational Intelligence for Modelling, Control and Automation, 2006 and International Conference on Intelligent Agents, Web Technologies and Internet Commerce, International Conference on, 57-57.
  • 53. Nam, H. and Han, B. 2016. “Learning multi-domain convolutional neural networks for visual tracking”. In Computer Vision and Pattern Recognition (CVPR), 2016 IEEE Conference on, 4293-4302.
  • 54. Nam, H. and Han, B. 2015. “Learning multi-domain convolutional neural networks for visual tracking,” arXiv preprint arXiv:1510.07945.
  • 55. Pantic, M. and Rothkrantz, L.J.M. 2000. “Automatic analysis of facial expressions: The state of the art”. IEEE Transactions on pattern analysis and machine intelligence, 22(12), 1424-1445.
  • 56. Pantic, M. and Rothkrantz, L.J. 2004. “Facial action recognition for facial expression analysis from static face images”. IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics), 34(3), 1449-1461.
  • 57. Patil, R.A. and Sahula, V. 2012. “Features classification using support vector machine for a facial expression recognition system”. Journal of Electronic Imaging, 21(4), 043003.
  • 58. Prkachin, K.M., Solomon, P.E., Ross, A.J. 2007. “The underestimation of pain among health-care providers” Can J Nurs Res., 39, 88-106.
  • 59. Prkachin, K.M. 1992. “The consistency of facial expressions of pain: a comparison across modalities”, Pain, 51, 297-306.
  • 60. Ranjan, R., Patel, V.M., Chellappa, R. 2015. “A Deep Pyramid Deformable Part Model for Face Detection”. IEEE International Conference on Biometrics: Theory, Applications and Systems (BTAS).
  • 61. Ranjan, R., Patel, V.M. and Chellappa, R. 2017.” Hyperface: A deep multi-task learning framework for face detection, landmark localization, pose estimation, and gender recognition”. IEEE Transactions on Pattern Analysis and Machine Intelligence.
  • 62. Russakovsky, O., Deng, J., Su, H., Krause, J., Satheesh, S., Ma, S., Huang, Z., Karpathy, A., Khosla, A., Bernstein, M. and Berg, A.C. 2015. “Imagenet large scale visual recognition challenge”. International Journal of Computer Vision, 115(3), 211-252.
  • 63. Simonyan, K. and Zisserman, A. 2014. “Very deep convolutional networks for large-scale image recognition”. https://arxiv.org/abs/1409.1556 son erişim tarihi, 29 Haziran 2018.
  • 64. Silva, C., Sobral, A. and Vieira, R.T., 2014. “An automatic facial expression recognition system evaluated by different classifiers”. In X Workshop de Visao Computacinal (WVC 2014), Uberlandia, Minas Gerais, Brazil.
  • 65. Sohail, A.S.M. and Bhattacharya, P. 2007. “Classification of facial expressions using knearest neighbor classifier”. In International Conference on Computer Vision/Computer Graphics Collaboration Techniques and Applications, Springer, Berlin, Heidelberg, 555- 566.
  • 66. Sotocinal, S.G., Sorge, R.E., Zaloum, A., Tuttle, A.H., Martin, L.J., Wieskopf, J.S., Mapplebeck, J.C., Wei, P., Zhan, S., Zhang, S. and McDougall, J.J. 2011. “The Rat Grimace Scale: a partially automated method for quantifying pain in the laboratory rat via facial expressions”. Molecular pain, 7(1), p.55.
  • 67. Stanley, K.L., Paice, J.A. 1997. “Animal Models in Pain Research Seminars in Oncology Nursing” , 113(1), 3-9.
  • 68. Suja, P., Tripathi, S. and Deepthy, J., 2014. Emotion recognition from facial expressions using frequency domain techniques. In Advances in signal processing and intelligent recognition systems, Springer, Cham, 299-310.
  • 69. Tuttle, A.H., Molinaro, M.J., Jethwa, J.F., Sotocinal, S.G., Prieto, J.C., Styner, M.A., Mogil, J.S. and Zylka, M.J. 2018. “A deep neural network to assess spontaneous pain from mouse facial expressions”. Molecular pain, 14, 1744806918763658.
  • 70. Uijlings, J.R., Van De Sande, K.E., Gevers, T. and Smeulders, A.W. 2013. “Selective search for object recognition”. International journal of computer vision, 104(2), 154-171.
  • 71. Viola, P. and Jones, M. 2001. “Rapid object detection using a boosted cascade of simple features”. In Computer Vision and Pattern Recognition, 2001. CVPR 2001. Proceedings of the 2001 IEEE Computer Society Conference on, 1, I-I.
  • 72. Wan, S., Chen, Z., Zhang, T., Zhang, B. and Wong, K.K., 2016. “Bootstrapping face detection with hard negative examples". arXiv preprint arXiv:1608.02236.
  • 73. Wang, L., Liu, T., Wang, G., Chan, K.L. and Yang, Q. 2015. “Video tracking using learned hierarchical features”. IEEE Transactions on Image Processing, 24(4), 1424-1435.
  • 74. Wang, L., Ouyang, W., Wang, X. and Lu, H. 2015. “Visual tracking with fully convolutional networks”. In Proceedings of the IEEE International Conference on Computer Vision, 3119-3127.
  • 75. Wang, N. and Yeung, D.Y. 2013. “Learning a deep compact image representation for visual tracking”. In Advances in neural information processing systems, 809-817.
  • 76. Wang, N., Li, S., Gupta, A. and Yeung, D.Y., 2015. “Transferring rich feature hierarchies for robust visual tracking”. https://arxiv.org/abs/1501.04587. son erişim tarihi, 29 Haziran 2018.
  • 77. Wang, N., Li, S., Gupta, A. and Yeung, D.Y. 2015. “Transferring rich feature hierarchies for robust visual tracking”. https://arxiv.org/abs/1501.04587 son erişim tarihi, 29 Haziran 2018.
  • 78. Wen, C.J. and Zhan Y. Z. 2008. “Hmm+knn classifier for facial expression recognition”. In 2008 3rd IEEE Conference on Industrial Electronics and Applications, 260–263.
  • 79. Whittaker, A.L. and Howarth, G.S. 2014. “Use of spontaneous behaviour measures to assess pain in laboratory rats and mice: How are we progressing?”. Applied Animal Behaviour Science, 151, 1-12.
  • 80. Williams, A.C.D.C. 2002. “Facial expression of pain: an evolutionary account”. Behavioral and brain sciences, 25(4), 439-455.
  • 81. Yang, B., Yan, J., Lei, Z. and Li, S.Z. 2015. “Convolutional channel features”. In Computer Vision (ICCV), 2015 IEEE International Conference on, 82-90.
  • 82. Yang, S., Luo, P., Loy, C.C. and Tang, X., 2015. “From facial parts responses to face detection: A deep learning approach”. In Proceedings of the IEEE International Conference on Computer Vision, 3676-3684.
  • 83. Ye, X., Chen, X., Chen, H., Gu, Y. and Lv, Q., 2015. “ Deep learning network for face detection”. In Communication Technology (ICCT), 2015 IEEE 16th International Conference on, 504-509.
  • 84. Yu, J., Jiang, Y., Wang, Z., Cao, Z. and Huang, T., 2016. “Unitbox: An advanced object detection network”. In Proceedings of the 2016 ACM on Multimedia Conference, 516- 520.
  • 85. Li, Y., Sun, B., Wu, T. and Wang, Y., 2016. “Face detection with end-to-end integration of a convnet and a 3d model”. In European Conference on Computer Vision, Springer, Cham, 420-436).
  • 86. Zhang, K., Zhang, Z., Li, Z. and Qiao, Y. 2016. “Joint face detection and alignment using multitask cascaded convolutional networks. IEEE Signal Processing Letters, 23(10), 1499-1503.
APA HALICI U, EREN KOÇAK E, DALKARA T, ulusoy i (2018). Farelerde Ağrıya Bağlı Yüz İfadesinin Otomatik Değerlendirilmesi (FARE-MİMİK). , 1 - 104.
Chicago HALICI Uğur,EREN KOÇAK Emine,DALKARA Turgay,ulusoy ilkay Farelerde Ağrıya Bağlı Yüz İfadesinin Otomatik Değerlendirilmesi (FARE-MİMİK). (2018): 1 - 104.
MLA HALICI Uğur,EREN KOÇAK Emine,DALKARA Turgay,ulusoy ilkay Farelerde Ağrıya Bağlı Yüz İfadesinin Otomatik Değerlendirilmesi (FARE-MİMİK). , 2018, ss.1 - 104.
AMA HALICI U,EREN KOÇAK E,DALKARA T,ulusoy i Farelerde Ağrıya Bağlı Yüz İfadesinin Otomatik Değerlendirilmesi (FARE-MİMİK). . 2018; 1 - 104.
Vancouver HALICI U,EREN KOÇAK E,DALKARA T,ulusoy i Farelerde Ağrıya Bağlı Yüz İfadesinin Otomatik Değerlendirilmesi (FARE-MİMİK). . 2018; 1 - 104.
IEEE HALICI U,EREN KOÇAK E,DALKARA T,ulusoy i "Farelerde Ağrıya Bağlı Yüz İfadesinin Otomatik Değerlendirilmesi (FARE-MİMİK)." , ss.1 - 104, 2018.
ISNAD HALICI, Uğur vd. "Farelerde Ağrıya Bağlı Yüz İfadesinin Otomatik Değerlendirilmesi (FARE-MİMİK)". (2018), 1-104.
APA HALICI U, EREN KOÇAK E, DALKARA T, ulusoy i (2018). Farelerde Ağrıya Bağlı Yüz İfadesinin Otomatik Değerlendirilmesi (FARE-MİMİK). , 1 - 104.
Chicago HALICI Uğur,EREN KOÇAK Emine,DALKARA Turgay,ulusoy ilkay Farelerde Ağrıya Bağlı Yüz İfadesinin Otomatik Değerlendirilmesi (FARE-MİMİK). (2018): 1 - 104.
MLA HALICI Uğur,EREN KOÇAK Emine,DALKARA Turgay,ulusoy ilkay Farelerde Ağrıya Bağlı Yüz İfadesinin Otomatik Değerlendirilmesi (FARE-MİMİK). , 2018, ss.1 - 104.
AMA HALICI U,EREN KOÇAK E,DALKARA T,ulusoy i Farelerde Ağrıya Bağlı Yüz İfadesinin Otomatik Değerlendirilmesi (FARE-MİMİK). . 2018; 1 - 104.
Vancouver HALICI U,EREN KOÇAK E,DALKARA T,ulusoy i Farelerde Ağrıya Bağlı Yüz İfadesinin Otomatik Değerlendirilmesi (FARE-MİMİK). . 2018; 1 - 104.
IEEE HALICI U,EREN KOÇAK E,DALKARA T,ulusoy i "Farelerde Ağrıya Bağlı Yüz İfadesinin Otomatik Değerlendirilmesi (FARE-MİMİK)." , ss.1 - 104, 2018.
ISNAD HALICI, Uğur vd. "Farelerde Ağrıya Bağlı Yüz İfadesinin Otomatik Değerlendirilmesi (FARE-MİMİK)". (2018), 1-104.