Yıl: 2019 Cilt: 12 Sayı: 4 Sayfa Aralığı: 307 - 317 Metin Dili: Türkçe DOI: 10.17671/gazibtd.569827 İndeks Tarihi: 15-11-2020

Uyarlanır Yerel Bağlı Nöron Modelinin İncelemesi

Öz:
Bu çalışmada uyarlanır yerel bağlı (odaklanan) nöron modelinin bir incelemesi sunulmuştur. Öncelikle bumodelin varolan diğer nöron modelleri ile ilişkisi incelenmiştir. Daha sonra modelin ileri beslemede çalışması ve geriyeyayılım ile eğitilmesi tartışılmıştır. Modelin çalışma prensipleri sentetik sınıflandırma veri kümeleri üzerinde deneylerlegösterilmiştir. Son olarak, basit ve evrişimli ağların saklı katmanlarında odaklı nöronlar kullanılması halinde tam bağlınöronlara göre daha iyi bir performans elde edilebileceği MNIST, CIFAR10, FASHION gibi popüler imge tanıma verikümelerinde karşılaştırmalı olarak gösterilmiştir.
Anahtar Kelime:

A Study on Adaptive Locally Connected Neuron Model

Öz:
The manuscript presents a detailed study of adaptive local connected (focusing) neuron model. Our analysis starts with the model’s relation to other neuron models. Then we describe the feed-forward operation and its training with backpropagation gradient descent algorithm. The operation principles of the model were demonstrated with synthetically sampled data sets. Finally, the comparative experiments on popular image recognition datasets such as MNIST, CIFAR10, and FASHION show that using focusing neuron layers can improve the classification performance in some data sets.
Anahtar Kelime:

Belge Türü: Makale Makale Türü: Araştırma Makalesi Erişim Türü: Erişime Açık
  • [1] Y. Sun et al., “Deep-id3: Face recognition with very deep neural networks”, CoRR, arXiv:1502.00873, 2015.
  • [2] Y. Zhang et al., “Very deep convolutional networks for end-toend speech recognition”, ICASSP, Lousianna, 4845-4849, 2017.
  • [3] Tai et al., “Improved Semantic Representations From TreeStructured Long Short-Term Memory Networks”, CoRR, arXiv:1503.00075, 2015.
  • [4] D. Silver, A. Huang, et al., “Mastering the game of Go with deep neural networks and tree search”, Nature, 529, 484-489, 2016.
  • [5] Y. LeCun, Y. Bengio, G. Hinton, “Deep learning”, Nature, 521, 436–444, 2015.
  • [6] C. Cortes, X. Gonzalvo, V. Kuznetsov, M. Mohri, S. Yang, “Adanet: Adaptive structural learning of articial neural networks”, ICMLR, 70, 874-883, Sydney, 2017.
  • [7] E. Fiesler, “Comparative bibliography of ontogenic neural networks”, ICANN, 793-796, Springer, 1994.
  • [8] B. Hassibi, D. G. Stork, G. J. Wol, “Optimal brain surgeon and general network pruning”, IEEE Int. Conf. on Neural Networks, 1, 293-299, 1993.
  • [9] A. Romero, N. Ballas, S. E. Kahou, A. Chassang, C. Gatta, Y. Bengio, “Fitnets: Hints for thin deep nets”, ICLR, CA, ABD, 2015.
  • [10] B. Baker, O. Gupta, N. Naik, R. Raskar, “Designing neural network architectures using reinforcement learning”, ICLR, Toulene, FR, 2017.
  • [11] S. Han, J. Pool, J. Tran, W. J. Dally, “Learning both weights and connections for efficient neural networks”, Neural Information Processing Systems, 1135-1143, Montreal, CND, 2015.
  • [12] A. Coates, A. Y. Ng, “Selecting receptive fields in deep networks”, Neural Information Processing Systems, Granada, SPN, 2011.
  • [13] I. Çam, F. B. Tek, “Odaklanan nöron (focusing neuron)”, IEEE 25th Signal Processing and Communications Applications (SIU), 1-4, Zonguldak, TR, 2017.
  • [14] E. R. Kandel, In search of memory: The emergence of a New Science of Mind, W. W. Norton & Company, 2006.
  • [15] A. R. Luria, “The Functional Organization of the Brain”, Scientific American, 222(3), 66-79, 1970.
  • [16] D. J. Graham, “Routing in the brain”, Frontiers in Computational Neuroscience, 8, 44, 2014.
  • [17] C. D. Gilbert, W. Li, V. Piech, “Perceptual learning and adult cortical plasticity”, The Journal of Physiology, 30, 2743-2751, 2009.
  • [18] T. Suter, Z. J. DeLoughery, A. Jaworski, “Meninges-derived cues control axon guidance”, Developmental Biology, 430, 1-10, 2017.
  • [19] S. J. Pan, Q. Yang, “A survey on transfer learning”, IEEE Trans. on knowledge and data engineering, 22(10), 1345-1359, 2010.
  • [20] A. Soltoggio, K. O. Stanley, S. Risi, “Born to learn: the inspiration, progress, and future of evolved plastic artificial neural networks”, Neural Networks, 108, 48-67, 2018.
  • [21] B. Baker, O. Gupta, N. Naik, R. Raskar, “Designing neural network architectures using reinforcement learning”, ICLR, Toulon, FR, 2017.
  • [22] H. Liu, K. Simonyan, Y. Yang, “Darts: Differentiable architecture search”, CoRR, arXiv 1806.09055, 2018.
  • [23] M. T. Hagan, H. B. Demuth, M. H. Beale, Neural Network Design, Martin Hagan, 2014.
  • [24] D. Elizondo, R. Fiesler, “A survey of partially connected neural networks”, Int J. Neural Systems, 8, 535-568, 1997.
  • [25] Y. Taigman, M. Yang, M. Ranzato, L. Wolf, “Deepface: Closing the gap to human-level performance in face verification”, CVPR, 1701-1708, 2014.
  • [26] H. A. Rowley, S. Baluja, T. Kanade, “Neural network-based face detection”, IEEE Trans. Pattern Anal. Mach. Intell. 20, 23-38, 1998.
  • [27] Y. LeCun, L. Bottou, Y. Bengio, P. Haffner, “Gradient-based learning applied to document recognition”, Proc. of the IEEE, 86, 2278-2324, 1998.
  • [28] K. Gregor, Y. LeCun, “Emergence of complex-like cells in a temporal product network with local receptive fields”, CoRR, arXiv:abs/1006.0448, 2010.
  • [29] T. Poggio, T. Serre, “Models of visual cortex”, Scholarpedia, 8, 4, 3516, 2013.
  • [30] F. Rosenblatt, “The perceptron: A probabilistic model for information storage and organization in the brain, cornell aeronautical laboratory”, Psychological Review, 65, 386-408, 1958
  • [31] B. A. Olshausen, D. J. Field, “Emergence of simple-cell receptive field properties by learning a sparse code for natural images”, Nature, 381, 607-609, 1996.
  • [32] S. Munder, D. M. Gavrila, “An experimental study on pedestiran classification”, IEEE Trans. Pattern Anal. Mach. Int., 28, 1863-1868, 2006.
  • [33] M. J. L. Orr, Introduction to radial basis function networks, Report Gatech, 1996.
  • [34] T. Kohonen, “The self-organizing map”, Proceedings of the IEEE, 78, 1464-1480, 1990.
  • [35] E. Oja, “Simplified neuron model as a principal component analyser”, Journal of Mathematical Biology, 15, 267-273, 1982.
  • [36] T. Miconi, J. Clune, K. O. Stanley, “Differentiable plasticity: training plastic networks with gradient descent”, ICML, Stockholm, Sweden, 2018.
  • [37] L. Itti, C. Koch, E. Niebur, “A model of saliency-based visual attention for rapid scene analysis”, IEEE Trans. Pattern Anal. Mach. Int., 20, pp 1254 1259, 1998.
  • [38] B. Olshausen, C. Anderson, D. Van Essen, “A neurobiological model of visual attention and invariant pattern recognition based on dynamic routing of information”, Journal of Neuroscience, 13, 4700-4719, 1993.
  • [39] K. Xu, J. L. Ba, R. K. et al., “Show, attend and tell: Neural image caption generation with visual attention”, ICML, 37, 2048-2057, 2015.
  • [40] J. Ba, V. Mnih, K. Kavukcuoglu, “Multiple object recognition with visual attention”, CoRR, arXiv 1412.7755, 2014.
  • [41] B. Cheung, E. Weiss, B. A. Olshausen, “Emergence of foveal image sampling from learning to attend in visual scenes”, CoRR arXiv: abs/1611.09430, 2016.
  • [42] S. Sabour, N. Frosst, G. E. Hinton, “Dynamic routing between capsules”, CoRR, arXiv: abs/1710.09829, 2017.
  • [43] Theano Development Team, “Theano: A {Python} framework for fast computation of mathematical expressions”, CoRR, arXiv:abs/1605.02688, 2016
  • [44] Internet: https://github.com/btekgit/FocusingNeuron.
  • [45] F. B. Tek, “An Adaptive Locally Connected Neuron Model: Focusing Neuron”, CoRR, arXiv:1809.09533, Aug, 2018.
  • [46] B. Can, "LSTM Ağları ile Türkçe Kök Bulma". Bilişim Teknolojileri Dergisi, 12(3), 183-193, 2019
  • [47] M. A. Kızrak, B. Bolat, "Derin Öğrenme ile Kalabalık Analizi Üzerine Detaylı Bir Araştırma". Bilişim Teknolojileri Dergisi, 11(3), 263-286, 2018.
APA Tek F (2019). Uyarlanır Yerel Bağlı Nöron Modelinin İncelemesi. , 307 - 317. 10.17671/gazibtd.569827
Chicago Tek Faik Boray Uyarlanır Yerel Bağlı Nöron Modelinin İncelemesi. (2019): 307 - 317. 10.17671/gazibtd.569827
MLA Tek Faik Boray Uyarlanır Yerel Bağlı Nöron Modelinin İncelemesi. , 2019, ss.307 - 317. 10.17671/gazibtd.569827
AMA Tek F Uyarlanır Yerel Bağlı Nöron Modelinin İncelemesi. . 2019; 307 - 317. 10.17671/gazibtd.569827
Vancouver Tek F Uyarlanır Yerel Bağlı Nöron Modelinin İncelemesi. . 2019; 307 - 317. 10.17671/gazibtd.569827
IEEE Tek F "Uyarlanır Yerel Bağlı Nöron Modelinin İncelemesi." , ss.307 - 317, 2019. 10.17671/gazibtd.569827
ISNAD Tek, Faik Boray. "Uyarlanır Yerel Bağlı Nöron Modelinin İncelemesi". (2019), 307-317. https://doi.org/10.17671/gazibtd.569827
APA Tek F (2019). Uyarlanır Yerel Bağlı Nöron Modelinin İncelemesi. Bilişim Teknolojileri Dergisi, 12(4), 307 - 317. 10.17671/gazibtd.569827
Chicago Tek Faik Boray Uyarlanır Yerel Bağlı Nöron Modelinin İncelemesi. Bilişim Teknolojileri Dergisi 12, no.4 (2019): 307 - 317. 10.17671/gazibtd.569827
MLA Tek Faik Boray Uyarlanır Yerel Bağlı Nöron Modelinin İncelemesi. Bilişim Teknolojileri Dergisi, vol.12, no.4, 2019, ss.307 - 317. 10.17671/gazibtd.569827
AMA Tek F Uyarlanır Yerel Bağlı Nöron Modelinin İncelemesi. Bilişim Teknolojileri Dergisi. 2019; 12(4): 307 - 317. 10.17671/gazibtd.569827
Vancouver Tek F Uyarlanır Yerel Bağlı Nöron Modelinin İncelemesi. Bilişim Teknolojileri Dergisi. 2019; 12(4): 307 - 317. 10.17671/gazibtd.569827
IEEE Tek F "Uyarlanır Yerel Bağlı Nöron Modelinin İncelemesi." Bilişim Teknolojileri Dergisi, 12, ss.307 - 317, 2019. 10.17671/gazibtd.569827
ISNAD Tek, Faik Boray. "Uyarlanır Yerel Bağlı Nöron Modelinin İncelemesi". Bilişim Teknolojileri Dergisi 12/4 (2019), 307-317. https://doi.org/10.17671/gazibtd.569827