Advances and Applications of Machine Learning in Industry 4.0

Main Article Content

Dilovan Asaad

Abstract

    Abstract- Recently, we humans integrate into the world of our smart phones and our portable electronic devices to the point that that world can numb us in one way or another and separate us from the real world, so that we and the generations that come after us are fully adapted to dealing with huge amounts of digital information, so they are ready to be absorbed faster And ready to deal with it more efficiently than previous generations, but during this process of rapid adaptation and adapting to the digital age gradually begins to lose one thing, this thing is what the machine has not given it yet and it is human emotions. A key step in the humanization of robotics is the ability to classify the emotion of the human operator. In this paper we present the design of an artificially intelligent system capable of emotion recognition trough facial expressions. Three Promising neural network architectures are customized, trained. and subjected to various classification tasks, after which the best performing network is further optimized. The applicability of the final model is portrayed in a live video application that can instantaneously return the emotion of the user. Technology experts have found that we can create empathy through technology as well, which will consequently lead to what is known as “emotional intelligence.” Instead of seeking about digital communication that is losing us to real communication, experts have found that we can employ technology in favor of that type of communication to restore Soul for social and emotional relationships that technology has lost its advantages for many years.

Downloads

Download data is not yet available.

Article Details

Section
Articles

References

Asaad, R., & Ali, R. (2019). Back Propagation Neural Network(BPNN) and Sigmoid Activation Function in Multi-Layer Networks.Academic Journal Of Nawroz University,8(4), 216. doi: 10.25007/ajnu.v8n4a464.[2] Zhang C., Zhang Z.A, (2010). Survey of Recent Advances in Face Detection.Microsoft Corporation; Albuquerque, NM, USA. TechReport, No. MSR-TR-2010-66.[3] 3.Ekman P., Friesen W., Hager J.(2002). Facial Action Coding System: The Manual on CD ROM.A Human Face; Salt Lake City, UT, USA.[4] Li, M., Zang, S., Zhang, B., Li, S., & Wu, C. (2014). A review of remote sensing image classification techniques: The role of spatial-contextual information.European Journal of Remote Sensing,47(1), 389-411.[5] Kwon, O. W., Chan, K., Hao, J., & Lee, T. W. (2003). Emotion recognition by speech signals. InEighth European Conference on Speech Communication and Technology.[6] Schuller, B., Rigoll, G., & Lang, M. (2003, April). Hidden Markov model-based speech emotion recognition. In2003 IEEE International Conference on Acoustics, Speech, and Signal Processing, 2003. Proceedings.(ICASSP'03).(Vol. 2, pp. II-1). IEEE.[7] El Ayadi, M., Kamel, M. S., & Karray, F. (2011). Survey on speech emotion recognition: Features, classification schemes, and databases.Pattern Recognition,44(3), 572-587.[8] Cowie, R., Douglas-Cowie, E., Tsapatsoulis, N., Votsis, G., Kollias, S., Fellenz, W., & Taylor, J. G. (2001). Emotion recognition in human-computer interaction.IEEE Signal processing magazine,18(1), 32-80.[9] Nwe, T. L., Foo, S. W., & De Silva, L. C. (2003). Speech emotion recognition using hidden Markov models.Speech communication,41(4), 603-623.[10] Busso, C., Lee, S., & Narayanan, S. (2009). Analysis of emotionally salient aspects of fundamental frequency for emotion detection.IEEE transactions on audio, speech, and language processing,17(4), 582-596