An Improved Face-Emotion Recognition to Automatically Generate Human Expression With Emoticons

An Improved Face-Emotion Recognition to Automatically Generate Human Expression With Emoticons

Basetty Mallikarjuna, M Sethu Ram, Supriya Addanke
Copyright: © 2022 |Pages: 18
DOI: 10.4018/IJRQEH.314945
OnDemand:
(Individual Articles)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

Any human face image expression naturally identifies expressions of happy, sad etc.; sometimes human facial image expression recognition is complex, and it is a combination of two emotions. The existing literature provides face emotion classification and image recognition, and the study on deep learning using convolutional neural networks (CNN), provides face emotion recognition most useful for healthcare and with the most complex of the existing algorithms. This paper improves the human face emotion recognition and provides feelings of interest for others to generate emoticons on their smartphone. Face emotion recognition plays a major role by using convolutional neural networks in the area of deep learning and artificial intelligence for healthcare services. Automatic facial emotion recognition consists of two methods, such as face detection with Ada boost classifier algorithm and emotional classification, which consists of feature extraction by using deep learning methods such as CNN to identify the seven emotions to generate emoticons.
Article Preview
Top

Motivation

The human brain is made up of billions of neurons used to cell membrane, each neuron is connected with axon, one axon connected with another axon (Li et al., 2020; Mallikarjuna, & Reddy, 2019). The ‘Nucleus’ is the center of neural system, which all dendrites spread from nucleus and act like axons (Gao et al., 2020; Mohit et al., 2020). The input received from axons and transmit like electrical signals and pass it to the neural system the entire biological neuron versus artificial neuron system described in Figure 1.

Figure 1.

Neurol system from Biological Neuron versus Artificial Neuron (Mane, D. T., & Kulkarni, U. V., 2020)

IJRQEH.314945.f01

The artificial neural network (ANN) system came from biological neurol system, and which imitate the human brain architecture, the ANN consists of many layers, each layer composed of many nodes (Mane & Kulkarni, 2020; Mallikarjuna et al., 2020b).

The first layer is called the input layer and receives the input signal and associated with the corresponding weight, the output of each layer gives the input to the next layer, the subsequent of input layer is hidden layer (Mane & Kulkarni, 2020), the ANN have many hidden layers, the last layer of ANN is output layer as shown in Figure 1. As per the requirements of output layer generates the ‘S’ shaped curve, step function etc. (Mane & Kulkarni, 2020). The ‘S’ shaped curve is called sigmoidal function, the mathematical notation of sigmoidal function IJRQEH.314945.m01 eq (1), it is also called hyperbolic tangent function or there are various activation function generates the output layer as per the requirement of applications, it generates the value in Y-axis 0 to 1 and range between -1 to 1, the applications of sigmodal function used to find ground water table and used classifications (Mane, D. T., & Kulkarni, U. V., 2020). ANN also generates the step function it is formed by the F(X)=X®X for all real numbers IJRQEH.314945.m02 eq (2) for, the step function generates real numbers Lx layer of the input wx associated weight but function F(X)=X®X produce greatest integer value F(X)=X®X, the values noted in the step functions graph measured as two values either 0 or 1 (Mane & Kulkarni, 2020).

Complete Article List

Search this Journal:
Reset
Volume 13: 1 Issue (2024): Forthcoming, Available for Pre-Order
Volume 12: 2 Issues (2023)
Volume 11: 4 Issues (2022)
Volume 10: 4 Issues (2021)
Volume 9: 4 Issues (2020)
Volume 8: 4 Issues (2019)
Volume 7: 4 Issues (2018)
Volume 6: 4 Issues (2017)
Volume 5: 4 Issues (2016)
Volume 4: 4 Issues (2015)
Volume 3: 4 Issues (2014)
Volume 2: 4 Issues (2013)
Volume 1: 4 Issues (2012)
View Complete Journal Contents Listing