In this tutorial, you'll learn how to estimate age, gender, and emotions of a face using Face SDK. The result will be displayed next to the detected face. This tutorial expects that you already have a project with face detection in a video stream or on an image. You can learn how to detect faces in a video stream in the tutorial Face Detection and Tracking in a Video Stream. In this tutorial, we estimate age, gender, and emotions on the image.
You can find the tutorial project in Face SDK: examples/tutorials/age_gender_and_emotions
- Specify the path to the image in the field
QImage image. Set the parameters of labels with information about gender, age, and emotions (size and style of text).
- Set the indent
margin_from_rectbetween the face bounding rectangle and the text with information about a face. Set the distance between the elements "age group", "age", and "emotion"
text_element_position. These elements are displayed to the right of a detected face, one below the other.
- Use the method
pbio::FacerecService::createAgeGenderEstimatorto create the object
AgeGenderEstimatorand estimate gender and age. When calling this method, specify the configuration file
- Calculate the starting point
base_pointto display the information about gender, age, and emotions: get the face bounding rectangle from the
RawSampleobject, which stores the face information and calculate the point taking into account the indent
- Estimate gender and age of a face
(*sample)using the method
pbio::AgeGenderEstimator. Display the age group based on the estimated age. Four age groups are available:
- Kid (under 18 years)
- Young (18-37 years)
- Adult (37-55 years)
- Senior (55 years and older)
The age group is taken from the enumeration
pbio::AgeGenderEstimator::Age and the result is stored in the
age_group_text variable. Display the label with the age group using the method
painter.drawText. The label will be on the first line to the right of the starting point
- Display the age in years on the second line (under the age group).
- Display the gender on the third line (under the age).
- Run the project. At this stage, you'll see the information about the age group, age and gender to the right of the detected face.
- Use the method
pbio::FacerecService::createEmotionsEstimatorto create the object
EmotionsEstimatorand estimate emotions. When calling this method, specify the configuration file
- Use the method
pbio::EmotionsEstimator::estimateEmotionsto estimate emotions of a detected face
(*sample)and get a confidence coefficient (from 0 to 1). The enumeration
pbio::EmotionsEstimator::Emotionincludes all available emotions. Face SDK estimates four emotions:
Each emotion is assigned an index from 0 to 3. In this project, we display emotions as four columns of different colors (blue, green, red, yellow) with corresponding labels (Neutral, Happy, Angry, Surprise). If the confidence coefficient of emotion is high, then the column for this emotion is longer than three other columns. This clearly shows which emotion prevails. All parameters of emotions are stored in the dictionary
- Set the starting point
emotions_base_point, which we'll use to draw the columns with emotions (to the right of a face bounding rectangle with the labels). Also set the size of the columns with emotions
bar_offsetfrom the labels of emotions.
- Display the labels and columns. In the loop, calculate the starting point
emotion_row_base_pointto display the information about each emotion and the starting point
text_base_pointto display the name of emotion. Then, get the name of the emotion from the dictionary and display the label
emotion_label. Display the column for each emotion: calculate the starting point to display the column
bar_base_pointand calculate the length of the column
bar_size. To do this, multiply the value of
bar_base_size.widthby the confidence coefficient. Color the columns according to the colors from the dictionary.
- Run the project. To the right of the face bounding rectangle, you'll see the information about emotions (the length of a column visualize the probability distribution in the space of described emotions).