In this tutorial you'll learn how to estimate age, gender, and emotions of a face using Face SDK. The result will be displayed next to the detected face. This tutorial expects that you already have a project with face detection in a video stream or on an image. You can learn how to detect faces in a video stream in the [Face Detection and Tracking in a Video Stream] tutorial (/docs/tutorials/face_detection_and_tracking_in_a_video_stream). In this tutorial we estimate age, gender, and emotions on the image.
You can find the tutorial project in Face SDK: examples/tutorials/age_gender_and_emotions
- Specify the path to the image in the
QImage imagefield. Set the parameters of labels with information about gender, age, and emotions (text size and style).
- Set the
margin_from_rectindent between the face bounding rectangle and the text with information about a face. Set the distance between the elements "age group", "age", and "emotion"
text_element_position. These elements are displayed to the right of a detected face, one below the other.
- Use the
pbio::FacerecService::createAgeGenderEstimatormethod to create the
AgeGenderEstimatorobject and estimate gender and age. When you call this method, specify the
- Calculate the
base_pointstarting point to display the information about gender, age, and emotions: get the face bounding rectangle from the
RawSampleobject, which stores the face information and calculate the point, taking into account the
- Estimate gender and age of a face
pbio::AgeGenderEstimatormethod. Display the age group based on the estimated age. Four age groups are available:
- Kid (under 18 years)
- Young (18-37 years)
- Adult (37-55 years)
- Senior (55 years and older)
The age group is taken from the
pbio::AgeGenderEstimator::Age enumeration and the result is stored in the
age_group_text variable. Display the label with the age group using the
painter.drawText method. The label will be on the first line to the right of the
base_point starting point.
- Display the age in years on the second line (under the age group).
- Display the gender on the third line (under the age).
- Run the project. You'll see the information about the age group, age and gender to the right of the detected face.
- Use the
pbio::FacerecService::createEmotionsEstimatormethod to create the
EmotionsEstimatorobject and estimate emotions. When you call this method, specify the
- Use the
pbio::EmotionsEstimator::estimateEmotionsmethod to estimate emotions of a detected face
(*sample)and get a confidence coefficient (from 0 to 1). The
pbio::EmotionsEstimator::Emotionenumeration includes all available emotions. Face SDK estimates four emotions:
Each emotion is assigned an index from 0 to 3. In this project we display emotions as four columns of different colors (blue, green, red, yellow) with corresponding labels (Neutral, Happy, Angry, Surprised). If the confidence coefficient of emotion is high, then the column for this emotion is longer than three other columns. This clearly shows which emotion prevails. All parameters of emotions are stored in the
- Set the
emotions_base_pointstarting point which you will use to draw the columns with emotions (to the right of a face bounding rectangle with the labels). Also set the size of the columns with emotions
bar_offsetindent from the labels of emotions.
- Display the labels and columns. In the loop calculate the
emotion_row_base_pointstarting point to display the information about each emotion and the
text_base_pointstarting point to display the emotion name. Then, get the emotion name from the dictionary and display the
emotion_labellabel. Display the column for each emotion: calculate the starting point to display the
bar_base_pointcolumn and calculate the length of the
bar_sizecolumn. To do this, multiply the value of
bar_base_size.widthby the confidence coefficient. Color the columns according to the colors from the dictionary.
- Run the project. To the right of the face bounding rectangle, you'll see the information about emotions (the length of a column visualizes the probability distribution in the space of described emotions).