The demo directory contains the source code for the Face SDK VideoEngine JS sample that demonstrates such Face SDK features as face detection and tracking, and active liveness estimation. To run the sample, follow the steps below. See how to integrate and use the Face SDK VideoEngine JS plugin in the section Interface of the Sample.
- Software requirements
- Node >= 10
- [Optional] Yarn >= 1.22
To run the demo, follow the steps below:
- Plug in a webcam to your computer.
- In the directory
yarn installin the console.
- After the project is initialized (you'll see a message about successful initialization in the console), run the
yarn startcommands. After that, the browser will automatically open a page with the sample.
- In the browser pop-up window select "Allow camera access".
This section provides information on available features of the Face SDK VideoEngine JS demo that demonstrates face detection and tracking, as well as active liveness estimation.
- The left side of the screen displays the video stream from the webcam (Stream).
- To start the Active Liveness check, press the Start button. During the processing, a progress bar and the phrase "Wait, detection is in progress ..." are displayed above the window with the video stream. If at least one blink is made during processing and the head is turned at least once, a face is considered to belong to a real person.
- The best shot of the face is displayed to the right of the video stream. If the face is not positioned correctly, the browser will display hints during processing. For example, "Please turn your head up", "Please come closer to the camera", etc.
- To stop the Active Liveness check, press the Stop button.
- To clear the received data, press the Clear button.