Head Tracking and Face and Emotion Recognition to control a WebVR Game

Awwwards
Awwwards Magazine
Published in
3 min readAug 9, 2016

--

Take a look inside the new series of eBooks for Digital Creatives, Volume 1: Space Lamb by 12 Wave.

With the help of a web camera, it is possible to provide a new, emotional interaction with websites and games on the internet. In this experiment, we decided to combine JSARToolKit and WebVR. For WebVR, you can use your mobile phone + helmet (Cardboard Google or equivalent). You can look at the lamb from different angles in 3D, moving away or getting closer to it.

The Use of Camera in the SpaceLamb Game

The webcam can be used to recognize a human face, its outline, eyebrows, eyes, nose and lips. In addition, it is possible to recognize hands and various markers used for augmented reality. We have implemented all these character control methods in the game with the lamb in outer space, who is flying home, dodging asteroids.

The monitor represents a porthole for the person controlling the flight with the help of their head. In this case, we used the Headtrackr library. Using this library, you can recognize the head position, if it’s getting closer or further from the camera. It works the following way — the video from the camera is transferred to the page, this video is then rendered into the image, which is transferred to the library for processing. The head position is determined in relation to this picture or video. The starting point is the upper-left corner of the video image. This enables us to keep track of the head inclination as compared with the centre of the camera, and control the lamb, guiding it.

Watch face tracking in action on this video

Face & Emotion Recognition by the Camera

The author of Headtrackr has one more library for webcam interaction,the multipurpose library clmtrackr. You can use it to recognize human facial features: eyes, nose, lips, eyebrows and face contour. Using this, we can see whether a person is smiling or not, if they have turned their head to the side or bent it down. You can see how it works in the following experiment: go.12wave.com/1Tza66Y

Turn your head in different directions, open your mouth, smile. The lamb will copy your head movements: turning or inclining it. If you open your mouth, you’ll see the lamb opening its mouth, too. Smile. The wider the smile, the higher the lamb raises its ears. It happens due to the fact that we are getting the data from the camera –70 points of the face.

This library provides 70 points of the face, with the help of which, we determine what a person does. In this experiment, we went the simple way. For the head turn, we took the horizontal points and compared the distance between the outer and the middle points of the face (nose point). If the distance from the point of the nose to one side of the face is less than to the other side, it means that the person has turned their head to the side, and the lamb turns its head to the same side too. To tilt the head, we used the same extreme points of the face and the nose point and checked the deviation of the nose point vertically, and so we understood that the head was inclined forward or backward. The library also allows you to define several emotions: joy, anger, surprise and sadness. The output is the result from 0 to 1. With the help of this data, we can determine the degree of happiness of a person.

Game available at: spacelamb.12wave.com

Brain Food! for Digital Creatives: Case Studies Vol. 1

Offering in-depth knowledge on themes such as Gamification, Animation, Face and Emotion Recognition, VR filming, and Interactive Experiments, Vol.1 of the new media rich series can be viewed on your tablet with EPUB or downloaded as a PDF and features case studies on iconic digital projects by Superhero Cheesecake, Resn, 12 Wave, Exzeb and DPDK.

Download Vol.1 of Brain Food! for Digital Creatives for free here .

--

--

The awards for design, creativity and innovation on the Internet, which recognize and promote the best web designers in the world