FaceAR Glossary

AR Technologies

FRX (Face tracking)

The technology to detect and track the presence of a human face in a digital video frame to enable FaceAR camera experiences.

Background separation

Neural network that separates a user on the foreground from the background in a sequence of video frames to remove the background or replace it with another graphical image.

Occlusion

Face occlusion detection neural network that enables FRX performance with facial occlusions such as sunglasses, scarf, mask etc.

Skin segmentation

Neural network employed in different aspects of face filters, beautification and virtual makeovers to segment the skin and allow for its modification such as change the color and tone.

Hair Segmentation

Neural network to detect and segment the image into hair, face and background to allow for real-time hair modification such as change the hair color.

Eye segmentation

Neural network to detect and segment the eye into iris, pupil and eyeball to allow for real-time eye modification such as recoloring, size adjustment, eyeshadows and eyelashes application.

Lips segmentation

Neural network to detect and segment the lips on the users face for real-time lips recoloring, lips size change in virtual makeup applications and beauty face filters.

Full body segmentation

Human body subtraction in full length in images and videos. The core of the background separation technology is a convolutional neural network which returns a binary output, tagging the image parts as either human or the background.

Acne removal (tap)

Acne removal is an algorithm to remove acne in photos by single taps on the selected area.

Acne removal (auto)

The algorithm to automatically remove acne in photos.

Eye bags removal

The algorithm segments and lightens the area under the eyes in photos for beautification purposes.

Hair strands painting

The algorithms to change the hair color with several colors applied simultaneously e.g. for strands highlight or coloring.

Features

Ruler

Real-time distance estimation between the face of the user and the device camera. Normally used to give the user feedback when he or she holds the device too far or too close to the face.

Eye-gaze direction

Algorithm detects micro-movements of the eye and its areas with subpixel accuracy in real-time. Based on that data, a vector of movement can be created. The technology makes it possible not only to “track” a person’s gaze but also to control a smartphone’s function with a gaze.

2+ faces detection

Algorithm allowing for masks application to several people simultaneously for more engaging group AR experiences. For the quality user experience, we generally don’t recommend supporting more than 3 faces on mobile devices due to limited computing capabilities.

Pulse (Heart rate)

Algorithm analyses fine patterns of the facial areas and their color variations within time to detect pulse frequency in real-time.

Mask on a picture from Camera Roll

Mask application on pre-recorded images the user uploads from the Camera Roll.

Mask on video from Camera Roll

Mask application on pre-recorded videos the user uploads from the Camera Roll.

Post-processing effects

Graphical camera effects and animations applied on pre-recorded videos.

Continuous photo editing

AR effect is processed in real-time on the image. E.g. beautification slider to control the face modification or "Before/After" slider.

Touches

Small AR scenarios enabled thought the user touches on the screen. AR objects or camera effects can change color and behaviour. Applied in FaceAR games or interactive face filters to increase engagement.

Trigger

Small AR scenarios enabled through user facial expressions. The user can interact with effects or call them opening mouth, smiling, raising eyebrows or frowning. Applied in FaceAR games or interactive face filters to increase engagement.

SFX

Sound effects support in FaceAR experiences, e.g. add music to filters.

Voice changer

The 3rd party software allowing to change the tone of the user's voice, e.g. make the user talk like a robot, kid, male/female, etc.

Glasses detection

The algorithm detects if the user wears glasses and removes them in a virtual mask. Applied in glasses try-on for convenient frame choice and face filters for AR glasses would not overlay the real glasses.

Graphical technologies

Morphing

Changing the size and proportions of the face, e.g. slim down the cheeks, nose or modify them for fun masks.

Skinned mesh animation

AR models look not static but moving, animated and transforming.

Physically-based rendering

AR models behave like the real objects in the flow of real-world light and physics, e.g. support gravity or mirror the light with the camera rotates and user tilts.

LUT post-processing

Real-time or offline color correction of pre-recorded images, e.g. Instagram-like filters.

Texture sequences

The digital representation of the surface of an AR object providing the sophisticated and life-like object representation.

Video textures

Infuse a static image with dynamic qualities and explicit action to achieve an enhanced look and feel of the FaceAR video experience.

Face Beautification

Face retouch and modification features including skin smooth, morphing, teeth whitening, eye’s modification and ect.

Action units

The fundamental actions of individual muscles or groups of muscles of the face that enable the AR mask to support user facial expressions, e.g. in emojis, avatars or full-face AR masks.

Last updated on