Skip to main content

How to set Triggers in face filters

Triggers are small AR scenarios enabled through user facial expressions. The user can interact with effects or call them opening mouth, smiling, raising eyebrows or frowning. You can use triggers to create Face AR games or interactive face filters to increase engagement.

Please, refer to list of triggers Face AR SDK supports to see what facial expressions can animate AR models.

Native platform triggers#

Banuba SDK has two types of triggers: native platform triggers and face state triggers. ​ Native platforms triggers are touch actions triggers and gesture action triggers. To use native platform triggers, you need to declare the global function in config.js file with the name of the trigger. ​ Banuba SDK JS API has 4 touch triggers:

  • onTouchesBegan - called when the user taps the screen
  • onTouchesEnded - called when the user removes the finger from the screen
  • onTouchesMoved - called when the user moves the finger over the screen
  • onTouchesCancelled - called when the native platform cancels the touch (in iOS it is called when the touch count is greater than 5) ​

All triggers receive the array of maps with the touch ID and touch position in NDC (normalized device coordinates). In JS, the representation type of argument looks like: [{id: 0, x: 0.5, y: 0.5}]

Example:

function onTouchesBegan(touches) {
for (var i = 0; i < touches.length, i++) {
Api.print("TOUCH WITH ID: " + touches[i].id + " ON POSITION X: " + touches[i].x + " Y: " + touches[i].y + " WAS BEGAN");
}
}

Also Banuba SDK JS API has 6 gesture triggers:

  1. onRotationGesture - receives the angle in degrees.

    Example:

    function onRotationGesture(angle) {
    Api.print("ROTATION RECOGNIZED. CURRENT ANGLE: " + angle);
    }
  2. onScaleGesture - receives the current scale coefficient.

    Example:

    function onScaleGesture(scale) {
    Api.print("PINCH RECOGNIZED. CURRENT SCALE: " + scale);
    }

  3. onSwipeGesture - receives 2 arguments: direction on x and y axis.

  • If the user swipes left x will be equal to -1 otherwise 1.

  • If the user swipes down y will be equal to -1 otherwise 1.

    Example:

    function onSwipeGesture(dirX, dirY) {
    Api.print("SWIPE RECOGNIZED. CURRENT DIRECTION X: " + dirX + " CURRENT DIRECTION Y: " + dirY );
    }
  1. onDoubleTapGesture - receives 2 arguments: x, y position of the tap in NDC.

    Example:

    function onDoubleTapGesture(x, y) {
    Api.print("DOUBLE TAP RECOGNIZED. X: " + x + " Y: " + y );
    }
  2. onLongTapGesture - same as onDoubleTapGesture.

  3. onGestureEnded - receives the string of the ended gesture:

  • "Swipe" - when onSwipeGesture is ended
  • "LongTap" - when onLongTapGesture is ended
  • "DoubleTap" - when onDoubleTapGesture is ended
  • "Scale" - when onScaleGesture is ended
  • "Rotation" - when onRotationGesture is ended

Banuba SDK calls these functions automatically when the native platform recognizes these triggers.

Test Effect: Bulb (with gestures)#

  • onScaleGesture: bulbs can be increased and decreased.
  • onRotationGesture: can be rotated with Z axis.
  • onSwipeGesture: swipe right and up to turn on lights, swipe left and down and the lights turn off depending on which ones are on. Swipe up / right when all lights are lit and they turn off.
  • onDoubleTapGesture: bulbs turn on one by one. Once all are on, they turn off.
  • onLongTapGesture: turns on and off all lights.
Download Bulb_gestures effect

Face State Triggers#

Banuba SDK has several face state triggers. To enable them, you need to call the next functions: ​

  • isMouthOpen - returns true when the mouth is open, otherwise returns false.
  • isSmile - returns true when the user smiles, otherwise returns false.
  • isEyebrowUp - returns false when brows are down, otherwise returns true.
  • isDisgust - returns false when brows are up, otherwise returns true.
  • hasGlasses - returns true if the user wears glasses, otherwise returns false. (to make it work turn on recognizer’s “glasses” feature in config.json)
  • getEyesStatus - returns the array of bool with the first element - left eye state, second - right eye state. If the eye is closed, the state is false, otherwise - true.

Example:

function Effect {
this.faceActions = [function() {
Api.print(isMouthOpen()); // will print true at every frame when user face was recognized and mouth is open and false when mouth is closed.​
}];
this.noFaceActions() = [];
this.videoRecordStartActions = [];
this.videoRecordDiscardActions = [];
this.videoRecordFinishActions = [];
}
configure(new Effect());

Test Effects: Bulb (with face triggers)#

Each trigger has its own effect:

  • bulb_smile: turn on/off the bulb with a smile
  • bulb_brows_down: on/off the bulb with brows frowned/returned to normal
  • bulb_brows_up: on/off the bulb raising your eyebrows
  • bulb_eyes_state: on/off the left and right bulbs with eyes opened/closed (beta version)
  • bulb_has_glasses: turn on the bulb if glasses are detected on the face and turn off they are missing
  • bulb_mouth_open: on/off the bulb with mouth opened/closed
Download Archive with 6 separate face trigger effects
Last updated on