Examples of using Banuba SDK
To start working with Banuba SDK examples, you need to have a client token. To receive it, please fill in our form on banuba.com, or contact us via info@banuba.com.
Our GitHub
Visit our GitHub page to see all available examples.
- iOS
- Android
- Web
- Flutter
- ReactNative
- macOS
- Desktop
iOS samples (Swift)
This repository contains basic samples how to use Player API.
https://github.com/Banuba/banuba-sdk-ios-samples
Agora plugin example (Swift)
This example shows how to use Banuba SDK as an Agora plugin for a video call between two devices.
https://github.com/Banuba/agora-plugin-filters-ios
Opentok example (Swift)
This example demonstrates the use of Banuba SDK in conjunction with Opentok SDK.
https://github.com/Banuba/opentok-ios-swift
WebRTC example (Objective-C)
This example demonstrates how to use Banuba SDK in conjunction with WebRTC.
https://github.com/Banuba/quickstart_webrtc_ios
Beauty example (Swift)
This example demonstrates how to correctly use the Makeup effect.
This example uses SPM
https://github.com/Banuba/beauty-ios
ARCloud example (Swift)
This example shows how to dynamically load effects from the network.
https://github.com/Banuba/arcloud-ios-swift
Quickstart example CPP (Objective-C and C++)
This example shows how to use the C++ EffectPlayer.
https://github.com/Banuba/quickstart-ios-cpp
Quickstart example (Objective-C)
This example shows how to use Banuba SDK in Objective-C application.
https://github.com/Banuba/quickstart-ios-objc
"Banuba Technologies" sample app (Swift)
A demo application with almost all our technologies.
Requirements
- Latest stable Android Studio
- Latest Gradle plugin
- Latest NDK
Banuba SDK examples (Kotlin)
This example shows how to use PlayerAPI for various tasks.
It uses the Player
, CameraInput
, SurfaceOutput
, and VideoOutput
classes.
https://github.com/Banuba/banuba-sdk-android-samples
Agora plugin example (Kotlin)
An example of using Banuba SDK as an Agora plugin for a video call between two devices.
https://github.com/Banuba/agora-plugin-filters-android
Opentok example (Java)
This example demonstrates the use of Banuba SDK in conjunction with Opentok SDK.
https://github.com/Banuba/opentok-android-java
WebRTC example (Kotlin)
This example demonstrates how to use Banuba SDK in conjunction with WebRTC. In it, a WebRTC camera is used, and after processing, the frame is drawn onto the surface using WebRTC. This example based on PlayerAPI.
https://github.com/Banuba/quickstart_webrtc_android
Beauty example (Java)
This example demonstrates how to correctly use the Makeup effect and how to call MakeupAPI scripts. This example is based on PlayerAPI.
https://github.com/Banuba/beauty-android-java
Beauty example (Kotlin)
This example demonstrates how to correctly use the Makeup effect and how to call MakeupAPI scripts.
https://github.com/Banuba/beauty-android-kotlin
ARCloud example (Kotlin)
This example shows how to dynamically load effects from the network. This example is based on PlayerAPI.
https://github.com/Banuba/arcloud-android-kotlin
Quickstart CPP (Java and C++)
This example shows how to use the C++ effect player in Android.
https://github.com/Banuba/quickstart-android-cpp
"Banuba SDK" sample app (Kotlin)
A demo application with almost all our technologies.
Quickstart
Beauty
Vue
Angular
React
import React, { useEffect } from "react"
import data from "@banuba/webar/BanubaSDK.data"
import wasm from "@banuba/webar/BanubaSDK.wasm"
import simd from "@banuba/webar/BanubaSDK.simd.wasm"
import FaceTracker from "@banuba/webar/face_tracker.zip"
import Glasses from "/path/to/Glasses.zip"
import { Webcam, Player, Module, Effect, Dom } from "@banuba/webar"
export default function WebARComponent() {
// componentDidMount
useEffect(() => {
let webcam
Player.create({
clientToken: "xxx-xxx-xxx",
locateFile: {
"BanubaSDK.data": data,
"BanubaSDK.wasm": wasm,
"BanubaSDK.simd.wasm": simd,
},
}).then(async (player) => {
await player.addModule(new Module(FaceTracker))
await player.use(webcam = new Webcam())
player.applyEffect(new Effect(Glasses))
Dom.render(player, "#webar")
})
// componentWillUnmount
return () => {
webcam?.stop()
Dom.unmount("#webar")
}
})
return <div id="webar"></div>
}
See Bundlers for notes about specific bundlers and locateFile
usage.
Agora
OpenTok (TokBox)
Customizing video source
You can easily modify the built-in Webcam module video by passing parameters to it.
Switching from front to back camera
For example, you can use back camera of the device by passing facingMode parameter to it:
// ...
// The default facingMode value is "user" which means front camera
// The "environment" value here means back camera
await player.use(new Webcam({ facingMode: "environment" }))
// ...
Rendering WebAR video in full screen on mobile
Simply add the following CSS to force WebAR AR video to fill viewport:
<style>
/* The simplest way to force Banuba WebAR canvas to fill viewport */
#webar > canvas {
width: 100vw;
height: 100vh;
object-fit: cover;
}
</style>
<script type="module">
import { Webcam, Player, Dom } from "https://cdn.jsdelivr.net/npm/@banuba/webar/dist/BanubaSDK.browser.esm.js"
Player.create({ clientToken: "xxx-xxx-xxx" })
.then(async (player) => {
await player.use(new Webcam())
Dom.render(player, "#webar")
})
</script>
If you decide to specify the exact width
and height
for the
Webcam, pay
attention that on several mobile devices/operating systems webcam video width
and height may be flipped. It's a known platform-specific
webcam bug.
To work around it, swap the width
and height
values:
const desiredWidth = 360
const desiredHeight = 540
await player.use(new Webcam({
width: desiredHeight,
height: desiredWidth,
}))
Also, you may want to check out the Video cropping section for more advanced scenarios.
External MediaStream
If the built-in Webcam can not fit your needs you can use a custom MediaStream with Player:
import { MediaStream /* ... */ } from "@banuba/webar"
// ...
/* process video from the camera */
const camera = await navigator.mediaDevices.getUserMedia({ audio: true, video: true })
await player.use(new MediaStream(camera))
/* or even from another canvas */
const canvas = $("canvas").captureStream()
await player.use(new MediaStream(canvas))
// ...
See MediaStream docs for more details.
Capturing processed video
You can easily capture the processed video, take screenshots, video recordings or pass the captured video to a WebRTC connection.
Screenshot
import { ImageCapture /* ... */ } from "@banuba/webar"
// ...
const capture = new ImageCapture(player)
const photo = await capture.takePhoto()
// ...
See ImageCapture.takePhoto() docs for more details.
Video
import { VideoRecorder /* ... */ } from "@banuba/webar"
// ...
const recorder = new VideoRecorder(player)
recorder.start()
await new Promise((r) => setTimeout(r, 5000)) // wait for 5 sec
const video = await recorder.stop()
// ...
See VideoRecorder docs for more details.
MediaStream
import { MediaStreamCapture /* ... */ } from "@banuba/webar"
// ...
// the capture is an instance of window.MediaStream
const capture = new MediaStreamCapture(player)
// so it can be used as a video source
$("video").srcObject = capture
// or can be added to a WebRTC peer connection
const connection = new RTCPeerConnection()
connection.addTrack(capture.getVideoTrack())
// ...
See MediaStreamCapture docs for more details.
Video cropping
You can adjust video frame dimensions via Webcam constructor parameters:
const wcam = new Webcam({ width: 320, height: 240 })
But this approach is platform-dependent and varies between browsers, e.g. some browsers may be unable to produces frames of the requested dimensions and can yield frames of close but different dimensions instead (e.g. 352x288 instead of requested 320x240).
To work around this platform-specific limitations, you can leverage the built-in SDK crop modificator:
const desiredWidth = 320
const desiredHeight = 240
function crop(renderWidth, renderHeight) {
const dx = (renderWidth - desiredWidth) / 2
const dy = (renderHeight - desiredHeight) / 2
return [dx, dy, desiredWidth, desiredHeight]
}
await player.use(webcam, { crop })
This way you can get the desired frame size regardless of the platform used.
See Player.use() and InputOptions docs for more datails.
Postprocessing
It's possible to post-process the video processed by WebAR SDK.
You can grab the idea from the code-snippet:
import { MediaStreamCapture /* ... */ } from "@banuba/webar"
// ...
const capture = document.createElement("video")
capture.autoplay = true
capture.srcObject = new MediaStreamCapture(player)
const canvas = document.getElementById("postprocessed")
const ctx = canvas.getContext("2d")
const fontSize = 48 * window.devicePixelRatio
function postprocess() {
canvas.width = capture.videoWidth
canvas.height = capture.videoHeight
ctx.drawImage(capture, 0, 0)
ctx.font = `${fontSize}px serif`
ctx.fillStyle = "red"
ctx.fillText("A Watermark", 0.5 * fontSize, 1.25 * fontSize)
}
;(function loop() {
postprocess()
requestAnimationFrame(loop)
})()
See Capturing processed video > MediaStream for details.
Minimal sample
A one-file example, a good starting point.
https://github.com/Banuba/banuba-sdk-flutter/tree/master/example
Quickstart Flutter sample
This project demonstrates how to apply effects, makeup; process photos.
https://github.com/Banuba/quickstart-flutter-plugin/tree/master
Video call sample
This example demonstrates the use of Banuba SDK in conjunction with AgoraRTC SDK for a video call on Flutter. This example is forked from the Flutter plugin of Agora.
Banuba SDK is integrated in JoinChannelVideo
subsample. See videocall.
https://github.com/Banuba/banuba-agora-flutter-sdk
ARCloud sample
Read more about ARCloud.
https://github.com/Banuba/arcloud-flutter/tree/master/example
Minimal sample
One-file example, good starting point. This a part of React Native over Banuba module.
https://github.com/Banuba/banuba-sdk-react-native/tree/master/example
Videocall example
This example demonstrates the use of Banuba SDK in conjunction with AgoraRTC SDK for a video call on React Native. This example is forked from Agora module for React Native.
Banuba SDK integrated in JoinChannelVideo
subsample. See videocall.
macOS sample (Swift)
This repository contains basic samples how to use Banuba SDK on macOS.
https://github.com/Banuba/quickstart-macos-swift
Offscreen Effect Player (C++, Objective-C++)
Examples bellow are written in C++ and will run both on Windows and macOS.
Quickstart Desktop (C++)
The starting point for desktop integration in C++.
https://github.com/Banuba/quickstart-desktop-cpp/tree/master
Demonstrates: