Using video calls with the Banuba SDK
In our example, AgoraRTC SDK is used for video streaming. But the integration can be done based on any video streaming library.
You should have client tokens for both AgoraRTC SDK and Banuba SDK.
To receive Banuba SDK token, fill in the form on banuba.com, or contact us via info@banuba.com.
To generate an AgoraRTC SDK tokens, visit Agora website.
- iOS
- Android
- Web
- Flutter
- ReactNative
Example of using video calls with the Banuba SDK
Installation
- Add
banuba-sdk-podspecs
repo along withAgoraRtcEngine_iOS
andBanubaSdk
packages into the yourPodfile
. Alternatively you may use our SPM modules
See the details about the Banuba SDK packages in Installation.
Integration
- Setup client tokens
loading...
- Initialize
BanubaSdkManager
loading...
- Initialize
AgoraRtcEngineKit
, setup video/audio encoders and join the channel
loading...
- Setup
Player
, load the effect and startCamera
frames forwarding
loading...
- Run the application! 🎉 🚀 💅
Example of using video calls with the Banuba SDK
For a video call, you need to receive frames as an array of pixels frame by frame in RGBA format. This
can be done using FrameOutput
. Just create a variable called frameOutput
and add a callback that will
receive an array of pixels framePixelBuffer
:
loading...
Now, when initializing the player, the created variable is set to player.use(myInput, frameOutput)
.
Follow these steps to configure Videocall
To get started, add the Banuba SDK integration code to your project.
Add the AgoraRTC SDK dependency to your build.gradle.kts
:
loading...
This example is based on the Player API.
First, create the Banuba SDK core - player
. Create a surfaceOutput
that will draw the
processed image from the Banuba SDK. Create a frameOutput
that will produce the processed
image and transfer it to Agora as an array of pixels. And create a camera cameraDevice
and
manage it yourself. Agora also has its own camera module, but in this example the Agora
camera is not used, so setExternalVideoSource(...)
is called to disable the Agora camera:
loading...
Create the Agora core with which the videocall will be made - agoraRtc
, inside we indicate
where the Agora will draw the received frames:
loading...
And then initialize everything and set up a video call in the onCreate(...)
method.
How it works
Frames from the Banuba camera are processed in the player
, and then the
result is passed to the onFrame(...)
handler. In the handler, frames are passed to the Agora
module via agoraRtc.pushExternalVideoFrame(...)
. And then Agora transmits the launched frame
to the server, and this is how the video call works.
Fully working code
loading...
- Agora
- OpenTok
- WebRTC
Agora
Check the official Banuba Agora Extension for Web @banuba/agora-extension.
Or if you need finer control, you may use the Banuba WebAR directly:
- Import AgoraRTC and Banuba WebAR
loading...
loading...
- Setup client tokens
loading...
loading...
- Initialize
Player
loading...
- Initialize
AgoraRTC
loading...
- Connect
Player
toAgoraRTC
loading...
- Run the application! 🎉 🚀 💅
See AgoraWebSDK NG API docs for details.
See Banuba Video call demo app for more code examples.
OpenTok (TokBox)
import "https://cdn.jsdelivr.net/npm/@opentok/client"
import { MediaStream, Player, Module Effect, MediaStreamCapture } from "https://cdn.jsdelivr.net/npm/@banuba/webar/dist/BanubaSDK.browser.esm.js"
// ...
const camera = await navigator.mediaDevices.getUserMedia({ audio: true, video: true })
const player = await Player.create({ clientToken: "xxx-xxx-xxx" })
await player.addModule(new Module("https://cdn.jsdelivr.net/npm/@banuba/webar/dist/modules/background.zip"))
const webar = new MediaStreamCapture(player)
await player.use(new MediaStream(camera))
player.applyEffect(new Effect("BackgroundBlur.zip"))
player.play()
// original audio
const audio = camera.getAudioTracks()[0]
// webar processed video
const video = webar.getVideoTracks()[0]
const session = OT.initSession("OT API KEY", "OT SESSION ID")
session.connect("OT SESSION TOKEN", async () => {
const publisher = await OT.initPublisher(
"publisher",
{
insertMode: "append",
audioSource: audio,
videoSource: video,
width: "100%",
height: "100%",
},
() => {},
)
session.publish(publisher, () => {})
})
// ...
See TokBox Video API docs for details.
See Banuba Video call (TokBox) demo app for more code examples.
WebRTC
Considering the Fireship WebRTC demo
import {
MediaStream as BanubaMediaStream,
Player,
Module,
Effect,
MediaStreamCapture,
} from "https://cdn.jsdelivr.net/npm/@banuba/webar/dist/BanubaSDK.browser.esm.js"
// ...
webcamButton.onclick = async () => {
localStream = await navigator.mediaDevices.getUserMedia({ video: true, audio: true })
remoteStream = new MediaStream()
const player = await Player.create({ clientToken: "xxx-xxx-xxx" })
await player.addModule(new Module("https://cdn.jsdelivr.net/npm/@banuba/webar/dist/modules/background.zip"))
const webar = new MediaStreamCapture(player)
await player.use(new BanubaMediaStream(localStream))
player.applyEffect(new Effect("BackgroundBlur.zip"))
player.play()
// original audio
const audio = localStream.getAudioTracks()[0]
// webar processed video
const video = webar.getVideoTracks()[0]
localStream = new MediaStream([audio, video])
// Push tracks from local stream to peer connection
localStream.getTracks().forEach((track) => {
pc.addTrack(track, localStream)
})
// ...
}
Due to Flutter limitations, for every videocall solution you have to create a native Flutter plugin. We have developed one for integration with Agora. It is expected that you will develop your own Flutter plugin if Agora isn't suitable for you.
We have created Agora Extension which is accessible from Flutter.
The sample described below
is a fork of
Flutter plugin of Agora.
Follow the instructions in README.md
to run it.
These are the general steps to integrate the sample code into your app:
- Android
- iOS
- Add the Client Token and extension properties keys constants
loading...
- Add common methods to interact with Banuba extension
loading...
- Initialize Banuba and load an effect
loading...
-
Copy effects in
assets/effects
folder -
Add a reference to Banuba Maven repo
loading...
- Add Banuba dependencies and prepare a task to copy effects into app
loading...
- Add Client Token and extension properties keys constants
loading...
- Add common methods to interact with Banuba extension
loading...
- Initialize Banuba and load an effect
loading...
-
Copy effects in
assets/effects
folder -
Add Banuba dependencies to
Podfile
loading...
- Add the
effects
folder added earlier into your project. Link it with your app: add the folder intoRunner
Xcode project (File
->Add Files to 'Runner'...
).
Due to React Native limitations, for every videocall solution you have to create a React native module. We developed one for integration with Agora. It is expected that you will develop your own React native module if Agora isn't suitable for you.
We have created Agora Extension which is accessible from React Native.
The sample described below
is a fork of
React Native around Agora.
Follow the instructions in README.md
to run it.
These are general steps to integrate the sample code into your app:
- Android
- iOS
- Add the Client Token and extension properties keys constants
loading...
- Add the common methods to interact with Banuba extension
loading...
- Initialize Banuba and load an effect
loading...
intiBanuba()
must be called just after engine.initialize(...)
, calling it
later will cause an error during extention loading.
-
Copy effects in
effects
folder -
Add a reference to the Banuba Maven repo
loading...
- Add Banuba dependencies and prepare a task to copy effects into app
loading...
- Add Client Token and extension properties keys constants
loading...
- Add common methods to interact with the Banuba extension
loading...
- Initialize Banuba and load an effect
loading...
intiBanuba()
must be called immediately after engine.initialize(...)
, calling it
later will cause an error during extention loading.
-
Copy effects in
effects
folder -
Add Banuba dependencies to
Podfile
loading...
- Add the
effects
folder added earlier into your project. Link it with your app: add the folder into Xcode project (File
->Add Files to '<You project>'...
).