Nest Logo



This demo showcases a 3D passive liveness detection technique for detecting spoofed faces.

In real-time, a single image captured from the Android camera computes a liveness score.

Additionally, the demo can also calculate the liveness score from gallery images and display the results.


SDK License

This project uses kby-ai’s liveness detection SDK. The SDK requires a license per bundle ID.

  • The code below shows how to use the license:

    var ret = FaceSDK.setActivation(glF/068/SCjGIKN/o1HwZKKEANRWrO0eCTY7VinO46mMiXWcccNfCGLzAe2aNU2JBbjcn+IY000Q +
    zwYQUjEqx8X4Dkx41KMcZhJQSSdPg9KBEpxbOaksjsPCktAy78wlUJ9L+zmX6oa0h/3H45gMCHka +
    qPjjEWqrtGotonspHOxo5Z8TqofHSWJ04ORGllILdB4UQELaeToomCJMSNMJRKt425sIEdidO/+2 +
    cQTdw04ShTpyMRgFI4B/sY5XMlz8Jyh1L9X3Yf5vEzj/Dk6d/7mtp1r3vRzIaBiFvk8M0Y0CyIwK +
    if(ret == SDK_SUCCESS.rawValue) {
    ret = FaceSDK.initSDK()
  • To request a license, please contact us:

Email: [email protected]

Telegram: @kbyai

WhatsApp: +19092802609

Skype: live:.cid.66e2522354b1049b

About SDK

Set up

  1. Copy the SDK (facesdk.framework folder) to the root folder of your project.

  2. Add SDK framework to the project in xcode

Project Navigator -> General -> Frameworks, Libraries, and Embedded Content


  1. Add the bridging header to your project settings

Project Navigator -> Build Settings -> Swift Compiler – General


Initializing an SDK

  • Step One

To begin, you need to activate the SDK using the license that you have received.


If activation is successful, the return value will be SDK_SUCCESS. Otherwise, an error value will be returned.

  • Step Two

After activation, call the SDK’s initialization function.


If initialization is successful, the return value will be SDK_SUCCESS. Otherwise, an error value will be returned.

Face Detection and Liveness Detection

The FaceSDK offers a single function for detecting face and liveness detection, which can be used as follows:

let faceBoxes = FaceSDK.faceDetection(image)
func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) {
guard let pixelBuffer: CVPixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer) else { return }
CVPixelBufferLockBaseAddress(pixelBuffer, CVPixelBufferLockFlags.readOnly)
let image = CIImage(cvPixelBuffer: pixelBuffer).oriented(CGImagePropertyOrientation.leftMirrored)
let capturedImage = UIImage(ciImage: image)
CVPixelBufferUnlockBaseAddress(pixelBuffer, CVPixelBufferLockFlags.readOnly)
let faceBoxes = FaceSDK.faceDetection(capturedImage)
DispatchQueue.main.sync {
self.faceView.setFrameSize(frameSize: capturedImage.size)
self.faceView.setFaceBoxes(faceBoxes: faceBoxes)

This function takes a single parameter, which is a UIImage object. The return value of the function is a list of FaceBox objects. Each FaceBox object contains the detected face rectangle, liveness score, and facial angles such as yaw, roll, and pitch.



View Github