MediaPipeTasksVision Framework Reference

GestureRecognizer

class GestureRecognizer : NSObject

@brief Performs gesture recognition on images.

This API expects a pre-trained TFLite hand gesture recognizer model or a custom one created using MediaPipe Solutions Model Maker. See https://developers.google.com/mediapipe/solutions/model_maker.

  • Creates a new instance of GestureRecognizer from an absolute path to a TensorFlow Lite model file stored locally on the device and the default GestureRecognizerOptions.

    Declaration

    Swift

    convenience init(modelPath: String) throws

    Parameters

    modelPath

    An absolute path to a TensorFlow Lite model file stored locally on the device.

    error

    An optional error parameter populated when there is an error in initializing the gesture recognizer.

    Return Value

    A new instance of GestureRecognizer with the given model path. nil if there is an error in initializing the gesture recognizer.

  • Creates a new instance of GestureRecognizer from the given GestureRecognizerOptions.

    Declaration

    Swift

    init(options: GestureRecognizerOptions) throws

    Parameters

    options

    The options of type GestureRecognizerOptions to use for configuring the GestureRecognizer.

    error

    An optional error parameter populated when there is an error in initializing the gesture recognizer.

    Return Value

    A new instance of GestureRecognizer with the given options. nil if there is an error in initializing the gesture recognizer.

  • Performs gesture recognition on the provided MPImage using the whole image as region of interest. Rotation will be applied according to the orientation property of the provided MPImage. Only use this method when the GestureRecognizer is created with running mode, .image.

    This method supports gesture recognition of RGBA images. If your MPImage has a source type of .pixelBuffer or .sampleBuffer, the underlying pixel buffer must have one of the following pixel format types:

    1. kCVPixelFormatType_32BGRA
    2. kCVPixelFormatType_32RGBA

    If your MPImage has a source type of .image ensure that the color space is RGB with an Alpha channel.

    Declaration

    Swift

    func recognize(image: MPImage) throws -> GestureRecognizerResult

    Parameters

    image

    The MPImage on which gesture recognition is to be performed.

    error

    An optional error parameter populated when there is an error in performing gesture recognition on the input image.

    Return Value

    An GestureRecognizerResult object that contains the hand gesture recognition results.

  • Performs gesture recognition on the provided video frame of type MPImage using the whole image as region of interest. Rotation will be applied according to the orientation property of the provided MPImage. Only use this method when the GestureRecognizer is created with running mode, .video.

    It’s required to provide the video frame’s timestamp (in milliseconds). The input timestamps must be monotonically increasing.

    This method supports gesture recognition of RGBA images. If your MPImage has a source type of .pixelBuffer or .sampleBuffer, the underlying pixel buffer must have one of the following pixel format types:

    1. kCVPixelFormatType_32BGRA
    2. kCVPixelFormatType_32RGBA

    If your MPImage has a source type of .image ensure that the color space is RGB with an Alpha channel.

    Declaration

    Swift

    func recognize(videoFrame image: MPImage, timestampInMilliseconds: Int) throws -> GestureRecognizerResult

    Parameters

    image

    The MPImage on which gesture recognition is to be performed.

    timestampInMilliseconds

    The video frame’s timestamp (in milliseconds). The input timestamps must be monotonically increasing.

    error

    An optional error parameter populated when there is an error in performing gesture recognition on the input video frame.

    Return Value

    An GestureRecognizerResult object that contains the hand gesture recognition results.

  • Sends live stream image data of type MPImage to perform gesture recognition using the whole image as region of interest. Rotation will be applied according to the orientation property of the provided MPImage. Only use this method when the GestureRecognizer is created with running mode, .liveStream.

    The object which needs to be continuously notified of the available results of gesture recognition must confirm to GestureRecognizerLiveStreamDelegate protocol and implement the gestureRecognizer(_:didFinishRecognitionWithResult:timestampInMilliseconds:error:) delegate method.

    It’s required to provide a timestamp (in milliseconds) to indicate when the input image is sent to the gesture recognizer. The input timestamps must be monotonically increasing.

    This method supports gesture recognition of RGBA images. If your MPImage has a source type of .pixelBuffer or .sampleBuffer, the underlying pixel buffer must have one of the following pixel format types:

    1. kCVPixelFormatType_32BGRA
    2. kCVPixelFormatType_32RGBA

    If the input MPImage has a source type of .image ensure that the color space is RGB with an Alpha channel.

    If this method is used for performing gesture recognition on live camera frames using AVFoundation, ensure that you request AVCaptureVideoDataOutput to output frames in kCMPixelFormat_32RGBA using its videoSettings property.

    Declaration

    Swift

    func recognizeAsync(image: MPImage, timestampInMilliseconds: Int) throws

    Parameters

    image

    A live stream image data of type MPImage on which gesture recognition is to be performed.

    timestampInMilliseconds

    The timestamp (in milliseconds) which indicates when the input image is sent to the gesture recognizer. The input timestamps must be monotonically increasing.

    error

    An optional error parameter populated when there is an error in performing gesture recognition on the input live stream image data.

    Return Value

    YES if the image was sent to the task successfully, otherwise NO.

  • Undocumented

  • Undocumented