Skip to main content

Creating Frame Processor Plugins

Creating a Frame Processor Plugin for iOS

The Frame Processor Plugin API is built to be as extensible as possible, which allows you to create custom Frame Processor Plugins. In this guide we will create a custom Face Detector Plugin which can be used from JS.

iOS Frame Processor Plugins can be written in either Objective-C or Swift.

Automatic setup

Run Vision Camera Plugin Builder CLI,

npx vision-camera-plugin-builder ios

The CLI will ask you for the path to project's .xcodeproj file, name of the plugin (e.g. FaceDetectorFrameProcessorPlugin), name of the exposed method (e.g. detectFaces) and language you want to use for plugin development (Objective-C, Objective-C++ or Swift). For reference see the CLI's docs.

Manual setup

  1. Open your Project in Xcode
  2. Create an Objective-C source file, for the Face Detector Plugin this will be called FaceDetectorFrameProcessorPlugin.m.
  3. Add the following code:
#import <VisionCamera/FrameProcessorPlugin.h>
#import <VisionCamera/FrameProcessorPluginRegistry.h>
#import <VisionCamera/Frame.h>

@interface FaceDetectorFrameProcessorPlugin : FrameProcessorPlugin

@implementation FaceDetectorFrameProcessorPlugin

- (instancetype) initWithOptions:(NSDictionary* _Nullable)options; {
self = [super initWithOptions:options];
return self;

- (id)callback:(Frame*)frame withArguments:(NSDictionary*)arguments {
CMSampleBufferRef buffer = frame.buffer;
UIImageOrientation orientation = frame.orientation;
// code goes here
return nil;

VISION_EXPORT_FRAME_PROCESSOR(FaceDetectorFrameProcessorPlugin, detectFaces)


The Frame Processor Plugin will be exposed to JS through the VisionCameraProxy object. In this case, it would be VisionCameraProxy.initFrameProcessorPlugin("detectFaces").

  1. Implement your Frame Processing. See the Example Plugin (Objective-C) for reference.

🚀 Next section: Finish creating your Frame Processor Plugin (or add Android support to your Frame Processor Plugin)