Migrating for iOS


Before you start to migrate your code, be sure you meet these requirements:

  • ML Kit supports Xcode 13.2.1 or greater.
  • ML Kit supports iOS version 10 or greater.
  • ML Kit does not support 32-bit architectures (i386 and armv7). ML Kit does support 64-bit architectures (x86_64 and arm64).
  • The ML Kit library is only provided as cocoapods. You can't mix frameworks and cocoapods, so in order to use this library you need to first migrate to use cocoapods.

Update Cocoapods

Update the dependencies for the ML Kit iOS cocoapods in your app’s Podfile:

APIOld pod name(s)New pod name(s)
Barcode scanning Firebase/MLVision
Face detection Firebase/MLVision
Image labeling Firebase/MLVision
Object detection and tracking Firebase/MLVisionObjectDetection GoogleMLKit/ObjectDetection
Text recognition Firebase/MLVision
AutoML image labeling (bundled model) Firebase/MLVisionAutoML GoogleMLKit/ImageLabelingCustom
AutoML image labeling (model download from Firebase) Firebase/MLVisionAutoML GoogleMLKit/ImageLabelingCustom
Language ID Firebase/MLNaturalLanguage
Smart reply Firebase/MLNaturalLanguage
Translate Firebase/MLNaturalLanguage

Update names of classes, enums, and types

In general, classes , enums, and types need to be renamed as follows:

  • Swift: Remove the Vision prefix from class names and enums
  • Objective-C: Replace both FIRVision and FIR class name and enum prefixes by MLK

For some class names and types this general rule does not apply:


Old class or typeNew class or type
AutoMLLocalModel LocalModel
AutoMLRemoteModel CustomRemoteModel
VisionBarcodeDetectionCallback BarcodeScanningCallback
VisionBarcodeDetector BarcodeScanner
VisionBarcodeDetectorOptions BarcodeScannerOptions
VisionImage VisionImage (no change)
VisionPoint VisionPoint (no change)
VisionOnDeviceAutoMLImageLabelerOptions CustomImageLabelerOptions
VisionOnDeviceImageLabelerOptions ImageLabelerOptions


Old class or typeNew class or type
FIRAutoMLLocalModel MLKLocalModel
FIRAutoMLRemoteModel MLKCustomRemoteModel
FIRVisionBarcodeDetectionCallback MLKBarcodeScanningCallback
FIRVisionBarcodeDetector MLKBarcodeScanner
FIRVisionBarcodeDetectorOptions MLKBarcodeScannerOptions
FIRVisionImage MLKVisionImage
FIRVisionOnDeviceAutoMLImageLabelerOptions MLKCustomImageLabelerOptions
FIRVisionOnDeviceImageLabelerOptions MLKImageLabelerOptions
FIRVisionPoint MLKVisionPoint


Update method names

Update method names according to these rules:

  • Domain entry point classes (Vision, NaturalLanguage) no longer exist. They have been replaced by task specific classes. Replace calls to their various factory methods for getting detectors with direct calls to each detector's factory method.

  • The VisionImageMetadata class has been removed, along with the VisionDetectorImageOrientation enum. Use the orientation property of VisionImage to specify the display orientation of an image.

  • The onDeviceTextRecognizer method that gets a new TextRecognizer instance has been renamed to textRecognizer.

  • The confidence property has been removed from text recognition result classes, including TextElement, TextLine, and TextBlock.

  • The onDeviceImageLabeler and onDeviceImageLabeler(options:) methods to get a new ImageLabeler instance have been merged and renamed to imageLabeler(options:).

  • The objectDetector method to get a new ObjectDetector instance has been removed. Use objectDetector(options:) instead.

  • The type property has been removed from ImageLabeler and the entityID property has been removed from the image labeling result class, ImageLabel.

  • The barcode scanning API detect(in _:, completion:) has been renamed to process(_:, completion:) to be consistent with other vision APIs.

  • The Natural Language APIs now use the term "language tag" (as defined by the BCP-47 standard) instead of "language code".

  • TranslateLanguage now uses readable names (like .english) for its constants instead of language tags ( like .en).

Here are some examples of old and new Swift methods:


let options = VisionOnDeviceImageLabelerOptions()
options.confidenceThreshold = 0.75
let labeler = Vision.vision().onDeviceImageLabeler(options: options)

let detector = Vision.vision().faceDetector(options: options)

let localModel = AutoMLLocalModel(manifestPath: "automl/manifest.json")
let options = VisionOnDeviceAutoMLImageLabelerOptions(localModel: localModel)
options.confidenceThreshold = 0.75
let labeler = vision.onDeviceAutoMLImageLabeler(options: options)

let detector = Vision.vision().objectDetector()


let options = ImageLabelerOptions()
options.confidenceThreshold = NSNumber(value:0.75)
let labeler = ImageLabeler.imageLabeler(options: options)

let detector = FaceDetector.faceDetector(options: options)

let localModel = LocalModel(manifestPath: "automl/manifest.json")
let options = CustomImageLabelerOptions(localModel: localModel)
options.confidenceThreshold = NSNumber(value:0.75)
let labeler = ImageLabeler.imageLabeler(options: options)

let detector = ObjectDetector.objectDetector(options: ObjectDetectorOptions())

Here are some examples of old and new Objective-C methods:


FIRVisionOnDeviceImageLabelerOptions *options = 
    [[FIRVisionOnDeviceImageLabelerOptions alloc] init];
options.confidenceThreshold = 0.75;
FIRVisionImageLabeler *labeler = 
    [[FIRVision vision] onDeviceImageLabelerWithOptions:options];

FIRVisionFaceDetector *detector =
    [[FIRVision vision] faceDetectorWithOptions: options];

FIRAutoMLLocalModel *localModel =
    [[FIRAutoMLLocalModel alloc] initWithManifestPath:@"automl/manifest.json"];
FIRVisionOnDeviceAutoMLImageLabelerOptions *options =
    [[FIRVisionOnDeviceAutoMLImageLabelerOptions alloc]
        initWithLocalModel: localModel];
options.confidenceThreshold = 0.75
FIRVisionImageLabeler *labeler =
    [[FIRVision vision] onDeviceAutoMLImageLabelerWithOptions:options];

FIRVisionObjectDetector *detector =
    [[FIRVision vision] objectDetector];


MLKImageLabelerOptions *options =
    [[MLKImageLabelerOptions alloc] init];
options.confidenceThreshold = @(0.75);
MLKImageLabeler *labeler =
    [MLKImageLabeler imageLabelerWithOptions:options];

MLKFaceDetector *detector =
    [MLKFaceDetector faceDetectorWithOptions:options];

MLKLocalModel *localModel =
    [[MLKLocalModel alloc]
MLKCustomImageLabelerOptions *options =
    [[MLKCustomImageLabelerOptions alloc] initWithLocalModel:localModel];
options.confidenceThreshold = @(0.75)
MLKImageLabeler *labeler =
    [MLKImageLabeler imageLabelerWithOptions:options];

MLKObjectDetectorOptions *options = [[MLKObjectDetectorOptions alloc] init];
MLKObjectDetector *detector = [MLKObjectDetector objectDetectorWithOptions:options];

API-specific changes

Object detection and tracking

If your app uses object classification, be aware that the new SDK has changed the way returns the classification category for detected objects.

VisionObjectCategory in VisionObject is returned as text in an ObjectLabel object, instead of an integer. All possible string categories are included in the DetectedObjectLabel enum.

Note that the .unknown category has been removed. When the confidence of classifying an object is low, the classifier returns no label at all.

Here is an example of the old and new Swift code:


if (object.classificationCategory == .food) {


if let label = object.labels.first {
  if (label.text == DetectedObjectLabel.food.rawValue) {
// or
if let label = object.labels.first {
  if (label.index == DetectedObjectLabelIndex.food.rawValue) {

Here is an example of the old and new Objective-C code:


if (object.classificationCategory == FIRVisionObjectCategoryFood) {


if ([object.labels[0].text isEqualToString:MLKDetectedObjectLabelFood]) {
// or
if ([object.labels[0].index == MLKDetectedObjectLabelIndexFood]) {

Remove Firebase dependencies (Optional)

This step only applies when these conditions are met:

  • Firebase ML Kit is the only Firebase component you use
  • You only use on-device APIs
  • You don't use model serving

If this is the case, you can remove Firebase dependencies after migration. Follow these steps:

  • Remove the Firebase configuration file by deleting the GoogleService-Info.plist file from your app’s directory and your Xcode project.
  • Remove any Firebase cocoapod, such as pod 'Firebase/Analytics', from your Podfile.
  • Remove any FirebaseApp initialization, such as FirebaseApp.configure() from your code.
  • Delete your Firebase app at the Firebase console accroding to the instructions on the Firebase support site.

Getting Help

If you run into any issues, please check out our Community page where we outline the channels available for getting in touch with us.