ML Kit Release Notes

July 28, 2020

Android: 16.*.* / iOS: 0.62.0

This release includes new features, improvements and bug fixes.

Android API update details

The following table lists the Android APIs that have changed in this release.

Group IDArtifact nameVersion
com.google.android.gmsplay-services-mlkit-barcode-scanning16.1.1
com.google.mlkitbarcode-scanning16.0.2
com.google.mlkitdigital-ink-recognition16.0.0
com.google.mlkitimage-labeling-automl16.2.0
com.google.mlkitimage-labeling-custom16.2.0
com.google.mlkitimage-labeling16.2.0
com.google.mlkitobject-detection-custom16.2.0
com.google.mlkitobject-detection16.2.0

New features

  • Added a new Digital Ink Recognition API that recognizes text and shapes handwritten on a digital surface (e.g. touch screens). Supports 300+ languages, as well as emoji and autodraw. It is the same technology that powers handwriting recognition layouts in Gboard, the Google Translate apps and the Quick, Draw! game.

    Digital Ink Recognition works fully offline (aside from a one-time download of language packs), and supports both Android and iOS.

  • On Android, added support for specifying your own custom Executor for running expensive tasks like model loading and inference to Image Labeling and Object Detection and Tracking. With this release, all ML Kit APIs now support custom executors.

    By default, ML Kit uses a highly optimized, internally managed thread pool to run background tasks. This API can help with specialized use cases where developers want to keep full control over the threads in their app instead.

Improvements and bug fixes

  • On Android, for the Barcode Scanning API, fixed rotation handling of the input image.

Known issues

  • On Android, for the Text Recognition, Image Labeling, and Object Detection and Tracking APIs, the performance with CameraX and Camera2 is still not on par with Camera1. This will be addressed in an upcoming release.
  • On Android, Task callbacks might execute after the Activity or Fragment in which they have been registered has been destroyed. This might lead to an exception if the callback tries to access a detector that has been closed in the meantime. If you are using ML Kit in an Activity, you can register an Activity-scoped listener, which will be automatically removed when the Activity is stopped. If you are using ML Kit in a Fragment or other environment, you can provide a custom executor which shuts down execution when the Fragment is destroyed. See ScopedExecutor.Java in the Vision Quickstart for an example. This will be addressed in an upcoming release.

July 15, 2020

iOS: GoogleMLKit/Common 0.61.1

This release includes improvements and bug fixes.

Improvements and bug fixes

  • Upgraded the version range of GoogleDataTransport dependency in MLKitcommon from 3.2 or higher to 7.0 or higher.
  • Removed the GoogleDataTransportCCTSupport dependency from MLKitCommon.

July 1, 2020

Android: 16.0.1, 16.1.0 / iOS: 0.61.0

This release includes new features, improvements and bug fixes.

Android API update details

The following table lists the Android APIs that have changed in this release.

Group IDArtifact nameVersion
com.google.android.gmsplay-services-mlkit-barcode-scanning16.1.0
com.google.android.gmsplay-services-mlkit-face-detection16.1.0
com.google.android.gmsplay-services-mlkit-text-recognition16.1.0
com.google.mlkitbarcode-scanning16.0.1
com.google.mlkitface-detection16.0.1
com.google.mlkitimage-labeling-automl16.1.0
com.google.mlkitimage-labeling-custom16.1.0
com.google.mlkitimage-labeling16.1.0
com.google.mlkitlanguage-id16.1.0
com.google.mlkitobject-detection-common16.1.0
com.google.mlkitobject-detection-custom16.1.0
com.google.mlkitobject-detection16.1.0
com.google.mlkitsmart-reply16.1.0
com.google.mlkittranslate16.1.0

New features

  • The Image Labeling and Object Detection and Tracking APIs now support float-based custom models.
  • On Android, added support for specifying your own custom Executor for running expensive tasks like model loading and inference to all APIs except Image Labeling and Object Detection and Tracking.

    By default, ML Kit uses a highly optimized, internally managed thread pool to run background tasks. This API can help with specialized use cases where developers want to keep full control over the threads in their app instead.

Improvements and bug fixes

  • Breaking change: On iOS, for the Barcode Scanning API, fixed a typo in BarcodePersonName and renamed its pronounciation property to pronunciation.
  • Fixed an issue on iOS in the Translate and Smart Reply APIs where the app would crash if the device locale does not specify a region (e.g. "en" instead of "en-US").

Known issues

  • On Android, the Image Labeling and Object Detection and Tracking APIs don't support specifying custom executors yet. This will be added in an upcoming release.
  • On Android, for the Text Recognition, Image Labeling, and Object Detection and Tracking APIs, the performance with CameraX and Camera2 is still not on par with Camera1. This will be addressed in an upcoming release.
  • On Android, Task callbacks might execute after the Activity or Fragment in which they have been registered has been destroyed. This might lead to an exception if the callback tries to access a detector that has been closed in the meantime. If you are using ML Kit in an Activity, you can register an Activity-scoped listener, which will be automatically removed when the Activity is stopped. If you are using ML Kit in a Fragment or other environment, you can provide a custom executor which shuts down execution when the Fragment is destroyed. See ScopedExecutor.Java in the Vision Quickstart for an example. This will be addressed in an upcoming release.

June 3, 2020

Android: 16.0.0 / iOS: 0.60.0

This is the first release of ML Kit as a standalone SDK, independent from Firebase. This SDK offers all the on-device APIs that were previously offered through the ML Kit for Firebase SDK. For more information on this change and instructions on migrating your existing apps, please follow our migration guide.

This release includes new features, improvements and bug fixes.

New features

Improvements and bug fixes

  • On Android, included support for Face Contours in “thin” variant of Face Detection API, backed by Google Play Services. Reduces size impact to your app by ~20.3 MB compared to bundled variant.
  • On Android, improved CameraX and Camera2 performance for Barcode Scanning and Face Detection APIs by moving image processing code from Java to native code.
  • Android Jetpack Lifecycle support has been added to all APIs. Developers can use addObserver to automatically manage the initiation and teardown of ML Kit APIs as the app goes through screen rotation or closure by the user / system. This makes CameraX integration easier.
  • Text Recognition API: the most prevalent recognized language is now provided
  • Face Detection API: the Euler X angle of a face is now provided
  • Barcode Scanning API [bundled]: added support for broken PDF417 start/stop pattern detection, improving recall by 10%
  • Object Detection and Tracking API: updated localizer model, improving average precision and reducing footprint by ~700KB

Known issues

  • On Android, for the Text Recognition, Image Labeling, and Object Detection and Tracking APIs, the performance with CameraX and Camera2 is still not on par with Camera1. This will be addressed in an upcoming release.
  • On Android, Task callbacks might execute after the Activity or Fragment in which they have been registered has been destroyed. This might lead to an exception if the callback tries to access a detector that has been closed in the meantime. If you are using ML Kit in an Activity, you can register an Activity-scoped listener, which will be automatically removed when the Activity is stopped. If you are using ML Kit in a Fragment or other environment, you can provide a custom executor which shuts down execution when the Fragment is destroyed. See ScopedExecutor.Java in the Vision Quickstart for an example. This will be addressed in an upcoming release.

Legacy releases

Changes prior to June 3, 2020 can be found in the Firebase release notes