Mobile devices have become an ubiquitous part of daily life for many. Your users have their phones with them throughout the day as they drive, walk, exercise, work, and play.
Understanding what users are doing in the physical world allows your app to be smarter about how to interact with them. For example, an app can start tracking a user's heartbeat when she starts running, another app can switch to car mode when it detects that the user has started driving.
The Activity Recognition API is built on top of the sensors available in a device. Device sensors provide insights into what users are currently doing. However, with dozens of signals from multiple sensors and slight variations in how people do things, detecting what users are doing is not easy.
The Activity Recognition API automatically detects activities by periodically reading short bursts of sensor data and processing them using machine learning models. To optimize resources, the API may stop activity reporting if the device has been still for a while, and uses low power sensors to resume reporting when it detects movement.
Perform an action when your app receives activity information
The Activity Recognition API delivers its results to a callback, which is usually implemented as an
IntentService in your app. The results are delivered at intervals that you specify, or your app can use the results requested by other clients without consuming additional power itself.
You can tell the API how to deliver results by using a
PendingIntent, which removes the need to have a service constantly running in the background for activity detection purposes. Your app receives the corresponding
Intents from the API, extracts the detected activities, and decides if it should take an action. Invoking the service only when an activity is received preserves resources, such as memory.
Receive detected activities that include a confidence grade
The Activity Recognition API does the heavy lifting by processing the signals from the device to identify the current activities. Your app receives a list of detected activities, each of which includes
confidence property indicates the likelihood that the user is performing the activity represented in the result. The
type property represents the detected activity of the device relative to entities in the physical world, for example, the device is on a bicycle or the device is on a user who is running.