Read It

Read It is a Google Assistant feature for supported Android devices that offers an alternate method for reading news articles, chat messages, and other in-app text elements. Users can say things like, "Hey Google, read it" to have Assistant read long-form web-based content and visible text elements aloud, highlight the words being read, and auto-scroll the page where supported. Adding Read It support to your app enables users to engage with your content in additional hands-free, eyes-free, and voice-forward contexts.

When prompted, an app reads web content on the screen aloud with
            the help of the Google Assistant.
Figure 1. Listening to an app read web content aloud.

Read web-based content

Users can ask Assistant to read web-based content displayed in an app. Apps can implement the onProvideAssistContent() method to optimize this content for Read It. This process helps maintain the structure of data as it's shared with Assistant. Users who receive shared app content can then be deep linked or receive content directly, instead of as text or a screenshot.

Read It only works with publicly available web content. It does not work with sites that require a sign-in, such as articles protected by a paywall. Assistant may still be able to read portions of protected content that is visible on a screen using the optical character recognition (OCR) method, which is described in the next section.

Read native app elements

In addition to reading web-based content, users can use Read It to read aloud visible app elements on their screen, such as chat messages, images containing text, and PDFs. When invoked, Assistant uses OCR to identify, highlight and repeat the visible text elements on the screen. Assistant supports OCR reading natively with no developer effort required.

Developers can exclude sensitive web articles and app screens from Read It support using the available opt-out mechanisms.

Provide web-based content to Assistant

You can implement the onProvideAssistContent() method to provide the web URI and basic article context to Assistant and optimize your web content for Read It. Otherwise, Assistant attempts to identify the content URI from the text available on the screen. Assistant then retrieves the web content to read aloud to the user. For Android apps that already implement web-based content using WebViews or Chrome Custom Tabs, we recommend using the same web URIs for Read It as a starting point.

When combining Read It functionality with built-in intents, you only need to implement onProvideAssistContent() for the final app activity in the user's task flow after invoking the App Action. For example, if your app displays news articles, you implement onProvideAssistContent() in the final screen showing the article; you don't need to implement it for any in-progress or preview screens.

Provide a web URI for your content in the uri field of AssistContent. Provide contextual information as a JSON-LD object using schema.org vocabulary in the structuredData field.

The following code snippet shows an example of providing content to Assistant:

override fun onProvideAssistContent(outContent: AssistContent) {
    // Set the web URI for content to be read from a
    // WebView, Chrome Custom Tab, or other source
    val urlString = url.toString()
    outContent.setWebUri(Uri.parse(urlString))

    // Create JSON-LD object based on schema.org structured data
    val structuredData = JSONObject()
        .put("@type", "Article")
        .put("name", "ExampleName of blog post")
        .put("url", outContent.getWebUri())
        .toString()
    outContent.setStructuredData(structuredData)

    super.onProvideAssistContent(outContent)
}

When implementing onProvideAssistContent(), we recommend providing as much data as possible about each entity. The following fields are required:

  • @type
  • name
  • uri (only required if the content is URL-addressable)

To learn more about using onProvideAssistContent(), see the Optimizing Contextual Content for the Assistant guide in the Android developer documentation.

Opt out of screen reading

Your app can restrict Read It functionality for sensitive content, such as non-public web content or app screens that display sensitive information. When a user invokes screen reading, Assistant audibly repeats web content and screen elements to the intended user and anyone within hearing range of the user's device. We recommend using the following opt-out mechanisms to protect sensitive information from being repeated.

Exclude web-based content

Online publishers can opt-out web based content using the noPageReadAloud meta tag. This tag restricts text-to-speech (STT) readers, like Read It, from reading specific articles aloud. The following sample HTML demonstrates this tag on a web article:

<!DOCTYPE html>
<html>
  <head>
    <meta name="google" content="noPageReadAloud" />
    <title>Example subscriber-only news article</title>
  </head>
</html>

Read It only works with publicly available web content. To exclude content that requires an account to view, such as paywalled articles, configure the isAccessibleForFree meta tag on your page markup.

Exclude native app elements

Apps can exclude specific screens from Read It using the FLAG_SECURE window flag. This flag prevents Read It from performing TTS on screens containing sensitive information, such as user account details or financial information. The following code demonstrates this flag on a sample Activity:

class SecureFlagActivity : Activity() {
    public override fun onCreate(savedInstanceState: Bundle?) {
        super.onCreate(savedInstanceState)
        window.setFlags(
            WindowManager.LayoutParams.FLAG_SECURE,
            WindowManager.LayoutParams.FLAG_SECURE
        )
        setContentView(R.layout.main)
    }
}