REST Resource: anomalies

Resource: Anomaly

Represents an anomaly detected in a dataset.

Our anomaly detection systems flag datapoints in a time series that fall outside of and expected range derived from historical data. Although those expected ranges have an upper and a lower bound, we only flag anomalies when the data has become unexpectedly worse, which usually corresponds to the case where the metric crosses the upper bound.

Multiple contiguous datapoints in a timeline outside of the expected range will be grouped into a single anomaly. Therefore, an anomaly represents effectively a segment of a metric's timeline. The information stored in the timelineSpec, dimensions and metric can be used to fetch a full timeline with extended ragne for context.

Required permissions: to access this resource, the calling user needs the View app information (read-only) permission for the app.

JSON representation
{
  "name": string,
  "metricSet": string,
  "timelineSpec": {
    object (TimelineSpec)
  },
  "dimensions": [
    {
      object (DimensionValue)
    }
  ],
  "metric": {
    object (MetricValue)
  }
}
Fields
name

string

Identifier. Name of the anomaly.

Format: apps/{app}/anomalies/{anomaly}

metricSet

string

Metric set resource where the anomaly was detected.

timelineSpec

object (TimelineSpec)

Timeline specification that covers the anomaly period.

dimensions[]

object (DimensionValue)

Combination of dimensions in which the anomaly was detected.

metric

object (MetricValue)

Metric where the anomaly was detected, together with the anomalous value.

TimelineSpec

Specification of the time-related aggregation parameters of a timeline.

Timelines have an aggregation period (DAILY, HOURLY, etc) which defines how events are aggregated in metrics.

The points in a timeline are defined by the starting DateTime of the aggregation period. The duration is implicit in the AggregationPeriod.

Hourly aggregation periods, when supported by a metric set, are always specified in UTC to avoid ambiguities around daylight saving time transitions, where an hour is skipped when adopting DST, and repeated when abandoning DST. For example, the timestamp '2021-11-07 01:00:00 America/Los_Angeles' is ambiguous since it can correspond to '2021-11-07 08:00:00 UTC' or '2021-11-07 09:00:00 UTC'.

Daily aggregation periods require specifying a timezone which will determine the precise instants of the start and the end of the day. Not all metric sets support all timezones, so make sure to check which timezones are supported by the metric set you want to query.

JSON representation
{
  "aggregationPeriod": enum (AggregationPeriod),
  "startTime": {
    object (DateTime)
  },
  "endTime": {
    object (DateTime)
  }
}
Fields
aggregationPeriod

enum (AggregationPeriod)

Type of the aggregation period of the datapoints in the timeline.

Intervals are identified by the date and time at the start of the interval.

startTime

object (DateTime)

Starting datapoint of the timeline (inclusive). Must be aligned to the aggregation period as follows:

  • HOURLY: the 'minutes', 'seconds' and 'nanos' fields must be unset. The timeZone can be left unset (defaults to UTC) or set explicitly to "UTC". Setting any other utcOffset or timezone id will result in a validation error.
  • DAILY: the 'hours', 'minutes', 'seconds' and 'nanos' fields must be unset. Different metric sets support different timezones. It can be left unset to use the default timezone specified by the metric set.

The timezone of the end point must match the timezone of the start point.

endTime

object (DateTime)

Ending datapoint of the timeline (exclusive). See startTime for restrictions. The timezone of the end point must match the timezone of the start point.

DimensionValue

Represents the value of a single dimension.

JSON representation
{
  "dimension": string,
  "valueLabel": string,

  // Union field value can be only one of the following:
  "stringValue": string,
  "int64Value": string
  // End of list of possible types for union field value.
}
Fields
dimension

string

Name of the dimension.

valueLabel

string

Optional. Human-friendly label for the value, always in English. For example, 'Spain' for the 'ES' country code.

Whereas the dimension value is stable, this value label is subject to change. Do not assume that the (value, valueLabel) relationship is stable. For example, the ISO country code 'MK' changed its name recently to 'North Macedonia'.

Union field value. Actual value of the dimension. Type-dependent. value can be only one of the following:
stringValue

string

Actual value, represented as a string.

int64Value

string (int64 format)

Actual value, represented as an int64.

MetricValue

Represents the value of a metric.

JSON representation
{
  "metric": string,

  // Union field value can be only one of the following:
  "decimalValue": {
    object (Decimal)
  }
  // End of list of possible types for union field value.

  // Union field confidence_interval can be only one of the following:
  "decimalValueConfidenceInterval": {
    object (DecimalConfidenceInterval)
  }
  // End of list of possible types for union field confidence_interval.
}
Fields
metric

string

Name of the metric.

Union field value. Actual value of the metric. Type-dependent. value can be only one of the following:
decimalValue

object (Decimal)

Actual value, represented as a decimal number.

Union field confidence_interval. If given, represents a confidence interval for value. confidence_interval can be only one of the following:
decimalValueConfidenceInterval

object (DecimalConfidenceInterval)

Confidence interval of a value that is of type type.Decimal.

DecimalConfidenceInterval

Represents the confidence interval of a metric.

JSON representation
{
  "lowerBound": {
    object (Decimal)
  },
  "upperBound": {
    object (Decimal)
  }
}
Fields
lowerBound

object (Decimal)

The confidence interval's lower bound.

upperBound

object (Decimal)

The confidence interval's upper bound.

Methods

list

Lists anomalies in any of the datasets.