"Street View ready pro" specifications

Introduction

These specs, which are updated from time to time, detail all hardware, timing, and data requirements for advanced 360 cameras that offer high-speed, high-accuracy Street View capture and publishing capabilities. Please note that this program does not apply to any operational or mechanical functions.

Imagery

  • ≥15MP at ≥7FPS
  • 360° horizontal FOV
  • ≥135° contiguous vertical FOV
  • Google will review image and geometry quality

IMU

Recommended components:

The accelerometer should satisfy the following specifications:

  • Resolution: ≥16 bit
  • Range: ≥ +/- 8G with ≥4096 LSB/g typically
  • Sampling rate: ≥200 Hz with <1% jitter
  • Low-pass filtering must be enabled to eliminate aliasing. The cut-off frequency should be set at the highest possible value below the Nyquist frequency, which is half the sampling rate. For example, if the frequency is 200 Hz, then the low-pass filter cut off should be below 100 Hz but as close as possible.
  • Noise Density must be ≤300 μg/√Hz, and should be ≤150 μg/√Hz
  • Stationary noise bias stability <15 μg * √Hz from 24-hour static dataset
  • Bias change vs. temperature: ≤ +/- 1mg / °C
  • Best-fit line non-linearity: ≤0.5%
  • Sensitivity change vs. temperature ≤0.03%/°C

The gyroscope should satisfy the following specifications:

  • Resolution: ≥16 bit
  • Range: ≥ +/- 1000 deg/s with ≥32 LSB/dps
  • Sampling rate: ≥200 Hz with <1% jitter
  • Low-pass filtering must be enabled to eliminate aliasing. The cut-off frequency should be set at the highest possible value below the Nyquist frequency, which is half the sampling rate. For example, if the sampling frequency is 200 Hz, then the low-pass filter cut off should be below 100 Hz but as close as possible.
  • Noise Density: ≤0.01 °/s/√Hz
  • Stationary Bias Stability <0.0002 °/s *√Hz from 24-hour static dataset
  • Bias change vs. temperature: ≤ +/- 0.015 °/ s / °C
  • Best-fit line non-linearity must be ≤0.2%, should be ≤0.1%
  • Sensitivity change vs. temperature: ≤0.02% / °C

GPS

Recommended components

  • Either of the u-blox MAX-M8 series or the u-blox NEO-M8 series

Requirements

  • Sampling rate: ≥4 Hz
  • Constellation: simultaneous tracking of at least GPS and GLONASS
  • Time to first fix:
    • Cold: ≤40 seconds
    • Hot: ≤5 seconds
  • Sensitivity:
    • Tracking: -158 dBm
    • Acquisition: -145 dBm
  • Horizontal position accuracy: 2.5 meters (circular error probable (CEP), 50%, 24 hours static >6 SVs)
  • Velocity accuracy: 0.06 m/s (50% at 30 m/s)
  • Operational limit: ≥4g
  • Internal antenna or rigidly affixed external antenna of known type

Antenna design

Physically small products, such as cameras that contain both the GPS receiver system and numerous complex electronic systems, are prone to problems with radio receiver performance caused by RF emissions from the included electronics systems. This interference is often in-band to the radio receiver and as such, cannot be filtered out.

Timing specifications

All sensor measurements must be accurately timestamped with respect to the same stable system clock. The measurements must be timestamped when the sensor measured the quantity, not when the processor received the message from the sensor chip. The timestamp jitter between the different sensor readings should be <1 ms. All timestamps recorded in the same sensor data log must be continuous with no discontinuities. If the hardware reboots or resets and the system clock resets, then a new log must be created to store the new incoming data.

GPS

The GPS sensor should support an output of a time pulse and an associated message with the GPS time corresponding with the time pulse. This can be used to timestamp other GPS data packets with the same GPS epoch timestamp. The device should have an input to receive these time pulses, and when it receives a leading or trailing edge (whichever is appropriate), it should record the timestamp from the stable system clock. When the corresponding message packet is received that contains the GPS time, the device can now calculate the timestamp with respect to the stable system clock when it receives the navigation message from the GPS sensor, which contains the GPS time.

Video / images

The image sensor must support hardware timing to determine the precise time with respect to the stable system clock. In the event of dropped frames, subsequent frames must still reflect accurate timestamps. The timestamp must be with respect to the first active photon in the image. The manufacturer must specify which pixel this corresponds to.

IMU

The IMU (accelerometer and gyroscope) measurements must be timestamped with respect to when the measurement was taken, not when received.

Data specifications

Street View optimized cameras and systems must collect multiple, per-sensor data measurements per second. The following details the data for each individual measurement.

IMU data requirements

IMU (accelerometer and gyroscope) measurement data:

int64 time_accel;    // The time in nanoseconds when the accelerometer
                     // measurement was taken. This is from the same stable
                     // system clock that is used to timestamp the GPS and
                     // image measurements.
// The accelerometer readings in meters/sec^2. The x, y, z refer to axes of
// the sensor.
float accel_x;
float accel_y;
float accel_z;

int64 time_gyro;     // The time in nanoseconds when the gyroscope
                     // measurement was taken. This is from the same stable
                     // system clock that is used to timestamp the GPS and
                     // image measurements.
// The gyro readings in radians/sec. The x, y, z refer to axes of the sensor.
float gyro_x;
float gyro_y;
float gyro_z;

GPS data requirements

int64 time;         // Time in nanoseconds, representing when the GPS
                    // measurement was taken, based on the same stable
                    // system clock that issues timestamps to the IMU
                    // and image measurements
double time_gps_epoch;      // Seconds from GPS epoch when measurement was taken
int gps_fix_type;           // The GPS fix type
                            // 0: no fix
                            // 2: 2D fix
                            // 3: 3D fix
double latitude;            // Latitude in degrees
double longitude;           // Longitude in degrees
float altitude;             // Height above the WGS-84 ellipsoid in meters
float horizontal_accuracy;  // Horizontal (lat/long) accuracy in meters
float vertical_accuracy;    // Vertical (altitude) accuracy in meters
float velocity_east;        // Velocity in the east direction represented in
                            // meters/second
float velocity_north;       // Velocity in the north direction represented in
                            // meters/second
float velocity_up;          // Velocity in the up direction represented in
                            // meters/second
float speed_accuracy;       // Speed accuracy represented in meters/second

Video requirements

Video must be recorded at a frame rate of 7 Hz or greater. The camera should also record metadata associated with each image frame. For each image,

int64 time;   // The time in nanoseconds when the image was taken.
              // This is from the same stable system clock that is used to
              // timestamp the IMU and GPS measurements.

// The corresponding frame in the video.
int32 frame_num;

You must also fill in the following user-data atoms in your MP4 360 video:

  • moov/udta/manu: Camera manufacturer (make) as a string
  • moov/udta/modl: Camera model as a string
  • moov/udta/meta/ilst/FIRM: Firmware version as a string
You can verify your video with ffprobe command:
$ ffprobe your_video.mp4
...
  Metadata:
    make            : my.camera.make
    model           : my.camera.model
    firmware        : v_1234.4321
...

Camera architecture

The six degrees of freedom (6-DOF) transformation (relative position and orientation) between each sensor’s and each camera’s frame of reference (FOR) must be specified with respect to the accelerometer FOR. The sensor FOR must be as defined in the sensor’s data sheet and aligned with the sensor’s physical placement in the device. The FOR for each camera has the positive z-axis pointing away from the device into the FOV of the camera along the optical axis, the x-axis points to the right, the y-axis points down from top to bottom, and the origin of the FOR is at the camera’s optical center. The GPS FOR is located at the antenna.

The 6-DOF transformation (3-DOF for position and 3-DOF for orientation) of each sensor or camera is represented as a 3x4 transformation matrix T = [R p], where R is the 3x3 rotation matrix representing the orientation of the sensor or camera FOR in the accelerometer FOR, and p is the 3x1 position vector (x, y, z) in meters representing the origin of the sensor or camera FOR in the accelerometer FOR.

The requested transformations can be from a computer-aided design (CAD) model of the device and do not need to be device-specific to account for manufacturing variations. This information must be shared with Google at the start of the evaluation process.

Camera configuration

  • The camera should not perform any motion stabilization to the images.
  • The camera settings should be tuned for capturing imagery indoors and outdoors.

Power (either or both of the following models should be employed):

  • USB 3.1 tethered power and recharging, supporting ≥ 4 hours of recording
  • Battery-powered operation supporting >1 hour recording and upload

Software implementation reminders

Support for upload via the Street View Publish API is required. Please note that all requests to the API must be authenticated as described here.

For all imagery uploaded to Street View:

For all 360 videos uploaded to Street View:

  • telemetry data must be communicated using Camera Motion Metadata.
  • the photo sequence must be encoded with the correct frame rate at which the video was captured.

Please also include the following language and line within your application prior to the user publishing (at least the first time):

“This content will be public on Google Maps and may also appear in other Google products. You can learn more about the Maps User Contributed Content Policy here.”

Product evaluation

  • Interested in Street View ready pro? Get ready!
    • Review Open Spherical Camera API and Street View Publish API
    • Request access to 360 photo sequences via Street View Publish API Support with a description of how your product meets the above specifications. You may also be asked to provide the below information using a template provided by our team.
      • 3 MP4 files and 3 photos that comply with the above specifications, including the Camera Motion Metadata Specification
      • Accounts to whitelist for access to the 360 photo sequence documentation and methods necessary for Street View ready pro eligibility.
      We will review and provide feedback to your submission. Once we confirm that the test data is complete and compliant, please proceed to the next step.
  • Selected for Street View ready pro? Get started!
    • Share with us your product’s camera architecture
    • Enable your product to upload 360 photos and photo sequences to Street View, using the Street View Publish API
    • Publish 12 photo sequences (covering at least 20 kms per photo sequence) and 12 photos, with equal spread across the below. Please share with us the results using the template provided by our team.
      • camera control operating systems: Android, iOS, on-device
      • uploading software operating systems: Android, iOS, MacOS, Windows, on-device
      • area types: urban canyons, other urban areas, suburban neighborhoods
      We will review and provide feedback to your submission. Once we confirm that the test data is complete and compliant, please proceed to the next step.
    • Engage at least 5 beta testers to upload at least 3 photo sequences each (covering at least 5 kms per photo sequence). Please share with us the results using the template provided by our team. We will review and provide feedback to your submission. Once we confirm that the test data is complete and compliant, please proceed to the next step.
    • Coordinate with our team to supply any necessary equipment (including accessories), accesses, and help content to assess your product’s end-to-end Street View experience. We will review the results of our tests and provide our feedback. Once we confirm that the test data and publishing flow are compliant, please proceed to the next step.
  • Approved as Street View ready pro? Congrats!
    • One last step - please submit a launch plan, including links to help content and support channels, to prepare for possible co-marketing opportunities (subject to our branding guidelines). Please share your plan using the template provided by our team. Once your submission has been full approved, we will provide access to the Street View ready pro badge and coordinate any additional co-marketing opportunities.
    • Congrats on being approved as Street View ready (pro grade)! This status remains valid for 1 year; products are automatically eligible for a second year if their users publish over 5,000 kms of imagery to Google Maps during their first year.

Exceptions

Exceptions may be granted for specific hardware and software solutions that do not match individual requirements but meet the overall end-to-end performance metrics stipulated in this document.