Maps Unity SDK Key Concepts

The following sections contain descriptions of the concepts that are key to understanding and using the Maps Unity SDK.

Maps Unity SDK Demo

This release features the demonstration version of the Maps Unity SDK. It contains the same functionality as the full version of the Maps Unity SDK, but it doesn't include access to Google Maps geo database. Instead, it includes geo-data for a limited dataset that centers upon Manhattan, New York.

The MapsService class and component

The MapsService class serves as the entry point for interacting with the Maps Unity SDK. It encapsulates the ApiKey, and it exposes the GameObjectManager, and the LoadMap function, as well as Events from the GameObject Creation pipeline.

To use the Maps Unity SDK in your Unity project, you add the Map Service script component to an empty GameObject in your scene. The Map Service automatically adds generated map feature GameObjects as children of this anchor GameObject. With the Maps Service (Script) attached to your base GameObject, you can access its public attributes in the Unity Inspector—as shown here.

Maps Service (Script)

Geographic features as Unity GameObjects

The Maps Unity SDK renders geographic features (such as building, roads, and water) from the Google Maps database, as Unity GameObjects in games. At runtime, they're created as children of the GameObject that the MapsService component is attached to, and they have names of the form {MapFeatureType} ({PlaceID}).

SDK Game Objects

GameObject creation

During gameplay, the SDK pulls-down geo data from the Google Maps database—as semantic vector tiles (via the Google Vector Tile API). It decodes this data on-the-fly, transforming it into Unity GameObjects. This process occurs in stages, in what we refer to as the production pipeline. This approach allows you to access map feature data (both the metadata, and the geometry data) at the earliest opportunity, so you can customize the GameObjects before they reach the end of the pipeline.

The first thing that the Maps Unity SDK does when it receives vector tile data, is construct a MapFeature object out of it. This is an intermediate object composed of two component objects:

MapFeatureMeta
Encapsulates the map feature's metadata. Metadata varies based on the type of map feature. For example, a building might store metadata such as its height, volume, and the points of interest that it contains. Or a road might store its width, the number of lanes it contains, and the names of roads it intersects.
TileCoord
Encapsulates the cartesian coordinates of the semantic vector tile.

At an intermediate stage in the pipeline, MapFeature objects are specialized. That is, they become specific types (for example, a Google.Maps.Feature.ModeledStructure). These specialized MapFeature objects contain the Shape geometry details ( ModeledVolume in the case of a ModeledStructure). These details include both MapFeature-specific data (such as vertices and triangles), and shared interfaces for accessing common fields (such as bounding boxes).

Geometry data is converted into a Unity Mesh, and is added to the spawned GameObject via MeshFilter, and then is displayed with a MeshRenderer.

Accessing the pipeline

MapFeatures are exposed to you through events that are triggered during various stages of the pipeline. These include WillCreate events—which are fired just before the GameObject is created, allowing you to specify the styling options, or even cancel creation; and DidCreate events—fired just after the GameObject is created, allowing you to make additions or changes to the finished mesh.

As an example, you could examine ExtrudedStructures after their WillCreateExtrudedStructureEvent fires, and hide all buildings shorter than 20 meters (or you could simply skip creating them altogether).

Types of events

The Google.Maps.Event namespace contains an event class for each type of geographic features.

Each of these event types has a WillCreate and a DidCreate public member event object that you can subscribe to, as demonstrated in the following code example.

dynamicMapsService.MapsService.Events.ExtrudedStructureEvents.DidCreate.AddListener(args => {

    // Apply nine-sliced walls and roof materials to this building.
    buildingTexturer.AssignNineSlicedMaterials(args.GameObject);

    // Add a border around the building base using the Building Border Builder class,
    // coloring it using the given border Material.
    Extruder.AddBuildingBorder(args.GameObject, args.MapFeature.Shape, BuildingAndRoadBorder);
});

WillCreate events

WillCreate events are fired immediately after a MapFeature is created, but before the final GameObject is generated from it. WillCreate events allow you to suppress or customize the GameObjects created from a MapFeature. WillCreate event arguments have the following form:

using System.ComponentModel;
using Google.Maps.Decoded;
using UnityEngine;

namespace Google.Maps {
  public class WillCreateGameObjectEventArgs<T, U>
      : CancelEventArgs where T : IMapObject, U : IGameObjectStyle {

    public readonly T MapObject;
    public U Style;
    public GameObject Prefab;

    Public WillCreateGameObjectEventArgs(T mapObject, U defaultStyle, GameObject prefab) {
      MapObject = mapObject;
      Style = defaultStyle;
      Prefab = prefab;
    }
  }
}
  • Setting Cancel (inherited from CancelEventArgs) to true suppresses the creation of the GameObject.
  • MapObject is readonly.
  • Setting Style allows you to customize the appearance of the created GameObject.
  • Setting Prefab replaces the GameObject that would have been generated, with the prefab.

DidCreate events

DidCreate events are fired after a GameObject is generated, after it has been added to the scene. They notify you when the creation of the GameObject was successful, allowing you to perform further processing. DidCreate event arguments have the following form:

using System.ComponentModel;
using Google.Maps.Decoded;
using UnityEngine;

namespace Google.Maps {
  public class DidCreateGameObjectEventArgs<T, U>
      : EventArgs where T : IMapObject, U : IGameObjectStyle {

    public readonly T MapObject;
    public GameObject CreatedObject;

    Public DidCreateGameObjectEventArgs(T mapObject, GameObject createdObject) {
      MapObject = mapObject;
      CreatedObject = createdObject;
    }
  }
}
  • MapObject is readonly - so mutating it will not cause any change to the scene.
  • Altering CreatedObject will alter the GameObject added to the scene.

Buildings

There are two types of buildings: extruded buildings, and modeled structures.

Extruded buildings

Extruded buildings are generated from an outline (that is, a 2D footprint) and a height. The SDK represents most buildings in this manner, and it generates them in the following three ways:

  • Using real-world height data (where this information is available). This is the default behavior.

  • By providing a fixed height for all building, disregarding their real-world height.

  • By providing a backup height for all buildings that don't have a real-world height (by default, this value is set to 10 meters).

Combining these three methods allows the Maps Unity SDK to create cityscapes with realistic variance reflecting the real world, or with a constant building height, or a mixture of the two. The KitchenSink example scene demonstrates all of these methods.

Modeled structures

Modeled structures are generated using the standard 3D modeling approach of tessellated triangles. This approach is typically used for landmark buildings, such as the Statue of Liberty.

The Modeled Statue of Liberty

Applying materials

In Unity, the rendering process uses shaders, materials, and textures, to add realism to GameObjects. Shaders define how textures, colors, and lighting, are applied to displayed geometry, with the specific textures, colors and other settings used stored as a material. You use materials to define how a surface is rendered—by including references to the textures it uses, to tiling information, and to color.

Shaders are small scripts that contain the logic for calculating the color of each pixel—based on the lighting input, and on the material configuration. The Maps Unity SDK comes with a standard shader for modeled structures, and another for basemap features, but it also supports advanced material application. Coordinates for UV mapping are calculated for map feature GameObjects in such a way that any basic material can be applied, and it will look reasonable without modification.

For more advanced material effects, the Maps Unity SDK provides additional data per-vertex via extra UV channels, as well as a number of convenience functions for cg shaders via the GoogleMapsShaderLib library. This allows things like Nine-Sliced building textures—cutting-up a texture into the roof, ground, wall-corners, and tiled walls, for a building.

For more information, see Creating and Using Materials in the Unity User Manual.

UV channels

The UV channels for each MapFeature type contain data of the following form:

ExtrudedStructure

Walls

Each wall on an ExtrudedStructure is constructed as a quad of the following form:

Walls

UV coordinates for walls are calculated per quad. Vertices are not shared between quads—to allow for hard normals between walls (that is, letting the corners of walls appear as hard angles, rather than soft rounded edges).

Channel 0: (x, y, width, height)
x and y are the coordinates relative to the bottom left corner of this quad (square section) of the wall, whereas width and height are the width and height of this quad of the wall. This applies to every quad making up the wall.
Roof

Roof textures have the option of being either axis-aligned, or aligned to the direction of the ExtrudedStructure. You set this via the ExtrudedStructureStyle object.

Channel 0: (x, y, width, height)
x and y are the coordinates of each vertex, relative to the bottom-left corner of the roof (specifically, the corner of the minimum-cover axis-aligned bounding box for the roof). width and height define the size of the roof's bounding box.

Region

Channel 0: (x, y, width, height)
x and y are the coordinates of each vertex relative to the bottom-left corner of the axis-aligned bounding box for the region. width and height define the size of the bounding box.

Segment

Channel 0: (x, y, width, length)
x and y are the coordinates of each vertex, calculated as if the segment were completely straight—to allow texturing to bend around corners. width and length define the dimensions of the segment.

ModeledStructure

Channel 0:
Each coordinate is set to (0, 0, 0, 0) because there is currently no texture-coordinate implementation.

GoogleMapsShaderLib

The Maps Unity SDK includes a shader library called GoogleMapsShaderLib, to help you build shaders that work well with MapFeature GameObjects. The library is implemented in the file GoogleMapsShaderLib.cginc. You can use the library by including the following #include directive within the CGPROGRAM flags section in your shader script.

CGPROGRAM
// ...
#include "/Assets/GoogleMaps/Materials/GoogleMapsShaderLib.cginc"
// ...
ENDCG

The shader library is bundled inside the GoogleMaps.unitypackage. After you import the package, you can find GoogleMapsShaderLib.cginc inside the project folder /Assets/GoogleMaps/Materials/.

Nine-slicing

GoogleMapsShaderLib includes a convenience function that you can use in fragment shaders to provide nine-slicing of textures. Nine-slicing is a technique for covering surfaces with a texture, where the texture is divided into nine portions using a series of bounds. Areas between the bounds are tiled, and areas outside the bounds remain fixed—as illustrated here:

Nine-slicing

For example, when applying a nine-sliced texture to a building's wall, the top of the texture is applied to the top of the wall (just under the roof), the bottom of the texture is applied to the bottom of the wall (connected to the ground), the sides of the texture are applied to the edges of the wall, and the area in the middle of the texture is tiled evenly across the wall.

On roads (for another example), nine-slicing allows you to have a sidewalk of fixed width, but with a variable number of lanes, depending on the width of the road.

You can use nine-slicing by including GoogleMapsShaderLib.cginc in your shader, and then calling the nineSlice function. Sample shaders and materials are included in the GoogleMaps.unitypackage to demonstrate how you can use the nineSlice function to create a realistic skyscraper of variable size—without stretching or tearing.

Material location
/Assets/GoogleMaps/Materials/NineSlicing/NineSlicedWall.mat
Shader location
/Assets/GoogleMaps/Materials/NineSlicing/NineSlicingExample.shader

You can use nine-slicing on any MapFeature, except for ModeledStructures, which don't currently have any texturing coordinates.

Decorations

The Maps Unity SDK includes a set of helper methods for decorating a map with Unity Prefabs—such as adding TV antennas to a rooftop, or adding trees to a park. The Decoration namespace includes classes that support the following features:

ExtrudedStructureDecorator

ExtrudedStructureDecorator is a class that exposes a decorator for ExtrudedStructure MapFeatures (2.5D buildings).

Single roof decoration

This API allows you to add multiple prefabs to the rooftop area of a specified ExtrudedStructure. Examples of roof decorations include antennas, air-conditioning units, and flags.

The Maps Unity SDKsupports two types of roof decorations:

Point roof decoration

This type of roof decoration is treated as infinitesimal in size, so it can be placed anywhere on a roof (like antennas for instance).

Example
mapsService.Events.ExtrudedStructureEvents.DidCreate.AddListener((e) => {
  ExtrudedStructureDecorator.AddPointRoofDecoration(
      e.CreatedGameObject, yourDecorationPrefab);
});
Volume roof decoration

This type of roof decoration is an object with a width (such as an air-conditioning unit), but is processed as an object of cylindrical shape, so that the API can place it such that it will not overhang the boundary of the roof. You must define the radius of GameObjects used as roof decorations. To do this, add a CircularMarker component to the root GameObject of the Prefab, and then adjust the Radius property.

It's possible that the API won't find the appropriate position for placing the roof decoration. This can happen when the target rooftop area is too small. In this case, the API won't place anything on the rooftop, and the method returns false.

Example
mapsService.Events.ExtrudedStructureEvents.DidCreate.AddListener((e) => {
  bool isSuccess = ExtrudedStructureDecorator.AddVolumeRoofDecoration(
      e.CreatedGameObject, yourDecorationPrefab);
});

Multi roof decorations

This is similar to Point Roof Decoration, except that this API prevents multiple roof decorations from overlapping.

To use this API, you must pass-in a Func<GameObject> delegate to use as the Prefab generator. Roof decorations are placed iteratively; the generator is called each time the API attempts to place another roof decoration, then it attempts to find a space that can accommodate the returned Prefab. When the API can no longer place roof decorations, it stops calling the Prefab generator. You can also stop the API from placing more roof decorations by making the Prefab generator return a null value.

Example

The following code example places air-conditioning units and solar cells with equal probability onto each building's rooftop. The maximum number of decorations per roof is 10.

mapsService.Events.ExtrudedStructureEvents.DidCreate.AddListener((e) => {

  // Determines whether to place air-conditioning units or solar cells.
  bool placeAirConditioningUnit = Random.value < 0.5f;

  // The current number of roof decorations returned from the delegate.
  int count = 0;

  int actualNumberOfPlacedRoofDecorations = ExtrudedStructureDecorator.AddMultipleVolumeRoofDecorations(

      e.GameObject, () => {
          if (count >= 10) {
              return null;
          }

          count++;

          if (placeAirConditioningUnit) {
            return airConditioningUnitPrefab;
          }

          else {
              return solarCellsPrefab;
          }
      });
});

The coordinate system

The Maps Unity SDK coordinate system uses the Web Mercator Projection to convert between spherical WGS 84 latitude-longitude and cartesian Unity Worldspace (Vector3).

Vector3 values are relative to a floating origin, which is typically set to the user's starting location. As a result, you should not persist Vector3 values outside of a session (that is, on your server, or on the user's device). We recommend that you specify physical world locations using latitude-longitude pairs.

A floating origin is used to avoid floating-point stability issues. Unity's Vector3 class uses single-precision floating-point numbers, and the density of representable floating-point numbers decreases as their magnitude increases (meaning larger floating-point numbers become less accurate). You can update the floating origin whenever users move far enough away from the origin that this becomes an issue. You can set this to a relatively small value (for example, 100 or 200 meters), or larger (greater than 1 km) depending on how often you want to update things.

Unity Worldspace is scaled to 1:1 (meters), based on the initial origin's latitude. In the Mercator Projection, scale varies slightly by latitude, so the Unity Wordspace scale diverges marginally from 1:1 as users move North and South; however, users are not expected to move far (or fast) enough for this to be noticeable.

The Maps Unity SDK contains conversion functions for converting between LatLng and Unity Worldspace (Vector3)—that take into account floating origin and scale.

Load errors

Errors that occur while loading map data from the network can be handled with the MapLoadErrorEvent event. The Maps Unity SDK handles most types of errors itself if you don't add an event handler. However, there is an error that requires that your app take some action. This is specified by MapLoadErrorArgs.DetailedErrorCode and is described below.

UnsupportedClientVersion

This version of the Maps Unity SDK is no longer supported, possibly in combination with the current API Key. Typically your app should prompt the user to update to a newer version of your app.

This error usually means that the Maps Unity SDK version is too old. In rare cases, we might use this if we discover a critical problem with a version of the Maps Unity SDK or with an API Key. We will make every effort to communicate this and ensure that this doesn't happen until there is a working version of the app available to update to.

Known issues

Currently, you must take care when applying new materials to base map features. The SDK makes explicit use of Unity's render queue setting on the materials assigned to these features. When replacing the materials on such features, two properties of the replacement material are important to ensure correct rendering: the material should have z-testing set to ZTest Always, and the replacement material should use the same render queue setting as the material that it replaces. We recommend that you keep a cache of materials by render queue value to allow simple reuse. Each feature type (park, forest, etc.) should appear on the map with a handful of different render queue settings.

Google has included a sample shader that addresses these issues—in the GoogleMaps.unitypackage. It's called "BaseMapTextured.shader, and it's located in the /Assets/GoogleMaps/Materials/ folder. To use it on a material, select Google > Maps > Shaders > BaseMap Textured from the shader drop-down in the material inspector.

When styling a Feature.Region or a Feature.AreaWater object, you can apply a fill using either a material, a custom color, or an automatically generated color chosen via the FillModeType enum within the styling object. Auto colors are generated based on the value of the Region's usage type, as shown in the default Region style in the KitchenSink example scene.