On-demand Rides and Deliveries Solution is currently available only to select partners.


Fleet Engine offers a simple logging service that lets you save its API requests and response payloads. With these logs, you can debug your integration, create monitoring metrics, and analyze traffic patterns.

Fleet Engine sends the logs as platform logs to Cloud Logging, so that you can use the Cloud Logging tools to easily access them.

What Fleet Engine logs

Fleet Engine sends all authenticated REST and gRPC requests and responses, even error responses, to Cloud Logging. These will also include calls initiated by the Driver SDK to the Fleet Engine. However, it does not log authentication errors and it can redact some fields from the request and response for data protection reasons.

See documentation for available log messages and schema in the Logging Reference.

Enabling Cloud Logging

Logging might not be enabled by default for projects created prior to Feb 10, 2022. You can confirm whether logging is enabled by using the following query in the Logs Explorer : resource.type:"fleetengine.googleapis.com"

If you don't see any logs for that query, Cloud Logging might not have been enabled for your project. Contact Support if you want to enable the feature.

Accessing your logs

Cloud Logs are structured around the LogEntry format. Fleet Engine sends logs to Cloud Logging with the LogEntry’s resource.type set to fleetengine.googleapis.com. You can use the Logs Explorer to write queries for viewing your logs.

For example, to view all RPCs to Fleet Engine that returned an error, use the following Logs Explorer query:


To view logs of RPCs made to the UpdateVehicle method for the project example-project-id, use the following Logs Explorer query:


The following example shows a LogEntry for the UpdateVehicle log. The RPC request and response are located inside the jsonPayload field:

      "insertId": "c6b85fbc927343fc8a85338c57a65733",
      "jsonPayload": {
        "request": {
          "header": {4},
          "updateMask": "deviceSettings",
          "vehicleId": "uniqueVehicleId",
          "vehicle": {2}
        "response": {
          "name": "providers/example-project-id/vehicles/uniqueVehicleId",
          "availableCapacity": 2,
          "state": "VEHICLE_STATE_OFFLINE",
          "maximumCapacity": 2,
          "vehicleType": {1},
          "supportedTrips": {1}
        "@type": "type.googleapis.com/maps.fleetengine.v1.UpdateVehicleLog"
      "resource": {
        "type": "fleetengine.googleapis.com/Fleet",
        "labels": {2}
      "timestamp": "2021-01-01T00:00:00.000000000Z",
      "labels": {2},
      "logName": "projects/example-project-id/logs/fleetengine.googleapis.com%2Fupdate_vehicle",
      "receiveTimestamp": "2021-01-01T00:00:00.000000000Z"

If an RPC error is returned, the responseVehicle field is cleared and the errorResponse field is set and populated within jsonPayload:

      "insertId": "c6b85fbc927343fc8a85338c57a65733",
      "jsonPayload": {
        "errorResponse": {
          "httpStatusCode": 404,
          "code": "NOT_FOUND",
          "message": "No entity with id invalidVehicleId exists"
        "@type": "type.googleapis.com/maps.fleetengine.v1.UpdateVehicleLog",
        "request": {
          "vehicle": {3},
          "updateMask": "deviceSettings",
          "vehicleId": "fakeVehicleId",
          "header": {4}
      "resource": {
        "type": "fleetengine.googleapis.com/Fleet",
        "labels": {2}
      "timestamp": "2021-01-01T00:00:00.000000000Z",
      "severity": "ERROR",
      "labels": {2}
      "logName": "projects/example-project-id/logs/fleetengine.googleapis.com%2Fupdate_vehicle",
      "receiveTimestamp": "2021-01-01T00:00:00.000000000Z"

For more information about the logging query language, see Logging query language . For information about how you can use your logs to create metrics, see Overview of logs-based metrics.

Managing logging costs

After logging is enabled, you are responsible for setting up how you would like to route, store, and retain your logs. You may incur extra Google Cloud Platform charges for log ingestion and retention if you exceed the usage and retention limits for no charge. However, you can control logging costs by doing one of the following:

Reducing logging usage

You can limit the amount of log data ingestion by excluding certain log entries.

Exporting or routing logs

You can route logs to other GCP or external destinations to avoid the default ingestion and storage costs. Make sure you turn off log ingestion, as described below, to avoid ingestion costs.

Turning off log ingestion

Reducing logging usage, or exporting or routing logs, are preferred over turning off log ingestion. However, if you don't intend to use Fleet Engine logs, you can avoid potential Cloud Logging charges by turning off ingestion. By default, Fleet Engine logs are routed to the _Default log bucket.

The following command updates the _Default logging bucket to not ingest Fleet Engine logs.

gcloud logging sinks update _Default --log-filter='
  NOT LOG_ID("cloudaudit.googleapis.com/activity")
  AND NOT LOG_ID("externalaudit.googleapis.com/activity")
  AND NOT LOG_ID("cloudaudit.googleapis.com/system_event")
  AND NOT LOG_ID("externalaudit.googleapis.com/system_event")
  AND NOT LOG_ID("cloudaudit.googleapis.com/access_transparency")
  AND NOT LOG_ID("externalaudit.googleapis.com/access_transparency")
  AND NOT resource.type:"fleetengine.googleapis.com"

For more information, see: Cloud Logging Exclusions and Excluding logs. Cloud Logging Exports and Exporting logs

Using the Logs Explorer

To use the Logs Explorer, open the Cloud Console, select Logging, and then Logs Explorer. To see a list of all the Fleet Engine logs available, click on the Fleet Engine Resource Type. Some Delivery API logs are labeled with a Task ID and a Delivery Vehicle ID. You can use these labels to select logs for the tasks or vehicles that interest you.

Log labels

Filtering logs by delivery vehicle ID

In the Logs Explorer, you can use the following query to restrict logs to a specific vehicle:


Filter vehicle

Filtering logs by task ID

In the Logs Explorer, you can use the following query to restrict logs to a specific task:


Filtering logs for a vehicle over a specific time period

In the Logs Explorer, you can use the following query to restrict logs to those for a vehicle over a specific time period:


Log-based metrics example

The following example shows how to use log-based metrics to track the number of tasks created over time.

  1. In the Cloud Console, select Logging and then Logs Explorer to open the Logs Explorer. Then apply the following filter:

    resource.type="fleetengine.googleapis.com/Fleet" resource.labels.location="global"
    jsonPayload.response.type= ("TASK_TYPE_LOG_PICKUP" OR "TASK_TYPE_LOG_DELIVERY")
  2. In the Query Results pane, select the Actions drop-down and then select Create Metric.

    Create metric

  3. In the Metric Editor dialog:

    • Specify a metric name (for example, billable_tasks).
    • Specify a metric description (for example, The number of Billable Trip calls).
    • Leave the Units option blank. _ Leave the Type option as Counter.

    Then select the Create Metric button.

  4. On the Logs-Based Metrics page, you should see a message confirming that the metric was created successfully, and the new metric should appear in the User-define metrics section. The metric will now be populated as matching logs are generated.

  5. Select the vertical drop down on the right side of the new metric and then select View in Metrics Explorer.

    View metric

  6. On the left pane under Build Your Query, set the Resource Type to Fleet Engine and search for the billable_tasks metric.

    Search metric

    The graph on the right shows the rate of billable_tasks calls.

Using BigQuery

BigQuery is a powerful tool for performing analytics. It can be used to store longer-term logs and to perform ad-hoc SQL-like queries against the data.

Routing logs to BigQuery

To take advantage of BigQuery, logs must be routed to a BigQuery data store, as follows:

  1. In the Cloud Console, select Logging and then Logs Explorer.

  2. Create a filter that isolates Fleet Engine logs. In the Logs Field Explorer, select the Fleetengine.googleapis.com/Fleet resource type.

    Create filter

  3. In the Query Results pane, click the Actions drop-down and choose Create Sink.

    Create sink

  4. In the Select sink service dialog, select BigQuery dataset.

    Select sink

  5. In the Edit Sink dialog, specify the following options:

    • Specify an sink name (for example, FleetEngineLogsSink).
    • Leave Sink Service as BigQuery.
    • Select the Use Partitioned Tables option. This will boost query performance.
    • Under Sink Destination, select Create New BigQuery Data Set, and then specify a BigQuery data set name (for example, FleetEngineLogs).
    • Click the Create Sink button.

    Edit sink

Your logs should now begin to populate the BigQuery data set. You can see the data in the BigQuery section of the Cloud Console.

BigQuery section

Several tables under the FleetEngineLogs data set will be populated automatically, one for each log type:

  • CreateDeliveryVehicle
  • GetDeliveryVehicle
  • ListDeliveryVehicle
  • UpdateDeliveryVehicle
  • CreateTask
  • GetTask
  • UpdateTask
  • ListTasks
  • SearchTasks

The table names use the following pattern:


For example, if the project is called test_project and the dataset name is FleetEngineLogs, the CreateTask table has the following name:


Example queries

This section shows examples of queries you can create.

Tasks created per hour

The following query counts the number of CreateTasks logs and groups them by hour.

SELECT TIMESTAMP_TRUNC(timestamp, HOUR) as hour,
       count(*) as num_tasks_created
ORDER by hour

Number of stops per vehicle per hour

The following query generates a count of the stops that a vehicle served, broken down by hour.

For example, this query could tell us that in the last hour:

  • Vehicle A completed 10 stops in hour 12 and 8 stops in hour 13.
  • Vehicle B completed 5 stops in hour 11 and 7 stops in hour 12.
  • Vehicle C completed 12 stops in hour 13 and 9 stops in hour 14.

      jsonpayload_v1_updatedeliveryvehiclelog.request.deliveryvehicleid AS vehicle,
      TIMESTAMP_TRUNC(timestamp, HOUR) AS hour,
      COUNT(*) AS num_stops
    ARRAY_LENGTH(jsonpayload_v1_updatedeliveryvehiclelog.request.deliveryvehicle.remainingvehiclejourneysegments) > 0
      AND jsonpayload_v1_updatedeliveryvehiclelog.request.deliveryvehicle.remainingvehiclejourneysegments[
      (0)].stop.state = 'VEHICLE_STOP_STATE_LOG_ARRIVED'

First delivery success rate

The following query that shows the percentage of success in the first delivery attempt rate.

 COUNTIF(outcome = "TASK_OUTCOME_LOG_SUCCEEDED") AS num_success,
 COUNT(*) AS total_deliveries,
 COUNTIF(outcome = "TASK_OUTCOME_LOG_SUCCEEDED") * 100/ COUNT(*) AS success_rate
   labels.delivery_vehicle_id AS vehicle_id,
   jsonpayload_v1_updatetasklog.response.trackingid AS trackingid,
     timestamp ASC)[ORDINAL(1)] AS outcome,
  jsonpayload_v1_updatetasklog.response.type = "TASK_TYPE_LOG_DELIVERY"
 GROUP BY 1, 2
 ORDER BY 1, 2)

Datastudio dashboards

BigQuery can be integrated with business intelligence tools and dashboards can be created for business analytics.

The following example shows how to build a dashboard on which tasks and vehicle movements can be visualized on a map.

  1. Launch a new Datastudio dashboard and select BigQuery as the data connection.

    Data connection

  2. Select Custom Query and select the Cloud Project to which it should be billed.

    Select project

  3. Enter the following query into the query box.

    Enter query

    jsonpayload_v1_updatedeliveryvehiclelog.response.lastlocation.rawlocation.latitude AS lat,
    jsonpayload_v1_updatedeliveryvehiclelog.response.lastlocation.rawlocation.longitude AS lng
  4. Select Chart Type as Bubble Map, and then select the location field.

    Chart type

  5. Select Create Field.

    Create field

  6. Name the field and add the following formula: CONCAT(lat, “,”, lng).

    Then set type to Geo->Latitude, Longitude.

    Set type

  7. You can add controls to the dashboard to filter data. For example, select the Date-range filter.

    Add controls

  8. Edit the date range box to select a default date range.

    Date range

  9. You can add additional Drop-down list controls for delivery_vehicle_id.

    Drop-down list

With these controls, you can visualize the movement of the vehicle or the movement within a trip.