Manage data efficiently

A core function of many Google Ads applications is retrieving account data for use cases such as data analysis, customer queries, and policy compliance checks. While fetching the data, you should optimize your usage so as to not overload Google servers, or risk being rate limited. For more details, see the guides on rate-limiting and maintaining an up-to-date contact email address.

Understand Google's resource usage policy for reports

To ensure the stability of its servers, Google Ads API throttles GoogleAdsService.Search and GoogleAdsService.SearchStream query patterns that consume excessive amounts of API resources. If a particular query pattern is throttled, other services, methods, and query patterns will continue to work unaffected. The following errors are thrown for throttled requests:

API Version Error code
>= v17 QuotaError.EXCESSIVE_SHORT_TERM_QUERY_RESOURCE_CONSUMPTION or QuotaError.EXCESSIVE_LONG_TERM_QUERY_RESOURCE_CONSUMPTION depending upon the duration of high resource usage.

To help you identify and monitor your expensive reports, we will also return a cost metric for individual reports.

Method Cost field
GoogleAdsService.Search SearchGoogleAdsResponse.query_resource_consumption
GoogleAdsService.SearchStream SearchGoogleAdsStreamResponse.query_resource_consumption

The cost metric returned by these fields depend on various factors such as

  • The size of your accounts
  • The views and columns you fetch in your reports
  • The load on the Google Ads API servers.

To help you track expensive queries, we are publishing initial aggregated statistics around the resource consumption of various query patterns we see on our servers. We will periodically publish updated numbers to help you finetune your queries.

Time window Average (p50). P70 (Moderately high) P95 (Very high)
Short term (5 mins) 6000 30000 1800000
Long term (24 hrs). 16000 90000 8400000

As an example, assume you are running a query pattern as follows, that consumes 600 units of resources per report.

SELECT,, metrics.cost_micros FROM campaign WHERE = "YYYY-MM-DD"

You run this query for multiple customer accounts for several individual dates by modifying the query to substitute different values for the filter. The following table shows the number of reports you can run in a given time window such that your resource usage fits into various resource usage buckets.

Time window Average Moderately high Very high
Short term (5 mins) 10 50 3000
Long term (24 hrs). 26 150 14000

Running this query pattern 10 times in 5 minutes would count as an average usage, whereas running 3000 reports in 5 minutes would count as very high usage.

There are several strategies to optimize the resource consumption of your reports. The rest of this guide covers some of these strategies.

Cache your data

You should cache the entity details you fetch from the API servers in a local database instead of calling the server every time you need the data, particularly for entities that are frequently accessed or which change infrequently. Use change-event and change-status where possible to detect which objects changed since you last synced the results.

Optimize the frequency of running reports

Google Ads has published guidelines around data freshness and how frequently the data is updated. You should use this guidance to determine how frequently to fetch reports.

If you need to update accounts on a regular basis, we recommend limiting the number of such accounts to a small set, for example, only the top twenty Google Ads accounts. The rest can be updated at a lower frequency, for example, once or twice a day.

Optimize the size of your reports

Your application should fetch large batches of data instead of running a large number of small reports. A factor that plays into this choice is the account limits.

For example, consider the following code that pulls the stats for specific ad groups and update a stats database table:

  List<long> adGroupIds = FetchAdGroupIdsFromLocalDatabase();

  foreach (long adGroupId in adGroupIds)
    string query = "SELECT,, metrics.clicks, " +
        "metrics.cost_micros, metrics.impressions, FROM " +
        "ad_group WHERE DURING LAST_7_DAYS AND " +
        " = ${adGroupId}";
    List<GoogleAdsRow> rows = RunGoogleAdsReport(customerId, query);
    InsertRowsIntoStatsTable(adGroupId, rows);

This code works well on a small test account. However, Google Ads supports up to 20,000 ad groups per campaign and 10,000 campaigns per account. So if this code runs against a large Google Ads account, it can overload the Google Ads API servers, leading to rate limiting and throttling.

A better approach would be to run a single report, and process it locally. One such approach using an in-memory map is shown.

  Hashset<long> adGroupIds = FetchAdGroupIdsFromLocalDatabase();

  string query = "SELECT,, metrics.clicks, " +
      "metrics.cost_micros, metrics.impressions, FROM " +
      "ad_group WHERE DURING LAST_7_DAYS";
  List<GoogleAdsRow> rows = RunGoogleAdsReport(customer_id, query);

  var memoryMap = new Dictionary<long, List<GoogleAdsRow>>();
  for each (GoogleAdsRow row in rows)
    var adGroupId = row.AdGroup.Id;

    if (adGroupIds.Contains(adGroupId))
      CheckAndAddRowIntoMemoryMap(row, adGroupId, memoryMap);
  foreach (long adGroupId in memoryMap.Keys())
    InsertRowsIntoStatsTable(adGroupId, rows);

This reduces the load on the Google Ads API servers due to the lower number of reports being run.

If you find that the report is too big to hold in memory, you can also break down the query into smaller groups by adding a LIMIT clause like this:

FROM ad_group
  AND IN (id1, id2, ...)
LIMIT 100000

Labels are another way to group entities and reduce the number of reporting queries. See the labels guide to learn more.

Optimize what you fetch

When running reports, you should be mindful of the columns that you include in your queries. Consider the following example that is scheduled to run every hour:

FROM keyword_view

The only columns that are likely to change every hour are metrics.clicks and metrics.impressions. All the other columns are updated infrequently or not at all, so it's highly inefficient to fetch them hourly. You could store these values in a local database and run a change-event or change-status report to download changes once or twice a day.

In some cases, you could reduce the number of rows you download by applying appropriate filters.

Clean up unused accounts

If your application manages third party advertiser accounts, then you need to develop your application with customer churn in mind. You should periodically clean up your processes and data stores to remove accounts for customers who no longer use your application. When cleaning up unused Google Ads accounts, keep the following guidance in mind:

  • Revoke the authorization that your customer gave your application to manage their account.
  • Stop making API calls to the customer's Google Ads accounts. This applies especially to offline jobs such as cron jobs and data pipelines that are designed to run without user intervention.
  • If the customer revoked their authorization, then your application should gracefully handle the situation and avoid sending invalid API calls to Google's API servers.
  • If the customer has cancelled their Google Ads account, then you should detect it and avoid sending invalid API calls to Google's API servers.
  • Delete the data you downloaded from the customer's Google Ads accounts from your local database after an appropriate period of time.