While most services provide synchronous APIs, requiring you to make a request and then wait for a response, BatchJobService provides a way to perform batches of operations on multiple services without waiting for the operations to complete.
Unlike service-specific mutate operations, a single job in BatchJobService can operate
against a mixed collection of campaigns, ad groups, ads, criteria,
labels, and feed items. Submitted jobs run in parallel, and BatchJobService
automatically retries operations that fail due to transient errors such as
RateExceededError
.
In addition, BatchJobService allows you to use temporary IDs within your requests so you can submit dependent operations in a single job.
Supported operations
BatchJobService supports the following operations:
Schema
Although each client library contains utilities that handle the XML serialization of uploaded operations and XML deserialization of downloaded results, the complete schema for batch job requests and responses is available at:
https://adwords.google.com/api/adwords/cm/v201809/BatchJobOpsService?wsdl
Batch job flow
The steps for using a batch job are:
- Create the
BatchJob
and capture itsuploadUrl
from themutate()
response. - Upload the list of operations you want executed to the
uploadUrl
. - Poll the batch job's
status
periodically until it isCANCELED
orDONE
. - Download the results of the job from its
downloadUrl
and check forprocessingErrors
.
In addition, you can cancel a BatchJob
in the
AWAITING_FILE
or ACTIVE
state by setting its
status
to CANCELING
.
Create a batch job
Create a batch job by sending an ADD
operation containing a new BatchJob
object.
// Create a BatchJob. BatchJobOperation addOp = new BatchJobOperation(); addOp.setOperator(Operator.ADD); addOp.setOperand(new BatchJob()); BatchJob batchJob = batchJobService.mutate(new BatchJobOperation[] {addOp}).getValue(0); // Get the upload URL from the new job. String uploadUrl = batchJob.getUploadUrl().getUrl(); System.out.printf("Created BatchJob with ID %d, status '%s' and upload URL %s.%n", batchJob.getId(), batchJob.getStatus(), uploadUrl);
At this point in the process, the status
of the job will be AWAITING_FILE
.
Create operations for the batch job
For this step, create the operations for your batch job using the same approach
you would use for the synchronous API services. For example, the following
snippet creates CampaignOperation
objects for adding new campaigns.
List<CampaignOperation> operations = new ArrayList<>(); for (int i = 0; i < NUMBER_OF_CAMPAIGNS_TO_ADD; i++) { Campaign campaign = new Campaign(); campaign.setName(String.format("Batch Campaign %s.%s", namePrefix, i)); // Recommendation: Set the campaign to PAUSED when creating it to prevent // the ads from immediately serving. Set to ENABLED once you've added // targeting and the ads are ready to serve. campaign.setStatus(CampaignStatus.PAUSED); campaign.setId(tempIdGenerator.next()); campaign.setAdvertisingChannelType(AdvertisingChannelType.SEARCH); Budget budget = new Budget(); budget.setBudgetId(budgetId); campaign.setBudget(budget); BiddingStrategyConfiguration biddingStrategyConfiguration = new BiddingStrategyConfiguration(); biddingStrategyConfiguration.setBiddingStrategyType(BiddingStrategyType.MANUAL_CPC); // You can optionally provide a bidding scheme in place of the type. ManualCpcBiddingScheme cpcBiddingScheme = new ManualCpcBiddingScheme(); biddingStrategyConfiguration.setBiddingScheme(cpcBiddingScheme); campaign.setBiddingStrategyConfiguration(biddingStrategyConfiguration); CampaignOperation operation = new CampaignOperation(); operation.setOperand(campaign); operation.setOperator(Operator.ADD); operations.add(operation); } return operations;
If you're creating dependent objects such as a complete campaign consisting of
a new campaign and corresponding ad groups, ads, and keywords, you can
use temporary IDs in your ADD
operations.
// Create a temporary ID generator that will produce a sequence of descending negative numbers. Iterator<Long> tempIdGenerator = new AbstractSequentialIterator<Long>(-1L) { @Override protected Long computeNext(Long previous) { return Long.MIN_VALUE == previous ? null : previous - 1; } }; // Use a random UUID name prefix to avoid name collisions. String namePrefix = UUID.randomUUID().toString(); // Create the mutate request that will be sent to the upload URL. List<Operation> operations = new ArrayList<>(); // Create and add an operation to create a new budget. BudgetOperation budgetOperation = buildBudgetOperation(tempIdGenerator, namePrefix); operations.add(budgetOperation); // Create and add operations to create new campaigns. List<CampaignOperation> campaignOperations = buildCampaignOperations(tempIdGenerator, namePrefix, budgetOperation); operations.addAll(campaignOperations); // Create and add operations to create new negative keyword criteria for each campaign. operations.addAll(buildCampaignCriterionOperations(campaignOperations)); // Create and add operations to create new ad groups. List<AdGroupOperation> adGroupOperations = new ArrayList<>(buildAdGroupOperations(tempIdGenerator, namePrefix, campaignOperations)); operations.addAll(adGroupOperations); // Create and add operations to create new ad group criteria (keywords). operations.addAll(buildAdGroupCriterionOperations(adGroupOperations)); // Create and add operations to create new ad group ads (text ads). operations.addAll(buildAdGroupAdOperations(adGroupOperations));
Upload operations to the upload URL
Once you've collected the set of operations for your job, the next step is to send them to the upload URL.
If you are using the utility in one of the client libraries, then you don't need to worry about all of the underlying details. The utility will handle constructing and sending the requests for you, and will provide methods for both of the following options:
- Uploading all operations at once.
- Uploading operations using multiple calls to the utility.
Option 1: Upload all operations at once
The example below uses the BatchJobHelper
utility from the Java client library
to upload all operations at once.
// Use a BatchJobHelper to upload all operations. BatchJobHelper batchJobHelper = adWordsServices.getUtility(session, BatchJobHelper.class); batchJobHelper.uploadBatchJobOperations(operations, uploadUrl);
Option 2: Upload operations using multiple calls to the utility
The example below uses the BatchJobHelper
utility from the Java client
library to upload operations incrementally via multiple calls to the
utility's uploadIncrementalBatchJobOperations()
method.
// Use a BatchJobUploadHelper to upload all operations.
BatchJobHelper batchJobUploadHelper = new BatchJobHelper(session);
BatchJobUploadStatus startingUploadStatus =
new BatchJobUploadStatus(0, URI.create(batchJob.getUploadUrl().getUrl()));
BatchJobUploadResponse uploadResponse;
// Create and upload the first operation to create a new budget.
BudgetOperation budgetOperation = buildBudgetOperation(tempIdGenerator, namePrefix);
uploadResponse = batchJobUploadHelper.uploadIncrementalBatchJobOperations(
Lists.newArrayList(budgetOperation),
false, /* pass isLastRequest = false */
startingUploadStatus);
System.out.printf("First upload response: %s%n", uploadResponse);
// Create and upload intermediate operations to create new campaigns.
List<CampaignOperation> campaignOperations =
buildCampaignOperations(budgetOperation, tempIdGenerator, namePrefix);
uploadResponse = batchJobUploadHelper.uploadIncrementalBatchJobOperations(
campaignOperations,
false, /* pass isLastRequest = false */
uploadResponse.getBatchJobUploadStatus());
System.out.printf("Intermediate upload response: %s%n", uploadResponse);
// Upload more intermediate requests...
// Create and upload operations to create new ad group ads (text ads).
// This is the final upload request for the BatchJob.
uploadResponse = batchJobUploadHelper.uploadIncrementalBatchJobOperations(
buildAdGroupAdOperations(adGroupOperations),
true, /* pass isLastRequest = true */
uploadResponse.getBatchJobUploadStatus());
System.out.printf("Last upload response: %s%n", uploadResponse);
Poll the batch job status
After you upload your operations, the batch job will be submitted to the job
queue, so you'll want to periodically check the job's status until it is
CANCELED
or DONE
.
Use an exponential backoff policy to avoid polling too aggressively. The
snippet below will wait 30 seconds on the first attempt, 60 seconds on the
second attempt, 120 seconds on the third attempt, and so on.
int pollAttempts = 0; boolean isPending; Selector selector = new SelectorBuilder() .fields(BatchJobField.Id, BatchJobField.Status, BatchJobField.DownloadUrl, BatchJobField.ProcessingErrors, BatchJobField.ProgressStats) .equalsId(batchJob.getId()) .build(); do { long sleepSeconds = (long) Math.scalb(30, pollAttempts); System.out.printf("Sleeping %d seconds...%n", sleepSeconds); Thread.sleep(sleepSeconds * 1000); batchJob = batchJobService.get(selector).getEntries(0); System.out.printf( "Batch job ID %d has status '%s'.%n", batchJob.getId(), batchJob.getStatus()); pollAttempts++; isPending = PENDING_STATUSES.contains(batchJob.getStatus()); } while (isPending && pollAttempts < MAX_POLL_ATTEMPTS);
Download the batch job results and check for errors
At this stage, your job should be in one of two states.
Status | Description | Actions to take |
---|---|---|
DONE |
BatchJobService successfully parsed and attempted each of the uploaded operations. |
|
CANCELED |
The job was canceled as requested, BatchJobService could not parse the uploaded operations, or an unexpected error occurred during job execution. |
|
The download URL will return a mutateResult
element for every uploaded operation. Each result will have the following
attributes as defined in BatchJobOpsService.wsdl:
Attribute | Type | Description |
---|---|---|
result |
Operand |
If the operation was successful, this will have exactly one of the following child elements:
index . For example, if the operation was a successful CampaignOperation , a Campaign element will be returned here.
|
errorList |
ErrorList |
If the operation failed, this will have one or more errors elements, each of which will be an instance of ApiError or one of its subclasses. |
index |
long |
The 0-based operation number of the operation. Use this to correlate this result to the corresponding operation in your upload. |
The code below shows one approach for processing the results retrieved from a download URL.
if (batchJob.getDownloadUrl() != null && batchJob.getDownloadUrl().getUrl() != null) { BatchJobMutateResponse mutateResponse = batchJobHelper.downloadBatchJobMutateResponse(batchJob.getDownloadUrl().getUrl()); System.out.printf("Downloaded results from %s:%n", batchJob.getDownloadUrl().getUrl()); for (MutateResult mutateResult : mutateResponse.getMutateResults()) { String outcome = mutateResult.getErrorList() == null ? "SUCCESS" : "FAILURE"; System.out.printf(" Operation [%d] - %s%n", mutateResult.getIndex(), outcome); } } else { System.out.println("No results available for download."); }
The batch job's
processingErrors
will contain all errors encountered while pre-processing your uploaded
operations, such as input file corruption errors. The code below shows one
approach for processing such errors.
if (batchJob.getProcessingErrors() != null) { int i = 0; for (BatchJobProcessingError processingError : batchJob.getProcessingErrors()) { System.out.printf( " Processing error [%d]: errorType=%s, trigger=%s, errorString=%s, fieldPath=%s" + ", reason=%s%n", i++, processingError.getApiErrorType(), processingError.getTrigger(), processingError.getErrorString(), processingError.getFieldPath(), processingError.getReason()); } } else { System.out.println("No processing errors found."); }
Using temporary IDs
A powerful feature of BatchJobService is that it supports the use of temporary IDs.
A temporary ID is a negative number (long
) that allows operations in a batch job
to reference the result of an ADD
operation from a previous operation in the same
batch job. Simply specify a negative number for the ID in the ADD
operation for
the parent object, and then reuse that ID in subsequent mutate()
operations for other
dependent objects in the same batch job.
A common use case for temporary IDs is to create a complete campaign in a
single batch job. For example, you could create a single job containing ADD
operations
with the following ID assignments on each operand
:
This sequence would:
- Add a campaign with (temporary) ID
-1
- Add an ad group with temporary ID
-2
for campaign-1
- Add an ad group ad for ad group
-2
- Add multiple ad group criteria (keywords) for ad group
-2
- Add an ad group ad for ad group
- Add an ad group with temporary ID
- Apply a label to campaign
-1
- Add a campaign negative criterion (keyword) for campaign
-1
Use a new temporary ID when creating a new object. If you don't, you will
receive a TaskExecutionError.TEMP_ID_ALREADY_USED
error.
Canceling a batch job
A BatchJob
can be canceled if its
status
is
AWAITING_FILE
or ACTIVE
. Simply issue a
BatchJobService.mutate()
request and pass a BatchJobOperation
with:
If the status
of the BatchJob
at the time of the above request is neither
AWAITING_FILE
nor ACTIVE
, the request will fail with a
BatchJobError.INVALID_STATE_CHANGE
error.
Canceling a job is an asynchronous operation, so after your mutate()
request,
poll the batch job status until it reaches
either CANCELED
or DONE
. Make sure you
download the results and check for errors
as well, since some of the operations in your job may have been attempted
before the job was canceled.
Upload requirements
Non-incremental uploads
Each client library utility provides a convenience method for uploading operations in a single step. However, if you're not using a client library, please note that non-incremental uploads are not supported.
Incremental uploads
Incremental uploads allow you to send multiple upload requests to the batch job
uploadUrl
. Your job will only start executing once you've uploaded the last
set of operations.
Exchange the BatchJob uploadURL for a resumable upload URL
The upload process follows the Google Cloud Storage guidelines for resumable uploads with the XML API.
The uploadUrl
of your BatchJob
must be exchanged for a resumable upload URL.
To exchange your uploadUrl
for a resumable upload URL,
send a request to the uploadUrl
that meets the following
specifications:
Request attributes | |
---|---|
Request method | POST |
URL | Upload URL returned by BatchJobService.mutate |
Content-Type HTTP header |
application/xml |
Content-Length HTTP header |
0 |
x-goog-resumable HTTP header |
start |
Request body | no request body required |
If your request is successful, the returned response will have a status
of 201 Created
, and a Location
header whose value
is the resumable upload URL.
Upload operations to the resumable upload URL
Once you have the resumable upload URL, you can start uploading your operations. Each request sent to the resumable upload URL must meet the following specifications.
Request attributes | |
---|---|
Request method | PUT |
URL | Resumable upload URL from the initialization step above. |
Content-Type HTTP header |
application/xml |
Content-Length HTTP header |
The number of bytes in the contents of the current request. |
Content-Range HTTP header |
Range of bytes in the request, followed by total bytes. Total bytes will
be
* for the first and intermediate requests, but should be
set to the final total bytes when sending the last request.Examples:
bytes 0-262143/* bytes 262144-524287/* bytes 524288-786431/786432 |
Request body |
Operations in XML form, as specified in BatchJobOpsService.wsdl.
|
Request body for the resumable upload URL
The BatchJobService will ultimately concatenate all of the XML uploaded to the uploadUrl
and parse
it as a single request, so you must take care to include only the start and end mutate
elements
on the first and last request, respectively.
Request | Start mutate element |
End mutate element |
---|---|---|
First | ||
Intermediate | ||
Last |
In addition, since BatchJobService will parse the uploaded XML as a single document, the request body for a single request need not contain a complete XML document. For example, if you were uploading 524305 bytes (256K + 256K + 17), your requests could be as follows:
Request 1
<?xml version="1.0" encoding="UTF-8"?> <ns1:mutate xmlns:ns1="https://adwords.google.com/api/adwords/cm/v201809"> <operations xsi:type="ns1:CampaignOperation" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"> <operator xsi:type="ns1:Operator">ADD</operator> <operand xsi:type="ns1:Campaign"> ... </operations> <operations> ... </operat
Content length of 262144, where the t
in the last line is the 262144th byte.
Request 2
ions> <operations xsi:type="ns1:AdGroupOperation" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"> <operator xsi:type="ns1:Operator">ADD</operator> <operand xsi:type="ns1:AdGroup"> ... </operations> <operations> ... </ope
Content length of 262144, where the e
in the last line is the 262144th byte.
Request 3
rations></mutate> ... (padded to 262144 bytes)
Content length without padding of 17 bytes, where the closing >
on </mutate>
is the 17th byte. Total content length with padding is 262144 bytes.
Best practices
Consider these guidelines when using BatchJobService:
- For better throughput, fewer larger jobs is preferred over many smaller jobs.
- When submitting multiple concurrent jobs for the same
clientCustomerId, try to
reduce the likelihood of jobs operating on the same objects at the same time, while
maintaining large job sizes. Many unfinished jobs (with status of
ACTIVE
orCANCELING
) that try to mutate the same set of objects may lead to deadlock-like conditions resulting in severe slow-down and even job failures. - Don't submit multiple operations that mutate the same object in the same job. The result will be unpredictable if you do so.
- For better throughput, order uploaded operations by operation type. For
example, if your job contains operations to add campaigns, ad groups, and ad group criteria,
order the operations in your upload so that all of the
CampaignOperations
are first, followed by all of theAdGroupOperations
, and finally allAdGroupCriterionOperations
. - Do not poll the job status too frequently or you will risk hitting rate limit errors.
Working with Shopping campaigns
When updating product partition trees using BatchJobService, the following restrictions apply:
If a list of operations on a product partition tree result in a structurally invalid product partition tree (for example, a node is subdivided without creating an other node), the entire list of operations on that product partition tree will fail.
Operations that don't introduce structural changes in a product partition tree (for example, bid changes on an existing node) are executed independently.
A
BatchJob
can contain operations that introduce product partition tree structural changes for more than two ad groups. The two ad group limit does not apply to batch jobs.When removing a product partition node, set the
AdGroupCriterion
object'scriterion
field to aProductPartition
instance. Setting this field to aCriterion
instance will cause the operation to fail with anAdGroupCriterionError.CONCRETE_TYPE_REQUIRED
error.
Limitations
At any given time, a Google Ads account can have up to 1 GB of uploaded operations across all of its batch jobs that have not yet completed. Once your account reaches this limit, you will get a
BatchJobError
with reasonDISK_QUOTA_EXCEEDED
if you try to add new batch jobs. If you encounter this error, wait until the size of pending uploaded operations falls below the limit before creating new jobs.Test accounts are limited to 250 new jobs per day.
Batch jobs are retained for 60 days after creation. After that, they won't be retrievable through either
get()
orquery()
requests.
Code examples
The following client libraries contain a complete code example that illustrates all of the features above.