Storage Transfer API . transferJobs

Instance Methods

create(body=None, x__xgafv=None)

Creates a transfer job that runs periodically.

get(jobName=*, projectId=None, x__xgafv=None)

Gets a transfer job.

list(pageSize=None, pageToken=None, x__xgafv=None, filter=None)

Lists transfer jobs.

list_next(previous_request=*, previous_response=*)

Retrieves the next page of results.

patch(jobName=*, body=None, x__xgafv=None)

Updates a transfer job. Updating a job's transfer spec does not affect

Method Details

create(body=None, x__xgafv=None)
Creates a transfer job that runs periodically.

Args:
  body: object, The request body.
    The object takes the form of:

{ # This resource represents the configuration of a transfer job that runs
      # periodically.
    "transferSpec": { # Configuration for running a transfer. # Transfer specification.
      "objectConditions": { # Conditions that determine which objects will be transferred. Applies only # Only objects that satisfy these object conditions are included in the set
          # of data source and data sink objects.  Object conditions based on
          # objects' "last modification time" do not exclude objects in a data sink.
          # to S3 and Cloud Storage objects.
          #
          # The "last modification time" refers to the time of the
          # last change to the object's content or metadata — specifically, this is
          # the `updated` property of Cloud Storage objects and the `LastModified`
          # field of S3 objects.
        "maxTimeElapsedSinceLastModification": "A String", # If specified, only objects with a "last modification time" on or after
            # `NOW` - `max_time_elapsed_since_last_modification` and objects that don't
            # have a "last modification time" are transferred.
            #
            # For each TransferOperation started by this TransferJob,
            # `NOW` refers to the start_time of the
            # `TransferOperation`.
        "includePrefixes": [ # If `include_prefixes` is specified, objects that satisfy the object
            # conditions must have names that start with one of the `include_prefixes`
            # and that do not start with any of the exclude_prefixes. If
            # `include_prefixes` is not specified, all objects except those that have
            # names starting with one of the `exclude_prefixes` must satisfy the object
            # conditions.
            #
            # Requirements:
            #
            #   * Each include-prefix and exclude-prefix can contain any sequence of
            #     Unicode characters, to a max length of 1024 bytes when UTF8-encoded,
            #     and must not contain Carriage Return or Line Feed characters.  Wildcard
            #     matching and regular expression matching are not supported.
            #
            #   * Each include-prefix and exclude-prefix must omit the leading slash.
            #     For example, to include the `requests.gz` object in a transfer from
            #     `s3://my-aws-bucket/logs/y=2015/requests.gz`, specify the include
            #     prefix as `logs/y=2015/requests.gz`.
            #
            #   * None of the include-prefix or the exclude-prefix values can be empty,
            #     if specified.
            #
            #   * Each include-prefix must include a distinct portion of the object
            #     namespace. No include-prefix may be a prefix of another
            #     include-prefix.
            #
            #   * Each exclude-prefix must exclude a distinct portion of the object
            #     namespace. No exclude-prefix may be a prefix of another
            #     exclude-prefix.
            #
            #   * If `include_prefixes` is specified, then each exclude-prefix must start
            #     with the value of a path explicitly included by `include_prefixes`.
            #
            # The max size of `include_prefixes` is 1000.
          "A String",
        ],
        "minTimeElapsedSinceLastModification": "A String", # If specified, only objects with a "last modification time" before
            # `NOW` - `min_time_elapsed_since_last_modification` and objects that don't
            #  have a "last modification time" are transferred.
            #
            # For each TransferOperation started by this TransferJob, `NOW`
            # refers to the start_time of the
            # `TransferOperation`.
        "lastModifiedBefore": "A String", # If specified, only objects with a "last modification time" before this
            # timestamp and objects that don't have a "last modification time" will be
            # transferred.
        "lastModifiedSince": "A String", # If specified, only objects with a "last modification time" on or after
            # this timestamp and objects that don't have a "last modification time" are
            # transferred.
            #
            # The `last_modified_since` and `last_modified_before` fields can be used
            # together for chunked data processing. For example, consider a script that
            # processes each day's worth of data at a time. For that you'd set each
            # of the fields as follows:
            #
            # *  `last_modified_since` to the start of the day
            #
            # *  `last_modified_before` to the end of the day
        "excludePrefixes": [ # `exclude_prefixes` must follow the requirements described for
            # include_prefixes.
            #
            # The max size of `exclude_prefixes` is 1000.
          "A String",
        ],
      },
      "gcsDataSource": { # In a GcsData resource, an object's name is the Cloud Storage object's # A Cloud Storage data source.
          # name and its "last modification time" refers to the object's `updated`
          # property of Cloud Storage objects, which changes when the content or the
          # metadata of the object is updated.
        "bucketName": "A String", # Required. Cloud Storage bucket name (see
            # [Bucket Name
            # Requirements](https://cloud.google.com/storage/docs/naming#requirements)).
      },
      "httpDataSource": { # An HttpData resource specifies a list of objects on the web to be transferred # An HTTP URL data source.
          # over HTTP.  The information of the objects to be transferred is contained in
          # a file referenced by a URL. The first line in the file must be
          # `"TsvHttpData-1.0"`, which specifies the format of the file.  Subsequent
          # lines specify the information of the list of objects, one object per list
          # entry. Each entry has the following tab-delimited fields:
          #
          # * **HTTP URL** — The location of the object.
          #
          # * **Length** — The size of the object in bytes.
          #
          # * **MD5** — The base64-encoded MD5 hash of the object.
          #
          # For an example of a valid TSV file, see
          # [Transferring data from
          # URLs](https://cloud.google.com/storage-transfer/docs/create-url-list).
          #
          # When transferring data based on a URL list, keep the following in mind:
          #
          # * When an object located at `http(s)://hostname:port/<URL-path>` is
          # transferred to a data sink, the name of the object at the data sink is
          # `<hostname>/<URL-path>`.
          #
          # * If the specified size of an object does not match the actual size of the
          # object fetched, the object will not be transferred.
          #
          # * If the specified MD5 does not match the MD5 computed from the transferred
          # bytes, the object transfer will fail. For more information, see
          # [Generating MD5
          # hashes](https://cloud.google.com/storage-transfer/docs/create-url-list#md5)
          #
          # * Ensure that each URL you specify is publicly accessible. For
          # example, in Cloud Storage you can
          # [share an object publicly]
          # (https://cloud.google.com/storage/docs/cloud-console#_sharingdata) and get
          # a link to it.
          #
          # * Storage Transfer Service obeys `robots.txt` rules and requires the source
          # HTTP server to support `Range` requests and to return a `Content-Length`
          # header in each response.
          #
          # * ObjectConditions have no effect when filtering objects to transfer.
        "listUrl": "A String", # Required. The URL that points to the file that stores the object list
            # entries. This file must allow public access.  Currently, only URLs with
            # HTTP and HTTPS schemes are supported.
      },
      "transferOptions": { # TransferOptions uses three boolean parameters to define the actions # If the option
          # delete_objects_unique_in_sink
          # is `true`, object conditions based on objects' "last modification time" are
          # ignored and do not exclude objects in a data source or a data sink.
          # to be performed on objects in a transfer.
        "overwriteObjectsAlreadyExistingInSink": True or False, # Whether overwriting objects that already exist in the sink is allowed.
        "deleteObjectsFromSourceAfterTransfer": True or False, # Whether objects should be deleted from the source after they are
            # transferred to the sink.
            #
            # **Note:** This option and delete_objects_unique_in_sink are mutually
            # exclusive.
        "deleteObjectsUniqueInSink": True or False, # Whether objects that exist only in the sink should be deleted.
            #
            # **Note:** This option and delete_objects_from_source_after_transfer are
            # mutually exclusive.
      },
      "gcsDataSink": { # In a GcsData resource, an object's name is the Cloud Storage object's # A Cloud Storage data sink.
          # name and its "last modification time" refers to the object's `updated`
          # property of Cloud Storage objects, which changes when the content or the
          # metadata of the object is updated.
        "bucketName": "A String", # Required. Cloud Storage bucket name (see
            # [Bucket Name
            # Requirements](https://cloud.google.com/storage/docs/naming#requirements)).
      },
      "awsS3DataSource": { # An AwsS3Data resource can be a data source, but not a data sink. # An AWS S3 data source.
          # In an AwsS3Data resource, an object's name is the S3 object's key name.
        "awsAccessKey": { # AWS access key (see # Required. AWS access key used to sign the API requests to the AWS S3
            # bucket. Permissions on the bucket must be granted to the access ID of the
            # AWS access key.
            # [AWS Security
            # Credentials](https://docs.aws.amazon.com/general/latest/gr/aws-security-credentials.html)).
          "secretAccessKey": "A String", # Required. AWS secret access key. This field is not returned in RPC
              # responses.
          "accessKeyId": "A String", # Required. AWS access key ID.
        },
        "bucketName": "A String", # Required. S3 Bucket name (see
            # [Creating a
            # bucket](https://docs.aws.amazon.com/AmazonS3/latest/dev/create-bucket-get-location-example.html)).
      },
      "azureBlobStorageDataSource": { # An AzureBlobStorageData resource can be a data source, but not a data sink. # An Azure Blob Storage data source.
          # An AzureBlobStorageData resource represents one Azure container. The storage
          # account determines the [Azure
          # endpoint](https://docs.microsoft.com/en-us/azure/storage/common/storage-create-storage-account#storage-account-endpoints).
          # In an AzureBlobStorageData resource, a blobs's name is the [Azure Blob
          # Storage blob's key
          # name](https://docs.microsoft.com/en-us/rest/api/storageservices/naming-and-referencing-containers--blobs--and-metadata#blob-names).
        "container": "A String", # Required. The container to transfer from the Azure Storage account.
        "azureCredentials": { # Azure credentials # Required. Credentials used to authenticate API requests to Azure.
          "sasToken": "A String", # Required. Azure shared access signature. (see
              # [Grant limited access to Azure Storage resources using shared access
              # signatures
              # (SAS)](https://docs.microsoft.com/en-us/azure/storage/common/storage-sas-overview)).
        },
        "storageAccount": "A String", # Required. The name of the Azure Storage account.
      },
    },
    "status": "A String", # Status of the job. This value MUST be specified for
        # `CreateTransferJobRequests`.
        # 
        # **Note:** The effect of the new job status takes place during a subsequent
        # job run. For example, if you change the job status from
        # ENABLED to DISABLED, and an operation
        # spawned by the transfer is running, the status change would not affect the
        # current operation.
    "deletionTime": "A String", # Output only. The time that the transfer job was deleted.
    "schedule": { # Transfers can be scheduled to recur or to run just once. # Schedule specification.
      "startTimeOfDay": { # Represents a time of day. The date and time zone are either not significant # The time in UTC that a transfer job is scheduled to run. Transfers may
          # start later than this time.
          #
          # If `start_time_of_day` is not specified:
          #
          # *   One-time transfers run immediately.
          # *   Recurring transfers run immediately, and each day at midnight UTC,
          #     through schedule_end_date.
          #
          # If `start_time_of_day` is specified:
          #
          # *   One-time transfers run at the specified time.
          # *   Recurring transfers run at the specified time each day, through
          #     `schedule_end_date`.
          # or are specified elsewhere. An API may choose to allow leap seconds. Related
          # types are google.type.Date and `google.protobuf.Timestamp`.
        "hours": 42, # Hours of day in 24 hour format. Should be from 0 to 23. An API may choose
            # to allow the value "24:00:00" for scenarios like business closing time.
        "nanos": 42, # Fractions of seconds in nanoseconds. Must be from 0 to 999,999,999.
        "seconds": 42, # Seconds of minutes of the time. Must normally be from 0 to 59. An API may
            # allow the value 60 if it allows leap-seconds.
        "minutes": 42, # Minutes of hour of day. Must be from 0 to 59.
      },
      "scheduleStartDate": { # Represents a whole or partial calendar date, e.g. a birthday. The time of day # Required. The start date of a transfer. Date boundaries are determined
          # relative to UTC time. If `schedule_start_date` and start_time_of_day
          # are in the past relative to the job's creation time, the transfer starts
          # the day after you schedule the transfer request.
          #
          # **Note:** When starting jobs at or near midnight UTC it is possible that
          # a job will start later than expected. For example, if you send an outbound
          # request on June 1 one millisecond prior to midnight UTC and the Storage
          # Transfer Service server receives the request on June 2, then it will create
          # a TransferJob with `schedule_start_date` set to June 2 and a
          # `start_time_of_day` set to midnight UTC. The first scheduled
          # TransferOperation will take place on June 3 at midnight UTC.
          # and time zone are either specified elsewhere or are not significant. The date
          # is relative to the Proleptic Gregorian Calendar. This can represent:
          #
          # * A full date, with non-zero year, month and day values
          # * A month and day value, with a zero year, e.g. an anniversary
          # * A year on its own, with zero month and day values
          # * A year and month value, with a zero day, e.g. a credit card expiration date
          #
          # Related types are google.type.TimeOfDay and `google.protobuf.Timestamp`.
        "year": 42, # Year of date. Must be from 1 to 9999, or 0 if specifying a date without
            # a year.
        "day": 42, # Day of month. Must be from 1 to 31 and valid for the year and month, or 0
            # if specifying a year by itself or a year and month where the day is not
            # significant.
        "month": 42, # Month of year. Must be from 1 to 12, or 0 if specifying a year without a
            # month and day.
      },
      "scheduleEndDate": { # Represents a whole or partial calendar date, e.g. a birthday. The time of day # The last day a transfer runs. Date boundaries are determined relative to
          # UTC time. A job will run once per 24 hours within the following guidelines:
          #
          # *   If `schedule_end_date` and schedule_start_date are the same and in
          #     the future relative to UTC, the transfer is executed only one time.
          # *   If `schedule_end_date` is later than `schedule_start_date`  and
          #     `schedule_end_date` is in the future relative to UTC, the job will
          #     run each day at start_time_of_day through `schedule_end_date`.
          # and time zone are either specified elsewhere or are not significant. The date
          # is relative to the Proleptic Gregorian Calendar. This can represent:
          #
          # * A full date, with non-zero year, month and day values
          # * A month and day value, with a zero year, e.g. an anniversary
          # * A year on its own, with zero month and day values
          # * A year and month value, with a zero day, e.g. a credit card expiration date
          #
          # Related types are google.type.TimeOfDay and `google.protobuf.Timestamp`.
        "year": 42, # Year of date. Must be from 1 to 9999, or 0 if specifying a date without
            # a year.
        "day": 42, # Day of month. Must be from 1 to 31 and valid for the year and month, or 0
            # if specifying a year by itself or a year and month where the day is not
            # significant.
        "month": 42, # Month of year. Must be from 1 to 12, or 0 if specifying a year without a
            # month and day.
      },
    },
    "projectId": "A String", # The ID of the Google Cloud Platform Project that owns the job.
    "description": "A String", # A description provided by the user for the job. Its max length is 1024
        # bytes when Unicode-encoded.
    "lastModificationTime": "A String", # Output only. The time that the transfer job was last modified.
    "creationTime": "A String", # Output only. The time that the transfer job was created.
    "notificationConfig": { # Specification to configure notifications published to Cloud Pub/Sub. # Notification configuration.
        # Notifications will be published to the customer-provided topic using the
        # following `PubsubMessage.attributes`:
        #
        # * `"eventType"`: one of the EventType values
        # * `"payloadFormat"`: one of the PayloadFormat values
        # * `"projectId"`: the project_id of the
        # `TransferOperation`
        # * `"transferJobName"`: the
        # transfer_job_name of the
        # `TransferOperation`
        # * `"transferOperationName"`: the name of the
        # `TransferOperation`
        #
        # The `PubsubMessage.data` will contain a TransferOperation resource
        # formatted according to the specified `PayloadFormat`.
      "eventTypes": [ # Event types for which a notification is desired. If empty, send
          # notifications for all event types.
        "A String",
      ],
      "payloadFormat": "A String", # Required. The desired format of the notification message payloads.
      "pubsubTopic": "A String", # Required. The `Topic.name` of the Cloud Pub/Sub topic to which to publish
          # notifications. Must be of the format: `projects/{project}/topics/{topic}`.
          # Not matching this format will result in an
          # INVALID_ARGUMENT error.
    },
    "name": "A String", # A unique name (within the transfer project) assigned when the job is
        # created.  If this field is empty in a CreateTransferJobRequest, Storage
        # Transfer Service will assign a unique name. Otherwise, the specified name
        # is used as the unique name for this job.
        # 
        # If the specified name is in use by a job, the creation request fails with
        # an ALREADY_EXISTS error.
        # 
        # This name must start with `"transferJobs/"` prefix and end with a letter or
        # a number, and should be no more than 128 characters.
        # Example: `"transferJobs/[A-Za-z0-9-._~]*[A-Za-z0-9]$"`
        # 
        # Invalid job names will fail with an
        # INVALID_ARGUMENT error.
  }

  x__xgafv: string, V1 error format.
    Allowed values
      1 - v1 error format
      2 - v2 error format

Returns:
  An object of the form:

    { # This resource represents the configuration of a transfer job that runs
        # periodically.
      "transferSpec": { # Configuration for running a transfer. # Transfer specification.
        "objectConditions": { # Conditions that determine which objects will be transferred. Applies only # Only objects that satisfy these object conditions are included in the set
            # of data source and data sink objects.  Object conditions based on
            # objects' "last modification time" do not exclude objects in a data sink.
            # to S3 and Cloud Storage objects.
            #
            # The "last modification time" refers to the time of the
            # last change to the object's content or metadata — specifically, this is
            # the `updated` property of Cloud Storage objects and the `LastModified`
            # field of S3 objects.
          "maxTimeElapsedSinceLastModification": "A String", # If specified, only objects with a "last modification time" on or after
              # `NOW` - `max_time_elapsed_since_last_modification` and objects that don't
              # have a "last modification time" are transferred.
              #
              # For each TransferOperation started by this TransferJob,
              # `NOW` refers to the start_time of the
              # `TransferOperation`.
          "includePrefixes": [ # If `include_prefixes` is specified, objects that satisfy the object
              # conditions must have names that start with one of the `include_prefixes`
              # and that do not start with any of the exclude_prefixes. If
              # `include_prefixes` is not specified, all objects except those that have
              # names starting with one of the `exclude_prefixes` must satisfy the object
              # conditions.
              #
              # Requirements:
              #
              #   * Each include-prefix and exclude-prefix can contain any sequence of
              #     Unicode characters, to a max length of 1024 bytes when UTF8-encoded,
              #     and must not contain Carriage Return or Line Feed characters.  Wildcard
              #     matching and regular expression matching are not supported.
              #
              #   * Each include-prefix and exclude-prefix must omit the leading slash.
              #     For example, to include the `requests.gz` object in a transfer from
              #     `s3://my-aws-bucket/logs/y=2015/requests.gz`, specify the include
              #     prefix as `logs/y=2015/requests.gz`.
              #
              #   * None of the include-prefix or the exclude-prefix values can be empty,
              #     if specified.
              #
              #   * Each include-prefix must include a distinct portion of the object
              #     namespace. No include-prefix may be a prefix of another
              #     include-prefix.
              #
              #   * Each exclude-prefix must exclude a distinct portion of the object
              #     namespace. No exclude-prefix may be a prefix of another
              #     exclude-prefix.
              #
              #   * If `include_prefixes` is specified, then each exclude-prefix must start
              #     with the value of a path explicitly included by `include_prefixes`.
              #
              # The max size of `include_prefixes` is 1000.
            "A String",
          ],
          "minTimeElapsedSinceLastModification": "A String", # If specified, only objects with a "last modification time" before
              # `NOW` - `min_time_elapsed_since_last_modification` and objects that don't
              #  have a "last modification time" are transferred.
              #
              # For each TransferOperation started by this TransferJob, `NOW`
              # refers to the start_time of the
              # `TransferOperation`.
          "lastModifiedBefore": "A String", # If specified, only objects with a "last modification time" before this
              # timestamp and objects that don't have a "last modification time" will be
              # transferred.
          "lastModifiedSince": "A String", # If specified, only objects with a "last modification time" on or after
              # this timestamp and objects that don't have a "last modification time" are
              # transferred.
              #
              # The `last_modified_since` and `last_modified_before` fields can be used
              # together for chunked data processing. For example, consider a script that
              # processes each day's worth of data at a time. For that you'd set each
              # of the fields as follows:
              #
              # *  `last_modified_since` to the start of the day
              #
              # *  `last_modified_before` to the end of the day
          "excludePrefixes": [ # `exclude_prefixes` must follow the requirements described for
              # include_prefixes.
              #
              # The max size of `exclude_prefixes` is 1000.
            "A String",
          ],
        },
        "gcsDataSource": { # In a GcsData resource, an object's name is the Cloud Storage object's # A Cloud Storage data source.
            # name and its "last modification time" refers to the object's `updated`
            # property of Cloud Storage objects, which changes when the content or the
            # metadata of the object is updated.
          "bucketName": "A String", # Required. Cloud Storage bucket name (see
              # [Bucket Name
              # Requirements](https://cloud.google.com/storage/docs/naming#requirements)).
        },
        "httpDataSource": { # An HttpData resource specifies a list of objects on the web to be transferred # An HTTP URL data source.
            # over HTTP.  The information of the objects to be transferred is contained in
            # a file referenced by a URL. The first line in the file must be
            # `"TsvHttpData-1.0"`, which specifies the format of the file.  Subsequent
            # lines specify the information of the list of objects, one object per list
            # entry. Each entry has the following tab-delimited fields:
            #
            # * **HTTP URL** — The location of the object.
            #
            # * **Length** — The size of the object in bytes.
            #
            # * **MD5** — The base64-encoded MD5 hash of the object.
            #
            # For an example of a valid TSV file, see
            # [Transferring data from
            # URLs](https://cloud.google.com/storage-transfer/docs/create-url-list).
            #
            # When transferring data based on a URL list, keep the following in mind:
            #
            # * When an object located at `http(s)://hostname:port/<URL-path>` is
            # transferred to a data sink, the name of the object at the data sink is
            # `<hostname>/<URL-path>`.
            #
            # * If the specified size of an object does not match the actual size of the
            # object fetched, the object will not be transferred.
            #
            # * If the specified MD5 does not match the MD5 computed from the transferred
            # bytes, the object transfer will fail. For more information, see
            # [Generating MD5
            # hashes](https://cloud.google.com/storage-transfer/docs/create-url-list#md5)
            #
            # * Ensure that each URL you specify is publicly accessible. For
            # example, in Cloud Storage you can
            # [share an object publicly]
            # (https://cloud.google.com/storage/docs/cloud-console#_sharingdata) and get
            # a link to it.
            #
            # * Storage Transfer Service obeys `robots.txt` rules and requires the source
            # HTTP server to support `Range` requests and to return a `Content-Length`
            # header in each response.
            #
            # * ObjectConditions have no effect when filtering objects to transfer.
          "listUrl": "A String", # Required. The URL that points to the file that stores the object list
              # entries. This file must allow public access.  Currently, only URLs with
              # HTTP and HTTPS schemes are supported.
        },
        "transferOptions": { # TransferOptions uses three boolean parameters to define the actions # If the option
            # delete_objects_unique_in_sink
            # is `true`, object conditions based on objects' "last modification time" are
            # ignored and do not exclude objects in a data source or a data sink.
            # to be performed on objects in a transfer.
          "overwriteObjectsAlreadyExistingInSink": True or False, # Whether overwriting objects that already exist in the sink is allowed.
          "deleteObjectsFromSourceAfterTransfer": True or False, # Whether objects should be deleted from the source after they are
              # transferred to the sink.
              #
              # **Note:** This option and delete_objects_unique_in_sink are mutually
              # exclusive.
          "deleteObjectsUniqueInSink": True or False, # Whether objects that exist only in the sink should be deleted.
              #
              # **Note:** This option and delete_objects_from_source_after_transfer are
              # mutually exclusive.
        },
        "gcsDataSink": { # In a GcsData resource, an object's name is the Cloud Storage object's # A Cloud Storage data sink.
            # name and its "last modification time" refers to the object's `updated`
            # property of Cloud Storage objects, which changes when the content or the
            # metadata of the object is updated.
          "bucketName": "A String", # Required. Cloud Storage bucket name (see
              # [Bucket Name
              # Requirements](https://cloud.google.com/storage/docs/naming#requirements)).
        },
        "awsS3DataSource": { # An AwsS3Data resource can be a data source, but not a data sink. # An AWS S3 data source.
            # In an AwsS3Data resource, an object's name is the S3 object's key name.
          "awsAccessKey": { # AWS access key (see # Required. AWS access key used to sign the API requests to the AWS S3
              # bucket. Permissions on the bucket must be granted to the access ID of the
              # AWS access key.
              # [AWS Security
              # Credentials](https://docs.aws.amazon.com/general/latest/gr/aws-security-credentials.html)).
            "secretAccessKey": "A String", # Required. AWS secret access key. This field is not returned in RPC
                # responses.
            "accessKeyId": "A String", # Required. AWS access key ID.
          },
          "bucketName": "A String", # Required. S3 Bucket name (see
              # [Creating a
              # bucket](https://docs.aws.amazon.com/AmazonS3/latest/dev/create-bucket-get-location-example.html)).
        },
        "azureBlobStorageDataSource": { # An AzureBlobStorageData resource can be a data source, but not a data sink. # An Azure Blob Storage data source.
            # An AzureBlobStorageData resource represents one Azure container. The storage
            # account determines the [Azure
            # endpoint](https://docs.microsoft.com/en-us/azure/storage/common/storage-create-storage-account#storage-account-endpoints).
            # In an AzureBlobStorageData resource, a blobs's name is the [Azure Blob
            # Storage blob's key
            # name](https://docs.microsoft.com/en-us/rest/api/storageservices/naming-and-referencing-containers--blobs--and-metadata#blob-names).
          "container": "A String", # Required. The container to transfer from the Azure Storage account.
          "azureCredentials": { # Azure credentials # Required. Credentials used to authenticate API requests to Azure.
            "sasToken": "A String", # Required. Azure shared access signature. (see
                # [Grant limited access to Azure Storage resources using shared access
                # signatures
                # (SAS)](https://docs.microsoft.com/en-us/azure/storage/common/storage-sas-overview)).
          },
          "storageAccount": "A String", # Required. The name of the Azure Storage account.
        },
      },
      "status": "A String", # Status of the job. This value MUST be specified for
          # `CreateTransferJobRequests`.
          #
          # **Note:** The effect of the new job status takes place during a subsequent
          # job run. For example, if you change the job status from
          # ENABLED to DISABLED, and an operation
          # spawned by the transfer is running, the status change would not affect the
          # current operation.
      "deletionTime": "A String", # Output only. The time that the transfer job was deleted.
      "schedule": { # Transfers can be scheduled to recur or to run just once. # Schedule specification.
        "startTimeOfDay": { # Represents a time of day. The date and time zone are either not significant # The time in UTC that a transfer job is scheduled to run. Transfers may
            # start later than this time.
            #
            # If `start_time_of_day` is not specified:
            #
            # *   One-time transfers run immediately.
            # *   Recurring transfers run immediately, and each day at midnight UTC,
            #     through schedule_end_date.
            #
            # If `start_time_of_day` is specified:
            #
            # *   One-time transfers run at the specified time.
            # *   Recurring transfers run at the specified time each day, through
            #     `schedule_end_date`.
            # or are specified elsewhere. An API may choose to allow leap seconds. Related
            # types are google.type.Date and `google.protobuf.Timestamp`.
          "hours": 42, # Hours of day in 24 hour format. Should be from 0 to 23. An API may choose
              # to allow the value "24:00:00" for scenarios like business closing time.
          "nanos": 42, # Fractions of seconds in nanoseconds. Must be from 0 to 999,999,999.
          "seconds": 42, # Seconds of minutes of the time. Must normally be from 0 to 59. An API may
              # allow the value 60 if it allows leap-seconds.
          "minutes": 42, # Minutes of hour of day. Must be from 0 to 59.
        },
        "scheduleStartDate": { # Represents a whole or partial calendar date, e.g. a birthday. The time of day # Required. The start date of a transfer. Date boundaries are determined
            # relative to UTC time. If `schedule_start_date` and start_time_of_day
            # are in the past relative to the job's creation time, the transfer starts
            # the day after you schedule the transfer request.
            #
            # **Note:** When starting jobs at or near midnight UTC it is possible that
            # a job will start later than expected. For example, if you send an outbound
            # request on June 1 one millisecond prior to midnight UTC and the Storage
            # Transfer Service server receives the request on June 2, then it will create
            # a TransferJob with `schedule_start_date` set to June 2 and a
            # `start_time_of_day` set to midnight UTC. The first scheduled
            # TransferOperation will take place on June 3 at midnight UTC.
            # and time zone are either specified elsewhere or are not significant. The date
            # is relative to the Proleptic Gregorian Calendar. This can represent:
            #
            # * A full date, with non-zero year, month and day values
            # * A month and day value, with a zero year, e.g. an anniversary
            # * A year on its own, with zero month and day values
            # * A year and month value, with a zero day, e.g. a credit card expiration date
            #
            # Related types are google.type.TimeOfDay and `google.protobuf.Timestamp`.
          "year": 42, # Year of date. Must be from 1 to 9999, or 0 if specifying a date without
              # a year.
          "day": 42, # Day of month. Must be from 1 to 31 and valid for the year and month, or 0
              # if specifying a year by itself or a year and month where the day is not
              # significant.
          "month": 42, # Month of year. Must be from 1 to 12, or 0 if specifying a year without a
              # month and day.
        },
        "scheduleEndDate": { # Represents a whole or partial calendar date, e.g. a birthday. The time of day # The last day a transfer runs. Date boundaries are determined relative to
            # UTC time. A job will run once per 24 hours within the following guidelines:
            #
            # *   If `schedule_end_date` and schedule_start_date are the same and in
            #     the future relative to UTC, the transfer is executed only one time.
            # *   If `schedule_end_date` is later than `schedule_start_date`  and
            #     `schedule_end_date` is in the future relative to UTC, the job will
            #     run each day at start_time_of_day through `schedule_end_date`.
            # and time zone are either specified elsewhere or are not significant. The date
            # is relative to the Proleptic Gregorian Calendar. This can represent:
            #
            # * A full date, with non-zero year, month and day values
            # * A month and day value, with a zero year, e.g. an anniversary
            # * A year on its own, with zero month and day values
            # * A year and month value, with a zero day, e.g. a credit card expiration date
            #
            # Related types are google.type.TimeOfDay and `google.protobuf.Timestamp`.
          "year": 42, # Year of date. Must be from 1 to 9999, or 0 if specifying a date without
              # a year.
          "day": 42, # Day of month. Must be from 1 to 31 and valid for the year and month, or 0
              # if specifying a year by itself or a year and month where the day is not
              # significant.
          "month": 42, # Month of year. Must be from 1 to 12, or 0 if specifying a year without a
              # month and day.
        },
      },
      "projectId": "A String", # The ID of the Google Cloud Platform Project that owns the job.
      "description": "A String", # A description provided by the user for the job. Its max length is 1024
          # bytes when Unicode-encoded.
      "lastModificationTime": "A String", # Output only. The time that the transfer job was last modified.
      "creationTime": "A String", # Output only. The time that the transfer job was created.
      "notificationConfig": { # Specification to configure notifications published to Cloud Pub/Sub. # Notification configuration.
          # Notifications will be published to the customer-provided topic using the
          # following `PubsubMessage.attributes`:
          #
          # * `"eventType"`: one of the EventType values
          # * `"payloadFormat"`: one of the PayloadFormat values
          # * `"projectId"`: the project_id of the
          # `TransferOperation`
          # * `"transferJobName"`: the
          # transfer_job_name of the
          # `TransferOperation`
          # * `"transferOperationName"`: the name of the
          # `TransferOperation`
          #
          # The `PubsubMessage.data` will contain a TransferOperation resource
          # formatted according to the specified `PayloadFormat`.
        "eventTypes": [ # Event types for which a notification is desired. If empty, send
            # notifications for all event types.
          "A String",
        ],
        "payloadFormat": "A String", # Required. The desired format of the notification message payloads.
        "pubsubTopic": "A String", # Required. The `Topic.name` of the Cloud Pub/Sub topic to which to publish
            # notifications. Must be of the format: `projects/{project}/topics/{topic}`.
            # Not matching this format will result in an
            # INVALID_ARGUMENT error.
      },
      "name": "A String", # A unique name (within the transfer project) assigned when the job is
          # created.  If this field is empty in a CreateTransferJobRequest, Storage
          # Transfer Service will assign a unique name. Otherwise, the specified name
          # is used as the unique name for this job.
          #
          # If the specified name is in use by a job, the creation request fails with
          # an ALREADY_EXISTS error.
          #
          # This name must start with `"transferJobs/"` prefix and end with a letter or
          # a number, and should be no more than 128 characters.
          # Example: `"transferJobs/[A-Za-z0-9-._~]*[A-Za-z0-9]$"`
          #
          # Invalid job names will fail with an
          # INVALID_ARGUMENT error.
    }
get(jobName=*, projectId=None, x__xgafv=None)
Gets a transfer job.

Args:
  jobName: string, Required. The job to get. (required)
  projectId: string, Required. The ID of the Google Cloud Platform Console project that owns the
job.
  x__xgafv: string, V1 error format.
    Allowed values
      1 - v1 error format
      2 - v2 error format

Returns:
  An object of the form:

    { # This resource represents the configuration of a transfer job that runs
        # periodically.
      "transferSpec": { # Configuration for running a transfer. # Transfer specification.
        "objectConditions": { # Conditions that determine which objects will be transferred. Applies only # Only objects that satisfy these object conditions are included in the set
            # of data source and data sink objects.  Object conditions based on
            # objects' "last modification time" do not exclude objects in a data sink.
            # to S3 and Cloud Storage objects.
            #
            # The "last modification time" refers to the time of the
            # last change to the object's content or metadata — specifically, this is
            # the `updated` property of Cloud Storage objects and the `LastModified`
            # field of S3 objects.
          "maxTimeElapsedSinceLastModification": "A String", # If specified, only objects with a "last modification time" on or after
              # `NOW` - `max_time_elapsed_since_last_modification` and objects that don't
              # have a "last modification time" are transferred.
              #
              # For each TransferOperation started by this TransferJob,
              # `NOW` refers to the start_time of the
              # `TransferOperation`.
          "includePrefixes": [ # If `include_prefixes` is specified, objects that satisfy the object
              # conditions must have names that start with one of the `include_prefixes`
              # and that do not start with any of the exclude_prefixes. If
              # `include_prefixes` is not specified, all objects except those that have
              # names starting with one of the `exclude_prefixes` must satisfy the object
              # conditions.
              #
              # Requirements:
              #
              #   * Each include-prefix and exclude-prefix can contain any sequence of
              #     Unicode characters, to a max length of 1024 bytes when UTF8-encoded,
              #     and must not contain Carriage Return or Line Feed characters.  Wildcard
              #     matching and regular expression matching are not supported.
              #
              #   * Each include-prefix and exclude-prefix must omit the leading slash.
              #     For example, to include the `requests.gz` object in a transfer from
              #     `s3://my-aws-bucket/logs/y=2015/requests.gz`, specify the include
              #     prefix as `logs/y=2015/requests.gz`.
              #
              #   * None of the include-prefix or the exclude-prefix values can be empty,
              #     if specified.
              #
              #   * Each include-prefix must include a distinct portion of the object
              #     namespace. No include-prefix may be a prefix of another
              #     include-prefix.
              #
              #   * Each exclude-prefix must exclude a distinct portion of the object
              #     namespace. No exclude-prefix may be a prefix of another
              #     exclude-prefix.
              #
              #   * If `include_prefixes` is specified, then each exclude-prefix must start
              #     with the value of a path explicitly included by `include_prefixes`.
              #
              # The max size of `include_prefixes` is 1000.
            "A String",
          ],
          "minTimeElapsedSinceLastModification": "A String", # If specified, only objects with a "last modification time" before
              # `NOW` - `min_time_elapsed_since_last_modification` and objects that don't
              #  have a "last modification time" are transferred.
              #
              # For each TransferOperation started by this TransferJob, `NOW`
              # refers to the start_time of the
              # `TransferOperation`.
          "lastModifiedBefore": "A String", # If specified, only objects with a "last modification time" before this
              # timestamp and objects that don't have a "last modification time" will be
              # transferred.
          "lastModifiedSince": "A String", # If specified, only objects with a "last modification time" on or after
              # this timestamp and objects that don't have a "last modification time" are
              # transferred.
              #
              # The `last_modified_since` and `last_modified_before` fields can be used
              # together for chunked data processing. For example, consider a script that
              # processes each day's worth of data at a time. For that you'd set each
              # of the fields as follows:
              #
              # *  `last_modified_since` to the start of the day
              #
              # *  `last_modified_before` to the end of the day
          "excludePrefixes": [ # `exclude_prefixes` must follow the requirements described for
              # include_prefixes.
              #
              # The max size of `exclude_prefixes` is 1000.
            "A String",
          ],
        },
        "gcsDataSource": { # In a GcsData resource, an object's name is the Cloud Storage object's # A Cloud Storage data source.
            # name and its "last modification time" refers to the object's `updated`
            # property of Cloud Storage objects, which changes when the content or the
            # metadata of the object is updated.
          "bucketName": "A String", # Required. Cloud Storage bucket name (see
              # [Bucket Name
              # Requirements](https://cloud.google.com/storage/docs/naming#requirements)).
        },
        "httpDataSource": { # An HttpData resource specifies a list of objects on the web to be transferred # An HTTP URL data source.
            # over HTTP.  The information of the objects to be transferred is contained in
            # a file referenced by a URL. The first line in the file must be
            # `"TsvHttpData-1.0"`, which specifies the format of the file.  Subsequent
            # lines specify the information of the list of objects, one object per list
            # entry. Each entry has the following tab-delimited fields:
            #
            # * **HTTP URL** — The location of the object.
            #
            # * **Length** — The size of the object in bytes.
            #
            # * **MD5** — The base64-encoded MD5 hash of the object.
            #
            # For an example of a valid TSV file, see
            # [Transferring data from
            # URLs](https://cloud.google.com/storage-transfer/docs/create-url-list).
            #
            # When transferring data based on a URL list, keep the following in mind:
            #
            # * When an object located at `http(s)://hostname:port/<URL-path>` is
            # transferred to a data sink, the name of the object at the data sink is
            # `<hostname>/<URL-path>`.
            #
            # * If the specified size of an object does not match the actual size of the
            # object fetched, the object will not be transferred.
            #
            # * If the specified MD5 does not match the MD5 computed from the transferred
            # bytes, the object transfer will fail. For more information, see
            # [Generating MD5
            # hashes](https://cloud.google.com/storage-transfer/docs/create-url-list#md5)
            #
            # * Ensure that each URL you specify is publicly accessible. For
            # example, in Cloud Storage you can
            # [share an object publicly]
            # (https://cloud.google.com/storage/docs/cloud-console#_sharingdata) and get
            # a link to it.
            #
            # * Storage Transfer Service obeys `robots.txt` rules and requires the source
            # HTTP server to support `Range` requests and to return a `Content-Length`
            # header in each response.
            #
            # * ObjectConditions have no effect when filtering objects to transfer.
          "listUrl": "A String", # Required. The URL that points to the file that stores the object list
              # entries. This file must allow public access.  Currently, only URLs with
              # HTTP and HTTPS schemes are supported.
        },
        "transferOptions": { # TransferOptions uses three boolean parameters to define the actions # If the option
            # delete_objects_unique_in_sink
            # is `true`, object conditions based on objects' "last modification time" are
            # ignored and do not exclude objects in a data source or a data sink.
            # to be performed on objects in a transfer.
          "overwriteObjectsAlreadyExistingInSink": True or False, # Whether overwriting objects that already exist in the sink is allowed.
          "deleteObjectsFromSourceAfterTransfer": True or False, # Whether objects should be deleted from the source after they are
              # transferred to the sink.
              #
              # **Note:** This option and delete_objects_unique_in_sink are mutually
              # exclusive.
          "deleteObjectsUniqueInSink": True or False, # Whether objects that exist only in the sink should be deleted.
              #
              # **Note:** This option and delete_objects_from_source_after_transfer are
              # mutually exclusive.
        },
        "gcsDataSink": { # In a GcsData resource, an object's name is the Cloud Storage object's # A Cloud Storage data sink.
            # name and its "last modification time" refers to the object's `updated`
            # property of Cloud Storage objects, which changes when the content or the
            # metadata of the object is updated.
          "bucketName": "A String", # Required. Cloud Storage bucket name (see
              # [Bucket Name
              # Requirements](https://cloud.google.com/storage/docs/naming#requirements)).
        },
        "awsS3DataSource": { # An AwsS3Data resource can be a data source, but not a data sink. # An AWS S3 data source.
            # In an AwsS3Data resource, an object's name is the S3 object's key name.
          "awsAccessKey": { # AWS access key (see # Required. AWS access key used to sign the API requests to the AWS S3
              # bucket. Permissions on the bucket must be granted to the access ID of the
              # AWS access key.
              # [AWS Security
              # Credentials](https://docs.aws.amazon.com/general/latest/gr/aws-security-credentials.html)).
            "secretAccessKey": "A String", # Required. AWS secret access key. This field is not returned in RPC
                # responses.
            "accessKeyId": "A String", # Required. AWS access key ID.
          },
          "bucketName": "A String", # Required. S3 Bucket name (see
              # [Creating a
              # bucket](https://docs.aws.amazon.com/AmazonS3/latest/dev/create-bucket-get-location-example.html)).
        },
        "azureBlobStorageDataSource": { # An AzureBlobStorageData resource can be a data source, but not a data sink. # An Azure Blob Storage data source.
            # An AzureBlobStorageData resource represents one Azure container. The storage
            # account determines the [Azure
            # endpoint](https://docs.microsoft.com/en-us/azure/storage/common/storage-create-storage-account#storage-account-endpoints).
            # In an AzureBlobStorageData resource, a blobs's name is the [Azure Blob
            # Storage blob's key
            # name](https://docs.microsoft.com/en-us/rest/api/storageservices/naming-and-referencing-containers--blobs--and-metadata#blob-names).
          "container": "A String", # Required. The container to transfer from the Azure Storage account.
          "azureCredentials": { # Azure credentials # Required. Credentials used to authenticate API requests to Azure.
            "sasToken": "A String", # Required. Azure shared access signature. (see
                # [Grant limited access to Azure Storage resources using shared access
                # signatures
                # (SAS)](https://docs.microsoft.com/en-us/azure/storage/common/storage-sas-overview)).
          },
          "storageAccount": "A String", # Required. The name of the Azure Storage account.
        },
      },
      "status": "A String", # Status of the job. This value MUST be specified for
          # `CreateTransferJobRequests`.
          #
          # **Note:** The effect of the new job status takes place during a subsequent
          # job run. For example, if you change the job status from
          # ENABLED to DISABLED, and an operation
          # spawned by the transfer is running, the status change would not affect the
          # current operation.
      "deletionTime": "A String", # Output only. The time that the transfer job was deleted.
      "schedule": { # Transfers can be scheduled to recur or to run just once. # Schedule specification.
        "startTimeOfDay": { # Represents a time of day. The date and time zone are either not significant # The time in UTC that a transfer job is scheduled to run. Transfers may
            # start later than this time.
            #
            # If `start_time_of_day` is not specified:
            #
            # *   One-time transfers run immediately.
            # *   Recurring transfers run immediately, and each day at midnight UTC,
            #     through schedule_end_date.
            #
            # If `start_time_of_day` is specified:
            #
            # *   One-time transfers run at the specified time.
            # *   Recurring transfers run at the specified time each day, through
            #     `schedule_end_date`.
            # or are specified elsewhere. An API may choose to allow leap seconds. Related
            # types are google.type.Date and `google.protobuf.Timestamp`.
          "hours": 42, # Hours of day in 24 hour format. Should be from 0 to 23. An API may choose
              # to allow the value "24:00:00" for scenarios like business closing time.
          "nanos": 42, # Fractions of seconds in nanoseconds. Must be from 0 to 999,999,999.
          "seconds": 42, # Seconds of minutes of the time. Must normally be from 0 to 59. An API may
              # allow the value 60 if it allows leap-seconds.
          "minutes": 42, # Minutes of hour of day. Must be from 0 to 59.
        },
        "scheduleStartDate": { # Represents a whole or partial calendar date, e.g. a birthday. The time of day # Required. The start date of a transfer. Date boundaries are determined
            # relative to UTC time. If `schedule_start_date` and start_time_of_day
            # are in the past relative to the job's creation time, the transfer starts
            # the day after you schedule the transfer request.
            #
            # **Note:** When starting jobs at or near midnight UTC it is possible that
            # a job will start later than expected. For example, if you send an outbound
            # request on June 1 one millisecond prior to midnight UTC and the Storage
            # Transfer Service server receives the request on June 2, then it will create
            # a TransferJob with `schedule_start_date` set to June 2 and a
            # `start_time_of_day` set to midnight UTC. The first scheduled
            # TransferOperation will take place on June 3 at midnight UTC.
            # and time zone are either specified elsewhere or are not significant. The date
            # is relative to the Proleptic Gregorian Calendar. This can represent:
            #
            # * A full date, with non-zero year, month and day values
            # * A month and day value, with a zero year, e.g. an anniversary
            # * A year on its own, with zero month and day values
            # * A year and month value, with a zero day, e.g. a credit card expiration date
            #
            # Related types are google.type.TimeOfDay and `google.protobuf.Timestamp`.
          "year": 42, # Year of date. Must be from 1 to 9999, or 0 if specifying a date without
              # a year.
          "day": 42, # Day of month. Must be from 1 to 31 and valid for the year and month, or 0
              # if specifying a year by itself or a year and month where the day is not
              # significant.
          "month": 42, # Month of year. Must be from 1 to 12, or 0 if specifying a year without a
              # month and day.
        },
        "scheduleEndDate": { # Represents a whole or partial calendar date, e.g. a birthday. The time of day # The last day a transfer runs. Date boundaries are determined relative to
            # UTC time. A job will run once per 24 hours within the following guidelines:
            #
            # *   If `schedule_end_date` and schedule_start_date are the same and in
            #     the future relative to UTC, the transfer is executed only one time.
            # *   If `schedule_end_date` is later than `schedule_start_date`  and
            #     `schedule_end_date` is in the future relative to UTC, the job will
            #     run each day at start_time_of_day through `schedule_end_date`.
            # and time zone are either specified elsewhere or are not significant. The date
            # is relative to the Proleptic Gregorian Calendar. This can represent:
            #
            # * A full date, with non-zero year, month and day values
            # * A month and day value, with a zero year, e.g. an anniversary
            # * A year on its own, with zero month and day values
            # * A year and month value, with a zero day, e.g. a credit card expiration date
            #
            # Related types are google.type.TimeOfDay and `google.protobuf.Timestamp`.
          "year": 42, # Year of date. Must be from 1 to 9999, or 0 if specifying a date without
              # a year.
          "day": 42, # Day of month. Must be from 1 to 31 and valid for the year and month, or 0
              # if specifying a year by itself or a year and month where the day is not
              # significant.
          "month": 42, # Month of year. Must be from 1 to 12, or 0 if specifying a year without a
              # month and day.
        },
      },
      "projectId": "A String", # The ID of the Google Cloud Platform Project that owns the job.
      "description": "A String", # A description provided by the user for the job. Its max length is 1024
          # bytes when Unicode-encoded.
      "lastModificationTime": "A String", # Output only. The time that the transfer job was last modified.
      "creationTime": "A String", # Output only. The time that the transfer job was created.
      "notificationConfig": { # Specification to configure notifications published to Cloud Pub/Sub. # Notification configuration.
          # Notifications will be published to the customer-provided topic using the
          # following `PubsubMessage.attributes`:
          #
          # * `"eventType"`: one of the EventType values
          # * `"payloadFormat"`: one of the PayloadFormat values
          # * `"projectId"`: the project_id of the
          # `TransferOperation`
          # * `"transferJobName"`: the
          # transfer_job_name of the
          # `TransferOperation`
          # * `"transferOperationName"`: the name of the
          # `TransferOperation`
          #
          # The `PubsubMessage.data` will contain a TransferOperation resource
          # formatted according to the specified `PayloadFormat`.
        "eventTypes": [ # Event types for which a notification is desired. If empty, send
            # notifications for all event types.
          "A String",
        ],
        "payloadFormat": "A String", # Required. The desired format of the notification message payloads.
        "pubsubTopic": "A String", # Required. The `Topic.name` of the Cloud Pub/Sub topic to which to publish
            # notifications. Must be of the format: `projects/{project}/topics/{topic}`.
            # Not matching this format will result in an
            # INVALID_ARGUMENT error.
      },
      "name": "A String", # A unique name (within the transfer project) assigned when the job is
          # created.  If this field is empty in a CreateTransferJobRequest, Storage
          # Transfer Service will assign a unique name. Otherwise, the specified name
          # is used as the unique name for this job.
          #
          # If the specified name is in use by a job, the creation request fails with
          # an ALREADY_EXISTS error.
          #
          # This name must start with `"transferJobs/"` prefix and end with a letter or
          # a number, and should be no more than 128 characters.
          # Example: `"transferJobs/[A-Za-z0-9-._~]*[A-Za-z0-9]$"`
          #
          # Invalid job names will fail with an
          # INVALID_ARGUMENT error.
    }
list(pageSize=None, pageToken=None, x__xgafv=None, filter=None)
Lists transfer jobs.

Args:
  pageSize: integer, The list page size. The max allowed value is 256.
  pageToken: string, The list page token.
  x__xgafv: string, V1 error format.
    Allowed values
      1 - v1 error format
      2 - v2 error format
  filter: string, Required. A list of query parameters specified as JSON text in the form of:
{"project<span>_</span>id":"my_project_id",
 "job_names":["jobid1","jobid2",...],
 "job_statuses":["status1","status2",...]}.
Since `job_names` and `job_statuses` support multiple values, their values
must be specified with array notation. `project`<span>`_`</span>`id` is
required.  `job_names` and `job_statuses` are optional.  The valid values
for `job_statuses` are case-insensitive:
ENABLED,
DISABLED, and
DELETED.

Returns:
  An object of the form:

    { # Response from ListTransferJobs.
    "nextPageToken": "A String", # The list next page token.
    "transferJobs": [ # A list of transfer jobs.
      { # This resource represents the configuration of a transfer job that runs
            # periodically.
          "transferSpec": { # Configuration for running a transfer. # Transfer specification.
            "objectConditions": { # Conditions that determine which objects will be transferred. Applies only # Only objects that satisfy these object conditions are included in the set
                # of data source and data sink objects.  Object conditions based on
                # objects' "last modification time" do not exclude objects in a data sink.
                # to S3 and Cloud Storage objects.
                #
                # The "last modification time" refers to the time of the
                # last change to the object's content or metadata — specifically, this is
                # the `updated` property of Cloud Storage objects and the `LastModified`
                # field of S3 objects.
              "maxTimeElapsedSinceLastModification": "A String", # If specified, only objects with a "last modification time" on or after
                  # `NOW` - `max_time_elapsed_since_last_modification` and objects that don't
                  # have a "last modification time" are transferred.
                  #
                  # For each TransferOperation started by this TransferJob,
                  # `NOW` refers to the start_time of the
                  # `TransferOperation`.
              "includePrefixes": [ # If `include_prefixes` is specified, objects that satisfy the object
                  # conditions must have names that start with one of the `include_prefixes`
                  # and that do not start with any of the exclude_prefixes. If
                  # `include_prefixes` is not specified, all objects except those that have
                  # names starting with one of the `exclude_prefixes` must satisfy the object
                  # conditions.
                  #
                  # Requirements:
                  #
                  #   * Each include-prefix and exclude-prefix can contain any sequence of
                  #     Unicode characters, to a max length of 1024 bytes when UTF8-encoded,
                  #     and must not contain Carriage Return or Line Feed characters.  Wildcard
                  #     matching and regular expression matching are not supported.
                  #
                  #   * Each include-prefix and exclude-prefix must omit the leading slash.
                  #     For example, to include the `requests.gz` object in a transfer from
                  #     `s3://my-aws-bucket/logs/y=2015/requests.gz`, specify the include
                  #     prefix as `logs/y=2015/requests.gz`.
                  #
                  #   * None of the include-prefix or the exclude-prefix values can be empty,
                  #     if specified.
                  #
                  #   * Each include-prefix must include a distinct portion of the object
                  #     namespace. No include-prefix may be a prefix of another
                  #     include-prefix.
                  #
                  #   * Each exclude-prefix must exclude a distinct portion of the object
                  #     namespace. No exclude-prefix may be a prefix of another
                  #     exclude-prefix.
                  #
                  #   * If `include_prefixes` is specified, then each exclude-prefix must start
                  #     with the value of a path explicitly included by `include_prefixes`.
                  #
                  # The max size of `include_prefixes` is 1000.
                "A String",
              ],
              "minTimeElapsedSinceLastModification": "A String", # If specified, only objects with a "last modification time" before
                  # `NOW` - `min_time_elapsed_since_last_modification` and objects that don't
                  #  have a "last modification time" are transferred.
                  #
                  # For each TransferOperation started by this TransferJob, `NOW`
                  # refers to the start_time of the
                  # `TransferOperation`.
              "lastModifiedBefore": "A String", # If specified, only objects with a "last modification time" before this
                  # timestamp and objects that don't have a "last modification time" will be
                  # transferred.
              "lastModifiedSince": "A String", # If specified, only objects with a "last modification time" on or after
                  # this timestamp and objects that don't have a "last modification time" are
                  # transferred.
                  #
                  # The `last_modified_since` and `last_modified_before` fields can be used
                  # together for chunked data processing. For example, consider a script that
                  # processes each day's worth of data at a time. For that you'd set each
                  # of the fields as follows:
                  #
                  # *  `last_modified_since` to the start of the day
                  #
                  # *  `last_modified_before` to the end of the day
              "excludePrefixes": [ # `exclude_prefixes` must follow the requirements described for
                  # include_prefixes.
                  #
                  # The max size of `exclude_prefixes` is 1000.
                "A String",
              ],
            },
            "gcsDataSource": { # In a GcsData resource, an object's name is the Cloud Storage object's # A Cloud Storage data source.
                # name and its "last modification time" refers to the object's `updated`
                # property of Cloud Storage objects, which changes when the content or the
                # metadata of the object is updated.
              "bucketName": "A String", # Required. Cloud Storage bucket name (see
                  # [Bucket Name
                  # Requirements](https://cloud.google.com/storage/docs/naming#requirements)).
            },
            "httpDataSource": { # An HttpData resource specifies a list of objects on the web to be transferred # An HTTP URL data source.
                # over HTTP.  The information of the objects to be transferred is contained in
                # a file referenced by a URL. The first line in the file must be
                # `"TsvHttpData-1.0"`, which specifies the format of the file.  Subsequent
                # lines specify the information of the list of objects, one object per list
                # entry. Each entry has the following tab-delimited fields:
                #
                # * **HTTP URL** — The location of the object.
                #
                # * **Length** — The size of the object in bytes.
                #
                # * **MD5** — The base64-encoded MD5 hash of the object.
                #
                # For an example of a valid TSV file, see
                # [Transferring data from
                # URLs](https://cloud.google.com/storage-transfer/docs/create-url-list).
                #
                # When transferring data based on a URL list, keep the following in mind:
                #
                # * When an object located at `http(s)://hostname:port/<URL-path>` is
                # transferred to a data sink, the name of the object at the data sink is
                # `<hostname>/<URL-path>`.
                #
                # * If the specified size of an object does not match the actual size of the
                # object fetched, the object will not be transferred.
                #
                # * If the specified MD5 does not match the MD5 computed from the transferred
                # bytes, the object transfer will fail. For more information, see
                # [Generating MD5
                # hashes](https://cloud.google.com/storage-transfer/docs/create-url-list#md5)
                #
                # * Ensure that each URL you specify is publicly accessible. For
                # example, in Cloud Storage you can
                # [share an object publicly]
                # (https://cloud.google.com/storage/docs/cloud-console#_sharingdata) and get
                # a link to it.
                #
                # * Storage Transfer Service obeys `robots.txt` rules and requires the source
                # HTTP server to support `Range` requests and to return a `Content-Length`
                # header in each response.
                #
                # * ObjectConditions have no effect when filtering objects to transfer.
              "listUrl": "A String", # Required. The URL that points to the file that stores the object list
                  # entries. This file must allow public access.  Currently, only URLs with
                  # HTTP and HTTPS schemes are supported.
            },
            "transferOptions": { # TransferOptions uses three boolean parameters to define the actions # If the option
                # delete_objects_unique_in_sink
                # is `true`, object conditions based on objects' "last modification time" are
                # ignored and do not exclude objects in a data source or a data sink.
                # to be performed on objects in a transfer.
              "overwriteObjectsAlreadyExistingInSink": True or False, # Whether overwriting objects that already exist in the sink is allowed.
              "deleteObjectsFromSourceAfterTransfer": True or False, # Whether objects should be deleted from the source after they are
                  # transferred to the sink.
                  #
                  # **Note:** This option and delete_objects_unique_in_sink are mutually
                  # exclusive.
              "deleteObjectsUniqueInSink": True or False, # Whether objects that exist only in the sink should be deleted.
                  #
                  # **Note:** This option and delete_objects_from_source_after_transfer are
                  # mutually exclusive.
            },
            "gcsDataSink": { # In a GcsData resource, an object's name is the Cloud Storage object's # A Cloud Storage data sink.
                # name and its "last modification time" refers to the object's `updated`
                # property of Cloud Storage objects, which changes when the content or the
                # metadata of the object is updated.
              "bucketName": "A String", # Required. Cloud Storage bucket name (see
                  # [Bucket Name
                  # Requirements](https://cloud.google.com/storage/docs/naming#requirements)).
            },
            "awsS3DataSource": { # An AwsS3Data resource can be a data source, but not a data sink. # An AWS S3 data source.
                # In an AwsS3Data resource, an object's name is the S3 object's key name.
              "awsAccessKey": { # AWS access key (see # Required. AWS access key used to sign the API requests to the AWS S3
                  # bucket. Permissions on the bucket must be granted to the access ID of the
                  # AWS access key.
                  # [AWS Security
                  # Credentials](https://docs.aws.amazon.com/general/latest/gr/aws-security-credentials.html)).
                "secretAccessKey": "A String", # Required. AWS secret access key. This field is not returned in RPC
                    # responses.
                "accessKeyId": "A String", # Required. AWS access key ID.
              },
              "bucketName": "A String", # Required. S3 Bucket name (see
                  # [Creating a
                  # bucket](https://docs.aws.amazon.com/AmazonS3/latest/dev/create-bucket-get-location-example.html)).
            },
            "azureBlobStorageDataSource": { # An AzureBlobStorageData resource can be a data source, but not a data sink. # An Azure Blob Storage data source.
                # An AzureBlobStorageData resource represents one Azure container. The storage
                # account determines the [Azure
                # endpoint](https://docs.microsoft.com/en-us/azure/storage/common/storage-create-storage-account#storage-account-endpoints).
                # In an AzureBlobStorageData resource, a blobs's name is the [Azure Blob
                # Storage blob's key
                # name](https://docs.microsoft.com/en-us/rest/api/storageservices/naming-and-referencing-containers--blobs--and-metadata#blob-names).
              "container": "A String", # Required. The container to transfer from the Azure Storage account.
              "azureCredentials": { # Azure credentials # Required. Credentials used to authenticate API requests to Azure.
                "sasToken": "A String", # Required. Azure shared access signature. (see
                    # [Grant limited access to Azure Storage resources using shared access
                    # signatures
                    # (SAS)](https://docs.microsoft.com/en-us/azure/storage/common/storage-sas-overview)).
              },
              "storageAccount": "A String", # Required. The name of the Azure Storage account.
            },
          },
          "status": "A String", # Status of the job. This value MUST be specified for
              # `CreateTransferJobRequests`.
              #
              # **Note:** The effect of the new job status takes place during a subsequent
              # job run. For example, if you change the job status from
              # ENABLED to DISABLED, and an operation
              # spawned by the transfer is running, the status change would not affect the
              # current operation.
          "deletionTime": "A String", # Output only. The time that the transfer job was deleted.
          "schedule": { # Transfers can be scheduled to recur or to run just once. # Schedule specification.
            "startTimeOfDay": { # Represents a time of day. The date and time zone are either not significant # The time in UTC that a transfer job is scheduled to run. Transfers may
                # start later than this time.
                #
                # If `start_time_of_day` is not specified:
                #
                # *   One-time transfers run immediately.
                # *   Recurring transfers run immediately, and each day at midnight UTC,
                #     through schedule_end_date.
                #
                # If `start_time_of_day` is specified:
                #
                # *   One-time transfers run at the specified time.
                # *   Recurring transfers run at the specified time each day, through
                #     `schedule_end_date`.
                # or are specified elsewhere. An API may choose to allow leap seconds. Related
                # types are google.type.Date and `google.protobuf.Timestamp`.
              "hours": 42, # Hours of day in 24 hour format. Should be from 0 to 23. An API may choose
                  # to allow the value "24:00:00" for scenarios like business closing time.
              "nanos": 42, # Fractions of seconds in nanoseconds. Must be from 0 to 999,999,999.
              "seconds": 42, # Seconds of minutes of the time. Must normally be from 0 to 59. An API may
                  # allow the value 60 if it allows leap-seconds.
              "minutes": 42, # Minutes of hour of day. Must be from 0 to 59.
            },
            "scheduleStartDate": { # Represents a whole or partial calendar date, e.g. a birthday. The time of day # Required. The start date of a transfer. Date boundaries are determined
                # relative to UTC time. If `schedule_start_date` and start_time_of_day
                # are in the past relative to the job's creation time, the transfer starts
                # the day after you schedule the transfer request.
                #
                # **Note:** When starting jobs at or near midnight UTC it is possible that
                # a job will start later than expected. For example, if you send an outbound
                # request on June 1 one millisecond prior to midnight UTC and the Storage
                # Transfer Service server receives the request on June 2, then it will create
                # a TransferJob with `schedule_start_date` set to June 2 and a
                # `start_time_of_day` set to midnight UTC. The first scheduled
                # TransferOperation will take place on June 3 at midnight UTC.
                # and time zone are either specified elsewhere or are not significant. The date
                # is relative to the Proleptic Gregorian Calendar. This can represent:
                #
                # * A full date, with non-zero year, month and day values
                # * A month and day value, with a zero year, e.g. an anniversary
                # * A year on its own, with zero month and day values
                # * A year and month value, with a zero day, e.g. a credit card expiration date
                #
                # Related types are google.type.TimeOfDay and `google.protobuf.Timestamp`.
              "year": 42, # Year of date. Must be from 1 to 9999, or 0 if specifying a date without
                  # a year.
              "day": 42, # Day of month. Must be from 1 to 31 and valid for the year and month, or 0
                  # if specifying a year by itself or a year and month where the day is not
                  # significant.
              "month": 42, # Month of year. Must be from 1 to 12, or 0 if specifying a year without a
                  # month and day.
            },
            "scheduleEndDate": { # Represents a whole or partial calendar date, e.g. a birthday. The time of day # The last day a transfer runs. Date boundaries are determined relative to
                # UTC time. A job will run once per 24 hours within the following guidelines:
                #
                # *   If `schedule_end_date` and schedule_start_date are the same and in
                #     the future relative to UTC, the transfer is executed only one time.
                # *   If `schedule_end_date` is later than `schedule_start_date`  and
                #     `schedule_end_date` is in the future relative to UTC, the job will
                #     run each day at start_time_of_day through `schedule_end_date`.
                # and time zone are either specified elsewhere or are not significant. The date
                # is relative to the Proleptic Gregorian Calendar. This can represent:
                #
                # * A full date, with non-zero year, month and day values
                # * A month and day value, with a zero year, e.g. an anniversary
                # * A year on its own, with zero month and day values
                # * A year and month value, with a zero day, e.g. a credit card expiration date
                #
                # Related types are google.type.TimeOfDay and `google.protobuf.Timestamp`.
              "year": 42, # Year of date. Must be from 1 to 9999, or 0 if specifying a date without
                  # a year.
              "day": 42, # Day of month. Must be from 1 to 31 and valid for the year and month, or 0
                  # if specifying a year by itself or a year and month where the day is not
                  # significant.
              "month": 42, # Month of year. Must be from 1 to 12, or 0 if specifying a year without a
                  # month and day.
            },
          },
          "projectId": "A String", # The ID of the Google Cloud Platform Project that owns the job.
          "description": "A String", # A description provided by the user for the job. Its max length is 1024
              # bytes when Unicode-encoded.
          "lastModificationTime": "A String", # Output only. The time that the transfer job was last modified.
          "creationTime": "A String", # Output only. The time that the transfer job was created.
          "notificationConfig": { # Specification to configure notifications published to Cloud Pub/Sub. # Notification configuration.
              # Notifications will be published to the customer-provided topic using the
              # following `PubsubMessage.attributes`:
              #
              # * `"eventType"`: one of the EventType values
              # * `"payloadFormat"`: one of the PayloadFormat values
              # * `"projectId"`: the project_id of the
              # `TransferOperation`
              # * `"transferJobName"`: the
              # transfer_job_name of the
              # `TransferOperation`
              # * `"transferOperationName"`: the name of the
              # `TransferOperation`
              #
              # The `PubsubMessage.data` will contain a TransferOperation resource
              # formatted according to the specified `PayloadFormat`.
            "eventTypes": [ # Event types for which a notification is desired. If empty, send
                # notifications for all event types.
              "A String",
            ],
            "payloadFormat": "A String", # Required. The desired format of the notification message payloads.
            "pubsubTopic": "A String", # Required. The `Topic.name` of the Cloud Pub/Sub topic to which to publish
                # notifications. Must be of the format: `projects/{project}/topics/{topic}`.
                # Not matching this format will result in an
                # INVALID_ARGUMENT error.
          },
          "name": "A String", # A unique name (within the transfer project) assigned when the job is
              # created.  If this field is empty in a CreateTransferJobRequest, Storage
              # Transfer Service will assign a unique name. Otherwise, the specified name
              # is used as the unique name for this job.
              #
              # If the specified name is in use by a job, the creation request fails with
              # an ALREADY_EXISTS error.
              #
              # This name must start with `"transferJobs/"` prefix and end with a letter or
              # a number, and should be no more than 128 characters.
              # Example: `"transferJobs/[A-Za-z0-9-._~]*[A-Za-z0-9]$"`
              #
              # Invalid job names will fail with an
              # INVALID_ARGUMENT error.
        },
    ],
  }
list_next(previous_request=*, previous_response=*)
Retrieves the next page of results.

Args:
  previous_request: The request for the previous page. (required)
  previous_response: The response from the request for the previous page. (required)

Returns:
  A request object that you can call 'execute()' on to request the next
  page. Returns None if there are no more items in the collection.
    
patch(jobName=*, body=None, x__xgafv=None)
Updates a transfer job. Updating a job's transfer spec does not affect
transfer operations that are running already. Updating a job's schedule
is not allowed.

**Note:** The job's status field can be modified
using this RPC (for example, to set a job's status to
DELETED,
DISABLED, or
ENABLED).

Args:
  jobName: string, Required. The name of job to update. (required)
  body: object, The request body.
    The object takes the form of:

{ # Request passed to UpdateTransferJob.
    "projectId": "A String", # Required. The ID of the Google Cloud Platform Console project that owns the
        # job.
    "updateTransferJobFieldMask": "A String", # The field mask of the fields in `transferJob` that are to be updated in
        # this request.  Fields in `transferJob` that can be updated are:
        # description,
        # transfer_spec,
        # notification_config, and
        # status.  To update the `transfer_spec` of the job, a
        # complete transfer specification must be provided. An incomplete
        # specification missing any required fields will be rejected with the error
        # INVALID_ARGUMENT.
    "transferJob": { # This resource represents the configuration of a transfer job that runs # Required. The job to update. `transferJob` is expected to specify only
        # four fields: description,
        # transfer_spec,
        # notification_config, and
        # status.  An `UpdateTransferJobRequest` that specifies
        # other fields will be rejected with the error
        # INVALID_ARGUMENT.
          # periodically.
        "transferSpec": { # Configuration for running a transfer. # Transfer specification.
          "objectConditions": { # Conditions that determine which objects will be transferred. Applies only # Only objects that satisfy these object conditions are included in the set
              # of data source and data sink objects.  Object conditions based on
              # objects' "last modification time" do not exclude objects in a data sink.
              # to S3 and Cloud Storage objects.
              #
              # The "last modification time" refers to the time of the
              # last change to the object's content or metadata — specifically, this is
              # the `updated` property of Cloud Storage objects and the `LastModified`
              # field of S3 objects.
            "maxTimeElapsedSinceLastModification": "A String", # If specified, only objects with a "last modification time" on or after
                # `NOW` - `max_time_elapsed_since_last_modification` and objects that don't
                # have a "last modification time" are transferred.
                #
                # For each TransferOperation started by this TransferJob,
                # `NOW` refers to the start_time of the
                # `TransferOperation`.
            "includePrefixes": [ # If `include_prefixes` is specified, objects that satisfy the object
                # conditions must have names that start with one of the `include_prefixes`
                # and that do not start with any of the exclude_prefixes. If
                # `include_prefixes` is not specified, all objects except those that have
                # names starting with one of the `exclude_prefixes` must satisfy the object
                # conditions.
                #
                # Requirements:
                #
                #   * Each include-prefix and exclude-prefix can contain any sequence of
                #     Unicode characters, to a max length of 1024 bytes when UTF8-encoded,
                #     and must not contain Carriage Return or Line Feed characters.  Wildcard
                #     matching and regular expression matching are not supported.
                #
                #   * Each include-prefix and exclude-prefix must omit the leading slash.
                #     For example, to include the `requests.gz` object in a transfer from
                #     `s3://my-aws-bucket/logs/y=2015/requests.gz`, specify the include
                #     prefix as `logs/y=2015/requests.gz`.
                #
                #   * None of the include-prefix or the exclude-prefix values can be empty,
                #     if specified.
                #
                #   * Each include-prefix must include a distinct portion of the object
                #     namespace. No include-prefix may be a prefix of another
                #     include-prefix.
                #
                #   * Each exclude-prefix must exclude a distinct portion of the object
                #     namespace. No exclude-prefix may be a prefix of another
                #     exclude-prefix.
                #
                #   * If `include_prefixes` is specified, then each exclude-prefix must start
                #     with the value of a path explicitly included by `include_prefixes`.
                #
                # The max size of `include_prefixes` is 1000.
              "A String",
            ],
            "minTimeElapsedSinceLastModification": "A String", # If specified, only objects with a "last modification time" before
                # `NOW` - `min_time_elapsed_since_last_modification` and objects that don't
                #  have a "last modification time" are transferred.
                #
                # For each TransferOperation started by this TransferJob, `NOW`
                # refers to the start_time of the
                # `TransferOperation`.
            "lastModifiedBefore": "A String", # If specified, only objects with a "last modification time" before this
                # timestamp and objects that don't have a "last modification time" will be
                # transferred.
            "lastModifiedSince": "A String", # If specified, only objects with a "last modification time" on or after
                # this timestamp and objects that don't have a "last modification time" are
                # transferred.
                #
                # The `last_modified_since` and `last_modified_before` fields can be used
                # together for chunked data processing. For example, consider a script that
                # processes each day's worth of data at a time. For that you'd set each
                # of the fields as follows:
                #
                # *  `last_modified_since` to the start of the day
                #
                # *  `last_modified_before` to the end of the day
            "excludePrefixes": [ # `exclude_prefixes` must follow the requirements described for
                # include_prefixes.
                #
                # The max size of `exclude_prefixes` is 1000.
              "A String",
            ],
          },
          "gcsDataSource": { # In a GcsData resource, an object's name is the Cloud Storage object's # A Cloud Storage data source.
              # name and its "last modification time" refers to the object's `updated`
              # property of Cloud Storage objects, which changes when the content or the
              # metadata of the object is updated.
            "bucketName": "A String", # Required. Cloud Storage bucket name (see
                # [Bucket Name
                # Requirements](https://cloud.google.com/storage/docs/naming#requirements)).
          },
          "httpDataSource": { # An HttpData resource specifies a list of objects on the web to be transferred # An HTTP URL data source.
              # over HTTP.  The information of the objects to be transferred is contained in
              # a file referenced by a URL. The first line in the file must be
              # `"TsvHttpData-1.0"`, which specifies the format of the file.  Subsequent
              # lines specify the information of the list of objects, one object per list
              # entry. Each entry has the following tab-delimited fields:
              #
              # * **HTTP URL** — The location of the object.
              #
              # * **Length** — The size of the object in bytes.
              #
              # * **MD5** — The base64-encoded MD5 hash of the object.
              #
              # For an example of a valid TSV file, see
              # [Transferring data from
              # URLs](https://cloud.google.com/storage-transfer/docs/create-url-list).
              #
              # When transferring data based on a URL list, keep the following in mind:
              #
              # * When an object located at `http(s)://hostname:port/<URL-path>` is
              # transferred to a data sink, the name of the object at the data sink is
              # `<hostname>/<URL-path>`.
              #
              # * If the specified size of an object does not match the actual size of the
              # object fetched, the object will not be transferred.
              #
              # * If the specified MD5 does not match the MD5 computed from the transferred
              # bytes, the object transfer will fail. For more information, see
              # [Generating MD5
              # hashes](https://cloud.google.com/storage-transfer/docs/create-url-list#md5)
              #
              # * Ensure that each URL you specify is publicly accessible. For
              # example, in Cloud Storage you can
              # [share an object publicly]
              # (https://cloud.google.com/storage/docs/cloud-console#_sharingdata) and get
              # a link to it.
              #
              # * Storage Transfer Service obeys `robots.txt` rules and requires the source
              # HTTP server to support `Range` requests and to return a `Content-Length`
              # header in each response.
              #
              # * ObjectConditions have no effect when filtering objects to transfer.
            "listUrl": "A String", # Required. The URL that points to the file that stores the object list
                # entries. This file must allow public access.  Currently, only URLs with
                # HTTP and HTTPS schemes are supported.
          },
          "transferOptions": { # TransferOptions uses three boolean parameters to define the actions # If the option
              # delete_objects_unique_in_sink
              # is `true`, object conditions based on objects' "last modification time" are
              # ignored and do not exclude objects in a data source or a data sink.
              # to be performed on objects in a transfer.
            "overwriteObjectsAlreadyExistingInSink": True or False, # Whether overwriting objects that already exist in the sink is allowed.
            "deleteObjectsFromSourceAfterTransfer": True or False, # Whether objects should be deleted from the source after they are
                # transferred to the sink.
                #
                # **Note:** This option and delete_objects_unique_in_sink are mutually
                # exclusive.
            "deleteObjectsUniqueInSink": True or False, # Whether objects that exist only in the sink should be deleted.
                #
                # **Note:** This option and delete_objects_from_source_after_transfer are
                # mutually exclusive.
          },
          "gcsDataSink": { # In a GcsData resource, an object's name is the Cloud Storage object's # A Cloud Storage data sink.
              # name and its "last modification time" refers to the object's `updated`
              # property of Cloud Storage objects, which changes when the content or the
              # metadata of the object is updated.
            "bucketName": "A String", # Required. Cloud Storage bucket name (see
                # [Bucket Name
                # Requirements](https://cloud.google.com/storage/docs/naming#requirements)).
          },
          "awsS3DataSource": { # An AwsS3Data resource can be a data source, but not a data sink. # An AWS S3 data source.
              # In an AwsS3Data resource, an object's name is the S3 object's key name.
            "awsAccessKey": { # AWS access key (see # Required. AWS access key used to sign the API requests to the AWS S3
                # bucket. Permissions on the bucket must be granted to the access ID of the
                # AWS access key.
                # [AWS Security
                # Credentials](https://docs.aws.amazon.com/general/latest/gr/aws-security-credentials.html)).
              "secretAccessKey": "A String", # Required. AWS secret access key. This field is not returned in RPC
                  # responses.
              "accessKeyId": "A String", # Required. AWS access key ID.
            },
            "bucketName": "A String", # Required. S3 Bucket name (see
                # [Creating a
                # bucket](https://docs.aws.amazon.com/AmazonS3/latest/dev/create-bucket-get-location-example.html)).
          },
          "azureBlobStorageDataSource": { # An AzureBlobStorageData resource can be a data source, but not a data sink. # An Azure Blob Storage data source.
              # An AzureBlobStorageData resource represents one Azure container. The storage
              # account determines the [Azure
              # endpoint](https://docs.microsoft.com/en-us/azure/storage/common/storage-create-storage-account#storage-account-endpoints).
              # In an AzureBlobStorageData resource, a blobs's name is the [Azure Blob
              # Storage blob's key
              # name](https://docs.microsoft.com/en-us/rest/api/storageservices/naming-and-referencing-containers--blobs--and-metadata#blob-names).
            "container": "A String", # Required. The container to transfer from the Azure Storage account.
            "azureCredentials": { # Azure credentials # Required. Credentials used to authenticate API requests to Azure.
              "sasToken": "A String", # Required. Azure shared access signature. (see
                  # [Grant limited access to Azure Storage resources using shared access
                  # signatures
                  # (SAS)](https://docs.microsoft.com/en-us/azure/storage/common/storage-sas-overview)).
            },
            "storageAccount": "A String", # Required. The name of the Azure Storage account.
          },
        },
        "status": "A String", # Status of the job. This value MUST be specified for
            # `CreateTransferJobRequests`.
            #
            # **Note:** The effect of the new job status takes place during a subsequent
            # job run. For example, if you change the job status from
            # ENABLED to DISABLED, and an operation
            # spawned by the transfer is running, the status change would not affect the
            # current operation.
        "deletionTime": "A String", # Output only. The time that the transfer job was deleted.
        "schedule": { # Transfers can be scheduled to recur or to run just once. # Schedule specification.
          "startTimeOfDay": { # Represents a time of day. The date and time zone are either not significant # The time in UTC that a transfer job is scheduled to run. Transfers may
              # start later than this time.
              #
              # If `start_time_of_day` is not specified:
              #
              # *   One-time transfers run immediately.
              # *   Recurring transfers run immediately, and each day at midnight UTC,
              #     through schedule_end_date.
              #
              # If `start_time_of_day` is specified:
              #
              # *   One-time transfers run at the specified time.
              # *   Recurring transfers run at the specified time each day, through
              #     `schedule_end_date`.
              # or are specified elsewhere. An API may choose to allow leap seconds. Related
              # types are google.type.Date and `google.protobuf.Timestamp`.
            "hours": 42, # Hours of day in 24 hour format. Should be from 0 to 23. An API may choose
                # to allow the value "24:00:00" for scenarios like business closing time.
            "nanos": 42, # Fractions of seconds in nanoseconds. Must be from 0 to 999,999,999.
            "seconds": 42, # Seconds of minutes of the time. Must normally be from 0 to 59. An API may
                # allow the value 60 if it allows leap-seconds.
            "minutes": 42, # Minutes of hour of day. Must be from 0 to 59.
          },
          "scheduleStartDate": { # Represents a whole or partial calendar date, e.g. a birthday. The time of day # Required. The start date of a transfer. Date boundaries are determined
              # relative to UTC time. If `schedule_start_date` and start_time_of_day
              # are in the past relative to the job's creation time, the transfer starts
              # the day after you schedule the transfer request.
              #
              # **Note:** When starting jobs at or near midnight UTC it is possible that
              # a job will start later than expected. For example, if you send an outbound
              # request on June 1 one millisecond prior to midnight UTC and the Storage
              # Transfer Service server receives the request on June 2, then it will create
              # a TransferJob with `schedule_start_date` set to June 2 and a
              # `start_time_of_day` set to midnight UTC. The first scheduled
              # TransferOperation will take place on June 3 at midnight UTC.
              # and time zone are either specified elsewhere or are not significant. The date
              # is relative to the Proleptic Gregorian Calendar. This can represent:
              #
              # * A full date, with non-zero year, month and day values
              # * A month and day value, with a zero year, e.g. an anniversary
              # * A year on its own, with zero month and day values
              # * A year and month value, with a zero day, e.g. a credit card expiration date
              #
              # Related types are google.type.TimeOfDay and `google.protobuf.Timestamp`.
            "year": 42, # Year of date. Must be from 1 to 9999, or 0 if specifying a date without
                # a year.
            "day": 42, # Day of month. Must be from 1 to 31 and valid for the year and month, or 0
                # if specifying a year by itself or a year and month where the day is not
                # significant.
            "month": 42, # Month of year. Must be from 1 to 12, or 0 if specifying a year without a
                # month and day.
          },
          "scheduleEndDate": { # Represents a whole or partial calendar date, e.g. a birthday. The time of day # The last day a transfer runs. Date boundaries are determined relative to
              # UTC time. A job will run once per 24 hours within the following guidelines:
              #
              # *   If `schedule_end_date` and schedule_start_date are the same and in
              #     the future relative to UTC, the transfer is executed only one time.
              # *   If `schedule_end_date` is later than `schedule_start_date`  and
              #     `schedule_end_date` is in the future relative to UTC, the job will
              #     run each day at start_time_of_day through `schedule_end_date`.
              # and time zone are either specified elsewhere or are not significant. The date
              # is relative to the Proleptic Gregorian Calendar. This can represent:
              #
              # * A full date, with non-zero year, month and day values
              # * A month and day value, with a zero year, e.g. an anniversary
              # * A year on its own, with zero month and day values
              # * A year and month value, with a zero day, e.g. a credit card expiration date
              #
              # Related types are google.type.TimeOfDay and `google.protobuf.Timestamp`.
            "year": 42, # Year of date. Must be from 1 to 9999, or 0 if specifying a date without
                # a year.
            "day": 42, # Day of month. Must be from 1 to 31 and valid for the year and month, or 0
                # if specifying a year by itself or a year and month where the day is not
                # significant.
            "month": 42, # Month of year. Must be from 1 to 12, or 0 if specifying a year without a
                # month and day.
          },
        },
        "projectId": "A String", # The ID of the Google Cloud Platform Project that owns the job.
        "description": "A String", # A description provided by the user for the job. Its max length is 1024
            # bytes when Unicode-encoded.
        "lastModificationTime": "A String", # Output only. The time that the transfer job was last modified.
        "creationTime": "A String", # Output only. The time that the transfer job was created.
        "notificationConfig": { # Specification to configure notifications published to Cloud Pub/Sub. # Notification configuration.
            # Notifications will be published to the customer-provided topic using the
            # following `PubsubMessage.attributes`:
            #
            # * `"eventType"`: one of the EventType values
            # * `"payloadFormat"`: one of the PayloadFormat values
            # * `"projectId"`: the project_id of the
            # `TransferOperation`
            # * `"transferJobName"`: the
            # transfer_job_name of the
            # `TransferOperation`
            # * `"transferOperationName"`: the name of the
            # `TransferOperation`
            #
            # The `PubsubMessage.data` will contain a TransferOperation resource
            # formatted according to the specified `PayloadFormat`.
          "eventTypes": [ # Event types for which a notification is desired. If empty, send
              # notifications for all event types.
            "A String",
          ],
          "payloadFormat": "A String", # Required. The desired format of the notification message payloads.
          "pubsubTopic": "A String", # Required. The `Topic.name` of the Cloud Pub/Sub topic to which to publish
              # notifications. Must be of the format: `projects/{project}/topics/{topic}`.
              # Not matching this format will result in an
              # INVALID_ARGUMENT error.
        },
        "name": "A String", # A unique name (within the transfer project) assigned when the job is
            # created.  If this field is empty in a CreateTransferJobRequest, Storage
            # Transfer Service will assign a unique name. Otherwise, the specified name
            # is used as the unique name for this job.
            #
            # If the specified name is in use by a job, the creation request fails with
            # an ALREADY_EXISTS error.
            #
            # This name must start with `"transferJobs/"` prefix and end with a letter or
            # a number, and should be no more than 128 characters.
            # Example: `"transferJobs/[A-Za-z0-9-._~]*[A-Za-z0-9]$"`
            #
            # Invalid job names will fail with an
            # INVALID_ARGUMENT error.
      },
  }

  x__xgafv: string, V1 error format.
    Allowed values
      1 - v1 error format
      2 - v2 error format

Returns:
  An object of the form:

    { # This resource represents the configuration of a transfer job that runs
        # periodically.
      "transferSpec": { # Configuration for running a transfer. # Transfer specification.
        "objectConditions": { # Conditions that determine which objects will be transferred. Applies only # Only objects that satisfy these object conditions are included in the set
            # of data source and data sink objects.  Object conditions based on
            # objects' "last modification time" do not exclude objects in a data sink.
            # to S3 and Cloud Storage objects.
            #
            # The "last modification time" refers to the time of the
            # last change to the object's content or metadata — specifically, this is
            # the `updated` property of Cloud Storage objects and the `LastModified`
            # field of S3 objects.
          "maxTimeElapsedSinceLastModification": "A String", # If specified, only objects with a "last modification time" on or after
              # `NOW` - `max_time_elapsed_since_last_modification` and objects that don't
              # have a "last modification time" are transferred.
              #
              # For each TransferOperation started by this TransferJob,
              # `NOW` refers to the start_time of the
              # `TransferOperation`.
          "includePrefixes": [ # If `include_prefixes` is specified, objects that satisfy the object
              # conditions must have names that start with one of the `include_prefixes`
              # and that do not start with any of the exclude_prefixes. If
              # `include_prefixes` is not specified, all objects except those that have
              # names starting with one of the `exclude_prefixes` must satisfy the object
              # conditions.
              #
              # Requirements:
              #
              #   * Each include-prefix and exclude-prefix can contain any sequence of
              #     Unicode characters, to a max length of 1024 bytes when UTF8-encoded,
              #     and must not contain Carriage Return or Line Feed characters.  Wildcard
              #     matching and regular expression matching are not supported.
              #
              #   * Each include-prefix and exclude-prefix must omit the leading slash.
              #     For example, to include the `requests.gz` object in a transfer from
              #     `s3://my-aws-bucket/logs/y=2015/requests.gz`, specify the include
              #     prefix as `logs/y=2015/requests.gz`.
              #
              #   * None of the include-prefix or the exclude-prefix values can be empty,
              #     if specified.
              #
              #   * Each include-prefix must include a distinct portion of the object
              #     namespace. No include-prefix may be a prefix of another
              #     include-prefix.
              #
              #   * Each exclude-prefix must exclude a distinct portion of the object
              #     namespace. No exclude-prefix may be a prefix of another
              #     exclude-prefix.
              #
              #   * If `include_prefixes` is specified, then each exclude-prefix must start
              #     with the value of a path explicitly included by `include_prefixes`.
              #
              # The max size of `include_prefixes` is 1000.
            "A String",
          ],
          "minTimeElapsedSinceLastModification": "A String", # If specified, only objects with a "last modification time" before
              # `NOW` - `min_time_elapsed_since_last_modification` and objects that don't
              #  have a "last modification time" are transferred.
              #
              # For each TransferOperation started by this TransferJob, `NOW`
              # refers to the start_time of the
              # `TransferOperation`.
          "lastModifiedBefore": "A String", # If specified, only objects with a "last modification time" before this
              # timestamp and objects that don't have a "last modification time" will be
              # transferred.
          "lastModifiedSince": "A String", # If specified, only objects with a "last modification time" on or after
              # this timestamp and objects that don't have a "last modification time" are
              # transferred.
              #
              # The `last_modified_since` and `last_modified_before` fields can be used
              # together for chunked data processing. For example, consider a script that
              # processes each day's worth of data at a time. For that you'd set each
              # of the fields as follows:
              #
              # *  `last_modified_since` to the start of the day
              #
              # *  `last_modified_before` to the end of the day
          "excludePrefixes": [ # `exclude_prefixes` must follow the requirements described for
              # include_prefixes.
              #
              # The max size of `exclude_prefixes` is 1000.
            "A String",
          ],
        },
        "gcsDataSource": { # In a GcsData resource, an object's name is the Cloud Storage object's # A Cloud Storage data source.
            # name and its "last modification time" refers to the object's `updated`
            # property of Cloud Storage objects, which changes when the content or the
            # metadata of the object is updated.
          "bucketName": "A String", # Required. Cloud Storage bucket name (see
              # [Bucket Name
              # Requirements](https://cloud.google.com/storage/docs/naming#requirements)).
        },
        "httpDataSource": { # An HttpData resource specifies a list of objects on the web to be transferred # An HTTP URL data source.
            # over HTTP.  The information of the objects to be transferred is contained in
            # a file referenced by a URL. The first line in the file must be
            # `"TsvHttpData-1.0"`, which specifies the format of the file.  Subsequent
            # lines specify the information of the list of objects, one object per list
            # entry. Each entry has the following tab-delimited fields:
            #
            # * **HTTP URL** — The location of the object.
            #
            # * **Length** — The size of the object in bytes.
            #
            # * **MD5** — The base64-encoded MD5 hash of the object.
            #
            # For an example of a valid TSV file, see
            # [Transferring data from
            # URLs](https://cloud.google.com/storage-transfer/docs/create-url-list).
            #
            # When transferring data based on a URL list, keep the following in mind:
            #
            # * When an object located at `http(s)://hostname:port/<URL-path>` is
            # transferred to a data sink, the name of the object at the data sink is
            # `<hostname>/<URL-path>`.
            #
            # * If the specified size of an object does not match the actual size of the
            # object fetched, the object will not be transferred.
            #
            # * If the specified MD5 does not match the MD5 computed from the transferred
            # bytes, the object transfer will fail. For more information, see
            # [Generating MD5
            # hashes](https://cloud.google.com/storage-transfer/docs/create-url-list#md5)
            #
            # * Ensure that each URL you specify is publicly accessible. For
            # example, in Cloud Storage you can
            # [share an object publicly]
            # (https://cloud.google.com/storage/docs/cloud-console#_sharingdata) and get
            # a link to it.
            #
            # * Storage Transfer Service obeys `robots.txt` rules and requires the source
            # HTTP server to support `Range` requests and to return a `Content-Length`
            # header in each response.
            #
            # * ObjectConditions have no effect when filtering objects to transfer.
          "listUrl": "A String", # Required. The URL that points to the file that stores the object list
              # entries. This file must allow public access.  Currently, only URLs with
              # HTTP and HTTPS schemes are supported.
        },
        "transferOptions": { # TransferOptions uses three boolean parameters to define the actions # If the option
            # delete_objects_unique_in_sink
            # is `true`, object conditions based on objects' "last modification time" are
            # ignored and do not exclude objects in a data source or a data sink.
            # to be performed on objects in a transfer.
          "overwriteObjectsAlreadyExistingInSink": True or False, # Whether overwriting objects that already exist in the sink is allowed.
          "deleteObjectsFromSourceAfterTransfer": True or False, # Whether objects should be deleted from the source after they are
              # transferred to the sink.
              #
              # **Note:** This option and delete_objects_unique_in_sink are mutually
              # exclusive.
          "deleteObjectsUniqueInSink": True or False, # Whether objects that exist only in the sink should be deleted.
              #
              # **Note:** This option and delete_objects_from_source_after_transfer are
              # mutually exclusive.
        },
        "gcsDataSink": { # In a GcsData resource, an object's name is the Cloud Storage object's # A Cloud Storage data sink.
            # name and its "last modification time" refers to the object's `updated`
            # property of Cloud Storage objects, which changes when the content or the
            # metadata of the object is updated.
          "bucketName": "A String", # Required. Cloud Storage bucket name (see
              # [Bucket Name
              # Requirements](https://cloud.google.com/storage/docs/naming#requirements)).
        },
        "awsS3DataSource": { # An AwsS3Data resource can be a data source, but not a data sink. # An AWS S3 data source.
            # In an AwsS3Data resource, an object's name is the S3 object's key name.
          "awsAccessKey": { # AWS access key (see # Required. AWS access key used to sign the API requests to the AWS S3
              # bucket. Permissions on the bucket must be granted to the access ID of the
              # AWS access key.
              # [AWS Security
              # Credentials](https://docs.aws.amazon.com/general/latest/gr/aws-security-credentials.html)).
            "secretAccessKey": "A String", # Required. AWS secret access key. This field is not returned in RPC
                # responses.
            "accessKeyId": "A String", # Required. AWS access key ID.
          },
          "bucketName": "A String", # Required. S3 Bucket name (see
              # [Creating a
              # bucket](https://docs.aws.amazon.com/AmazonS3/latest/dev/create-bucket-get-location-example.html)).
        },
        "azureBlobStorageDataSource": { # An AzureBlobStorageData resource can be a data source, but not a data sink. # An Azure Blob Storage data source.
            # An AzureBlobStorageData resource represents one Azure container. The storage
            # account determines the [Azure
            # endpoint](https://docs.microsoft.com/en-us/azure/storage/common/storage-create-storage-account#storage-account-endpoints).
            # In an AzureBlobStorageData resource, a blobs's name is the [Azure Blob
            # Storage blob's key
            # name](https://docs.microsoft.com/en-us/rest/api/storageservices/naming-and-referencing-containers--blobs--and-metadata#blob-names).
          "container": "A String", # Required. The container to transfer from the Azure Storage account.
          "azureCredentials": { # Azure credentials # Required. Credentials used to authenticate API requests to Azure.
            "sasToken": "A String", # Required. Azure shared access signature. (see
                # [Grant limited access to Azure Storage resources using shared access
                # signatures
                # (SAS)](https://docs.microsoft.com/en-us/azure/storage/common/storage-sas-overview)).
          },
          "storageAccount": "A String", # Required. The name of the Azure Storage account.
        },
      },
      "status": "A String", # Status of the job. This value MUST be specified for
          # `CreateTransferJobRequests`.
          #
          # **Note:** The effect of the new job status takes place during a subsequent
          # job run. For example, if you change the job status from
          # ENABLED to DISABLED, and an operation
          # spawned by the transfer is running, the status change would not affect the
          # current operation.
      "deletionTime": "A String", # Output only. The time that the transfer job was deleted.
      "schedule": { # Transfers can be scheduled to recur or to run just once. # Schedule specification.
        "startTimeOfDay": { # Represents a time of day. The date and time zone are either not significant # The time in UTC that a transfer job is scheduled to run. Transfers may
            # start later than this time.
            #
            # If `start_time_of_day` is not specified:
            #
            # *   One-time transfers run immediately.
            # *   Recurring transfers run immediately, and each day at midnight UTC,
            #     through schedule_end_date.
            #
            # If `start_time_of_day` is specified:
            #
            # *   One-time transfers run at the specified time.
            # *   Recurring transfers run at the specified time each day, through
            #     `schedule_end_date`.
            # or are specified elsewhere. An API may choose to allow leap seconds. Related
            # types are google.type.Date and `google.protobuf.Timestamp`.
          "hours": 42, # Hours of day in 24 hour format. Should be from 0 to 23. An API may choose
              # to allow the value "24:00:00" for scenarios like business closing time.
          "nanos": 42, # Fractions of seconds in nanoseconds. Must be from 0 to 999,999,999.
          "seconds": 42, # Seconds of minutes of the time. Must normally be from 0 to 59. An API may
              # allow the value 60 if it allows leap-seconds.
          "minutes": 42, # Minutes of hour of day. Must be from 0 to 59.
        },
        "scheduleStartDate": { # Represents a whole or partial calendar date, e.g. a birthday. The time of day # Required. The start date of a transfer. Date boundaries are determined
            # relative to UTC time. If `schedule_start_date` and start_time_of_day
            # are in the past relative to the job's creation time, the transfer starts
            # the day after you schedule the transfer request.
            #
            # **Note:** When starting jobs at or near midnight UTC it is possible that
            # a job will start later than expected. For example, if you send an outbound
            # request on June 1 one millisecond prior to midnight UTC and the Storage
            # Transfer Service server receives the request on June 2, then it will create
            # a TransferJob with `schedule_start_date` set to June 2 and a
            # `start_time_of_day` set to midnight UTC. The first scheduled
            # TransferOperation will take place on June 3 at midnight UTC.
            # and time zone are either specified elsewhere or are not significant. The date
            # is relative to the Proleptic Gregorian Calendar. This can represent:
            #
            # * A full date, with non-zero year, month and day values
            # * A month and day value, with a zero year, e.g. an anniversary
            # * A year on its own, with zero month and day values
            # * A year and month value, with a zero day, e.g. a credit card expiration date
            #
            # Related types are google.type.TimeOfDay and `google.protobuf.Timestamp`.
          "year": 42, # Year of date. Must be from 1 to 9999, or 0 if specifying a date without
              # a year.
          "day": 42, # Day of month. Must be from 1 to 31 and valid for the year and month, or 0
              # if specifying a year by itself or a year and month where the day is not
              # significant.
          "month": 42, # Month of year. Must be from 1 to 12, or 0 if specifying a year without a
              # month and day.
        },
        "scheduleEndDate": { # Represents a whole or partial calendar date, e.g. a birthday. The time of day # The last day a transfer runs. Date boundaries are determined relative to
            # UTC time. A job will run once per 24 hours within the following guidelines:
            #
            # *   If `schedule_end_date` and schedule_start_date are the same and in
            #     the future relative to UTC, the transfer is executed only one time.
            # *   If `schedule_end_date` is later than `schedule_start_date`  and
            #     `schedule_end_date` is in the future relative to UTC, the job will
            #     run each day at start_time_of_day through `schedule_end_date`.
            # and time zone are either specified elsewhere or are not significant. The date
            # is relative to the Proleptic Gregorian Calendar. This can represent:
            #
            # * A full date, with non-zero year, month and day values
            # * A month and day value, with a zero year, e.g. an anniversary
            # * A year on its own, with zero month and day values
            # * A year and month value, with a zero day, e.g. a credit card expiration date
            #
            # Related types are google.type.TimeOfDay and `google.protobuf.Timestamp`.
          "year": 42, # Year of date. Must be from 1 to 9999, or 0 if specifying a date without
              # a year.
          "day": 42, # Day of month. Must be from 1 to 31 and valid for the year and month, or 0
              # if specifying a year by itself or a year and month where the day is not
              # significant.
          "month": 42, # Month of year. Must be from 1 to 12, or 0 if specifying a year without a
              # month and day.
        },
      },
      "projectId": "A String", # The ID of the Google Cloud Platform Project that owns the job.
      "description": "A String", # A description provided by the user for the job. Its max length is 1024
          # bytes when Unicode-encoded.
      "lastModificationTime": "A String", # Output only. The time that the transfer job was last modified.
      "creationTime": "A String", # Output only. The time that the transfer job was created.
      "notificationConfig": { # Specification to configure notifications published to Cloud Pub/Sub. # Notification configuration.
          # Notifications will be published to the customer-provided topic using the
          # following `PubsubMessage.attributes`:
          #
          # * `"eventType"`: one of the EventType values
          # * `"payloadFormat"`: one of the PayloadFormat values
          # * `"projectId"`: the project_id of the
          # `TransferOperation`
          # * `"transferJobName"`: the
          # transfer_job_name of the
          # `TransferOperation`
          # * `"transferOperationName"`: the name of the
          # `TransferOperation`
          #
          # The `PubsubMessage.data` will contain a TransferOperation resource
          # formatted according to the specified `PayloadFormat`.
        "eventTypes": [ # Event types for which a notification is desired. If empty, send
            # notifications for all event types.
          "A String",
        ],
        "payloadFormat": "A String", # Required. The desired format of the notification message payloads.
        "pubsubTopic": "A String", # Required. The `Topic.name` of the Cloud Pub/Sub topic to which to publish
            # notifications. Must be of the format: `projects/{project}/topics/{topic}`.
            # Not matching this format will result in an
            # INVALID_ARGUMENT error.
      },
      "name": "A String", # A unique name (within the transfer project) assigned when the job is
          # created.  If this field is empty in a CreateTransferJobRequest, Storage
          # Transfer Service will assign a unique name. Otherwise, the specified name
          # is used as the unique name for this job.
          #
          # If the specified name is in use by a job, the creation request fails with
          # an ALREADY_EXISTS error.
          #
          # This name must start with `"transferJobs/"` prefix and end with a letter or
          # a number, and should be no more than 128 characters.
          # Example: `"transferJobs/[A-Za-z0-9-._~]*[A-Za-z0-9]$"`
          #
          # Invalid job names will fail with an
          # INVALID_ARGUMENT error.
    }