0

I configured Google Storage Transfer Service to copy objects from a Google Cloud Storage bucket to another Google Cloud Storage bucket.

By the way, 98% of objects are succeed to copy, but 2% of objects are failed. From error details, I noticed that 2% of failed are NOT_FOUND error, and the error message is

No such object: source-bucket-name/path/to/object/ObjectName.json

But, if I check the file is exists, it exists. I don't understand why this is happened, because GCS will notify to Pubsub when the object is successfully created. (OBJECT_FINALIZE)

Are there way to fix the NOT_FOUND problem, or any workarounds?

Here is my config connector and Storage Transfer setup.

---
apiVersion: storage.cnrm.cloud.google.com/v1beta1
kind: StorageNotification
metadata:
  name: source-notification
spec:
  bucketRef:
    name: source-bucket-name
  payloadFormat: JSON_API_V1
  topicRef:
    name: my-notification-topic
  eventTypes:
    - "OBJECT_FINALIZE"
    - "OBJECT_METADATA_UPDATE"
    - "OBJECT_DELETE"
    - "OBJECT_ARCHIVE"

---
apiVersion: pubsub.cnrm.cloud.google.com/v1beta1
kind: PubSubTopic
metadata:
  name: my-notification-topic
spec:
  messageRetentionDuration: 2678400s # 31 days

---
apiVersion: pubsub.cnrm.cloud.google.com/v1beta1
kind: PubSubSubscription
metadata:
  name: my-notification-subscription
spec:
  ackDeadlineSeconds: 300
  enableExactlyOnceDelivery: false
  retainAckedMessages: false
  topicRef:
    name: my-notification-topic
  expirationPolicy:
    ttl: ""

Because Config Connector not yet support Google Storage Transfer with event-driven mode, Google Storage Transfer is manually setuped:

Scheduling mode: Event driven
Source
  Type: Google Cloud Storage
  Name: source-bucket-name
  Folder path: path/to/source
  Event stream:
    Pub/sub subscription name: projects/my-projects/subscriptions/my-notification-subscription
  Filters: None
Destination
  Type: Google Cloud Storage
  Name: destination-bucket-name
  Folder path: path/to/destination
Data handling options:
  Metadata options:
    ACL: Use destination bucket's object ACLs
    KMS key: Use destination bucket's encryption settings
    Storage class: Use destination bucket's storage class
    Temporary hold: Preserve object's original temporary hold status
    Time created: Do not preserve source object's time created
  When to overwrite: Always
  When to delete: Never

I want to fix whole 'NOT_FOUND' Error, or workaround to copy the errored objects.

1 Answer 1

0

I also had such an issue with transfer jobs. In my case I could reproduce the issue as follows:

  • Create and run a transfer job from bucket A to bucket B
  • Delete Bucket B
  • Create bucket B (with same name)
  • Run the transfer from A to B again -> this produced many NOT_FOUND errors

My use case was creating backup and therefor I could work around the issue by never reusing the same bucket name as target for a backup transfer.

Nevertheless, I wrote to google support. They could reproduce the issue with the help of above instructions. They created a fix and rolled it out on 2024-07-03. My issue has been fixed by that rollout.

Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.