Xtrabackup/xbcloud backups broken unexpectedly

Hey awesome community,

I am facing a strange issue with xtrabackup and xbcloud while trying to backup my DB to GCS. It is worth noting that this worked fine for several months until some days ago. Using this command:

xtrabackup --backup --stream=xbstream --compress \
  --compress-threads=5 --extra-lsndir=/tmp \
  --target-dir=/tmp --parallel=5 \
  | xbcloud put --storage=google \ 
  --google-endpoint='storage.googleapis.com' \ 
  --google-access-key=GOOGLEACCESSKEY \
  --google-secret-key=GOOGLESECRETKEY \
  --google-bucket=BUCKETNAME \
  --parallel=5 TEST/$HOSTNAME-full-backup

The command above exits after a while, with HTTP/2 error messages like:

210224 22:10:19 xbcloud: Failed to upload object. Error: Error in the HTTP2 framing layer
xtrabackup: Error writing file 'UNOPENED' (Errcode: 32 - Broken pipe)
xb_stream_write_data() failed.
compress: write to the destination stream failed.
xtrabackup: Error writing file 'UNOPENED' (Errcode: 32 - Broken pipe)
[17] xtrabackup: Error: xtrabackup_copy_datafile() failed.
[17] xtrabackup: Error: failed to copy datafile.
  • xtrabackup version
xtrabackup version 2.4.20 based on MySQL server 5.7.26 Linux (x86_64) (revision id: c8b4056)

From troubleshooting up to now I believe the issue is somewhere between xbcloud and GCS, but I’m not sure what else to inspect. Does anybody have a clue what’s happening here?

UPDATE: Apparently in my case it’s a bug with GCS, I tried the same exact command with the S3 endpoint and it seems to have uploaded successfully.

1 Like

Hi @aorfanos .

Thanks for the updated, I’ve just tested and it seems to be all working.
If the problem persists, can you please run xbcloud with --verbose flag and attach the error log.

Thanks

1 Like

Hi @Marcelo_Altmann ,

To replicate you will need a big database, try larger than 1TB. Also, I’m using xbcloud and xtrabackup version 2.4.21 .

Apparently there’s something up with HTTP/2 handling on GCS, or libcurl side (probably GCS, since I can normally backup to S3).

Specifically, GCS periodically sends HTTP/2 GOAWAY messages on connections to enforce clients to reconnect (as they’ve informed me, that’s a new feature as of 18/02/2021). Normally that shouldn’t affect connection streams, but it seems to do exactly that in this case. We’re suspecting that libcurl might be responsible for this, still testing.

I’ve ran xbcloud with --verbose, here’s the point where upload breaks:

* Connection state changed (MAX_CONCURRENT_STREAMS == 100)!
210304 17:17:23 xbcloud: Upload failed.
xtrabackup: Error writing file 'UNOPENED' (Errcode: 32 - Broken pipe)
xb_stream_write_data() failed.
compress: write to the destination stream failed.
xtrabackup: Error writing file 'UNOPENED' (Errcode: 32 - Broken pipe)
[04] xtrabackup: Error: xtrabackup_copy_datafile() failed.
[04] xtrabackup: Error: failed to copy datafile.

Let me know if you don’t manage to replicate this with a DB bigger than 1TB.

1 Like