On the fly xtrabackup straight to aws S3 and restoring

Putting my streamed format backup straight to S3

xtrabackup --encrypt=AES256 --encrypt-key-file=mysecretkey --backup --compress --stream=xbstream --extra-lsndir=/tmp --target-dir=/tmp | xbcloud put --parallel=8 --storage=s3 --s3-bucket=mybucket test-xbcloud

Now my question, is it possible to pipe xbstream so it can decrypt and decompress the backups straight to disk for database to be readily available as it was streamed?

What I thought I should use

xbcloud get --parallel=8 s3://mybucket/test-xbcloud | xbstream -x | xtrabackup --decrypt=AES256 --encrypt-key-file=mysecretkey --decompress --target-dir=./ --remove-original

This gets me the files on disk like these

ibdata1.qp.xbcrypt

…

Then I have to run third command with decrypt and decompress to get the them decrypted and uncompressed, once again can I pipe xbstream straight ?

1 Like

Hi Arnoldasb. You can achieve it using below commands:

mkfifo /tmp/xtrabackup.fifo

xbstream -x < /tmp/xtrabackup.fifo -C /tmp/backup &

xtrabackup --encrypt=AES256 --encrypt-key-file=mysecretkey --backup --compress --stream=xbstream --extra-lsndir=/tmp --target-dir=/tmp | xbcloud put --parallel=8 --storage=s3 --s3-bucket=mybucket test-xbcloud

xtrabackup --decrypt=AES256 --encrypt-key-file=mysecretkey --decompress --target-dir=/tmp/backup --remove-original

This way you will be streaming the backup to s3 and to /tmp/backup via a FIFO file.

1 Like

Thanks Marcelo for following up :slight_smile:

So streaming database to S3 works with my command, and xbcloud put.

What about restoring database from S3 on the fly, is it possible? xbstream format, when it is encrypted and compressed?

Could you show me whole commands with “xbcloud get” then ?

1 Like

Hi Arnoldasb.

What about restoring database from s3 on the fly. is it possible? xbstream format, when it is encrypted and compressed?

I’m not sure what you mean by on the fly here. If you mean on a single command, no. Prepare phase cannot read from stdin as it has to read multiple files in a specific order. But you can wait until the backup has been streamed to S3 and then pull the backup files and restore them (two commands):

xbcloud get --parallel=8 s3://mybucket/test-xbcloud | xbstream -x

xtrabackup --decrypt=AES256 --encrypt-key-file=mysecretkey --decompress --target-dir=/tmp/backup --remove-original

The way you are doing it you are doubleing the network transfer as you first upload then you donwload. With fifo file you first write it locally to have it ready in case you need to use, then it uploads to s3. Btw, on my previous reply I missed part of the command to simultaniusly send data to fifo file and send it to xbcloud via the tee command:

mkfifo /tmp/xtrabackup.fifo

xbstream -x < /tmp/xtrabackup.fifo -C /tmp/backup &

xtrabackup --encrypt=AES256 --encrypt-key-file=mysecretkey --backup --compress --stream=xbstream --extra-lsndir=/tmp --target-dir=/tmp | tee /tmp/xtrabackup.fifo | xbcloud put --parallel=8 --storage=s3 --s3-bucket=mybucket test-xbcloud

xtrabackup --decrypt=AES256 --encrypt-key-file=mysecretkey --decompress --target-dir=/tmp/backup --remove-original

1 Like

Yeah that’s exactly what I came up with for restoring backup from S3 with these two commands.

xbcloud get --parallel=8 s3://mybucket/test-xbcloud | xbstream -x

xtrabackup --decrypt=AES256 --encrypt-key-file=mysecretkey --decompress --target-dir=/tmp/backup --remove-original

For prepare phase I understand it’s another part, I just originally thought if it is possible to pipe above commands through one liner as xbstream -x was the gotcha and needs to separate those commands.

I thought to myself maybe if you can dump | stream | encrypt | compress | upload straight to S3, maybe it is possible to do the same when you want to restore it with avoiding additional I/O on 2 command parts instead of one pipe’ing “download | uncompress | decrypt” while downloading stream, decrypt and uncompress straight to original files. But I guess right now it’s just on a wishlist, as it is made currently, you first need to download everything to be able to uncompress and decrypt it then and cannot be achieved through streamed pipes, right?

1 Like

You are correct. This might be possible to implement as --decompress --decrypt could potentially read from stdin and thus be used alongside with xbstream. Do you mind to raise this as a feature request under https://jira.percona.com/projects/PXB/ ?

1 Like

Thanks Marcelo for assistance and guidance :slight_smile: first time (newbie) I am doing this, so pardon me if something needs to be done differently.

I have registered such a feature request and linked to this discussion as well.

https://jira.percona.com/browse/PXB-2385

1 Like

Can i upload on premise backup to AWS S3 using xbcloud and also can i download from AWS S3 to EC2 using xbcloud ?
If yes, Is there any limitation in this entire process ?

1 Like