I store full backup encrypted and compressed and now I want to retrieve only a specific database. Looking the documentation The xbcloud Binary about partial download, it’s not clear to me the files I have pass to restore a specific database from a full backup when I need to gather the files using xbcloud
I pass obviously the name of the database (that match the folder name that contains the files) and ibdata1. Do I need something else ?
so it gives this command-line
xbcloud get --parallel=10 --swift-container=db_full_20210303 ibdata1 my_database
do I need something else ?
edit: actually xbcloud does not retrieve ibdata1 and database files with the above command. it actually download nothing at all (but --verbose shows me it looks for files)
As files are compressed and encrypted during backup, they end with qp.xbcrypt (plus the file segment 0000000000000xx), does it impact the way I should provide the filename to xbcloud ?
Using --verbose I can see the HTTP requests and xbcloud is browsing files
GET /v1/AUTH_xxxxxx/full_db_backup_20210303?format=json&limit=10&marker=mysql_full_2021030302%2Fibdata1.qp.xbcrypt.00000000000000000013&prefix=mysql_full_2021030302%2F HTTP/1.1
The way to restore files is either to pull all files from a folder, or manually specify every file you want. For example. If I want ibdata1 and a single file from a database (here named m1/a.frm) we have to use:
The way xbcloud works it connects, enter to the folder you specify (in this last example marce_test/m1) list all files, and if you have specified a list of file it will pull those files, if no files are specified it will download all files on that folder and its subfolders. In this last example, it downloaded all the files from m1 database.
Also, please note that you will need xtrabackup_checkpoints, xtrabackup_info and xtrabackup_logfile in order to prepare the backup.
Perhaps adding the information about the necessity to get xtrabackup_checkpoints , xtrabackup_info and xtrabackup_logfile would be a nice addition to the man / documention to give more information for other users.
About your mention that I had to match filename for downloading files (hence that xbcloud doesn’t perform globbing or regexp matching) I discover that after my second post, however I’m unable to download for instance ibdata1 for a compressed and/or cyphered backup.
I’ve passed ibdata1.qp.bcrypt and I got nothing downloaded at the end, same applies for folders.
is this configuration supported ? I’m using swift storage if it matters.
so I launch a partial download this way on a cloud backup which is just compressed (so files end with qp extension)
xbcloud get all_2021030101 --swift-container=database_backup_2021030101 ibdata1.qp
I can see that xbcloud can see chunk of the file ibdata1.qp
GET /v1/AUTH_7542981824cf4bcb883cf4c4321195ae/database_backup_2021030101?format=json&limit=10&prefix=all_2021030101%2F HTTP/1.1
GET /v1/AUTH_7542981824cf4bcb883cf4c4321195ae/database_backup_2021030101?format=json&limit=10&marker=all_2021030101%2Fibdata1.qp.00000000000000000009&prefix=all_2021030101%2F HTTP/1.1
GET /v1/AUTH_7542981824cf4bcb883cf4c4321195ae/database_backup_2021030101?format=json&limit=10&marker=all_2021030101%2Fibdata1.qp.00000000000000000019&prefix=all_2021030101%2F HTTP/1.1
GET /v1/AUTH_7542981824cf4bcb883cf4c4321195ae/database_backup_2021030101?format=json&limit=10&marker=all_2021030101%2Fibdata1.qp.00000000000000000030&prefix=all_2021030101%2F HTTP/1.1
GET /v1/AUTH_7542981824cf4bcb883cf4c4321195ae/database_backup_2021030101?format=json&limit=10&marker=all_2021030101%2Fibdata1.qp.00000000000000000042&prefix=all_2021030101%2F HTTP/1.1
GET /v1/AUTH_7542981824cf4bcb883cf4c4321195ae/database_backup_2021030101?format=json&limit=10&marker=all_2021030101%2Fibdata1.qp.00000000000000000052&prefix=all_2021030101%2F HTTP/1.1
GET /v1/AUTH_7542981824cf4bcb883cf4c4321195ae/database_backup_2021030101?format=json&limit=10&marker=all_2021030101%2Fibdata1.qp.00000000000000000062&prefix=all_2021030101%2F HTTP/1.1
GET /v1/AUTH_7542981824cf4bcb883cf4c4321195ae/database_backup_2021030101?format=json&limit=10&marker=all_2021030101%2Fibdata1.qp.00000000000000000072&prefix=all_2021030101%2F HTTP/1.1
GET /v1/AUTH_7542981824cf4bcb883cf4c4321195ae/database_backup_2021030101?format=json&limit=10&marker=all_2021030101%2Fibdata1.qp.00000000000000000082&prefix=all_2021030101%2F HTTP/1.1
GET /v1/AUTH_7542981824cf4bcb883cf4c4321195ae/database_backup_2021030101?format=json&limit=10&marker=all_2021030101%2Fibdata1.qp.00000000000000000092&prefix=all_2021030101%2F HTTP/1.1
GET /v1/AUTH_7542981824cf4bcb883cf4c4321195ae/database_backup_2021030101?format=json&limit=10&marker=all_2021030101%2Fibdata1.qp.00000000000000000102&prefix=all_2021030101%2F HTTP/1.1
GET /v1/AUTH_7542981824cf4bcb883cf4c4321195ae/database_backup_2021030101?format=json&limit=10&marker=all_2021030101%2Fibdata1.qp.00000000000000000112&prefix=all_2021030101%2F HTTP/1.1
GET /v1/AUTH_7542981824cf4bcb883cf4c4321195ae/database_backup_2021030101?format=json&limit=10&marker=all_2021030101%2Fibdata1.qp.00000000000000000122&prefix=all_2021030101%2F HTTP/1.1
GET /v1/AUTH_7542981824cf4bcb883cf4c4321195ae/database_backup_2021030101?format=json&limit=10&marker=all_2021030101%2Fibdata1.qp.00000000000000000132&prefix=all_2021030101%2F HTTP/1.1
GET /v1/AUTH_7542981824cf4bcb883cf4c4321195ae/database_backup_2021030101?format=json&limit=10&marker=all_2021030101%2Fibdata1.qp.00000000000000000142&prefix=all_2021030101%2F HTTP/1.1
GET /v1/AUTH_7542981824cf4bcb883cf4c4321195ae/database_backup_2021030101?format=json&limit=10&marker=all_2021030101%2Fibdata1.qp.00000000000000000152&prefix=all_2021030101%2F HTTP/1.1
GET /v1/AUTH_7542981824cf4bcb883cf4c4321195ae/database_backup_2021030101?format=json&limit=10&marker=all_2021030101%2Fibdata1.qp.00000000000000000163&prefix=all_2021030101%2F HTTP/1.1
GET /v1/AUTH_7542981824cf4bcb883cf4c4321195ae/database_backup_2021030101?format=json&limit=10&marker=all_2021030101%2Fibdata1.qp.00000000000000000173&prefix=all_2021030101%2F HTTP/1.1
GET /v1/AUTH_7542981824cf4bcb883cf4c4321195ae/database_backup_2021030101?format=json&limit=10&marker=all_2021030101%2Fibdata1.qp.00000000000000000183&prefix=all_2021030101%2F HTTP/1.1
GET /v1/AUTH_7542981824cf4bcb883cf4c4321195ae/database_backup_2021030101?format=json&limit=10&marker=all_2021030101%2Fibdata1.qp.00000000000000000193&prefix=all_2021030101%2F HTTP/1.1
GET /v1/AUTH_7542981824cf4bcb883cf4c4321195ae/database_backup_2021030101?format=json&limit=10&marker=all_2021030101%2Fibdata1.qp.00000000000000000203&prefix=all_2021030101%2F HTTP/1.1
GET /v1/AUTH_7542981824cf4bcb883cf4c4321195ae/database_backup_2021030101?format=json&limit=10&marker=all_2021030101%2Fibdata1.qp.00000000000000000213&prefix=all_2021030101%2F HTTP/1.1
GET /v1/AUTH_7542981824cf4bcb883cf4c4321195ae/database_backup_2021030101?format=json&limit=10&marker=all_2021030101%2Fibdata1.qp.00000000000000000223&prefix=all_2021030101%2F HTTP/1.1
GET /v1/AUTH_7542981824cf4bcb883cf4c4321195ae/database_backup_2021030101?format=json&limit=10&marker=all_2021030101%2Fibdata1.qp.00000000000000000233&prefix=all_2021030101%2F HTTP/1.1
However the file is not downloaded. The same applies if I try to match a directory as you suggested previously,
However, swift uses a different implementation. I currently don’t have access to any swift container. If I sent you my public gpg key, would you be able to create a temporary access key/secret and upload a test backup so I can test ?
In that case, please open a JIRA bug. The bug validation team will check that against Swift.
For now, as a workaround please download all files in order to restore your backup.
Fast forwarding to today, I found swift has a S3 API so now I use this.
This time I can download single files if I put their names (like xbcloud get full --s3-bucket my-bucket xtrabackup_binlog_info.qp.xbcrypt) but not a whole directory (xbcloud get full --s3-bucket my-bucket my_db) as for swift.
Is this case supposed to work as it is or is this a bug ?