Not the answer you need?
Register and ask your own question!

How to restore xbstream compressed backup

sfatulasfatula EntrantInactive User Role Participant
I have a backup created via stream method to xbstream, and, compressed. The output is of course a single xbstream file, which the whole thing can be extracted via xbstream -x. But, several things are missing here:

1. Is there any way to simply get a listing of all files in the file?
2. After running xbstream, the output is all of the compressed files, each of which can be extract via qpress. But, let's say all I have is the xbstream file. Is there any way I can restore a single file from it, without first running xbstream, and then qpress on one of the many extracted files?

Seems much less convenient than tar. But, maybe there are some methods I am missing here.

Comments

  • Eli KleinEli Klein Entrant Inactive User Role Participant
    I'm curious if you've figured this one out. Restoring a compressed backup (independent of xbstream) couldn't be this painful. Please let me know if you've either figured this out or if anyone has responded to you.

    Thanks!
  • sfatulasfatula Entrant Inactive User Role Participant
    Have not heard a word from anyone. It seems WAY less useful.
  • knoxvilledbaknoxvilledba Entrant Current User Role Participant
    I wish I had an answer to your problem. But as soon as I encountered the issue with tar failing after so large of a database, I tried the compress option, without stream.

    So what I am using is this:

    innobackupex -slave-info -parallel=2 -compress -compress-threads=2 /backups/

    I am having issues find an easy way to restore this other type of compressed backups as well. The compressed files are compressed using an archiver called qpress. I have never heard of this archiver before playing around with Xtrabackup. Why qpress? gzip anyone?

    The --copy-back option fails because it can't find the uncompressed versions of files when it goes to copy them back, so you have to "unqpress" them before copying them back, which is not a very clean task. This also defeats the purpose of having compressed files on an optional mount location or across network.

    So does anyone have a good recipe yet for compressed backups of databases larger than 15G with XtraBackup?
  • sfatulasfatula Entrant Inactive User Role Participant
    Without xbstream really isn't much different. xbstream merely adds a simple step to the restore, namely, xbstream -x. Which gives you the same scenario you are trying to resolve with qpress, which appears too limited.

    So, I had just done a restore to a machine. Had to

    1. xbstream -x
    2. Write a small script to visit each directory and qpress -d each .qp file. Of course, I had to FIND qpress first, and download and install it.
    3. apply-log since this whole process is way too hard and time consuming to do once it's compressed, daily.
    4. Stop MySQL
    5. copy-back
    6. Copy out stuff I did not restore (mysql database) since the directory must be empty
    7. chown everything in mysql datadir
    8. Start mysql

    Worked fine. Ideally, I'll need to write a script to do the whole thing so it's one command to do the restore. Definitely way harder than a simple tar -xzf.

    I'll have to re-visit this process and see if I might avoid the compress (and thus qpress), though, when you send off site, it's sort of nice since it's transfers faster. I can see there might be ways to stream to gzip or other such techniques, will have to try them one day.
Sign In or Register to comment.

MySQL, InnoDB, MariaDB and MongoDB are trademarks of their respective owners.
Copyright ©2005 - 2020 Percona LLC. All rights reserved.