Not the answer you need?
Register and ask your own question!

large files with mk-query-digest

iberkneriberkner ContributorInactive User Role Advisor
I have 2 slow query log files. One is 6GB, the other is 20GB. I'd like to run them through mk-query-digest, any suggestions? I've got one running already for 1+ hours.

Thanks

Comments

  • justin.swanhartjustin.swanhart MySQL Sage Inactive User Role Advisor
    You can split up the source files using the 'split' command, run mk-query-digest on the split files to produce one report per file, then use http://www.maatkit.org/doc/mk-merge-mqd-results.html to merge the reports together.
  • iberkneriberkner Contributor Inactive User Role Advisor
    Thanks, the split command worked well.

    I now have 76 250MB files.

    I processed each of them through mk-query-digest using the --save-result option which generated 76 .gz files.

    I then ran mk-merge-mqd-results res1.gz ... res76.gz, but it fails with this error:

    Error merging class/sample: undefined min value at ./mk-merge-mqd-results line 2788

    I guess there are problems with some of the files, as ultimately the composite log is still generated.
  • xaprbxaprb Mentor Inactive User Role Leader
    That seems to be a bug. Can you generate a reproducible test case and submit it to Maatkit's bug tracker at http://code.google.com/p/maatkit/issues/list ?
Sign In or Register to comment.

MySQL, InnoDB, MariaDB and MongoDB are trademarks of their respective owners.
Copyright ©2005 - 2020 Percona LLC. All rights reserved.