large files with mk-query-digest

I have 2 slow query log files. One is 6GB, the other is 20GB. I’d like to run them through mk-query-digest, any suggestions? I’ve got one running already for 1+ hours.

Thanks

You can split up the source files using the ‘split’ command, run mk-query-digest on the split files to produce one report per file, then use [URL]http://www.maatkit.org/doc/mk-merge-mqd-results.html[/URL] to merge the reports together.

Thanks, the split command worked well.

I now have 76 250MB files.

I processed each of them through mk-query-digest using the --save-result option which generated 76 .gz files.

I then ran mk-merge-mqd-results res1.gz … res76.gz, but it fails with this error:

Error merging class/sample: undefined min value at ./mk-merge-mqd-results line 2788

I guess there are problems with some of the files, as ultimately the composite log is still generated.

That seems to be a bug. Can you generate a reproducible test case and submit it to Maatkit’s bug tracker at [URL]Google Code Archive - Long-term storage for Google Code Project Hosting. ?