Wondering if anyone has any recommendation around doing ‘analyze table’ as part of archiving.
we are archiving a large chunk of data from our production system (almost 60M records) from 100 different tables. We plan to do this in batches and archive based on date (e.g. for each month in past). The tables are mix of InoDB and MyISAM, so I’m wondering if we are better off doing the table analyze after all the data has been archived or along every step (i.e. after each month) or even sooner than that?
we obviously don’t want to lock down the production tables for long!
Hi,
Actually, When to run ANALYZE TABLE, depends on how much percentage of dataset would be changed. ANALYZE is recalculate the table and index statistics which are needed for Query Performance. During the analysis, the table is locked with a read lock for InnoDB and MyISAM tables.
If you are archiving large chunk of data from the table, you should run ANALYZE after archive each table rather than after all the data has been archived. So if archive process will take time, atleast some of the tables will be analyzed and you will not face any performance issue for those tables and related queries.
Thanks, this definitely makes sense. however my question was more around for each table. i.e. if we are archiving data for 10 years, and each year will have about 5 to 10% of data, should we analyze table after archiving each year or after all the years?