I have a statistical database that contains a lot of data for all of the different hardware/software in our organization. The database is partitioned by week using MyISAM tables using the latest 5.1 release (installed it on Monday).
I have a new database that I’m migrating everything to and I’m starting to see some behaviors that I need to plan for so everyone can use the data in a timely manner while inserting bulk data.
There are a lot of “load data into” queries that run throughout the day and there are some users who write very long running queries that are blocking the updates from happening. I’ve thought about several options (short of smacking users) and need some opinions.
- Create a blackhole database that handles all of the inserts and then replicates to the main database for reading
- Switch to InnoDB (not sure if this will help if “load data” will lock the entire table anyway)
- Insert into another table into the main database and create an event to insert that data into the main table
Any suggestions are welcome.