Hi. I’m trying to optimise large imports into a large table with keys. Here is the story so far. Can anyone suggest ways of creating a scalable solution?
I want to allow periodic real time imports of up to 100,000 rows into a table with more than 2,000,000 rows. The table has 2 text indexes.
I use a ‘load data infile’ which is quick. Or was, until the number of rows in the table grew and the inserts slowed down because of index rebuilding (I assume).
I then started disabling and enabling the keys before and after the import, and ran the enable part off as a background process so the user didn’t have to wait for it.
However, ‘enable keys’ locks the table, which makes selects and other operations on that table impossible during the process (currently about 2 mins).
This is a problem. Can anyone suggest how I can get around it. Either using my current method, or using a different method altogether which would be more scaleable.
The table is mysam with fulltext indexes on 3 varchar columns.
My system variables look like this:
[mysqld]
set-variable = max_connections=500
safe-show-database
ft_stopword_file=‘’
ft_min_word_len=3
default-character-set=utf8
init-connect=‘SET NAMES utf8’
query_cache_type=1
query_cache_limit=2M
query_cache_size=64M
bulk_insert_buffer_size=64M
key_buffer_size=128M
log-slow-queries
delay-key-write=ON