large table - index creation

I am running MYSQL version 5.0.24 and have a table with over 100 million records. A slimmed down version of the InnoDB table looks like the following:

Individual_id varchar(10),
Family_id varchar(10),
Value varchar(100)

The goal was to create a table that could be used to generate a list of values by individual and by family id. So, each of the following queries would produce the list of values that I’m looking for.

select group_concat(value)
from table
group by individual_id

example: individual1, value1, value2, value3,….,valueN

select group_concat(value)
from table
group by family_id

example: family1, value1, value2, value3,….,valueN

I’ve imported all the data but it is taking a very long time to create indexes on the Individual_id and Family_id columns. Individual_id has a cardinality of 20,000 while family_id has a cardinality of ~ 4000. The index creation has been running for over 4 hours. Can anyone suggest a more efficient way to approach the problem of index creation? In hindsight, I probably would have converted the individual_id and family_id to an integer value? I’m wondering if I should cut my losses and make that switch before creating the indexes?

Thanks in advance.

Are these Innodb tables or MyISAM tables?

If MyISAM do a “show processlist” and see if the
MySQL thread is stuck doing a “repair with keycache”. It should
be doing a “repair with filesort” instead. Make sure you have
key_buffer size big enough to get around this.

I missed that you said they were Innodb tables…

Maybe make the innodb_buffer_size and sort_buffer_size
as big as possible and should help a lot. You can
set the sort_buffer_size dynamically.

SET sort_buffer_size=1000000

slight correction…f MyISAM, make sure myisam_sort_buffer
is also big.