I have a large csv file of about 2 million domain names from urlblacklist.com
I have an app that queries the table a lot and in order to lookup the domain names fast they need to be indexed. The table is innodb that has columns for domain and categoryID, the combination of these two are the unique key.
When I try to load 2 million records into the table (load infile) it gets to about 1.5 million then starts to have a real hard time. I let it run for 12 hours but it didnt finish so I gave up.
My guess is that this is because it is having to check for a unique key on every insert, so as the number of records grows I am put in a worse and worse situation.
Is there something I can do to make this work? Is mysql just not the right tool? The oracle buffs out there are saying I am just using the wrong platform, is there something I can do to reasonably use mysql for this application?