I have a large table (MyISAM with 65+million records) made from the logs from a security product. The logs contain:
Date, Time, IP address, Userid, Process name, File name, Event code, Sub code
I then do detailed queried against this data for research. It is not networked so only ever one user - me. However it has gone very slow for some of the queries I am using now. I have added a column with the MD5 hash of the Process name so that I can do faster queries on process name.
I want to start again, however I can not see a logical way to split the table or produce other tables. Any advice? This must have been done before with things like web log tables? For instance the URL field is very similar to my process field - hence the MD5 hash approach. (Process field eg; C:\Windows\System32\dhdfgg.dll )
I have several fast machines available and I wondered how I could split the table across several servers? I get bound on both CPU and/or READs depending on the query. I was hoping I could split the work across several machines?
Thanks very much