I have the following table with 2million records.
Books (BIBID, TITLE, AUTHOR, ISBN, LOC, CALLNO)
Users will be searching (via a php form) the table based on either TITLE or AUTHOR.
Two sample queries.
SELECT * FROM (SELECT * FROM CATALOG2 WHERE TITLE like ‘%xml%’) AS CATALOGRESULT LIMIT 60, 20
SELECT * FROM (SELECT * FROM CATALOG2 WHERE AUTHOR like ‘%tom%’) AS CATALOGRESULT LIMIT 40, 20
(I use the ‘LIMIT’ coz only 20 records are been displayed in a single php page. When users click on Prev or Next a new query is executed and retrieve another 20 records)
Please point me to the right direction to increase performance.
- I already use indexes. But doesn’t seems to be enough. Is there any specific type of indexes that I should focus on?
- What about Stored Procs and Functions? (whz the difference between those two)
- Currently using MYISAM. Do we need to move to INNODB or MEMORY?
- Good Tutorial to create a FULLTEXT index if that’s the best way to go.
- MySQL client version: 5.0.51b
- Server: Solaris 10 with 16GB RAM (Table is only 400MB, so if there’s a way we could even dedicate 1GB from RAM just for the table)
- This is a static table that doesn’t get updated after we setup the system
Thanks in advance.