The subject says it all really. I am curious if, on relatively large tables, there is a better way to get ‘paged’ data than using the LIMIT function.
For example I have a comments table, it contains around 300k comments. The table structure is:
The actual comment text is held in comment_body:
And finally I’ve a comment_stats table:
I split the ‘total views’ out into a separate table because this table receives a large number of update requests, where-as the main comment and comment_body tables are heavily read, but rarely modified.
Keeping the comment text itself out of the main comment table kept the table fixed. comment_body is LEFT JOINed on the comment_id to comment_ref.
comment_id is my primary key (auto inc, unsigned)
In order to page through the comments, which are displayed based on date_created, I’m using LIMIT X,Y in order to show 50 comments per page. The ‘item_ref’ field is the link to the item being commented on, so for example some items may only have a couple of comments, while others will have thousands.
I program with PHP 5 (PDO extension) and I am having to run my query once with a COUNT(*) at the start in order to get back table totals, and then again with LIMIT at the end to get the block of rows that I want to display. Using explain shows that it is checking ALL comments matching item_ref, even though it only returns the 50 I require.
Is there a better way to do this? I’m perfectly happy to move my query into a stored procedure if it will help (and use a cursor to loop through the data?) but is the problem more intrinsic than that? Perhaps the solution isn’t MySQL based at all, maybe I should just be looking at caching say the first few pages worth of results for each item and not worry about the MySQL hit?
Eitherway I’d truly appreciate your advice.