Joining many Tables vs. executing many queries

In my experience it was always better to include as much information in one statement as possible. So I created sometimes really complex queries across 5-10 tables and joined them (also using left joins). My Question is this:

If only a few restrictions on some tables in the actual where clause appear, is it faster only to join those restricting tables and execute additional queries for each record in the limited result set, or does MySQL optimize such a thing?

For example:

select t1.* ,t2.* ,t3.* ,t4.*from t1 left join t2 on t1.id = t2.t1Id left join t3 on t1.id = t3.t1Id left join t4 on t1.id = t4.t1Idwhere t1.value = ‘xy’ and t2.otherValue = 'foo’limit 0,20;

as opposed to:

select t1.* ,t2.*from t1 left join t2 on t1.id = t2.t1Idwhere t1.value = ‘xy’ and t2.otherValue = 'foo’limit 0,20;

and then fetch the values for t3 and t4 separately for each row.

This again may seem a strange question as the first statement makes much more sense…but i was wondering wether MySQL actually optimizes this itself and takes the limit into consideration. It would only need to fetch the data for t3 and t4 for the first 20 found to match the criteria…

any thoughts on this?

If MySQL optimizes query well it is better to leave it to do it job, meaning if perfoms less work than you would do manual - row examining, IO sorting etc.

In many simple cases MySQL would do better job by itself there are however cases then manually you can do better job. Examples:

  1. Constant IN Subselect. Replacing it with previously selected list, such as IN(5,2345,8909) may perform much better due to problems with MySQL Optimizer.

  2. Complex join, touching only portion of some of joined tables. It might be faster to preload this portion in the temporary MEMORY table before performing join. This is because MySQL currently does not have Hash join as in version 5.0 while nested loops method may require too many random IO requests.