Hi all.
I’m running some benchmarks at the moment. I’m running on a smaller subset of (what will likely be) a “real” production data set, for a couple of reasons: 1. It doesn’t take days to load, 2. My benchmark queries can complete in an overnight run (as opposed to taking weeks).
The problem is that, although I scale down the size of the MySQL caches so that only a realistic proportion of my data fits in the cache, the disk cache just keeps growing, and growing (the machine has 32GB of RAM, so it uses most of that). This means that my benchmark tests are unrealistic, because an unrealistic amount of data fits in memory.
Does anyone know of a way to limit the disk cache? I’ve tried dropping it entirely (echo 1 > /proc/sys/vm/drop_caches), but this really kills performance, and is unrealistic in a different way.
Alternatively does anyone know of a way around this problem? What do people typically do when they want realistic benchmarks, but they don’t want to use huge data sets?
At this point I’m considering buying a new test server and removing RAM from it.
Thanks.
Neil.