I am organizing a database for all the experiment on microarray chips. There are about 230,000 probes that will have different reading and need to be stored in the database. We will perform multiple experiment, which will output a huge number of data. I wonder which set up would give me the best result when updating and querying data.
Large table setup, with 230,000 rows and columns will be added on as we perform more and more experiments. I think this will be a pain every time we do an update.
Large table, but the columns fixed, only the rows will be increasing, which means that we will have A LOT of rows by the end of the 20th experiment. This will help updating but I’m not sure it is recommended.
Store each experiment on a separate table. This seems to be the easiest. But is it the optimized solution?
Thank you for any help!