Dear Community
I need your advice ) In our company we have a database with lots of tables and each table contains more then 500.000 rows.
My Job is to create a backend interface where user can select data via a filter interface and then keeping the results in a new table for later use. So far so good but i run into the problem that handling this large data takes a lot of time.
For Example
CREATE TABLE searcg_xxtemp
( id
bigint(20) unsigned NOT NULL auto_increment, name
varchar(30) default NULL, pw
varchar(30) default NULL, email
varchar(255) default NULL, isanonym
tinyint(4) NOT NULL default ‘0’, lastlogindatum
datetime NOT NULL default ‘0000-00-00 00:00:00’, PRIMARY KEY (id
), UNIQUE KEY name
(name
), KEY user_idkennung
(id
,kennung
), FULLTEXT KEY pw
(pw
)) ENGINE=MyISAM SELECT * FROM user WHERE {Filterdata goes here}
This takes a lot of time because in worst case the SELECT statement returns 500.000 rows. How can handle this the best and fastest way. I need a good solution how to store the temporary results for later use! Maybe you can give me a good hint )