Creating index on a large table (60 millions rows)

Hi all,
Can somebody tell me how to create an index on a large table (about 60 millions) in a right direction? Right now I am in stack with my database I can’t do anything with a data.

I was trying to create the index on the “datetime field” to chunk the data by 6 last months after it, to increase the performance.

I was trying to create on the same table, but it took me a week. After it the database was crashed and I restored it from my backup and it took me another week.

I found an article on the Internet that some guy did a research. It is not documented in MySQL documentation but if a table is large, MySQL copies all data into a temporary table and recreate all indexes (I have 10 of them) again.

His recommendation was to create a new table with the appropriate index and do “INSERT INTO new_tbl SELECT * old_tbl”; And delete the old one and rename the new one.

I did it. But it wasn’t finished in during a half of a week. I stopped it. I need to understand what is wrong with my database? Is there any way to improve the performance?

P.S.
MySQL 5.x is running in a local Soalris 10 zone
box has 4GB of RAM

Thanks in advance.

Create php script which will:
a) get some rows from old table
b) insert them into new table with indexes (as multirow insert or as delayed insert)
c) delete fetched rows from old table (or remember which rows were fetched)
Check if it is working without errors.
After that You should add

          • /path/to/bin/php /path/to/script.php
            to the crontab

Cron will run this script every minute. This may take a while, but You will be able to kill importing script without hurting database.

Add more RAM, use faster disks…