Does anyone know the best way to extract all the INSERTS AND UPDATE IN A MARIA BINLOG file of 1GB IN SIZE. I have a some bin logs files more than 20 of them. But I will like to extract the inserts and update in each binlog file which is about 1GB in size.
Note: each binlog file is 1GB. I have 20 of them.I will take one at a time
I have tried some ways out but I do not want it to impact my serve . Thank you
To impact the server the least, the best is to copy the files to some other server for processing.
Then for inspecting the content of the binary log you need to use mysqlbinlog tool as explained in the following links:
Examining the content of the binary log will show everything inside the log, i.e not only INSERT and UPDATE but also DELETES and DDLS. You need to use grep and other linux commands to only get what you need.
Thank you @CTutte for looking at this. Now I’m trying to replay all the binary logs they are 1GB each like I said earlier. When I replayed one it took more than 2hours to finish. Is there any better way to get this done. To me if I have 40 of them it’s going to be in days.
Also I piped them into .sql file and I found out that each one 1GB I thought was not it was more than that , each one amounted to roughly 6GB in my .sql file. The behavior is quiet strange.
Note: Binary_transaction_compressed_level wasn’t used where I took this binary logs
This way you can take advantage of parallel replication, start/top and replication filters, etc…
If you only want to apply certain statements, then you need to manually parse the binary log file which will be time consuming and might not be compatible with above strategy.
Also do check system resources. Does the server has enough CPU and disk capacity? Applying 1 GB worth of binary logs should be faster than 1 day except if the server does not have much capacity