When I used pt-table-sync to synchronize mysql tinyblob type data, I found that the data on both sides were inconsistent, but there was no error in the log, I accidentally found it, and I could not set the - -charset
parameter, see the screenshot for specific results
@yu.wang.gogo,
You should be able to use --charset
and --set-vars
to enforce the proper character set. If these options are not working, please show us the command you are using and the error output.
- -set- -vars
It’s an array parameter, and I don’t really know how to set it up, right?
--set-vars charset=utf8
My order is as follows:docker run --network net-db -i -t --rm perconalab/percona-toolkit /usr/bin/pt-archiver --source h=xx.xx.xx.xx,P=3306,u=root,p=Rds-sso-Mysql-01,D=qgbbs,t=post --dest h=mysql,P=3306,u=root,p=fYMF8HabJFWzG6,D=qgbbs,t=post --where "id > 139554432 and id <= 174443040" --bulk-insert --no-delete --limit=4000 --txn-size=4000 --skip-foreign-key-checks --set-vars='character_set_client=utf8,character_set_connection=utf8,character_set_results=utf8' --charset=UTF8 --nosafe-auto-increment --progress=100000 --statistics
I don t know if my --set-vars
parameter is added correctly
--set-vars='character_set_client=utf8,character_set_connection=utf8,character_set_results=utf8'
But the garbled still exists
Can you provide a repeatable test case? Table schema and example data?
See the attachment for my table architecture and sample data
post.txt (2.6 KB)
Running without --bulk-insert worked just fine for me. I keep running into permissions issues when I use --bulk-insert because that requires LOAD INFILE LOCAL privs. Can you try that please?
Indeed, if I delete the parameter --bulk-insert
, I can insert the data normally, and the data is also normal. But my target end open LOAD INFILE LOCAL
, add --bulk-insert
, insert can also insert, is the blob field data is not correct, I feel that is a pt-archiver
in the block data problem