Graph for MySQL Replication Lag has some “gaps”, moments where it seems like it can not get right value for the system

Hello, I have the situation that graph for MySQL Replication Lag has some “gaps”, moments where it seems like it can not get right value for the system, but actually with the command ‘show slave status'G’
everything is ok and it doesn’t have the gap. From which parameter does pmm takes this info and what should be checked on pmm side?

I would check the pmm agent logs on the mysql server and see if that is reporting any issues since it is the agent which executes the SQL.

@matthewb Here is the part of output of pmm agetn logs

Nov 08 12:48:32 mysql-replica-1 pmm-agent[1078]: time="2023-11-08T12:48:32.560+04:00" level=error msg="ts=2023-11-08T08:48:32.560Z caller=stdlib.go:105 level=error msg=\"error gathering metrics: 3 error(s) occurred:\\n* [from Gatherer #2] collected metric \\\"node_textfile_scrape_error\\\" { gauge:<value:0 > } was collected before with the same name and label values\\n* [from Gatherer #2] collected metric \\\"node_textfile_scrape_error\\\" { gauge:<value:0 > } was collected before with the same name and label values\\n* [from Gatherer #2] collected metric \\\"node_textfile_scrape_error\\\" { gauge:<value:0 > } was collected before with the same name and label values\"" agentID=/agent_id/d90dcee0-653d-4827-ac07-dccad9349041 component=agent-process type=node_exporter
Nov 08 12:49:33mysql-replica-1 pmm-agent[1078]: time="2023-11-08T12:49:33.709+04:00" level=error msg="ts=2023-11-08T08:49:33.709Z caller=stdlib.go:105 level=error msg=\"error gathering metrics: 3 error(s) occurred:\\n* [from Gatherer #2] collected metric \\\"node_textfile_scrape_error\\\" { gauge:<value:0 > } was collected before with the same name and label values\\n* [from Gatherer #2] collected metric \\\"node_textfile_scrape_error\\\" { gauge:<value:0 > } was collected before with the same name and label values\\n* [from Gatherer #2] collected metric \\\"node_textfile_scrape_error\\\" { gauge:<value:0 > } was collected before with the same name and label values\"" agentID=/agent_id/d90dcee0-653d-4827-ac07-dccad9349041 component=agent-process type=node_exporter