We are testing the backup/restore on the sharding server.
/data/backup directory is accessible from all servers.
backup file is not recognized by the recovery target server.
backup server
$ pbm status
Cluster:
========
config-gf:
- config-gf/data-mongodb-wb901:27300: pbm-agent v1.6.1 OK
- config-gf/data-mongodb-wb902:27300: pbm-agent v1.6.1 OK
- config-gf/data-mongodb-wb903:27300: pbm-agent v1.6.1 OK
shard-gf1:
- shard-gf1/data-mongodb-wb901:27201: pbm-agent v1.6.1 OK
- shard-gf1/data-mongodb-wb902:27201: pbm-agent v1.6.1 OK
- shard-gf1/data-mongodb-wb903:27201: pbm-agent v1.6.1 OK
shard-gf2:
- shard-gf2/data-mongodb-wb901:27202: pbm-agent v1.6.1 OK
- shard-gf2/data-mongodb-wb902:27202: pbm-agent v1.6.1 OK
- shard-gf2/data-mongodb-wb903:27202: pbm-agent v1.6.1 OK
PITR incremental backup:
========================
Status [ON]
Currently running:
==================
(none)
Backups:
========
FS /data/backup
Snapshots:
2021-12-28T20:13:54Z 229.30MB [complete: 2021-12-28T20:13:58]
PITR chunks [304.55KB]:
2021-12-28T20:13:59 - 2021-12-28T20:54:10
restore server
$ pbm status
Cluster:
========
config-gf:
- config-gf/data-mongodb-wa901:27300: pbm-agent v1.6.1 OK
- config-gf/data-mongodb-wa902:27300: pbm-agent v1.6.1 OK
- config-gf/data-mongodb-wa903:27300: pbm-agent v1.6.1 OK
shard-gf1:
- shard-gf1/data-mongodb-wa901:27201: pbm-agent v1.6.1 OK
- shard-gf1/data-mongodb-wa902:27201: pbm-agent v1.6.1 OK
- shard-gf1/data-mongodb-wa903:27201: pbm-agent v1.6.1 OK
shard-gf2:
- shard-gf2/data-mongodb-wa901:27202: pbm-agent v1.6.1 OK
- shard-gf2/data-mongodb-wa902:27202: pbm-agent v1.6.1 OK
- shard-gf2/data-mongodb-wa903:27202: pbm-agent v1.6.1 OK
PITR incremental backup:
========================
Status [OFF]
Currently running:
==================
(none)
Backups:
========
FS /data/backup
(none)
$ pbm config --force-resync
$ pbm logs
2021-12-28T21:00:02Z I [shard-gf1/data-mongodb-wa903:27201] [resyncBcpList] not a member of the leader rs
2021-12-28T21:00:02Z I [shard-gf2/data-mongodb-wa903:27202] got command resyncBcpList <ts: 1640725201>
2021-12-28T21:00:02Z I [shard-gf2/data-mongodb-wa903:27202] got epoch {1640725201 28}
2021-12-28T21:00:02Z I [shard-gf2/data-mongodb-wa903:27202] [resyncBcpList] not a member of the leader rs
2021-12-28T21:00:02Z I [config-gf/data-mongodb-wa901:27300] got command resyncBcpList <ts: 1640725201>
2021-12-28T21:00:02Z I [config-gf/data-mongodb-wa902:27300] got command resyncBcpList <ts: 1640725201>
2021-12-28T21:00:02Z I [config-gf/data-mongodb-wa901:27300] got epoch {1640725201 28}
2021-12-28T21:00:02Z I [config-gf/data-mongodb-wa902:27300] got epoch {1640725201 28}
2021-12-28T21:00:02Z I [shard-gf1/data-mongodb-wa902:27201] got command resyncBcpList <ts: 1640725201>
2021-12-28T21:00:02Z I [shard-gf1/data-mongodb-wa902:27201] got epoch {1640725201 28}
2021-12-28T21:00:02Z I [shard-gf2/data-mongodb-wa901:27202] got command resyncBcpList <ts: 1640725201>
2021-12-28T21:00:02Z I [shard-gf1/data-mongodb-wa902:27201] [resyncBcpList] not a member of the leader rs
2021-12-28T21:00:02Z I [shard-gf2/data-mongodb-wa901:27202] got epoch {1640725201 28}
2021-12-28T21:00:02Z I [shard-gf2/data-mongodb-wa901:27202] [resyncBcpList] not a member of the leader rs
2021-12-28T21:00:02Z I [shard-gf1/data-mongodb-wa901:27201] got command resyncBcpList <ts: 1640725201>
2021-12-28T21:00:02Z I [shard-gf1/data-mongodb-wa901:27201] got epoch {1640725201 28}
2021-12-28T21:00:02Z I [shard-gf1/data-mongodb-wa901:27201] [resyncBcpList] not a member of the leader rs
2021-12-28T21:00:02Z I [shard-gf2/data-mongodb-wa902:27202] got command resyncBcpList <ts: 1640725201>
2021-12-28T21:00:02Z I [shard-gf2/data-mongodb-wa902:27202] got epoch {1640725201 28}
2021-12-28T21:00:02Z I [shard-gf2/data-mongodb-wa902:27202] [resyncBcpList] not a member of the leader rs