pump 10.97.6.44 8250 linux/x86_64 /data/deploy/pump,/data/deploy/pump/data.pump
pump 10.97.6.46 8250 linux/x86_64 /data/deploy/pump,/data/deploy/pump/data.pump
Attention:
1. If the topology is not what you expected, check your yaml file.
2. Please confirm there is no port/directory conflicts in same host.
Do you want to continue? [y/N]: y
[ Serial ] - SSHKeySet: privateKey=/home/appadmin/.tiup/storage/cluster/clusters/servicecloud_oltp/ssh/id_rsa, publicKey=/home/appadmin/.tiup/storage/cluster/clusters/servicecloud_oltp/ssh/id_rsa.pub
Download pump:v4.0.8 (linux/amd64) … Done
[Parallel] - UserSSH: user=tidb, host=10.97.6.47
[Parallel] - UserSSH: user=tidb, host=10.97.6.44
[Parallel] - UserSSH: user=tidb, host=10.97.6.47
[Parallel] - UserSSH: user=tidb, host=10.97.6.47
[Parallel] - UserSSH: user=tidb, host=10.97.6.46
[Parallel] - UserSSH: user=tidb, host=10.97.6.45
[Parallel] - UserSSH: user=tidb, host=10.97.6.46
[Parallel] - UserSSH: user=tidb, host=10.97.6.52
[Parallel] - UserSSH: user=tidb, host=10.97.6.45
[Parallel] - UserSSH: user=tidb, host=10.97.6.51
[Parallel] - UserSSH: user=tidb, host=10.97.6.47
[Parallel] - UserSSH: user=tidb, host=10.97.6.44
[Parallel] - UserSSH: user=tidb, host=10.97.6.46
[Parallel] - UserSSH: user=tidb, host=10.97.6.50
[Parallel] - UserSSH: user=tidb, host=10.97.6.45
[ Serial ] - UserSSH: user=tidb, host=10.97.6.46
[ Serial ] - UserSSH: user=tidb, host=10.97.6.44
[ Serial ] - Mkdir: host=10.97.6.44, directories=‘/data/deploy/pump’,‘/data/deploy/pump/log’,‘/data/deploy/pump/bin’,‘/data/deploy/pump/conf’,‘/data/deploy/pump/scripts’
[ Serial ] - Mkdir: host=10.97.6.46, directories=‘/data/deploy/pump’,‘/data/deploy/pump/log’,‘/data/deploy/pump/bin’,‘/data/deploy/pump/conf’,‘/data/deploy/pump/scripts’
[ Serial ] - Mkdir: host=10.97.6.44, directories=‘/data/deploy/pump/data.pump’
[ Serial ] - Mkdir: host=10.97.6.46, directories=‘/data/deploy/pump/data.pump’
[ Serial ] - CopyComponent: component=pump, version=v4.0.8, remote=10.97.6.44:/data/deploy/pump os=linux, arch=amd64
[ Serial ] - CopyComponent: component=pump, version=v4.0.8, remote=10.97.6.46:/data/deploy/pump os=linux, arch=amd64
Error: init config failed: 10.97.6.50:20160: transfer from /home/appadmin/.tiup/storage/cluster/clusters/servicecloud_oltp/config-cache/tikv-10.97.6.50-20160.service to /tmp/tikv_f88f850a-fb6a-4b4f-bfee-2f5217bc026b.service failed: failed to scp /home/appadmin/.tiup/storage/cluster/clusters/servicecloud_oltp/config-cache/tikv-10.97.6.50-20160.service to tidb@10.97.6.50:/tmp/tikv_f88f850a-fb6a-4b4f-bfee-2f5217bc026b.service: Process exited with status 1
Verbose debug logs has been written to /home/appadmin/logs/tiup-cluster-debug-2021-04-20-14-34-57.log.
Error: run /home/appadmin/.tiup/components/cluster/v1.2.3/tiup-cluster (wd:/home/appadmin/.tiup/data/SV6LSMX) failed: exit status 1
在grep -i err tikv.log时没有发现错误日志,有一些warn的告警,如下,我上传了tikv.log
[2021/04/20 14:35:54.709 +08:00] [WARN] [endpoint.rs:527] [error-response] [err=“Region error (will back off and retry) message: "peer is not leader for region 897, leader may Some(id: 899 store_id: 5)" not_leader { region_id: 897 leader { id: 899 store_id: 5 } }”]
[2021/04/20 14:35:54.709 +08:00] [WARN] [endpoint.rs:527] [error-response] [err=“Region error (will back off and retry) message: "peer is not leader for region 905, leader may Some(id: 908 store_id: 4)" not_leader { region_id: 905 leader { id: 908 store_id: 4 } }”]
[2021/04/20 14:35:54.709 +08:00] [WARN] [endpoint.rs:527] [error-response] [err=“Region error (will back off and retry) message: "peer is not leader for region 913, leader may Some(id: 916 store_id: 4)" not_leader { region_id: 913 leader { id: 916 store_id: 4 } }”]
[2021/04/20 14:35:54.709 +08:00] [WARN] [endpoint.rs:527] [error-response] [err=“Region error (will back off and retry) message: "peer is not leader for region 913, leader may Some(id: 916 store_id: 4)" not_leader { region_id: 913 leader { id: 916 store_id: 4 } }”]
[2021/04/20 14:35:54.761 +08:00] [WARN] [endpoint.rs:527] [error-response] [err=“Region error (will back off and retry) message: "peer is not leader for region 897, leader may Some(id: 899 store_id: 5)" not_leader { region_id: 897 leader { id: 899 store_id: 5 } }”]
[2021/04/20 14:35:54.761 +08:00] [WARN] [endpoint.rs:527] [error-response] [err=“Region error (will back off and retry) message: "peer is not leader for region 905, leader may Some(id: 908 store_id: 4)" not_leader { region_id: 905 leader { id: 908 store_id: 4 } }”]
[2021/04/20 14:35:54.761 +08:00] [WARN] [endpoint.rs:527] [error-response] [err=“Region error (will back off and retry) message: "peer is not leader for region 913, leader may Some(id: 916 store_id: 4)" not_leader { region_id: 913 leader { id: 916 store_id: 4 } }”]
[2021/04/20 14:35:54.761 +08:00] [WARN] [endpoint.rs:527] [error-response] [err=“Region error (will back off and retry) message: "peer is not leader for region 913, leader may Some(id: 916 store_id: 4)" not_leader { region_id: 913 leader { id: 916 store_id: 4 } }”]
[2021/04/20 14:35:54.970 +08:00] [WARN] [endpoint.rs:527] [error-response] [err=“Region error (will back off and retry) message: "peer is not leader for region 13353, leader may Some(id: 13355 store_id: 5)" not_leader { region_id: 13353 leader { id: 13355 store_id: 5 } }”]
[2021/04/20 14:36:00.027 +08:00] [WARN] [endpoint.rs:527] [error-response] [err=“Region error (will back off and retry) message: "peer is not leader for region 12045, leader may Some(id: 12048 store_id: 4)" not_leader { region_id: 12045 leader { id: 12048 store_id: 4 } }”]
[2021/04/20 14:36:00.110 +08:00] [WARN] [endpoint.rs:527] [error-response] [err=“Region error (will back off and retry) message: "peer is not leader for region 12045, leader may Some(id: 12048 store_id: 4)" not_leader { region_id: 12045 leader { id: 12048 store_id: 4 } }”]