为提高效率,请提供以下信息,问题描述清晰能够更快得到解决:
【TiDB 版本】
v4.0.8
【问题描述】
[tidb@data11 ~]$ tiup cluster display jiuji-tidb-cluster-v2
Starting component `cluster`: /home/tidb/.tiup/components/cluster/v1.3.2/tiup-cluster display jiuji-tidb-cluster-v2
Cluster type: tidb
Cluster name: jiuji-tidb-cluster-v2
Cluster version: v4.0.8
SSH type: builtin
Dashboard URL: http://192.168.254.32:9379/dashboard
ID Role Host Ports OS/Arch Status Data Dir Deploy Dir
-- ---- ---- ----- ------- ------ -------- ----------
192.168.254.12:9095 alertmanager 192.168.254.12 9095/9096 linux/x86_64 Up /data/tidb-data-v2/alertmanager-9095 /data/tidb-deploy-v2/alertmanager-9095
192.168.254.12:3030 grafana 192.168.254.12 3030 linux/x86_64 Up - /data/tidb-deploy-v2/grafana-3030
192.168.254.13:9379 pd 192.168.254.13 9379/9380 linux/x86_64 Up /data/tidb-data-v2/pd-9379 /data/tidb-deploy-v2/pd-9379
192.168.254.31:9379 pd 192.168.254.31 9379/9380 linux/x86_64 Up|L /data/tidb-data-v2/pd-9379 /data/tidb-deploy-v2/pd-9379
192.168.254.32:9379 pd 192.168.254.32 9379/9380 linux/x86_64 Up|UI /data/tidb-data-v2/pd-9379 /data/tidb-deploy-v2/pd-9379
192.168.254.12:9080 prometheus 192.168.254.12 9080 linux/x86_64 Up /data/tidb-data-v2/prometheus-9080 /data/tidb-deploy-v2/prometheus-9080
192.168.254.13:9383 tidb 192.168.254.13 9383/9384 linux/x86_64 Up - /data/tidb-deploy-v2/tidb-9383
192.168.254.31:9383 tidb 192.168.254.31 9383/9384 linux/x86_64 Up - /data/tidb-deploy-v2/tidb-9383
192.168.254.32:9383 tidb 192.168.254.32 9383/9384 linux/x86_64 Up - /data/tidb-deploy-v2/tidb-9383
192.168.254.12:8000 tiflash 192.168.254.12 8000/7123/2930/20270/20192/7234 linux/x86_64 Up /data/tidb-data-v2/tiflash-8000 /data/tidb-deploy-v2/tiflash-8000
192.168.254.13:9385 tikv 192.168.254.13 9385/9386 linux/x86_64 Up /data/tidb-data-v2/tikv-9385 /data/tidb-deploy-v2/tikv-9385
192.168.254.31:9385 tikv 192.168.254.31 9385/9386 linux/x86_64 Up /data/tidb-data-v2/tikv-9385 /data/tidb-deploy-v2/tikv-9385
192.168.254.32:9385 tikv 192.168.254.32 9385/9386 linux/x86_64 Up /data/tidb-data-v2/tikv-9385 /data/tidb-deploy-v2/tikv-9385
Total nodes: 13
[tidb@data11 ~]$ tiup update --self
download https://tiup-mirrors.pingcap.com/tiup-v1.3.2-linux-amd64.tar.gz 8.49 MiB / 8.49 MiB 100.00% 2.25 GiB p/s
Updated successfully!
[tidb@data11 ~]$ tiup update cluster
component cluster version v1.3.2 is already installed
Updated successfully!
执行
tiup cluster upgrade jiuji-tidb-cluster-v2 v4.0.9
错误日志如下(直接升级到v4.0.10 也一样的错误):
+ [ Serial ] - InitConfig: cluster=jiuji-tidb-cluster-v2, user=tidb, host=192.168.254.12, path=/home/tidb/.tiup/storage/cluster/clusters/jiuji-tidb-cluster-v2/config-cache/grafana-3030.service, deploy_dir=/data/tidb-deploy-v2/grafana-3030, data_dir=[], log_dir=/data/tidb-deploy-v2/grafana-3030/log, cache_dir=/home/tidb/.tiup/storage/cluster/clusters/jiuji-tidb-cluster-v2/config-cache
+ [ Serial ] - InitConfig: cluster=jiuji-tidb-cluster-v2, user=tidb, host=192.168.254.31, path=/home/tidb/.tiup/storage/cluster/clusters/jiuji-tidb-cluster-v2/config-cache/tikv-9385.service, deploy_dir=/data/tidb-deploy-v2/tikv-9385, data_dir=[/data/tidb-data-v2/tikv-9385], log_dir=/data/tidb-deploy-v2/tikv-9385/log, cache_dir=/home/tidb/.tiup/storage/cluster/clusters/jiuji-tidb-cluster-v2/config-cache
+ [ Serial ] - InitConfig: cluster=jiuji-tidb-cluster-v2, user=tidb, host=192.168.254.32, path=/home/tidb/.tiup/storage/cluster/clusters/jiuji-tidb-cluster-v2/config-cache/tikv-9385.service, deploy_dir=/data/tidb-deploy-v2/tikv-9385, data_dir=[/data/tidb-data-v2/tikv-9385], log_dir=/data/tidb-deploy-v2/tikv-9385/log, cache_dir=/home/tidb/.tiup/storage/cluster/clusters/jiuji-tidb-cluster-v2/config-cache
+ [ Serial ] - InitConfig: cluster=jiuji-tidb-cluster-v2, user=tidb, host=192.168.254.12, path=/home/tidb/.tiup/storage/cluster/clusters/jiuji-tidb-cluster-v2/config-cache/tiflash-8000.service, deploy_dir=/data/tidb-deploy-v2/tiflash-8000, data_dir=[/data/tidb-data-v2/tiflash-8000], log_dir=/data/tidb-deploy-v2/tiflash-8000/log, cache_dir=/home/tidb/.tiup/storage/cluster/clusters/jiuji-tidb-cluster-v2/config-cache
Error: init config failed: 192.168.254.31:9385: executor.ssh.execute_failed: Failed to execute command over SSH for 'tidb@192.168.254.31:22' {ssh_stderr: invalid configuration: default rocksdb not exist, buf raftdb exist
, ssh_stdout: , ssh_command: export LANG=C; PATH=$PATH:/usr/bin:/usr/sbin /data/tidb-deploy-v2/tikv-9385/bin/tikv-server --config-check --config=/data/tidb-deploy-v2/tikv-9385/conf/tikv.toml --pd=""}, cause: Process exited with status 1: check config failed
Verbose debug logs has been written to /home/tidb/.tiup/logs/tiup-cluster-debug-2021-02-02-01-41-00.log.
Error: run `/home/tidb/.tiup/components/cluster/v1.3.2/tiup-cluster` (wd:/home/tidb/.tiup/data/SNmy76V) failed: exit status 1
2021-02-02T01:40:59.859+0800 DEBUG TaskFinish {"task": "BackupComponent: component=tiflash, currentVersion=v4.0.8, remote=192.168.254.12:/data/tidb-deplo
y-v2/tiflash-8000\
CopyComponent: component=tiflash, version=v4.0.9, remote=192.168.254.12:/data/tidb-deploy-v2/tiflash-8000 os=linux, arch=amd64\
InitConfig: clus
ter=jiuji-tidb-cluster-v2, user=tidb, host=192.168.254.12, path=/home/tidb/.tiup/storage/cluster/clusters/jiuji-tidb-cluster-v2/config-cache/tiflash-8000.service,
deploy_dir=/data/tidb-deploy-v2/tiflash-8000, data_dir=[/data/tidb-data-v2/tiflash-8000], log_dir=/data/tidb-deploy-v2/tiflash-8000/log, cache_dir=/home/tidb/.tiup
/storage/cluster/clusters/jiuji-tidb-cluster-v2/config-cache\
BackupComponent: component=pd, currentVersion=v4.0.8, remote=192.168.254.13:/data/tidb-deploy-v2/pd-9
379\
CopyComponent: component=pd, version=v4.0.9, remote=192.168.254.13:/data/tidb-deploy-v2/pd-9379 os=linux, arch=amd64\
InitConfig: cluster=jiuji-tidb-cluster-v
2, user=tidb, host=192.168.254.13, path=/home/tidb/.tiup/storage/cluster/clusters/jiuji-tidb-cluster-v2/config-cache/pd-9379.service, deploy_dir=/data/tidb-deploy-
v2/pd-9379, data_dir=[/data/tidb-data-v2/pd-9379], log_dir=/data/tidb-deploy-v2/pd-9379/log, cache_dir=/home/tidb/.tiup/storage/cluster/clusters/jiuji-tidb-cluster
-v2/config-cache\
BackupComponent: component=pd, currentVersion=v4.0.8, remote=192.168.254.31:/data/tidb-deploy-v2/pd-9379\
CopyComponent: component=pd, version=v4
.0.9, remote=192.168.254.31:/data/tidb-deploy-v2/pd-9379 os=linux, arch=amd64\
InitConfig: cluster=jiuji-tidb-cluster-v2, user=tidb, host=192.168.254.31, path=/hom
e/tidb/.tiup/storage/cluster/clusters/jiuji-tidb-cluster-v2/config-cache/pd-9379.service, deploy_dir=/data/tidb-deploy-v2/pd-9379, data_dir=[/data/tidb-data-v2/pd-
9379], log_dir=/data/tidb-deploy-v2/pd-9379/log, cache_dir=/home/tidb/.tiup/storage/cluster/clusters/jiuji-tidb-cluster-v2/config-cache\
BackupComponent: component
=pd, currentVersion=v4.0.8, remote=192.168.254.32:/data/tidb-deploy-v2/pd-9379\
CopyComponent: component=pd, version=v4.0.9, remote=192.168.254.32:/data/tidb-deplo
y-v2/pd-9379 os=linux, arch=amd64\
InitConfig: cluster=jiuji-tidb-cluster-v2, user=tidb, host=192.168.254.32, path=/home/tidb/.tiup/storage/cluster/clusters/jiuji-
tidb-cluster-v2/config-cache/pd-9379.service, deploy_dir=/data/tidb-deploy-v2/pd-9379, data_dir=[/data/tidb-data-v2/pd-9379], log_dir=/data/tidb-deploy-v2/pd-9379/
log, cache_dir=/home/tidb/.tiup/storage/cluster/clusters/jiuji-tidb-cluster-v2/config-cache\
BackupComponent: component=tikv, currentVersion=v4.0.8, remote=192.168
.254.13:/data/tidb-deploy-v2/tikv-9385\
CopyComponent: component=tikv, version=v4.0.9, remote=192.168.254.13:/data/tidb-deploy-v2/tikv-9385 os=linux, arch=amd64\
I
nitConfig: cluster=jiuji-tidb-cluster-v2, user=tidb, host=192.168.254.13, path=/home/tidb/.tiup/storage/cluster/clusters/jiuji-tidb-cluster-v2/config-cache/tikv-93
85.service, deploy_dir=/data/tidb-deploy-v2/tikv-9385, data_dir=[/data/tidb-data-v2/tikv-9385], log_dir=/data/tidb-deploy-v2/tikv-9385/log, cache_dir=/home/tidb/.t
iup/storage/cluster/clusters/jiuji-tidb-cluster-v2/config-cache\
BackupComponent: component=tikv, currentVersion=v4.0.8, remote=192.168.254.31:/data/tidb-deploy-v2
/tikv-9385\
CopyComponent: component=tikv, version=v4.0.9, remote=192.168.254.31:/data/tidb-deploy-v2/tikv-9385 os=linux, arch=amd64\
InitConfig: cluster=jiuji-tid
b-cluster-v2, user=tidb, host=192.168.254.31, path=/home/tidb/.tiup/storage/cluster/clusters/jiuji-tidb-cluster-v2/config-cache/tikv-9385.service, deploy_dir=/data
/tidb-deploy-v2/tikv-9385, data_dir=[/data/tidb-data-v2/tikv-9385], log_dir=/data/tidb-deploy-v2/tikv-9385/log, cache_dir=/home/tidb/.tiup/storage/cluster/clusters
/jiuji-tidb-cluster-v2/config-cache\
BackupComponent: component=tikv, currentVersion=v4.0.8, remote=192.168.254.32:/data/tidb-deploy-v2/tikv-9385\
CopyComponent: c
omponent=tikv, version=v4.0.9, remote=192.168.254.32:/data/tidb-deploy-v2/tikv-9385 os=linux, arch=amd64\
InitConfig: cluster=jiuji-tidb-cluster-v2, user=tidb, hos
t=192.168.254.32, path=/home/tidb/.tiup/storage/cluster/clusters/jiuji-tidb-cluster-v2/config-cache/tikv-9385.service, deploy_dir=/data/tidb-deploy-v2/tikv-9385, d
ata_dir=[/data/tidb-data-v2/tikv-9385], log_dir=/data/tidb-deploy-v2/tikv-9385/log, cache_dir=/home/tidb/.tiup/storage/cluster/clusters/jiuji-tidb-cluster-v2/confi
g-cache\
BackupComponent: component=tidb, currentVersion=v4.0.8, remote=192.168.254.13:/data/tidb-deploy-v2/tidb-9383\
CopyComponent: component=tidb, version=v4.0.
9, remote=192.168.254.13:/data/tidb-deploy-v2/tidb-9383 os=linux, arch=amd64\
InitConfig: cluster=jiuji-tidb-cluster-v2, user=tidb, host=192.168.254.13, path=/home
/tidb/.tiup/storage/cluster/clusters/jiuji-tidb-cluster-v2/config-cache/tidb-9383.service, deploy_dir=/data/tidb-deploy-v2/tidb-9383, data_dir=[], log_dir=/data/ti
db-deploy-v2/tidb-9383/log, cache_dir=/home/tidb/.tiup/storage/cluster/clusters/jiuji-tidb-cluster-v2/config-cache\
BackupComponent: component=tidb, currentVersion
=v4.0.8, remote=192.168.254.31:/data/tidb-deploy-v2/tidb-9383\
CopyComponent: component=tidb, version=v4.0.9, remote=192.168.254.31:/data/tidb-deploy-v2/tidb-9383
os=linux, arch=amd64\
InitConfig: cluster=jiuji-tidb-cluster-v2, user=tidb, host=192.168.254.31, path=/home/tidb/.tiup/storage/cluster/clusters/jiuji-tidb-cluster-
v2/config-cache/tidb-9383.service, deploy_dir=/data/tidb-deploy-v2/tidb-9383, data_dir=[], log_dir=/data/tidb-deploy-v2/tidb-9383/log, cache_dir=/home/tidb/.tiup/s
torage/cluster/clusters/jiuji-tidb-cluster-v2/config-cache\
BackupComponent: component=tidb, currentVersion=v4.0.8, remote=192.168.254.32:/data/tidb-deploy-v2/tidb
-9383\
CopyComponent: component=tidb, version=v4.0.9, remote=192.168.254.32:/data/tidb-deploy-v2/tidb-9383 os=linux, arch=amd64\
InitConfig: cluster=jiuji-tidb-clu
ster-v2, user=tidb, host=192.168.254.32, path=/home/tidb/.tiup/storage/cluster/clusters/jiuji-tidb-cluster-v2/config-cache/tidb-9383.service, deploy_dir=/data/tidb
-deploy-v2/tidb-9383, data_dir=[], log_dir=/data/tidb-deploy-v2/tidb-9383/log, cache_dir=/home/tidb/.tiup/storage/cluster/clusters/jiuji-tidb-cluster-v2/config-cac
he\
BackupComponent: component=prometheus, currentVersion=v4.0.8, remote=192.168.254.12:/data/tidb-deploy-v2/prometheus-9080\
CopyComponent: component=prometheus,
version=v4.0.9, remote=192.168.254.12:/data/tidb-deploy-v2/prometheus-9080 os=linux, arch=amd64\
InitConfig: cluster=jiuji-tidb-cluster-v2, user=tidb, host=192.168
.254.12, path=/home/tidb/.tiup/storage/cluster/clusters/jiuji-tidb-cluster-v2/config-cache/prometheus-9080.service, deploy_dir=/data/tidb-deploy-v2/prometheus-9080
, data_dir=[/data/tidb-data-v2/prometheus-9080], log_dir=/data/tidb-deploy-v2/prometheus-9080/log, cache_dir=/home/tidb/.tiup/storage/cluster/clusters/jiuji-tidb-c
luster-v2/config-cache\
BackupComponent: component=grafana, currentVersion=v4.0.8, remote=192.168.254.12:/data/tidb-deploy-v2/grafana-3030\
CopyComponent: componen
t=grafana, version=v4.0.9, remote=192.168.254.12:/data/tidb-deploy-v2/grafana-3030 os=linux, arch=amd64\
InitConfig: cluster=jiuji-tidb-cluster-v2, user=tidb, host
=192.168.254.12, path=/home/tidb/.tiup/storage/cluster/clusters/jiuji-tidb-cluster-v2/config-cache/grafana-3030.service, deploy_dir=/data/tidb-deploy-v2/grafana-30
30, data_dir=[], log_dir=/data/tidb-deploy-v2/grafana-3030/log, cache_dir=/home/tidb/.tiup/storage/cluster/clusters/jiuji-tidb-cluster-v2/config-cache\
BackupCompo
nent: component=alertmanager, currentVersion=v4.0.8, remote=192.168.254.12:/data/tidb-deploy-v2/alertmanager-9095\
CopyComponent: component=alertmanager, version=v
0.17.0, remote=192.168.254.12:/data/tidb-deploy-v2/alertmanager-9095 os=linux, arch=amd64\
InitConfig: cluster=jiuji-tidb-cluster-v2, user=tidb, host=192.168.254.1
2, path=/home/tidb/.tiup/storage/cluster/clusters/jiuji-tidb-cluster-v2/config-cache/alertmanager-9095.service, deploy_dir=/data/tidb-deploy-v2/alertmanager-9095,
data_dir=[/data/tidb-data-v2/alertmanager-9095], log_dir=/data/tidb-deploy-v2/alertmanager-9095/log, cache_dir=/home/tidb/.tiup/storage/cluster/clusters/jiuji-tidb
-cluster-v2/config-cache", "error": "init config failed: 192.168.254.31:9385: executor.ssh.execute_failed: Failed to execute command over SSH for 'tidb@192.168.254
.31:22' {ssh_stderr: invalid configuration: default rocksdb not exist, buf raftdb exist\
, ssh_stdout: , ssh_command: export LANG=C; PATH=$PATH:/usr/bin:/usr/sbin
/data/tidb-deploy-v2/tikv-9385/bin/tikv-server --config-check --config=/data/tidb-deploy-v2/tikv-9385/conf/tikv.toml --pd=\"\"}, cause: Process exited with status
1: check config failed", "errorVerbose": "check config failed\
executor.ssh.execute_failed: Failed to execute command over SSH for 'tidb@192.168.254.31:22' {ssh_st
derr: invalid configuration: default rocksdb not exist, buf raftdb exist\
, ssh_stdout: , ssh_command: export LANG=C; PATH=$PATH:/usr/bin:/usr/sbin /data/tidb-depl
oy-v2/tikv-9385/bin/tikv-server --config-check --config=/data/tidb-deploy-v2/tikv-9385/conf/tikv.toml --pd=\"\"}, cause: Process exited with status 1\
github.com/p
ingcap/tiup/pkg/cluster/spec.checkConfig\
\tgithub.com/pingcap/tiup@/pkg/cluster/spec/server_config.go:268\
github.com/pingcap/tiup/pkg/cluster/spec.(*TiKVInstance
).InitConfig\
\tgithub.com/pingcap/tiup@/pkg/cluster/spec/tikv.go:272\
github.com/pingcap/tiup/pkg/cluster/task.(*InitConfig).Execute\
\tgithub.com/pingcap/tiup@/p
kg/cluster/task/init_config.go:49\
github.com/pingcap/tiup/pkg/cluster/task.(*Serial).Execute\
\tgithub.com/pingcap/tiup@/pkg/cluster/task/task.go:196\
github.com/
pingcap/tiup/pkg/cluster/task.(*Parallel).Execute.func1\
\tgithub.com/pingcap/tiup@/pkg/cluster/task/task.go:241\
runtime.goexit\
\truntime/asm_amd64.s:1357\
init
config failed: 192.168.254.31:9385"}
2021-02-02T01:40:59.860+0800 INFO Execute command finished {"code": 1, "error": "init config failed: 192.168.254.31:9385: executor.ssh.execute_failed:
Failed to execute command over SSH for 'tidb@192.168.254.31:22' {ssh_stderr: invalid configuration: default rocksdb not exist, buf raftdb exist\
, ssh_stdout: , s
sh_command: export LANG=C; PATH=$PATH:/usr/bin:/usr/sbin /data/tidb-deploy-v2/tikv-9385/bin/tikv-server --config-check --config=/data/tidb-deploy-v2/tikv-9385/conf
/tikv.toml --pd=\"\"}, cause: Process exited with status 1: check config failed", "errorVerbose": "check config failed\
executor.ssh.execute_failed: Failed to exec
ute command over SSH for 'tidb@192.168.254.31:22' {ssh_stderr: invalid configuration: default rocksdb not exist, buf raftdb exist\
, ssh_stdout: , ssh_command: exp
ort LANG=C; PATH=$PATH:/usr/bin:/usr/sbin /data/tidb-deploy-v2/tikv-9385/bin/tikv-server --config-check --config=/data/tidb-deploy-v2/tikv-9385/conf/tikv.toml --pd
=\"\"}, cause: Process exited with status 1\
github.com/pingcap/tiup/pkg/cluster/spec.checkConfig\
\tgithub.com/pingcap/tiup@/pkg/cluster/spec/server_config.go:268
\
github.com/pingcap/tiup/pkg/cluster/spec.(*TiKVInstance).InitConfig\
\tgithub.com/pingcap/tiup@/pkg/cluster/spec/tikv.go:272\
github.com/pingcap/tiup/pkg/cluster
/task.(*InitConfig).Execute\
\tgithub.com/pingcap/tiup@/pkg/cluster/task/init_config.go:49\
github.com/pingcap/tiup/pkg/cluster/task.(*Serial).Execute\
\tgithub.co
m/pingcap/tiup@/pkg/cluster/task/task.go:196\
github.com/pingcap/tiup/pkg/cluster/task.(*Parallel).Execute.func1\
\tgithub.com/pingcap/tiup@/pkg/cluster/task/task.
go:241\
runtime.goexit\
\truntime/asm_amd64.s:1357\
init config failed: 192.168.254.31:9385"}