tispark开启鉴权,使用Spark Thrift Server服务进行远程连接如何传入连接TiDB的身份验证信息,目前有报错:Caused by: java.util.NoSuchElementException: spark.sql.tidb.user

tispark已开启Tidb数据库鉴权连接。
Spark Thrift Server服务已正常启动
root@master:/opt/spark/spark-3.1.3/sbin# ./start-thriftserver.sh
root@master:/opt/spark/spark-3.1.3/bin# jps -l
79219 org.apache.hadoop.yarn.server.resourcemanager.ResourceManager
79701 org.apache.spark.deploy.SparkSubmit
78998 org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode
78699 org.apache.hadoop.hdfs.server.namenode.NameNode
79564 org.apache.spark.deploy.master.Master
80286 sun.tools.jps.Jps

启动beeline,输入 !connect jdbc:hive2://192.168.1.110:10000连接。
提示**Error: Could not open client transport with JDBC Uri: jdbc:hive2://192.168.1.110:10000: Failed to open new session: java.lang.reflect.InvocationTargetException (state=08S01,code=0)**导致连接不成功。
root@slave2:/opt/spark/spark-3.1.3/bin# ./beeline
Beeline version 2.3.7 by Apache Hive
beeline> !connect jdbc:hive2://192.168.1.110:10000
Connecting to jdbc:hive2://192.168.1.110:10000
Enter username for jdbc:hive2://192.168.1.110:10000: root
Enter password for jdbc:hive2://192.168.1.110:10000: *********
22/09/09 14:35:27 INFO Utils: Supplied authorities: 192.168.1.110:10000
22/09/09 14:35:27 INFO Utils: Resolved authority: 192.168.1.110:10000
22/09/09 14:35:39 WARN HiveConnection: Failed to connect to 192.168.1.110:10000
Error: Could not open client transport with JDBC Uri: jdbc:hive2://192.168.1.110:10000: Failed to open new session: java.lang.reflect.InvocationTargetException (state=08S01,code=0)

查看thriftserver日志:
22/09/09 14:49:54 INFO ThriftCLIService: Client protocol version: HIVE_CLI_SERVICE_PROTOCOL_V10
22/09/09 14:49:54 INFO deprecation: No unit for dfs.client.datanode-restart.timeout(30) assuming SECONDS
22/09/09 14:49:54 INFO SessionState: Created HDFS directory: /tmp/hive/test_spark
22/09/09 14:49:54 INFO SessionState: Created HDFS directory: /tmp/hive/test_spark/5984aa7c-8d6a-46cc-8f6b-7654dac0d9a0
22/09/09 14:49:54 INFO SessionState: Created local directory: /tmp/root/5984aa7c-8d6a-46cc-8f6b-7654dac0d9a0
22/09/09 14:49:54 INFO SessionState: Created HDFS directory: /tmp/hive/test_spark/5984aa7c-8d6a-46cc-8f6b-7654dac0d9a0/_tmp_space.db
22/09/09 14:49:54 INFO HiveSessionImpl: Operation log session directory is created: /tmp/root/operation_logs/5984aa7c-8d6a-46cc-8f6b-7654dac0d9a0
22/09/09 14:49:54 INFO TiAuthRuleFactory: TiSpark running in auth mode
22/09/09 14:49:54 INFO SessionState: Deleted directory: /tmp/hive/test_spark/5984aa7c-8d6a-46cc-8f6b-7654dac0d9a0 on fs with scheme hdfs
22/09/09 14:49:54 INFO SessionState: Deleted directory: /tmp/root/5984aa7c-8d6a-46cc-8f6b-7654dac0d9a0 on fs with scheme file
22/09/09 14:49:54 WARN ThriftCLIService: Error opening session:
org.apache.hive.service.cli.HiveSQLException: Failed to open new session: java.lang.reflect.InvocationTargetException
at org.apache.spark.sql.hive.thriftserver.SparkSQLSessionManager.openSession(SparkSQLSessionManager.scala:85)
at org.apache.hive.service.cli.CLIService.openSessionWithImpersonation(CLIService.java:204)
at org.apache.hive.service.cli.thrift.ThriftCLIService.getSessionHandle(ThriftCLIService.java:371)
at org.apache.hive.service.cli.thrift.ThriftCLIService.OpenSession(ThriftCLIService.java:243)
at org.apache.hive.service.rpc.thrift.TCLIService$Processor$OpenSession.getResult(TCLIService.java:1497)
at org.apache.hive.service.rpc.thrift.TCLIService$Processor$OpenSession.getResult(TCLIService.java:1482)
at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:38)
at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39)
at org.apache.hive.service.auth.TSetIpAddressProcessor.process(TSetIpAddressProcessor.java:53)
at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:310)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at com.pingcap.tispark.utils.ReflectionUtil$.newTiAuthRule(ReflectionUtil.scala:140)
at org.apache.spark.sql.extensions.TiAuthRuleFactory.apply(rules.scala:36)
at org.apache.spark.sql.extensions.TiAuthRuleFactory.apply(rules.scala:25)
at org.apache.spark.sql.SparkSessionExtensions.$anonfun$buildResolutionRules$1(SparkSessionExtensions.scala:141)
at scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:238)
at scala.collection.mutable.ResizableArray.foreach(ResizableArray.scala:62)
at scala.collection.mutable.ResizableArray.foreach$(ResizableArray.scala:55)
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:49)
at scala.collection.TraversableLike.map(TraversableLike.scala:238)
at scala.collection.TraversableLike.map$(TraversableLike.scala:231)
at scala.collection.AbstractTraversable.map(Traversable.scala:108)
at org.apache.spark.sql.SparkSessionExtensions.buildResolutionRules(SparkSessionExtensions.scala:141)
at org.apache.spark.sql.internal.BaseSessionStateBuilder.customResolutionRules(BaseSessionStateBuilder.scala:200)
at org.apache.spark.sql.hive.HiveSessionStateBuilder$$anon$1.(HiveSessionStateBuilder.scala:82)
at org.apache.spark.sql.hive.HiveSessionStateBuilder.analyzer(HiveSessionStateBuilder.scala:73)
at org.apache.spark.sql.internal.BaseSessionStateBuilder.$anonfun$build$2(BaseSessionStateBuilder.scala:342)
at org.apache.spark.sql.internal.SessionState.analyzer$lzycompute(SessionState.scala:84)
at org.apache.spark.sql.internal.SessionState.analyzer(SessionState.scala:84)
at org.apache.spark.sql.execution.QueryExecution.$anonfun$analyzed$1(QueryExecution.scala:73)
at org.apache.spark.sql.catalyst.QueryPlanningTracker.measurePhase(QueryPlanningTracker.scala:111)
at org.apache.spark.sql.execution.QueryExecution.$anonfun$executePhase$1(QueryExecution.scala:143)
at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:775)
at org.apache.spark.sql.execution.QueryExecution.executePhase(QueryExecution.scala:143)
at org.apache.spark.sql.execution.QueryExecution.analyzed$lzycompute(QueryExecution.scala:73)
at org.apache.spark.sql.execution.QueryExecution.analyzed(QueryExecution.scala:71)
at org.apache.spark.sql.execution.QueryExecution.assertAnalyzed(QueryExecution.scala:63)
at org.apache.spark.sql.Dataset$.$anonfun$ofRows$2(Dataset.scala:98)
at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:775)
at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:96)
at org.apache.spark.sql.SparkSession.$anonfun$sql$1(SparkSession.scala:618)
at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:775)
at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:613)
at org.apache.spark.sql.SQLContext.sql(SQLContext.scala:650)
at org.apache.spark.sql.hive.thriftserver.SparkSQLSessionManager.openSession(SparkSQLSessionManager.scala:73)
… 12 more
Caused by: java.util.NoSuchElementException: spark.sql.tidb.user
at org.apache.spark.sql.internal.SQLConf.$anonfun$getConfString$3(SQLConf.scala:3732)
at scala.Option.getOrElse(Option.scala:189)
at org.apache.spark.sql.internal.SQLConf.getConfString(SQLConf.scala:3732)
at com.pingcap.tispark.auth.TiAuthorization$.tiAuthorization(TiAuthorization.scala:244)
at org.apache.spark.sql.TiContext.tiAuthorization$lzycompute(TiContext.scala:42)
at org.apache.spark.sql.TiContext.tiAuthorization(TiContext.scala:42)
at org.apache.spark.sql.TiContext.(TiContext.scala:47)
at org.apache.spark.sql.TiExtensions.getOrCreateTiContext(TiExtensions.scala:49)
at org.apache.spark.sql.TiExtensions.$anonfun$apply$2(TiExtensions.scala:36)
at org.apache.spark.sql.extensions.TiAuthorizationRule.(TiAuthorizationRule.scala:34)
… 50 more
日志中有提示信息:Caused by: java.util.NoSuchElementException: spark.sql.tidb.user意思为找不到元素spark.sql.tidb.user的配置。
目前的问题是:tispark开启了TiDB鉴权连接,且因为生产数据库比较重要不能将用户和密码显式配置在配置文件中。使用beeline如何才能传入用户密码等验证信息使Spark Thrift Server服务连接成功。

beeline里面可以set,想要添加的配置通过 set 属性=值 的方式可以设置。

好的!我这边试试

Hi,这个问题解决了吗?如已解决,别忘了标记解决,谢谢!!!!!
beeline 用法 --> https://blog.csdn.net/qq_31382921/article/details/73925140