为提高效率,提问时请提供以下信息,问题描述清晰可优先响应。
- 【TiDB 版本】:3.0.12
- 【问题描述】:
使用场景:
阿里大数据平台dataworks中集成maxcompute。maxcompute中中支持使用spark开发。
由于spark是集成封装的不能操作spark-deflauts.conf文件直接在启动参数–conf中指定了
spark.tispark.pd.addresses $your_pd_servers
spark.sql.extensions org.apache.spark.sql.TiExtensions
并引用对应的tispark.jar。测试运行报错
20/06/09 15:14:48 ERROR ApplicationMaster: User class threw exception: java.lang.NullPointerException: Failed to init client for PD cluster.
java.lang.NullPointerException: Failed to init client for PD cluster.
at shade.com.google.common.base.Preconditions.checkNotNull(Preconditions.java:228)
at com.pingcap.tikv.PDClient.initCluster(PDClient.java:368)
at com.pingcap.tikv.PDClient.createRaw(PDClient.java:395)
at com.pingcap.tikv.TiSession.getPDClient(TiSession.java:80)
at com.pingcap.tikv.TiSession.<init>(TiSession.java:49)
at com.pingcap.tikv.TiSession.create(TiSession.java:147)
at org.apache.spark.sql.TiContext.<init>(TiContext.scala:43)
at org.apache.spark.sql.TiExtensions.getOrCreateTiContext(TiExtensions.scala:15)
at org.apache.spark.sql.TiExtensions$$anonfun$apply$5.apply(TiExtensions.scala:24)
at org.apache.spark.sql.TiExtensions$$anonfun$apply$5.apply(TiExtensions.scala:24)
at org.apache.spark.sql.extensions.TiResolutionRule.<init>(rules.scala:31)
at org.apache.spark.sql.TiExtensions$$anonfun$apply$6.apply(TiExtensions.scala:24)
at org.apache.spark.sql.TiExtensions$$anonfun$apply$6.apply(TiExtensions.scala:24)
at org.apache.spark.sql.SparkSessionExtensions$$anonfun$buildResolutionRules$1.apply(SparkSessionExtensions.scala:75)
at org.apache.spark.sql.SparkSessionExtensions$$anonfun$buildResolutionRules$1.apply(SparkSessionExtensions.scala:75)
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48)
at scala.collection.TraversableLike$class.map(TraversableLike.scala:234)
at scala.collection.AbstractTraversable.map(Traversable.scala:104)
at org.apache.spark.sql.SparkSessionExtensions.buildResolutionRules(SparkSessionExtensions.scala:75)
at org.apache.spark.sql.internal.BaseSessionStateBuilder.customResolutionRules(BaseSessionStateBuilder.scala:183)
at org.apache.spark.sql.odps.OdpsSessionStateBuilder$$anon$1.<init>(OdpsSessionStateBuilder.scala:67)
at org.apache.spark.sql.odps.OdpsSessionStateBuilder.analyzer(OdpsSessionStateBuilder.scala:62)
at org.apache.spark.sql.internal.BaseSessionStateBuilder$$anonfun$build$2.apply(BaseSessionStateBuilder.scala:293)
at org.apache.spark.sql.internal.BaseSessionStateBuilder$$anonfun$build$2.apply(BaseSessionStateBuilder.scala:293)
at org.apache.spark.sql.internal.SessionState.analyzer$lzycompute(SessionState.scala:79)
at org.apache.spark.sql.internal.SessionState.analyzer(SessionState.scala:79)
at org.apache.spark.sql.execution.QueryExecution.analyzed$lzycompute(QueryExecution.scala:58)
at org.apache.spark.sql.execution.QueryExecution.analyzed(QueryExecution.scala:56)
at org.apache.spark.sql.execution.QueryExecution.assertAnalyzed(QueryExecution.scala:48)
at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:74)
at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:638)
at com.hydee.dataworks.SparkApplication$.main(SparkApplication.scala:50)
at com.hydee.dataworks.SparkApplication.main(SparkApplication.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$4.run(ApplicationMaster.scala:708)
, stdout content:
at com.aliyun.odps.cupid.CupidUtil.errMsg2SparkException(CupidUtil.java:43)
at com.aliyun.odps.cupid.CupidUtil.getResult(CupidUtil.java:123)
at com.aliyun.odps.cupid.requestcupid.YarnClientImplUtil.pollAMStatus(YarnClientImplUtil.java:107)
at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.applicationReportTransform(YarnClientImpl.java:340)
... 15 more
java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at com.aliyun.odps.SubmitJob.main(SubmitJob.java:74)
Caused by: org.apache.spark.SparkException: Application application_1591686825162_922794149 finished with failed status
at org.apache.spark.deploy.yarn.Client.run(Client.scala:1185)
at org.apache.spark.deploy.yarn.YarnClusterApplication.start(Client.scala:1542)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:881)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:197)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:227)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:136)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
... 5 more