Spark Streaming 错误: java.lang.NoSuchMethodError: org.apache.spark.internal.Logging.$init$(Lorg/apache/spark/internal/Logging;)V
Spark Streaming 错误: java.lang.NoSuchMethodError: org.apache.spark.internal.Logging.$init$(Lorg/apache/spark/internal/Logging;)V
这个错误通常出现在使用 Spark Streaming 时,在初始化 SparkSession 时抛出。其根本原因是找不到 org.apache.spark.internal.Logging.$init$ 方法。
错误信息:
Exception in thread "JobGenerator" java.lang.NoSuchMethodError: org.apache.spark.internal.Logging.$init$(Lorg/apache/spark/internal/Logging;)V
at org.apache.spark.sql.SparkSession$.<init>(SparkSession.scala:771)
at org.apache.spark.sql.SparkSession$.<clinit>(SparkSession.scala)
at com.sparkstreaming.test.BlackListFilter2$$anonfun$main$2.apply(BlackListFilter2.scala:32)
at com.sparkstreaming.test.BlackListFilter2$$anonfun$main$2.apply(BlackListFilter2.scala:31)
at org.apache.spark.streaming.dstream.DStream$$anonfun$transform$1$$anonfun$apply$21.apply(DStream.scala:666)
at org.apache.spark.streaming.dstream.DStream$$anonfun$transform$1$$anonfun$apply$21.apply(DStream.scala:666)
at org.apache.spark.streaming.dstream.DStream$$anonfun$transform$2$$anonfun$5.apply(DStream.scala:680)
at org.apache.spark.streaming.dstream.DStream$$anonfun$transform$2$$anonfun$5.apply(DStream.scala:678)
at org.apache.spark.streaming.dstream.TransformedDStream.compute(TransformedDStream.scala:46)
at org.apache.spark.streaming.dstream.DStream$$anonfun$getOrCompute$1$$anonfun$1$$anonfun$apply$7.apply(DStream.scala:341)
at org.apache.spark.streaming.dstream.DStream$$anonfun$getOrCompute$1$$anonfun$1$$anonfun$apply$7.apply(DStream.scala:341)
at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)
at org.apache.spark.streaming.dstream.DStream$$anonfun$getOrCompute$1$$anonfun$1.apply(DStream.scala:340)
at org.apache.spark.streaming.dstream.DStream$$anonfun$getOrCompute$1$$anonfun$1.apply(DStream.scala:340)
at org.apache.spark.streaming.dstream.DStream.createRDDWithLocalProperties(DStream.scala:415)
at org.apache.spark.streaming.dstream.TransformedDStream.createRDDWithLocalProperties(TransformedDStream.scala:65)
at org.apache.spark.streaming.dstream.DStream$$anonfun$getOrCompute$1.apply(DStream.scala:335)
at org.apache.spark.streaming.dstream.DStream$$anonfun$getOrCompute$1.apply(DStream.scala:333)
at scala.Option.orElse(Option.scala:289)
at org.apache.spark.streaming.dstream.DStream.getOrCompute(DStream.scala:330)
at org.apache.spark.streaming.dstream.ForEachDStream.generateJob(ForEachDStream.scala:48)
at org.apache.spark.streaming.DStreamGraph$$anonfun$1.apply(DStreamGraph.scala:117)
at org.apache.spark.streaming.DStreamGraph$$anonfun$1.apply(DStreamGraph.scala:116)
at scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:241)
at scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:241)
at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48)
at scala.collection.TraversableLike$class.flatMap(TraversableLike.scala:241)
at scala.collection.AbstractTraversable.flatMap(Traversable.scala:104)
at org.apache.spark.streaming.DStreamGraph.generateJobs(DStreamGraph.scala:116)
at org.apache.spark.streaming.scheduler.JobGenerator$$anonfun$3.apply(JobGenerator.scala:249)
at org.apache.spark.streaming.scheduler.JobGenerator$$anonfun$3.apply(JobGenerator.scala:247)
at scala.util.Try$.apply(Try.scala:192)
at org.apache.spark.streaming.scheduler.JobGenerator.generateJobs(JobGenerator.scala:247)
at org.apache.spark.streaming.scheduler.JobGenerator.org$apache$spark$streaming$scheduler$JobGenerator$$processEvent(JobGenerator.scala:183)
at org.apache.spark.streaming.scheduler.JobGenerator$$anon$1.onReceive(JobGenerator.scala:89)
at org.apache.spark.streaming.scheduler.JobGenerator$$anon$1.onReceive(JobGenerator.scala:88)
at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48)
原因分析:
这个错误通常意味着你的代码使用了一个与当前 Spark 版本不兼容的 Spark 依赖项。
解决方法:
- 检查 Spark 版本: 首先,确认你的代码中使用的 Spark 版本。你可以在代码中查看
spark-core依赖的版本号,也可以使用spark-submit命令查看运行时使用的 Spark 版本:
spark-submit --version
-
检查依赖项: 确认你的项目中使用的所有 Spark 相关依赖项的版本是否与你的 Spark 版本兼容。可以通过查看 Spark 文档或搜索相关信息来确认依赖项的兼容性。
-
更新依赖项: 如果依赖项版本不兼容,你需要更新它们到与当前 Spark 版本兼容的版本。可以使用 Maven 或 Gradle 等构建工具来更新依赖项。
示例:
如果你使用的是 Spark 3.0 版本,你需要确保你的项目依赖项中 spark-core 的版本也为 3.0 或更高版本。
其他可能的原因:
除了版本不兼容,还可能存在其他原因导致这个错误,例如:
- 错误的类路径配置: 如果类路径配置错误,也可能导致找不到
org.apache.spark.internal.Logging.$init$方法。* 代码错误: 代码中可能存在错误调用了不存在的方法。
建议:
- 仔细检查你的代码和依赖项,并确保它们与你的 Spark 版本兼容。* 如果你无法解决问题,可以参考 Spark 文档或社区论坛寻求帮助。
原文地址: http://www.cveoy.top/t/topic/b7iZ 著作权归作者所有。请勿转载和采集!