线程“main”中的异常java.lang.NoClassDefFoundError:org / apache / spark / Logging

时间:2022-02-15 20:50:28

I am new to Spark Mllib and I just tried to run the sample codes from their website. However I get the Logging error. I had the same error when I tried doing some Twitter Analysis too. The error is as follows

我是Spark Mllib的新手,我只是试图从他们的网站上运行示例代码。但是我收到了Logging错误。当我尝试做一些Twitter分析时,我也遇到了同样的错误。错误如下

**Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/spark/Logging**
at java.lang.ClassLoader.defineClass1(Native Method)`
at java.lang.ClassLoader.defineClass(ClassLoader.java:763)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:467)
at java.net.URLClassLoader.access$100(URLClassLoader.java:73)
at java.net.URLClassLoader$1.run(URLClassLoader.java:368)
at java.net.URLClassLoader$1.run(URLClassLoader.java:362)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:361)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at org.apache.spark.mllib.recommendation.ALS$.train(ALS.scala:599)
at org.apache.spark.mllib.recommendation.ALS$.train(ALS.scala:616)
at scalamornprac.ML$.main(ML.scala:30)
at scalamornprac.ML.main(ML.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:147)
Caused by: java.lang.ClassNotFoundException: org.apache.spark.Logging
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 21 more
16/11/10 10:02:29 INFO SparkContext: Invoking stop() from shutdown hook

I use Intellij IDEA. The sbt is as follows.

我使用Intellij IDEA。 sbt如下。

name := "Spark-Packt"
version := "1.0"
scalaVersion := "2.10.6"

libraryDependencies += "org.apache.spark" % "spark-core_2.10" % "2.0.0"
libraryDependencies += "org.apache.spark" % "spark-streaming_2.10" % "2.0.0"
libraryDependencies += "org.apache.spark" % "spark-mllib_2.10" % "1.0.0"
libraryDependencies += "org.apache.spark" % "spark-mllib-local_2.10" % "2.0.0"

Also note I have imported import org.apache.log4j.{Level,Logger} in my code. Still doesnt work.

另请注意,我已在我的代码中导入了导入org.apache.log4j。{Level,Logger}。还是不行。

1 个解决方案

#1


0  

Check from spark-shell import org.apache.spark.Logging if it works then make proper changes in build.sbt

检查spark-shell导入org.apache.spark.Logging是否有效,然后在build.sbt中进行适当的更改

libraryDependencies += "org.apache.spark" % "spark-core_2.10" % "1.6.1"
libraryDependencies += "org.apache.spark" % "spark-streaming_2.10" % "1.6.1"
libraryDependencies += "org.apache.spark" % "spark-mllib_2.10" % "1.6.1"
libraryDependencies += "org.apache.spark" % "spark-mllib-local_2.10" % "1.6.1"

If not it means that packages of spark which you are running on environment doesn't include org.apache.spark.Logging

如果不是,则表示您在环境中运行的spark包不包含org.apache.spark.Logging

#1


0  

Check from spark-shell import org.apache.spark.Logging if it works then make proper changes in build.sbt

检查spark-shell导入org.apache.spark.Logging是否有效,然后在build.sbt中进行适当的更改

libraryDependencies += "org.apache.spark" % "spark-core_2.10" % "1.6.1"
libraryDependencies += "org.apache.spark" % "spark-streaming_2.10" % "1.6.1"
libraryDependencies += "org.apache.spark" % "spark-mllib_2.10" % "1.6.1"
libraryDependencies += "org.apache.spark" % "spark-mllib-local_2.10" % "1.6.1"

If not it means that packages of spark which you are running on environment doesn't include org.apache.spark.Logging

如果不是,则表示您在环境中运行的spark包不包含org.apache.spark.Logging