idea中使用scala运行spark出现:
Exception in thread "main" java.lang.NoClassDefFoundError: scala/collection/GenTraversableOnce$class
查看build.sbt:
name := "ScalaSBT" version := "1.0" scalaVersion := "2.11.8" libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "1.6.1"
你需要确保 spark所使用的scala版本与你系统scala的版本一致
你也可以这样:
libraryDependencies += "org.apache.spark" %% "spark-core" % "1.6.1"