Spark 1.3.1 install failed in MLlib when I run make-distribution.sh in Ubuntu 14.04
当我在Ubuntu 14.04中运行make-distribution.sh时,在MLlib中Spark 1.3.1安装失败
- Java -version:
java version "1.7.0_80" Java(TM) SE Runtime Environment (build 1.7.0_80-b15) Java HotSpot(TM) 64-Bit Server VM (build 24.80-b11, mixed mode)
- Scala -version:
Scala code runner version 2.10.4 -- Copyright 2002-2013, LAMP/EPFL
- Failing message:
Java -version:java版“1.7.0_80”Java(TM)SE运行时环境(版本1.7.0_80-b15)Java HotSpot(TM)64位服务器VM(版本24.80-b11,混合模式)
Scala -version:Scala代码运行器版本2.10.4 - 版权所有2002-2013,LAMP / EPFL
`
INFO] ------------------------------------------------------------------------
[INFO] Building Spark Project ML Library 1.3.2-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[WARNING] The POM for net.sf.opencsv:opencsv:jar:2.3 is invalid, transitive dependencies (if any) will not be available, enable debug logging for more detail
s
[INFO]
[INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ spark-mllib_2.10 ---
[INFO] Deleting /home/tongz/project/spark/spark/mllib/target
[INFO]
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (enforce-versions) @ spark-mllib_2.10 ---
[INFO]
[INFO] --- scala-maven-plugin:3.2.0:add-source (eclipse-add-source) @ spark-mllib_2.10 ---
[INFO] Add Source directory: /home/tongz/project/spark/spark/mllib/src/main/scala
[INFO] Add Test Source directory: /home/tongz/project/spark/spark/mllib/src/test/scala
[INFO]
[INFO] --- build-helper-maven-plugin:1.8:add-source (add-scala-sources) @ spark-mllib_2.10 ---
[INFO] Source directory: /home/tongz/project/spark/spark/mllib/src/main/scala added.
[INFO]
[INFO] --- maven-remote-resources-plugin:1.5:process (default) @ spark-mllib_2.10 ---
[WARNING] Invalid POM for net.sf.opencsv:opencsv:jar:2.3, transitive dependencies (if any) will not be available, enable debug logging for more details
[WARNING] Invalid project model for artifact [opencsv:net.sf.opencsv:2.3]. It will be ignored by the remote resources Mojo.
[INFO]
[INFO] --- maven-resources-plugin:2.6:resources (default-resources) @ spark-mllib_2.10 ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] Copying 26 resources
[INFO] Copying 3 resources
[INFO] --- scala-maven-plugin:3.2.0:compile (scala-compile-first) @ spark-mllib_2.10 ---
[INFO] Using zinc server for incremental compilation
[INFO] compiler plugin: BasicArtifact(org.scalamacros,paradise_2.10.4,2.0.1,null)
[info] Compiling 144 Scala sources and 2 Java sources to /home/tongz/project/spark/spark/mllib/target/scala-2.10/classes...
[error] error while loading , error in opening zip file
[error] object scala.runtime in compiler mirror not found.
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Spark Project Parent POM .......................... SUCCESS [4.145s]
[INFO] Spark Project Networking .......................... SUCCESS [11.811s]
[INFO] Spark Project Shuffle Streaming Service ........... SUCCESS [6.064s]
[INFO] Spark Project Core ................................ SUCCESS [2:39.458s]
[INFO] Spark Project Bagel ............................... SUCCESS [5.837s]
[INFO] Spark Project GraphX .............................. SUCCESS [17.580s]
[INFO] Spark Project Streaming ........................... SUCCESS [30.898s]
[INFO] Spark Project Catalyst ............................ SUCCESS [34.868s]
[INFO] Spark Project SQL ................................. SUCCESS [41.695s]
[INFO] Spark Project ML Library .......................... FAILURE [0.522s]
[INFO] Spark Project Tools ............................... SKIPPED
[INFO] Spark Project Hive ................................ SKIPPED
[INFO] Spark Project REPL ................................ SKIPPED
[INFO] Spark Project Assembly ............................ SKIPPED
[INFO] Spark Project External Twitter .................... SKIPPED
[INFO] Spark Project External Flume Sink ................. SKIPPED
[INFO] Spark Project External Flume ...................... SKIPPED
[INFO] Spark Project External MQTT ....................... SKIPPED
[INFO] Spark Project External ZeroMQ ..................... SKIPPED
[INFO] Spark Project External Kafka ...................... SKIPPED
[INFO] Spark Project Examples ............................ SKIPPED
[INFO] Spark Project External Kafka Assembly ............. SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 5:13.600s
[INFO] Finished at: Sun May 03 21:23:26 EDT 2015
[INFO] Final Memory: 41M/499M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal net.alchim31.maven:scala-maven-plugin:3.2.0:compile (scala-compile-first) on project spark-mllib_2.10: Execution scala-compile-first of goal net.alchim31.maven:scala-maven-plugin:3.2.0:compile failed. CompileFailed -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/PluginExecutionException
[ERROR]
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR] mvn -rf :spark-mllib_2.10
`
This is the last few line of error message, if you need I can provide you more.
这是最后几行错误信息,如果您需要我可以为您提供更多信息。
Thanks in advance!
提前致谢!
1 个解决方案
#1
Ok wait for 12 hours still no answer! I dig a lot I think I found the answer myself here are the trick: sbt clean clean-files rm -rf ~/.ivy2 ~/.m2 ~/.sbt
好的等了12个小时仍然没有答案!我挖了很多我觉得我在这里找到答案就是诀窍:sbt clean clean-files rm -rf~ / .ivy2~ / .m2~ / .sbt
These 2 lines are the problem [error] error while loading , error in opening zip file [error] object scala.runtime in compiler mirror not found.
这2行是加载时出现问题[错误]错误,打开zip文件错误[错误]对象scala.runtime未找到编译器镜像。
From what I understand I have some scalar or mvn package broken before, it causes this error, I have to remove them. Also it may also because of sbt was old that's why I did clean that.
从我的理解,我有一些标量或mvn包破坏之前,它会导致此错误,我必须删除它们。也可能因为sbt老了这就是为什么我干净了。
PS: if you wanna find what packages are broken do follow cli find ~/.ivy2 ~/.m2 ~/.sbt -name "*.jar" -exec unzip -qqt {} \;
PS:如果你想找到什么包坏了,请按照cli找到〜/ .ivy2~ / .m2~ / .sbt -name“* .jar”-exec unzip -qqt {} \;
#1
Ok wait for 12 hours still no answer! I dig a lot I think I found the answer myself here are the trick: sbt clean clean-files rm -rf ~/.ivy2 ~/.m2 ~/.sbt
好的等了12个小时仍然没有答案!我挖了很多我觉得我在这里找到答案就是诀窍:sbt clean clean-files rm -rf~ / .ivy2~ / .m2~ / .sbt
These 2 lines are the problem [error] error while loading , error in opening zip file [error] object scala.runtime in compiler mirror not found.
这2行是加载时出现问题[错误]错误,打开zip文件错误[错误]对象scala.runtime未找到编译器镜像。
From what I understand I have some scalar or mvn package broken before, it causes this error, I have to remove them. Also it may also because of sbt was old that's why I did clean that.
从我的理解,我有一些标量或mvn包破坏之前,它会导致此错误,我必须删除它们。也可能因为sbt老了这就是为什么我干净了。
PS: if you wanna find what packages are broken do follow cli find ~/.ivy2 ~/.m2 ~/.sbt -name "*.jar" -exec unzip -qqt {} \;
PS:如果你想找到什么包坏了,请按照cli找到〜/ .ivy2~ / .m2~ / .sbt -name“* .jar”-exec unzip -qqt {} \;