i got this error. I'm not sure why this is the case because there is a coalesce method in org.apache.spark.rdd.RDD. Any ideas? Am I running a incompatible version of Spark and org.apache.spark.rdd.RDD?
我收到了这个错误。我不确定为什么会这样,因为org.apache.spark.rdd.RDD中有一个coalesce方法。有任何想法吗?我是否运行了Spark和org.apache.spark.rdd.RDD的不兼容版本?
Exception in thread "main" java.lang.NoSuchMethodError: org.apache.spark.rdd.RDD.coalesce$default$3(IZ)Lscala/math/Ordering;
2 个解决方案
#1
0
As I suspected, this is a library compatibility issue. Everything works (no code change) after downgrading Spark alone.
我怀疑这是一个库兼容性问题。单独降级Spark后,一切正常(无代码更改)。
Before:
- scala 2.11.8
- spark 2.0.1
- Java 1.8.0_92
After
- scala 2.11.8
- spark 1.6.2
- Java 1.8.0_92
OS: OSX 10.11.6
操作系统:OSX 10.11.6
#2
0
It was because some part of your code or project dependencies called old version(spark version before 2.0.0) spark API 'coalesce' while in new version spark this API has been removed and replaced by 'repartition'.
这是因为你的代码或项目依赖项的某些部分称为旧版本(2.0.0之前的spark版本)会引发API'coalesce',而在新版本spark中,此API已被删除并被“repartition”取代。
To fix this problem, you could either downgrade your spark run environment to below version 2.0.0, or you can upgrade your SDK spark version to above 2.0.0 and upgrade project dependencies version to be compatible with spark 2.0.0 or above.
要解决此问题,您可以将spark运行环境降级到2.0.0以下,或者可以将SDK spark版本升级到2.0.0以上,并升级项目依赖项版本以与spark 2.0.0或更高版本兼容。
For more details please see this thread: https://github.com/twitter/algebird/issues/549 https://github.com/EugenCepoi/algebird/commit/0dc7d314cba3be588897915c8dcfb14964933c31
有关详细信息,请参阅此主题:https://github.com/twitter/algebird/issues/549 https://github.com/EugenCepoi/algebird/commit/0dc7d314cba3be588897915c8dcfb14964933c31
#1
0
As I suspected, this is a library compatibility issue. Everything works (no code change) after downgrading Spark alone.
我怀疑这是一个库兼容性问题。单独降级Spark后,一切正常(无代码更改)。
Before:
- scala 2.11.8
- spark 2.0.1
- Java 1.8.0_92
After
- scala 2.11.8
- spark 1.6.2
- Java 1.8.0_92
OS: OSX 10.11.6
操作系统:OSX 10.11.6
#2
0
It was because some part of your code or project dependencies called old version(spark version before 2.0.0) spark API 'coalesce' while in new version spark this API has been removed and replaced by 'repartition'.
这是因为你的代码或项目依赖项的某些部分称为旧版本(2.0.0之前的spark版本)会引发API'coalesce',而在新版本spark中,此API已被删除并被“repartition”取代。
To fix this problem, you could either downgrade your spark run environment to below version 2.0.0, or you can upgrade your SDK spark version to above 2.0.0 and upgrade project dependencies version to be compatible with spark 2.0.0 or above.
要解决此问题,您可以将spark运行环境降级到2.0.0以下,或者可以将SDK spark版本升级到2.0.0以上,并升级项目依赖项版本以与spark 2.0.0或更高版本兼容。
For more details please see this thread: https://github.com/twitter/algebird/issues/549 https://github.com/EugenCepoi/algebird/commit/0dc7d314cba3be588897915c8dcfb14964933c31
有关详细信息,请参阅此主题:https://github.com/twitter/algebird/issues/549 https://github.com/EugenCepoi/algebird/commit/0dc7d314cba3be588897915c8dcfb14964933c31