与sbt + cassandra连接器依赖问题的火花流

时间:2021-08-22 20:50:29

Folks,

I am trying to integrated cassandra with spark streaming. Below is the sbt file:

我正在尝试将cassandra与火花流整合在一起。以下是sbt文件:

 scalaVersion := "2.11.8"

libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % "2.0.0" % "provided",
"org.apache.spark" %% "spark-streaming" % "2.0.0" % "provided",
"org.apache.spark" %% "spark-sql" % "1.6.1",
"com.datastax.spark" %% "spark-cassandra-connector" % "1.6.2",
"com.datastax.cassandra" % "cassandra-driver-core" % "3.0.0",
("org.apache.spark" %% "spark-streaming-kafka" % "1.6.0").
exclude("org.spark-project.spark", "unused")
)

I added below line(error line mentioned below) for cassandra integration:

我为cassandra集成添加了以下行(下面提到的错误行):

val lines = KafkaUtils.createDirectStream[
String, String, StringDecoder, StringDecoder](
ssc, kafkaParams, topics)

//Getting errors once I add below line in program 
lines.saveToCassandra("test", "test", SomeColumns("key", "value"))

lines.print()

Once I add above line, I see below error in IDE:

一旦我添加上面的行,我在IDE中看到以下错误:

与sbt + cassandra连接器依赖问题的火花流

I see similar error if i try to package this project from command prompt:

如果我尝试从命令提示符打包此项目,我看到类似的错误:

与sbt + cassandra连接器依赖问题的火花流

FYR, I am using below versions:

FYR,我使用以下版本:

scala - 2.11

斯卡拉 - 2.11

kafka - kafka_2.11-0.8.2.1

kafka - kafka_2.11-0.8.2.1

java - 8

java - 8

cassandra - datastax-community-64bit_2.2.8

cassandra - datastax-community-64bit_2.2.8

Please help to resolve the issue.

请帮忙解决问题。

1 个解决方案

#1


0  

As expected, it was dependency issue which is resolved by updating sbt file as below:

正如预期的那样,依赖性问题通过更新sbt文件解决,如下所示:

scalaVersion := "2.11.8"

libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % "2.0.0" % "provided",
"org.apache.spark" %% "spark-streaming" % "2.0.0" % "provided",
"org.apache.spark" %% "spark-sql" % "2.0.0",
"com.datastax.spark" %% "spark-cassandra-connector" % "2.0.0-RC1",
"com.datastax.cassandra" % "cassandra-driver-core" % "3.0.0",
("org.apache.spark" %% "spark-streaming-kafka" % "1.6.0").
exclude("org.spark-project.spark", "unused")
)

#1


0  

As expected, it was dependency issue which is resolved by updating sbt file as below:

正如预期的那样,依赖性问题通过更新sbt文件解决,如下所示:

scalaVersion := "2.11.8"

libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % "2.0.0" % "provided",
"org.apache.spark" %% "spark-streaming" % "2.0.0" % "provided",
"org.apache.spark" %% "spark-sql" % "2.0.0",
"com.datastax.spark" %% "spark-cassandra-connector" % "2.0.0-RC1",
"com.datastax.cassandra" % "cassandra-driver-core" % "3.0.0",
("org.apache.spark" %% "spark-streaming-kafka" % "1.6.0").
exclude("org.spark-project.spark", "unused")
)