Why does the following line with KafkaUtils.createStream
为什么以下行与KafkaUtils.createStream
val reciver = KafkaUtils.createStream[String, String , StringDecoder, StringDecoder](ssc, kafkaParams, topics).map(_._2)
give me error: overloaded method value createStream with alternatives:
?
给我错误:重载方法值createStream与替代品:?
error: overloaded method value createStream with alternatives:
(jssc: org.apache.spark.streaming.api.java.JavaStreamingContext,keyTypeClass: Class[String],valueTypeClass: Class[String],keyDecoderClass: Class[kafka.serializer.StringDecoder],valueDecoderClass: Class[kafka.serializer.StringDecoder],kafkaParams: java.util.Map[String,String],topics: java.util.Map[String,Integer],storageLevel: org.apache.spark.storage.StorageLevel)org.apache.spark.streaming.api.java.JavaPairReceiverInputDStream[String,String] <and> (ssc: org.apache.spark.streaming.StreamingContext,kafkaParams: scala.collection.immutable.Map[String,String],topics: scala.collection.immutable.Map[String,Int],storageLevel: org.apache.spark.storage.StorageLevel)(implicit evidence$1: scala.reflect.ClassTag[String], implicit evidence$2: scala.reflect.ClassTag[String], implicit evidence$3: scala.reflect.ClassTag[kafka.serializer.StringDecoder], implicit evidence$4: scala.reflect.ClassTag[kafka.serializer.StringDecoder])org.apache.spark.streaming.dstream.ReceiverInputDStream[(String, String)]
cannot be applied to (org.apache.spark.streaming.StreamingContext, scala.collection.immutable.Map[String,String], scala.collection.immutable.Set[String])
KafkaUtils.createStream(ssc, kafkaParams, topics)
^
1 个解决方案
#1
0
cannot be applied to (org.apache.spark.streaming.StreamingContext, scala.collection.mutable.Map[String,String], scala.collection.immutable.Set[String])
无法应用于(org.apache.spark.streaming.StreamingContext,scala.collection.mutable.Map [String,String],scala.collection.immutable.Set [String])
Mind the types as they don't match what KafkaUtils.createStream supports.
注意类型,因为它们与KafkaUtils.createStream支持的不匹配。
You seem to be the closest to the following:
您似乎最接近以下内容:
createStream[K, V, U <: Decoder[_], T <: Decoder[_]](
ssc: StreamingContext,
kafkaParams: Map[String, String],
topics: Map[String, Int],
storageLevel: StorageLevel)(/** implicits removed */): ReceiverInputDStream[(K, V)]
and your topics
is of type Set
not Map
(!)
你的主题是Set not Map(!)类型
Use the official KafkaWordCount example for support.
使用官方KafkaWordCount示例获取支持。
#1
0
cannot be applied to (org.apache.spark.streaming.StreamingContext, scala.collection.mutable.Map[String,String], scala.collection.immutable.Set[String])
无法应用于(org.apache.spark.streaming.StreamingContext,scala.collection.mutable.Map [String,String],scala.collection.immutable.Set [String])
Mind the types as they don't match what KafkaUtils.createStream supports.
注意类型,因为它们与KafkaUtils.createStream支持的不匹配。
You seem to be the closest to the following:
您似乎最接近以下内容:
createStream[K, V, U <: Decoder[_], T <: Decoder[_]](
ssc: StreamingContext,
kafkaParams: Map[String, String],
topics: Map[String, Int],
storageLevel: StorageLevel)(/** implicits removed */): ReceiverInputDStream[(K, V)]
and your topics
is of type Set
not Map
(!)
你的主题是Set not Map(!)类型
Use the official KafkaWordCount example for support.
使用官方KafkaWordCount示例获取支持。