:

时间:2025-04-03 09:42:50

单机运行不报错,集群模式报错:

因为本地单机配置了spark/conf/

export SPARK_DIST_CLASSPATH=$(/usr/local/hadoop/bin/hadoop classpath):/usr/local/spark/examples/jars/*:/usr/local/spark/jars/kafka/*:/usr/local/kafka/libs/*

集群环境下,没有在所有机器上部署kafka的jar包

解决方法:

1.集群所有机器配置spark/conf/ (此方法实践未通过,为啥?)

export SPARK_DIST_CLASSPATH=$(/usr/local/hadoop/bin/hadoop classpath):/usr/local/spark/examples/jars/*:/usr/local/spark/jars/kafka/*:/usr/local/kafka/libs/*

2.

spark-submit --master spark://master:7077 --jars spark-streaming-kafka-0-8-assembly_2.11-2.3. spark_streaming.py