Windows下IntelliJ IDEA中运行Spark Standalone

时间:2022-12-28 16:51:56

ZHUAN

http://www.cnblogs.com/one--way/archive/2016/08/29/5818989.html

http://www.cnblogs.com/one--way/p/5814148.html

前提条件:

1、Spark Standalone 集群部署完成

2、Intellij Idea 能够运行 Spark local 模式的程序。

源码:

Windows下IntelliJ IDEA中运行Spark Standalone
 1 import org.apache.spark.{SparkContext, SparkConf}
 2 import scala.math._
 3
 4 /**
 5   * Created by Edward on 2016/8/27.
 6   */
 7 object WordCount {
 8   def main(args: Array[String]) {
 9
10     val sparkConf = new SparkConf().setAppName("WordCount")
11       .setMaster("spark://node1:7077").setJars(List("D:\\documents\\Spark\\MyDemo\\Test\\out\\artifacts\\spark_sample_jar\\Test.jar"))
12     //val sc = new SparkContext(sparkConf)
13     val spark = new SparkContext(sparkConf)
14     val slices = if (args.length > 0) args(0).toInt else 2
15     val n = math.min(100000L * slices, Int.MaxValue).toInt // avoid overflow
16     val count = spark.parallelize(1 until n, slices).map { i =>
17         val x = random * 2 - 1
18         val y = random * 2 - 1
19         if (x*x + y*y < 1) 1 else 0
20       }.reduce(_ + _)
21     println("Pi is roughly " + 4.0 * count / n)
22     spark.stop()
23   }
24 }
Windows下IntelliJ IDEA中运行Spark Standalone

这里主要的思想还是将打包的jar提交到集群。

使用.setJars方法