踩坑事件:windows操作系统下的eclipse中编写SparkSQL不能从本地读取或者保存parquet文件

时间:2023-01-20 21:19:08

这个大坑... ....

如题,在Windows的eclipse中编写SparkSQL代码时,编写如下代码时,一运行就抛出一堆空指针异常:

        // 首先还是创建SparkConf
SparkConf conf = new SparkConf()
.setMaster("local")
.setAppName("HiveDataSource");
// 创建JavaSparkContext
JavaSparkContext sc = new JavaSparkContext(conf);
SQLContext sqlContext=new SQLContext(sc); // DataFrame usersDF=sqlContext.read().parquet("hdfs://spark2:9000/francis/spark-core/users.parquet");
DataFrame usersDF=sqlContext.read().parquet("users.parquet");

这个纠结啊... ...。

后来将数据保存到hdfs上可以运行。于是我误以为不能再本地保存,后来google了一下,看很多demo都是将数据保存到本地的parquet中,于是这个猜测否决了。

后来在这里找到了答案:http://*.com/questions/25505365/parquet-file-in-spark-sql

其回复如下:

Spark is compatible with Windows. You can run your program in a spark-shell session in Windows or you can run it using spark-submit with necessary argument such as "-master" (again, in Windows or other OS). You cannot just run your Spark program as an ordinary Java program in Eclispe without properly setting up the Spark environment and so on. You problem has nothing to do with Windows.

后来又在linux 上的spark-shell上验证了一下,可以本地保存的!!!!

所以啊,要想保存在本地,还是使用spark-submit吧,不要直接在eclipse中运行了。