在Windows启动pyspark shell:Failed to find Spark jars directory. You need to build Spark before running this program

时间:2022-11-22 16:21:02

D:\Develop tools\spark-2.2.0-bin-hadoop2.7\bin>pyspark2.cmd

'tools\spark-2.2.0-bin-hadoop2.7\bin\..\jars""\' 不是内部或外部命令,也不是可运
行的程序
或批处理文件。
Failed to find Spark jars directory.
You need to build Spark before running this program.

 

错误原因:路径中含有空格(D:\Develop tools\spark-2.2.0-bin-hadoop2.7\bin的Develop tools中间有空格)