I keep trying to get the Spark shell running to no avail.
我一直试图让Spark shell运行无济于事。
OS: Windows 8.1
Spark: 1.3.1
Java: 8
Downloaded both prepackaged and unpackaged variants (unpackaged variant built by maven and simple build tool). I've attempted to resolve my issue three different ways to no avail.
下载预打包和未打包的变体(由maven和简单构建工具构建的无包装变体)。我试图以三种不同的方式解决我的问题无济于事。
1) From my Spark Directory, I attempt to start its shell with variants of spark-shell.cmd
or .\bin\spark-shell.cmd
.
1)从我的Spark目录,我尝试使用spark-shell.cmd或。\ bin \ spark-shell.cmd的变体启动它的shell。
I consistently get an error along these lines:
我一直在这些方面遇到错误:
'C:\Program' is not recognized as an internal or external command, operable program or batch file.
'C:\ Program'不被识别为内部或外部命令,可操作程序或批处理文件。
Knowing a possible whitespace error when I see one, I've attempted variants of my command with quotes, full paths, etc. No results thus far.
当我看到一个可能的空格错误时,我已尝试使用引号,完整路径等命令的变体。到目前为止没有结果。
2) Next, I've tried simply moving my spark directory to highest level of my harddrive (C:\ \spark-1.3.1-bin-hadoop2.6).
2)接下来,我尝试将我的spark目录移动到我的硬盘的*别(C:\ \ spark-1.3.1-bin-hadoop2.6)。
With whitespace eliminated as a possible issue, my error messages now fall along these lines:
随着空白被删除作为一个可能的问题,我的错误消息现在落在这些方面:
find: 'version': No such file or directory else was unexpected at this time.
find:'version':此时没有其他意外的文件或目录。
3) I've tried to invoke Spark through Scala somehow (as some documents and screencasts given the impression). I can confirm that Scala (2.11.6) is properly configured within my environment variables. Its shell works correctly.
3)我试图以某种方式通过Scala调用Spark(因为一些文档和截屏给出了印象)。我可以确认在我的环境变量中正确配置了Scala(2.11.6)。它的shell工作正常。
If there's supposedly a command to make it start the Spark shell, I'm listening. Current attempts through Scala is another dead end.
如果有一个命令让它启动Spark shell,我正在听。目前通过Scala的尝试是另一个死胡同。
Thank you.
5 个解决方案
#1
In this file bin\spark-class2.cmd
find the line
在这个文件中bin \ spark-class2.cmd找到该行
set RUNNER="%JAVA_HOME%\bin\java"
and replace it with (remove the ")
并替换为(删除“)
set RUNNER=%JAVA_HOME%\bin\java
#2
-
Moving Spark directory under C:\ worked for me.(C:\spark-1.6.0-bin-hadoop2.6).
在C:\下移动Spark目录为我工作。(C:\ spark-1.6.0-bin-hadoop2.6)。
-
I also updated the system's PATH variable for find.exe
我还更新了find.exe的系统PATH变量
#3
To solve the C:\Program is not recognized
, change C:\Program Files
to C:\Progra~1
, which is an alias for C:\Program Files
.
要解决C:\ Program无法识别,请将C:\ Program Files更改为C:\ Progra~1,这是C:\ Program Files的别名。
#4
Apart from moving Spark & Scala to C:\ , I had to move java installation to C:\ as well
除了将Spark和Scala移动到C:\之外,我还必须将java安装移动到C:\
#5
I just attempted to install Spark On my Windows 8.1 laptop. I followed the instructions given in the Safari online Video titled "Apache Spark with Scala": Apache Spark With Scala After following the simple instructions the "spark-shell" command refused to work at all. I tried everything, reinstalling, changing directory names ... blah blah. I even attempted to rewrite the maze of intertwinning shell scripts that bring it up. All with no success. Finally I tried to install an earlier release. I installed the "2.1.0:" release instead of the 2.1.1. EVERYTHING WORKED THEN. This seems to imply that the windows .cmd commands are totally broken in the 2.1.1 release:
我刚刚尝试在我的Windows 8.1笔记本电脑上安装Spark。我按照Safari在线视频“Apache Spark with Scala”中的说明进行操作:使用Scala的Apache Spark按照简单的说明操作后,“spark-shell”命令完全不起作用。我尝试了一切,重新安装,更改目录名称......等等等等。我甚至试图重写那些提出它的intertwinning shell脚本的迷宫。一切都没有成功。最后我尝试安装早期版本。我安装了“2.1.0:”版本而不是2.1.1。一切都这样。这似乎意味着windows .cmd命令在2.1.1版本中完全被破坏:
#1
In this file bin\spark-class2.cmd
find the line
在这个文件中bin \ spark-class2.cmd找到该行
set RUNNER="%JAVA_HOME%\bin\java"
and replace it with (remove the ")
并替换为(删除“)
set RUNNER=%JAVA_HOME%\bin\java
#2
-
Moving Spark directory under C:\ worked for me.(C:\spark-1.6.0-bin-hadoop2.6).
在C:\下移动Spark目录为我工作。(C:\ spark-1.6.0-bin-hadoop2.6)。
-
I also updated the system's PATH variable for find.exe
我还更新了find.exe的系统PATH变量
#3
To solve the C:\Program is not recognized
, change C:\Program Files
to C:\Progra~1
, which is an alias for C:\Program Files
.
要解决C:\ Program无法识别,请将C:\ Program Files更改为C:\ Progra~1,这是C:\ Program Files的别名。
#4
Apart from moving Spark & Scala to C:\ , I had to move java installation to C:\ as well
除了将Spark和Scala移动到C:\之外,我还必须将java安装移动到C:\
#5
I just attempted to install Spark On my Windows 8.1 laptop. I followed the instructions given in the Safari online Video titled "Apache Spark with Scala": Apache Spark With Scala After following the simple instructions the "spark-shell" command refused to work at all. I tried everything, reinstalling, changing directory names ... blah blah. I even attempted to rewrite the maze of intertwinning shell scripts that bring it up. All with no success. Finally I tried to install an earlier release. I installed the "2.1.0:" release instead of the 2.1.1. EVERYTHING WORKED THEN. This seems to imply that the windows .cmd commands are totally broken in the 2.1.1 release:
我刚刚尝试在我的Windows 8.1笔记本电脑上安装Spark。我按照Safari在线视频“Apache Spark with Scala”中的说明进行操作:使用Scala的Apache Spark按照简单的说明操作后,“spark-shell”命令完全不起作用。我尝试了一切,重新安装,更改目录名称......等等等等。我甚至试图重写那些提出它的intertwinning shell脚本的迷宫。一切都没有成功。最后我尝试安装早期版本。我安装了“2.1.0:”版本而不是2.1.1。一切都这样。这似乎意味着windows .cmd命令在2.1.1版本中完全被破坏: