I've been through all questions on * with a similar error, added the JRE/bin folder to PATH, CLASSPATH environment variables, both as user variables and system variables, as well as JAVA_HOME. I've re-installed Java twice, in Program Files and Program Files (x86).
我已经在*上遇到了类似的错误,将JRE/bin文件夹添加到PATH, CLASSPATH环境变量,作为用户变量和系统变量,以及JAVA_HOME。我已经在程序文件和程序文件(x86)中重新安装了两次Java。
Still, when I try to run PySpark, I get the error message in the title. Any ideas?
不过,当我尝试运行PySpark时,我在标题中得到了错误消息。什么好主意吗?
I have Windows 8 64b, JRE8, Python 2.7
我有Windows 8 64b, JRE8, Python 2.7。
Java 8 is installed in C:\Program Files\Java\jre1.8.0_31\
Java 8是安装在C:\Program Files\Java\ jre1.8.0_31 \
The value of my environment variables are:
我的环境变量的值为:
User variables:
用户变量:
CLASSPATH: C:\ProgramData\Oracle\Java\javapath PATH: %PATH%;%JAVA_HOME%\bin;C:\Users\Alexis\AppData\Local\Continuum\Anaconda;C:\Users\Alexis\AppData\Local\Continuum\Anaconda\Scripts;C:\Program Files\Java\jre1.8.0_31\bin\;C:\ProgramData\Oracle\Java\javapath
类路径:C:\程序数据\Java\ Java\javapath路径:%PATH%;%JAVA_HOME%\bin;C:\ \Alexis\AppData\ \ \ \ \ \ \ \ \ \ \ \ \Alexis\ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ ! \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \。
System variables:
系统变量:
Path: C:\Program Files\Java\jre1.8.0_31\bin;C:\ProgramData\Oracle\Java\javapath;C:\Program Files (x86)\Intel\iCLS Client\;C:\Program Files\Intel\iCLS Client\;%SystemRoot%\system32;%SystemRoot%;%SystemRoot%\System32\Wbem;%SYSTEMROOT%\System32\WindowsPowerShell\v1.0\;C:\Program Files\Intel\Intel(R) Management Engine Components\DAL;C:\Program Files\Intel\Intel(R) Management Engine Components\IPT;C:\Program Files (x86)\Intel\Intel(R) Management Engine Components\DAL;C:\Program Files (x86)\Intel\Intel(R) Management Engine Components\IPT\;C:\ProgramData\Oracle\Java\javapath PATHEXT: .COM;.EXE;.BAT;.CMD;.VBS;.VBE;.JS;.JSE;.WSF;.WSH;.MSC JAVA_HOME: C:\Program Files\Java\jre1.8.0_31\
路径:C:\Program Files\Java\ jre1.8.0_31 \ bin;C:\ ProgramData \甲骨文\ Java \ javapath;C:\程序文件(x86)\英特尔\民端\;C:\英特尔\民客户\ \程序文件;% SystemRoot % \ system32系统;% SystemRoot %,% SystemRoot % \ system32系统\ Wbem;% SystemRoot % \ system32系统\ WindowsPowerShell \ v1.0 \;C:\ Program Files \英特尔\英特尔(R)管理引擎组件\木豆;C:\ Program Files \英特尔\英特尔(R)管理引擎组件\ IPT;C:\程序文件(x86)\英特尔\英特尔(R)管理引擎组件\木豆;英特尔C:\程序文件(x86)\ \英特尔(右)管理引擎组件\IPT\;C:\程序数据\Java\javapath PATHEXT:。com;. exe;. bat;. cmd;. vbs;. jse;. wsh;。MSC JAVA_HOME:C:\Program Files\Java\ jre1.8.0_31 \
Update 1: I now get this error:
更新1:我现在得到这个错误:
C:\Spark\spark-1.1.0-bin-hadoop1\bin>pyspark
Running python with PYTHONPATH=C:\Spark\spark-1.1.0-bin-hadoop1\bin\..\python\li
b\py4j-0.8.2.1-src.zip;C:\Spark\spark-1.1.0-bin-hadoop1\bin\..\python;
Python 2.7.6 |Anaconda 2.0.0 (64-bit)| (default, May 27 2014, 15:00:33) [MSC v.1
500 64 bit (AMD64)] on win32
Type "help", "copyright", "credits" or "license" for more information.
Anaconda is brought to you by Continuum Analytics.
Please check out: http://continuum.io and https://binstar.org
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=128m; sup
port was removed in 8.0
Picked up _JAVA_OPTIONS: -Xmx512M
#
# A fatal error has been detected by the Java Runtime Environment:
#
# EXCEPTION_ACCESS_VIOLATION (0xc0000005) at pc=0x0000000000c51e10, pid=4488, t
id=9484
#
# JRE version: Java(TM) SE Runtime Environment (8.0_31-b13) (build 1.8.0_31-b13)
# Java VM: Java HotSpot(TM) 64-Bit Server VM (25.31-b07 mixed mode windows-amd64
compressed oops)
# Problematic frame:
# C 0x0000000000c51e10
#
# Failed to write core dump. Minidumps are not enabled by default on client vers
ions of Windows
#
# An error report file with more information is saved as:
# C:\Spark\spark-1.1.0-bin-hadoop1\bin\hs_err_pid4488.log
#
# If you would like to submit a bug report, please visit:
# http://bugreport.java.com/bugreport/crash.jsp
# The crash happened outside the Java Virtual Machine in native code.
# See problematic frame for where to report the bug.
#
Traceback (most recent call last):
File "C:\Spark\spark-1.1.0-bin-hadoop1\bin\..\python\pyspark\shell.py", line 4
4, in <module>
sc = SparkContext(appName="PySparkShell", pyFiles=add_files)
File "C:\Spark\spark-1.1.0-bin-hadoop1\python\pyspark\context.py", line 107, i
n __init__
conf)
File "C:\Spark\spark-1.1.0-bin-hadoop1\python\pyspark\context.py", line 155, i
n _do_init
self._jsc = self._initialize_context(self._conf._jconf)
File "C:\Spark\spark-1.1.0-bin-hadoop1\python\pyspark\context.py", line 201, i
n _initialize_context
return self._jvm.JavaSparkContext(jconf)
File "C:\Spark\spark-1.1.0-bin-hadoop1\python\lib\py4j-0.8.2.1-src.zip\py4j\ja
va_gateway.py", line 699, in __call__
File "C:\Spark\spark-1.1.0-bin-hadoop1\python\lib\py4j-0.8.2.1-src.zip\py4j\ja
va_gateway.py", line 369, in send_command
File "C:\Spark\spark-1.1.0-bin-hadoop1\python\lib\py4j-0.8.2.1-src.zip\py4j\ja
va_gateway.py", line 362, in send_command
File "C:\Spark\spark-1.1.0-bin-hadoop1\python\lib\py4j-0.8.2.1-src.zip\py4j\ja
va_gateway.py", line 318, in _get_connection
File "C:\Spark\spark-1.1.0-bin-hadoop1\python\lib\py4j-0.8.2.1-src.zip\py4j\ja
va_gateway.py", line 325, in _create_connection
File "C:\Spark\spark-1.1.0-bin-hadoop1\python\lib\py4j-0.8.2.1-src.zip\py4j\ja
va_gateway.py", line 432, in start
py4j.protocol.Py4JNetworkError: An error occurred while trying to connect to the
Java server
>>>
Update 2: Content of hs_err_pid4488.log
更新2:hs_err_pid4488.log的内容。
#
# A fatal error has been detected by the Java Runtime Environment:
#
# EXCEPTION_ACCESS_VIOLATION (0xc0000005) at pc=0x0000000000c51e10, pid=4488, tid=9484
#
# JRE version: Java(TM) SE Runtime Environment (8.0_31-b13) (build 1.8.0_31-b13)
# Java VM: Java HotSpot(TM) 64-Bit Server VM (25.31-b07 mixed mode windows-amd64 compressed oops)
# Problematic frame:
# C 0x0000000000c51e10
#
# Failed to write core dump. Minidumps are not enabled by default on client versions of Windows
#
# If you would like to submit a bug report, please visit:
# http://bugreport.java.com/bugreport/crash.jsp
# The crash happened outside the Java Virtual Machine in native code.
# See problematic frame for where to report the bug.
#
--------------- T H R E A D ---------------
Current thread (0x0000000015bd7000): JavaThread "Thread-2" [_thread_in_native, id=9484, stack(0x0000000017d70000,0x0000000017e70000)]
siginfo: ExceptionCode=0xc0000005, ExceptionInformation=0x0000000000000008 0x0000000000c51e10
Registers:
RAX=0x0000000000000000, RBX=0x0000000000c51e10, RCX=0x0000000000c51e10, RDX=0x0000000017e6de28
RSP=0x0000000017e6dec0, RBP=0x0000000000000000, RSI=0x000000018002db42, RDI=0x0000000015bd5aa0
R8 =0x000000018006eb38, R9 =0x0000000000008000, R10=0x00000000f7508910, R11=0x000000005e6fcaa0
R12=0x0000000015bd5b08, R13=0x0000000015c38b50, R14=0x0000000017e6e030, R15=0x0000000000000004
RIP=0x0000000000c51e10, EFLAGS=0x0000000000010202
Top of Stack: (sp=0x0000000017e6dec0)
0x0000000017e6dec0: 0000000015bd71e8 000000005e858d94
0x0000000017e6ded0: 0000000015c6dd20 0000000000000000
0x0000000017e6dee0: 0000000015c6d720 0000000015c6dd20
0x0000000017e6def0: 0000000015bd71e8 000000005df13cd2
0x0000000017e6df00: 0000000000c51e10 0000000015bd71e8
0x0000000017e6df10: 0000000015c6dd20 0000000015c6d720
0x0000000017e6df20: 0000000000000001 0000000000c51e10
0x0000000017e6df30: 0000000015c6d720 0000000000000002
0x0000000017e6df40: 0000000000000000 0000000000000000
0x0000000017e6df50: 0000000000000000 0000000000000000
0x0000000017e6df60: 0000000000000000 0000000002825a06
0x0000000017e6df70: 0000000015bd7000 0000000017e6e038
0x0000000017e6df80: 00000000135e1e78 0000000000000000
0x0000000017e6df90: 0000000000000258 0000000002825b74
0x0000000017e6dfa0: 00000000135e1e80 0000000017e6e010
0x0000000017e6dfb0: 0000000001000020 0000000015bd7000
Instructions: (pc=0x0000000000c51e10)
0x0000000000c51df0: 80 66 c7 6c ff 7f 00 00 70 e0 d5 6c ff 7f 00 00
0x0000000000c51e00: 58 9a c8 6c ff 7f 00 00 5e 84 5a 42 00 14 00 90
0x0000000000c51e10: 00 00 00 00 17 00 00 00 00 00 00 00 00 00 00 00
0x0000000000c51e20: 1c 00 00 00 00 00 00 00 00 83 c7 00 00 00 00 00
Register to memory mapping:
RAX=0x0000000000000000 is an unknown value
RBX=0x0000000000c51e10 is an unknown value
RCX=0x0000000000c51e10 is an unknown value
RDX=0x0000000017e6de28 is pointing into the stack for thread: 0x0000000015bd7000
RSP=0x0000000017e6dec0 is pointing into the stack for thread: 0x0000000015bd7000
RBP=0x0000000000000000 is an unknown value
RSI=0x000000018002db42 is an unknown value
RDI=0x0000000015bd5aa0 is an unknown value
R8 =0x000000018006eb38 is an unknown value
R9 =0x0000000000008000 is an unknown value
R10=0x00000000f7508910 is an oop
[Ljava.net.InetAddress;
- klass: 'java/net/InetAddress'[]
- length: 4
R11=0x000000005e6fcaa0 is an unknown value
R12=0x0000000015bd5b08 is an unknown value
R13=0x0000000015c38b50 is an unknown value
R14=0x0000000017e6e030 is pointing into the stack for thread: 0x0000000015bd7000
R15=0x0000000000000004 is an unknown value
Stack: [0x0000000017d70000,0x0000000017e70000], sp=0x0000000017e6dec0, free space=1015k
Native frames: (J=compiled Java code, j=interpreted, Vv=VM code, C=native code)
Java frames: (J=compiled Java code, j=interpreted, Vv=VM code)
j java.net.Inet6AddressImpl.lookupAllHostAddr(Ljava/lang/String;)[Ljava/net/InetAddress;+0
j java.net.InetAddress$2.lookupAllHostAddr(Ljava/lang/String;)[Ljava/net/InetAddress;+4
j java.net.InetAddress.getAddressesFromNameService(Ljava/lang/String;Ljava/net/InetAddress;)[Ljava/net/InetAddress;+51
j java.net.InetAddress.getLocalHost()Ljava/net/InetAddress;+90
j org.apache.spark.util.Utils$.findLocalIpAddress()Ljava/lang/String;+19
j org.apache.spark.util.Utils$.localIpAddress$lzycompute()Ljava/lang/String;+17
j org.apache.spark.util.Utils$.localIpAddress()Ljava/lang/String;+12
j org.apache.spark.util.Utils$.localIpAddressHostname$lzycompute()Ljava/lang/String;+18
j org.apache.spark.util.Utils$.localIpAddressHostname()Ljava/lang/String;+12
j org.apache.spark.util.Utils$$anonfun$localHostName$1.apply()Ljava/lang/String;+3
j org.apache.spark.util.Utils$$anonfun$localHostName$1.apply()Ljava/lang/Object;+1
j scala.Option.getOrElse(Lscala/Function0;)Ljava/lang/Object;+8
j org.apache.spark.util.Utils$.localHostName()Ljava/lang/String;+11
j org.apache.spark.SparkContext.<init>(Lorg/apache/spark/SparkConf;)V+101
j org.apache.spark.api.java.JavaSparkContext.<init>(Lorg/apache/spark/SparkConf;)V+6
v ~StubRoutines::call_stub
j sun.reflect.NativeConstructorAccessorImpl.newInstance0(Ljava/lang/reflect/Constructor;[Ljava/lang/Object;)Ljava/lang/Object;+0
j sun.reflect.NativeConstructorAccessorImpl.newInstance([Ljava/lang/Object;)Ljava/lang/Object;+85
j sun.reflect.DelegatingConstructorAccessorImpl.newInstance([Ljava/lang/Object;)Ljava/lang/Object;+5
j java.lang.reflect.Constructor.newInstance([Ljava/lang/Object;)Ljava/lang/Object;+79
j py4j.reflection.MethodInvoker.invoke(Ljava/lang/Object;[Ljava/lang/Object;)Ljava/lang/Object;+108
j py4j.reflection.ReflectionEngine.invoke(Ljava/lang/Object;Lpy4j/reflection/MethodInvoker;[Ljava/lang/Object;)Ljava/lang/Object;+6
j py4j.Gateway.invoke(Ljava/lang/String;Ljava/util/List;)Lpy4j/ReturnObject;+69
j py4j.commands.ConstructorCommand.invokeConstructor(Ljava/lang/String;Ljava/util/List;)Lpy4j/ReturnObject;+8
j py4j.commands.ConstructorCommand.execute(Ljava/lang/String;Ljava/io/BufferedReader;Ljava/io/BufferedWriter;)V+18
j py4j.GatewayConnection.run()V+77
j java.lang.Thread.run()V+11
v ~StubRoutines::call_stub
--------------- P R O C E S S ---------------
Java Threads: ( => current thread )
=>0x0000000015bd7000 JavaThread "Thread-2" [_thread_in_native, id=9484, stack(0x0000000017d70000,0x0000000017e70000)]
0x0000000015bd4800 JavaThread "Thread-1" [_thread_in_native, id=10208, stack(0x0000000016c70000,0x0000000016d70000)]
0x0000000015b28000 JavaThread "pool-1-thread-1" [_thread_blocked, id=9232, stack(0x0000000016e70000,0x0000000016f70000)]
0x00000000154bb000 JavaThread "Service Thread" daemon [_thread_blocked, id=6636, stack(0x00000000158a0000,0x00000000159a0000)]
0x0000000013c3a800 JavaThread "C1 CompilerThread2" daemon [_thread_blocked, id=9640, stack(0x00000000153a0000,0x00000000154a0000)]
0x0000000013c36000 JavaThread "C2 CompilerThread1" daemon [_thread_blocked, id=10144, stack(0x00000000152a0000,0x00000000153a0000)]
0x0000000013c31800 JavaThread "C2 CompilerThread0" daemon [_thread_blocked, id=9208, stack(0x00000000151a0000,0x00000000152a0000)]
0x0000000013c30000 JavaThread "Attach Listener" daemon [_thread_blocked, id=9692, stack(0x00000000150a0000,0x00000000151a0000)]
0x0000000013c2f000 JavaThread "Signal Dispatcher" daemon [_thread_blocked, id=7508, stack(0x0000000014fa0000,0x00000000150a0000)]
0x0000000002804800 JavaThread "Finalizer" daemon [_thread_blocked, id=8024, stack(0x0000000014ea0000,0x0000000014fa0000)]
0x0000000013bf7800 JavaThread "Reference Handler" daemon [_thread_blocked, id=4584, stack(0x0000000014da0000,0x0000000014ea0000)]
0x0000000002713000 JavaThread "main" [_thread_in_native, id=3784, stack(0x0000000002610000,0x0000000002710000)]
Other Threads:
0x0000000013bf6800 VMThread [stack: 0x0000000014ca0000,0x0000000014da0000] [id=10220]
0x0000000013c8e000 WatcherThread [stack: 0x00000000159a0000,0x0000000015aa0000] [id=9396]
VM state:not at safepoint (normal execution)
VM Mutex/Monitor currently owned by a thread: None
Heap:
PSYoungGen total 153088K, used 34360K [0x00000000f5580000, 0x0000000100000000, 0x0000000100000000)
eden space 131584K, 26% used [0x00000000f5580000,0x00000000f770e128,0x00000000fd600000)
from space 21504K, 0% used [0x00000000feb00000,0x00000000feb00000,0x0000000100000000)
to space 21504K, 0% used [0x00000000fd600000,0x00000000fd600000,0x00000000feb00000)
ParOldGen total 349696K, used 0K [0x00000000e0000000, 0x00000000f5580000, 0x00000000f5580000)
object space 349696K, 0% used [0x00000000e0000000,0x00000000e0000000,0x00000000f5580000)
Metaspace used 8933K, capacity 9204K, committed 9344K, reserved 1056768K
class space used 1355K, capacity 1392K, committed 1408K, reserved 1048576K
Card table byte_map: [0x0000000011bd0000,0x0000000011ce0000] byte_map_base: 0x00000000114d0000
Marking Bits: (ParMarkBitMap*) 0x000000005e76d4f0
Begin Bits: [0x0000000012190000, 0x0000000012990000)
End Bits: [0x0000000012990000, 0x0000000013190000)
Polling page: 0x0000000000e40000
CodeCache: size=245760Kb used=2356Kb max_used=2356Kb free=243403Kb
bounds [0x0000000002810000, 0x0000000002a80000, 0x0000000011810000]
total_blobs=829 nmethods=503 adapters=240
compilation: enabled
Compilation events (10 events):
Event: 3.568 Thread 0x0000000013c3a800 502 3 scala.collection.immutable.HashMap::<init> (9 bytes)
Event: 3.569 Thread 0x0000000013c3a800 nmethod 502 0x0000000002a59390 code [0x0000000002a59580, 0x0000000002a59d28]
Event: 3.569 Thread 0x0000000013c3a800 501 3 scala.collection.immutable.HashMap$HashTrieMap::<init> (20 bytes)
Event: 3.570 Thread 0x0000000013c3a800 nmethod 501 0x0000000002a5a210 code [0x0000000002a5a3a0, 0x0000000002a5a7e8]
Event: 3.570 Thread 0x0000000013c3a800 499 3 scala.collection.immutable.HashMap$HashTrieMap::updated0 (240 bytes)
Event: 3.575 Thread 0x0000000013c3a800 nmethod 499 0x0000000002a5aa90 code [0x0000000002a5adc0, 0x0000000002a5c7f8]
Event: 3.575 Thread 0x0000000013c3a800 500 3 scala.collection.mutable.MapBuilder::$plus$eq (9 bytes)
Event: 3.576 Thread 0x0000000013c3a800 nmethod 500 0x0000000002a5da50 code [0x0000000002a5dbe0, 0x0000000002a5e008]
Event: 3.576 Thread 0x0000000013c3a800 503 3 scala.collection.immutable.HashMap$HashTrieMap::size (5 bytes)
Event: 3.576 Thread 0x0000000013c3a800 nmethod 503 0x0000000002a5e190 code [0x0000000002a5e2e0, 0x0000000002a5e450]
GC Heap History (0 events):
No events
Deoptimization events (1 events):
Event: 2.283 Thread 0x0000000002713000 Uncommon trap: reason=unloaded action=reinterpret pc=0x00000000029efb88 method=sun.misc.URLClassPath.getLoader(I)Lsun/misc/URLClassPath$Loader; @ 113
Internal exceptions (10 events):
Event: 3.554 Thread 0x0000000015bd7000 Exception <a 'java/security/PrivilegedActionException'> (0x00000000f74d7910) thrown at [C:\workspace\8-2-build-windows-amd64-cygwin\jdk8u31\2394\hotspot\src\share\vm\prims\jvm.cpp, line 1312]
Event: 3.555 Thread 0x0000000015bd7000 Exception <a 'java/security/PrivilegedActionException'> (0x00000000f74dbb48) thrown at [C:\workspace\8-2-build-windows-amd64-cygwin\jdk8u31\2394\hotspot\src\share\vm\prims\jvm.cpp, line 1312]
Event: 3.557 Thread 0x0000000015bd7000 Exception <a 'java/security/PrivilegedActionException'> (0x00000000f74dfb48) thrown at [C:\workspace\8-2-build-windows-amd64-cygwin\jdk8u31\2394\hotspot\src\share\vm\prims\jvm.cpp, line 1312]
Event: 3.558 Thread 0x0000000015bd7000 Exception <a 'java/security/PrivilegedActionException'> (0x00000000f74e3130) thrown at [C:\workspace\8-2-build-windows-amd64-cygwin\jdk8u31\2394\hotspot\src\share\vm\prims\jvm.cpp, line 1312]
Event: 3.559 Thread 0x0000000015bd7000 Exception <a 'java/security/PrivilegedActionException'> (0x00000000f74e6080) thrown at [C:\workspace\8-2-build-windows-amd64-cygwin\jdk8u31\2394\hotspot\src\share\vm\prims\jvm.cpp, line 1312]
Event: 3.561 Thread 0x0000000015bd7000 Exception <a 'java/security/PrivilegedActionException'> (0x00000000f74ed1c0) thrown at [C:\workspace\8-2-build-windows-amd64-cygwin\jdk8u31\2394\hotspot\src\share\vm\prims\jvm.cpp, line 1312]
Event: 3.568 Thread 0x0000000015bd7000 Exception <a 'java/security/PrivilegedActionException'> (0x00000000f74f7100) thrown at [C:\workspace\8-2-build-windows-amd64-cygwin\jdk8u31\2394\hotspot\src\share\vm\prims\jvm.cpp, line 1312]
Event: 3.571 Thread 0x0000000015bd7000 Exception <a 'java/security/PrivilegedActionException'> (0x00000000f74fcf08) thrown at [C:\workspace\8-2-build-windows-amd64-cygwin\jdk8u31\2394\hotspot\src\share\vm\prims\jvm.cpp, line 1312]
Event: 3.573 Thread 0x0000000015bd7000 Exception <a 'java/security/PrivilegedActionException'> (0x00000000f7501178) thrown at [C:\workspace\8-2-build-windows-amd64-cygwin\jdk8u31\2394\hotspot\src\share\vm\prims\jvm.cpp, line 1312]
Event: 3.576 Thread 0x0000000015bd7000 Exception <a 'java/security/PrivilegedActionException'> (0x00000000f75051d0) thrown at [C:\workspace\8-2-build-windows-amd64-cygwin\jdk8u31\2394\hotspot\src\share\vm\prims\jvm.cpp, line 1312]
Events (10 events):
Event: 3.561 loading class org/apache/spark/SparkConf$$anonfun$validateSettings$4
Event: 3.561 loading class org/apache/spark/SparkConf$$anonfun$validateSettings$4 done
Event: 3.568 loading class org/apache/spark/SparkConf$$anonfun$validateSettings$5
Event: 3.568 loading class org/apache/spark/SparkConf$$anonfun$validateSettings$5 done
Event: 3.571 loading class org/apache/spark/SparkConf$$anonfun$getBoolean$2
Event: 3.571 loading class org/apache/spark/SparkConf$$anonfun$getBoolean$2 done
Event: 3.573 loading class org/apache/spark/SparkConf$$anonfun$getBoolean$1
Event: 3.573 loading class org/apache/spark/SparkConf$$anonfun$getBoolean$1 done
Event: 3.576 loading class org/apache/spark/util/Utils$$anonfun$localHostName$1
Event: 3.576 loading class org/apache/spark/util/Utils$$anonfun$localHostName$1 done
Dynamic libraries:
0x00007ff644d20000 - 0x00007ff644d54000 C:\Program Files\Java\jre1.8.0_31\bin\java.exe
0x00007fff70190000 - 0x00007fff70336000 C:\windows\SYSTEM32\ntdll.dll
0x00007fff6efc0000 - 0x00007fff6f0fa000 C:\windows\system32\KERNEL32.DLL
0x00007fff6c550000 - 0x00007fff6c65f000 C:\windows\system32\KERNELBASE.dll
0x00007fff6a9c0000 - 0x00007fff6aa48000 C:\windows\system32\apphelp.dll
0x00007fff53810000 - 0x00007fff5385f000 C:\windows\AppPatch\AppPatch64\AcGenral.DLL
0x00007fff6e550000 - 0x00007fff6e5f7000 C:\windows\system32\msvcrt.dll
0x00007fff6c260000 - 0x00007fff6c28b000 C:\windows\SYSTEM32\SspiCli.dll
0x00007fff6e7f0000 - 0x00007fff6e841000 C:\windows\system32\SHLWAPI.dll
0x00007fff6e850000 - 0x00007fff6e9c1000 C:\windows\system32\USER32.dll
0x00007fff6e670000 - 0x00007fff6e7e8000 C:\windows\system32\ole32.dll
0x00007fff6ce10000 - 0x00007fff6e21f000 C:\windows\system32\SHELL32.dll
0x00007fff6ba70000 - 0x00007fff6ba8e000 C:\windows\SYSTEM32\USERENV.dll
0x00007fff6ebe0000 - 0x00007fff6ec85000 C:\windows\system32\ADVAPI32.dll
0x00007fff5ab50000 - 0x00007fff5ab6b000 C:\windows\SYSTEM32\MPR.dll
0x00007fff6edd0000 - 0x00007fff6ef07000 C:\windows\system32\RPCRT4.dll
0x00007fff6cdb0000 - 0x00007fff6ce07000 C:\windows\SYSTEM32\sechost.dll
0x00007fff6cbd0000 - 0x00007fff6cda6000 C:\windows\SYSTEM32\combase.dll
0x00007fff6c990000 - 0x00007fff6cad4000 C:\windows\system32\GDI32.dll
0x00007fff6c480000 - 0x00007fff6c494000 C:\windows\SYSTEM32\profapi.dll
0x00007fff6a880000 - 0x00007fff6a91f000 C:\windows\SYSTEM32\SHCORE.dll
0x00007fff6cb80000 - 0x00007fff6cbb4000 C:\windows\system32\IMM32.DLL
0x00007fff6ec90000 - 0x00007fff6edc9000 C:\windows\system32\MSCTF.dll
0x00007fff68940000 - 0x00007fff68b9a000 C:\windows\WinSxS\amd64_microsoft.windows.common-controls_6595b64144ccf1df_6.0.9600.17031_none_6242a4b3ecbb55a1\COMCTL32.dll
0x00007fff64920000 - 0x00007fff649a9000 C:\Program Files (x86)\AVG\AVG2015\avghooka.dll
0x000000005e7f0000 - 0x000000005e8c2000 C:\Program Files\Java\jre1.8.0_31\bin\msvcr100.dll
0x000000005df90000 - 0x000000005e7ea000 C:\Program Files\Java\jre1.8.0_31\bin\server\jvm.dll
0x00007fff5a420000 - 0x00007fff5a429000 C:\windows\SYSTEM32\WSOCK32.dll
0x00007fff67d90000 - 0x00007fff67daf000 C:\windows\SYSTEM32\WINMM.dll
0x00007fff6ebd0000 - 0x00007fff6ebd7000 C:\windows\system32\PSAPI.DLL
0x00007fff6e9d0000 - 0x00007fff6ea28000 C:\windows\system32\WS2_32.dll
0x00007fff67c90000 - 0x00007fff67cba000 C:\windows\SYSTEM32\WINMMBASE.dll
0x00007fff6cbc0000 - 0x00007fff6cbc9000 C:\windows\system32\NSI.dll
0x00007fff6c940000 - 0x00007fff6c98a000 C:\windows\SYSTEM32\cfgmgr32.dll
0x00007fff6b2c0000 - 0x00007fff6b2e6000 C:\windows\SYSTEM32\DEVOBJ.dll
0x000000005df80000 - 0x000000005df8f000 C:\Program Files\Java\jre1.8.0_31\bin\verify.dll
0x000000005df50000 - 0x000000005df78000 C:\Program Files\Java\jre1.8.0_31\bin\java.dll
0x000000005df30000 - 0x000000005df46000 C:\Program Files\Java\jre1.8.0_31\bin\zip.dll
0x000000005df10000 - 0x000000005df2a000 C:\Program Files\Java\jre1.8.0_31\bin\net.dll
0x0000000180000000 - 0x0000000180078000 C:\windows\system32\ASProxy64.dll
0x00007fff6bcc0000 - 0x00007fff6bd18000 C:\windows\SYSTEM32\MSWSOCK.dll
0x00007fff6b530000 - 0x00007fff6b559000 C:\windows\SYSTEM32\IPHLPAPI.DLL
0x00007fff6e480000 - 0x00007fff6e541000 C:\windows\system32\OLEAUT32.dll
0x00007fff6b520000 - 0x00007fff6b52a000 C:\windows\SYSTEM32\VERSION.dll
0x00007fff6b510000 - 0x00007fff6b51a000 C:\windows\SYSTEM32\WINNSI.DLL
0x00007fff6b290000 - 0x00007fff6b29a000 C:\windows\SYSTEM32\kernel.appcore.dll
0x00007fff6c350000 - 0x00007fff6c35a000 C:\windows\SYSTEM32\CRYPTBASE.dll
0x00007fff6c2f0000 - 0x00007fff6c350000 C:\windows\SYSTEM32\bcryptPrimitives.dll
0x00007fff6b160000 - 0x00007fff6b281000 C:\windows\system32\uxtheme.dll
0x00007fff66990000 - 0x00007fff66a28000 C:\Program Files\Common Files\microsoft shared\ink\tiptsf.dll
0x00007fff6ef10000 - 0x00007fff6efb4000 C:\windows\SYSTEM32\clbcatq.dll
0x00007fff6bd20000 - 0x00007fff6bd3e000 C:\windows\SYSTEM32\CRYPTSP.dll
0x00007fff6b960000 - 0x00007fff6b995000 C:\windows\system32\rsaenh.dll
0x00007fff6bf60000 - 0x00007fff6bf86000 C:\windows\SYSTEM32\bcrypt.dll
0x00007fff6c360000 - 0x00007fff6c3f7000 C:\windows\SYSTEM32\sxs.dll
0x00007fff66370000 - 0x00007fff66384000 C:\windows\SYSTEM32\dhcpcsvc6.DLL
0x00007fff66550000 - 0x00007fff66569000 C:\windows\SYSTEM32\dhcpcsvc.DLL
0x00007fff5ee20000 - 0x00007fff5ee34000 C:\windows\system32\napinsp.dll
0x00007fff5ed10000 - 0x00007fff5ed29000 C:\windows\system32\pnrpnsp.dll
0x00007fff6ab30000 - 0x00007fff6ab4b000 C:\windows\system32\NLAapi.dll
0x00007fff6bad0000 - 0x00007fff6bb73000 C:\windows\SYSTEM32\DNSAPI.dll
0x00007fff5ed00000 - 0x00007fff5ed0c000 C:\windows\System32\winrnr.dll
0x00007fff5ece0000 - 0x00007fff5ecf3000 C:\windows\system32\wshbth.dll
0x000000005efb0000 - 0x000000005efd6000 C:\Program Files\Bonjour\mdnsNSP.dll
0x00007fff64140000 - 0x00007fff64149000 C:\Windows\System32\rasadhlp.dll
0x00007fff66480000 - 0x00007fff664e8000 C:\windows\System32\fwpuclnt.dll
0x00007fff5b8e0000 - 0x00007fff5ba68000 C:\windows\SYSTEM32\dbghelp.dll
VM Arguments:
jvm_args: -XX:MaxPermSize=128m -Xms512m -Xmx512m -Xmx512M
java_command: org.apache.spark.deploy.SparkSubmit pyspark-shell
java_class_path (initial): ;;C:\Spark\spark-1.1.0-bin-hadoop1\bin\..\conf;C:\Spark\spark-1.1.0-bin-hadoop1\bin\..\lib\spark-assembly-1.1.0-hadoop1.0.4.jar;;C:\Spark\spark-1.1.0-bin-hadoop1\bin\..\lib\datanucleus-api-jdo-3.2.1.jar;C:\Spark\spark-1.1.0-bin-hadoop1\bin\..\lib\datanucleus-core-3.2.2.jar;C:\Spark\spark-1.1.0-bin-hadoop1\bin\..\lib\datanucleus-rdbms-3.2.1.jar;
Launcher Type: SUN_STANDARD
Environment Variables:
JAVA_HOME=C:\Program Files\Java\jre1.8.0_31\
_JAVA_OPTIONS=-Xmx512M
CLASSPATH=;;C:\Spark\spark-1.1.0-bin-hadoop1\bin\..\conf;C:\Spark\spark-1.1.0-bin-hadoop1\bin\..\lib\spark-assembly-1.1.0-hadoop1.0.4.jar;;C:\Spark\spark-1.1.0-bin-hadoop1\bin\..\lib\datanucleus-api-jdo-3.2.1.jar;C:\Spark\spark-1.1.0-bin-hadoop1\bin\..\lib\datanucleus-core-3.2.2.jar;C:\Spark\spark-1.1.0-bin-hadoop1\bin\..\lib\datanucleus-rdbms-3.2.1.jar;
PATH=C:\Users\Alexis\AppData\Local\Continuum\Anaconda\lib\site-packages\numpy\core;C:\Program Files\Java\jre1.8.0_31\bin;C:\ProgramData\Oracle\Java\javapath;C:\Program Files (x86)\Intel\iCLS Client\;C:\Program Files\Intel\iCLS Client\;C:\windows\system32;C:\windows;C:\windows\System32\Wbem;C:\windows\System32\WindowsPowerShell\v1.0\;C:\Program Files\Intel\Intel(R) Management Engine Components\DAL;C:\Program Files\Intel\Intel(R) Management Engine Components\IPT;C:\Program Files (x86)\Intel\Intel(R) Management Engine Components\DAL;C:\Program Files (x86)\Intel\Intel(R) Management Engine Components\IPT\;C:\ProgramData\Oracle\Java\javapath;C:\Program Files\Java\jre1.8.0_31\bin;C:\ProgramData\Oracle\Java\javapath;C:\Program Files (x86)\Intel\iCLS Client\;C:\Program Files\Intel\iCLS Client\;C:\windows\system32;C:\windows;C:\windows\System32\Wbem;C:\windows\System32\WindowsPowerShell\v1.0\;C:\Program Files\Intel\Intel(R) Management Engine Components\DAL;C:\Program Files\Intel\Intel(R) Management Engine Components\IPT;C:\Program Files (x86)\Intel\Intel(R) Management Engine Components\DAL;C:\Program Files (x86)\Intel\Intel(R) Management Engine Components\IPT\;C:\ProgramData\Oracle\Java\javapath;C:\Program Files\Java\jre1.8.0_31\\bin;C:\Users\Alexis\AppData\Local\Continuum\Anaconda;C:\Users\Alexis\AppData\Local\Continuum\Anaconda\Scripts;C:\Program Files\Java\jre1.8.0_31\bin\;C:\ProgramData\Oracle\Java\javapath
USERNAME=Alexis
OS=Windows_NT
PROCESSOR_IDENTIFIER=Intel64 Family 6 Model 69 Stepping 1, GenuineIntel
--------------- S Y S T E M ---------------
OS: Windows 8.1 , 64 bit Build 9600
CPU:total 4 (2 cores per cpu, 2 threads per core) family 6 model 69 stepping 1, cmov, cx8, fxsr, mmx, sse, sse2, sse3, ssse3, sse4.1, sse4.2, popcnt, avx, avx2, aes, clmul, erms, lzcnt, ht, tsc, tscinvbit, bmi1, bmi2
Memory: 4k page, physical 8298776k(3960856k free), swap 16687384k(11119384k free)
vm_info: Java HotSpot(TM) 64-Bit Server VM (25.31-b07) for windows-amd64 JRE (1.8.0_31-b13), built on Dec 17 2014 21:00:28 by "java_re" with MS VC++ 10.0 (VS2010)
time: Fri Jan 30 14:03:30 2015
elapsed time: 3 seconds (0d 0h 0m 3s)
3 个解决方案
#1
1
On your error's stacktrace:
在你的错误的异常堆栈:
j java.net.Inet6AddressImpl.lookupAllHostAddr(Ljava/lang/String;)[Ljava/net/InetAddress;+0
Can suggest to execute this command on Windows cmd:
可以建议在Windows cmd上执行此命令:
setx _JAVA_OPTIONS -Djava.net.preferIPv4Stack=true
More information about this Java's options: http://docs.oracle.com/javase/6/docs/technotes/guides/net/ipv6_guide/
有关此Java选项的更多信息:http://docs.oracle.com/javase/6/docs/guides/net/ipv6_guide/。
#2
1
I had the same problem. Turned out my PATHEXT variable was messed up. You need to have .EXE in there. Mine contained only .PY. Now it's like this: .COM;.EXE;.BAT;.CMD;.VBS;.VBE;.JS;.JSE;.WSF;.WSH;.MSC;.PY This fixed the problem.
我遇到了同样的问题。我的PATHEXT变量被搞砸了。你需要有。exe。我只包含. py。现在是这样的:. com,. exe;。bat;.CMD;.VBS;.VBE;. js,.JSE;.WSF;.WSH;.MSC;。这个问题解决了。
#3
0
buddy, I recommend you to start your spark or software journey under linux. Things showed that your developing environment was not configured right.
朋友,我建议你在linux下启动你的spark或软件之旅。事情表明您的开发环境没有被正确配置。
Since I didn't use spark under Windows, but I can give you some configurations under linux, hope that will give you some tips or methods to figure your problem out.
因为我没有在Windows下使用spark,但是我可以给您一些linux下的配置,希望能给您一些提示或方法来解决您的问题。
root@ubuntu2[13:26:05]:~/Desktop#vi company_spark_1.2.0.bashrc # define the spark edition used in company export PATH=/root/anaconda/bin:/opt/anaconda/bin:/bin:/usr/bin:/usr/X11R6/bin:/usr/loca l/bin:/sbin:/usr/local/hadoop/bin:/usr/local/hive/bin: export PATH=/usr/local/spark-1.2.0-bin-cdh4/bin:$PATH export PS1="\[\033[01;31m\]\u\[\033[00m\]@\[\033[01;32m\]\h\[\033[00m\][\[\033[01;33m\] \t\[\033[00m\]]:\[\033[01;34m\]\w\[\033[00m\]#"
most important is this :export PATH=/usr/local/spark-1.2.0-bin-cdh4/bin:$PATH, where the commands of spark are located.
最重要的是:导出路径=/usr/local/spark-1.2.0-bin-cdh4/bin:$PATH,其中spark的命令位于其中。
root@ubuntu2[13:28:25]:~/Desktop#java -version
java version "1.7.0_71"
Java(TM) SE Runtime Environment (build 1.7.0_71-b14)
Java HotSpot(TM) 64-Bit Server VM (build 24.71-b01, mixed mode)
root@ubuntu2[13:28:28]:~/Desktop#$JAVA_HOME
-bash: /usr/local/jdk1.7.0_71/: Is a directory
root@ubuntu2[13:29:16]:~/Desktop#vi java_1.7_path.bashrc
export PATH=/usr/local/jdk1.7.0_71/bin:$PATH
export CLASSPATH="/usr/local/jdk1.7.0_71/lib:."
export JAVA_HOME="/usr/local/jdk1.7.0_71/"
#1
1
On your error's stacktrace:
在你的错误的异常堆栈:
j java.net.Inet6AddressImpl.lookupAllHostAddr(Ljava/lang/String;)[Ljava/net/InetAddress;+0
Can suggest to execute this command on Windows cmd:
可以建议在Windows cmd上执行此命令:
setx _JAVA_OPTIONS -Djava.net.preferIPv4Stack=true
More information about this Java's options: http://docs.oracle.com/javase/6/docs/technotes/guides/net/ipv6_guide/
有关此Java选项的更多信息:http://docs.oracle.com/javase/6/docs/guides/net/ipv6_guide/。
#2
1
I had the same problem. Turned out my PATHEXT variable was messed up. You need to have .EXE in there. Mine contained only .PY. Now it's like this: .COM;.EXE;.BAT;.CMD;.VBS;.VBE;.JS;.JSE;.WSF;.WSH;.MSC;.PY This fixed the problem.
我遇到了同样的问题。我的PATHEXT变量被搞砸了。你需要有。exe。我只包含. py。现在是这样的:. com,. exe;。bat;.CMD;.VBS;.VBE;. js,.JSE;.WSF;.WSH;.MSC;。这个问题解决了。
#3
0
buddy, I recommend you to start your spark or software journey under linux. Things showed that your developing environment was not configured right.
朋友,我建议你在linux下启动你的spark或软件之旅。事情表明您的开发环境没有被正确配置。
Since I didn't use spark under Windows, but I can give you some configurations under linux, hope that will give you some tips or methods to figure your problem out.
因为我没有在Windows下使用spark,但是我可以给您一些linux下的配置,希望能给您一些提示或方法来解决您的问题。
root@ubuntu2[13:26:05]:~/Desktop#vi company_spark_1.2.0.bashrc # define the spark edition used in company export PATH=/root/anaconda/bin:/opt/anaconda/bin:/bin:/usr/bin:/usr/X11R6/bin:/usr/loca l/bin:/sbin:/usr/local/hadoop/bin:/usr/local/hive/bin: export PATH=/usr/local/spark-1.2.0-bin-cdh4/bin:$PATH export PS1="\[\033[01;31m\]\u\[\033[00m\]@\[\033[01;32m\]\h\[\033[00m\][\[\033[01;33m\] \t\[\033[00m\]]:\[\033[01;34m\]\w\[\033[00m\]#"
most important is this :export PATH=/usr/local/spark-1.2.0-bin-cdh4/bin:$PATH, where the commands of spark are located.
最重要的是:导出路径=/usr/local/spark-1.2.0-bin-cdh4/bin:$PATH,其中spark的命令位于其中。
root@ubuntu2[13:28:25]:~/Desktop#java -version
java version "1.7.0_71"
Java(TM) SE Runtime Environment (build 1.7.0_71-b14)
Java HotSpot(TM) 64-Bit Server VM (build 24.71-b01, mixed mode)
root@ubuntu2[13:28:28]:~/Desktop#$JAVA_HOME
-bash: /usr/local/jdk1.7.0_71/: Is a directory
root@ubuntu2[13:29:16]:~/Desktop#vi java_1.7_path.bashrc
export PATH=/usr/local/jdk1.7.0_71/bin:$PATH
export CLASSPATH="/usr/local/jdk1.7.0_71/lib:."
export JAVA_HOME="/usr/local/jdk1.7.0_71/"