本地跑 spark ui 报错

时间:2022-07-17 17:04:54
java.lang.NoSuchMethodError: javax.servlet.http.HttpServletRequest.isAsyncStarted()Z
at org.spark_project.jetty.servlets.gzip.GzipHandler.handle(GzipHandler.java:)
at org.spark_project.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:)
at org.spark_project.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:)
at org.spark_project.jetty.server.Server.handle(Server.java:)
at org.spark_project.jetty.server.HttpChannel.handle(HttpChannel.java:)
at org.spark_project.jetty.server.HttpConnection.onFillable(HttpConnection.java:)
at org.spark_project.jetty.io.AbstractConnection$.run(AbstractConnection.java:)
at org.spark_project.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:)
at org.spark_project.jetty.util.thread.QueuedThreadPool$.run(QueuedThreadPool.java:)
at java.lang.Thread.run(Thread.java:)
// :: WARN HttpChannel: Could not send response error : java.lang.NoSuchMethodError: javax.servlet.http.HttpServletRequest.isAsyncStarted()Z

jar包冲突导致的无法访问的问题

<dependency>

    <groupId>javax.servlet</groupId>

    <artifactId>javax.servlet-api</artifactId>

    <version>3.0.1</version>

</dependency>

将这个依赖放在pom文件的第一行,解决问题

解决ui能访问问题之后又发现了另一个问题,就是点击

本地跑 spark ui 报错

点击executors无法访问,开始以为是 jax-sr 1.0 和jax-sr 2.0的问题.

初步像网上一样怀疑是

<dependency>
<groupId>javax.ws.rs</groupId>
<artifactId>javax.ws.rs-api</artifactId>
<version>2.0.1</version>
</dependency> 的jar 冲突

经过排查maven 依赖的问题之后,发现我的本地依赖只有一个 2.0 没有jax-sr 1.0 协议的jar

漫漫之路,在网上搜啊搜,总算找到一篇博客可以解决这样的问题

http://data4q.com/2017/10/30/%E8%A7%A3%E5%86%B3jersey-2-x-jersey-1-x%E5%86%B2%E7%AA%81%E9%97%AE%E9%A2%98-uribuilder%E9%97%AE%E9%A2%98/

就是加入以下jar

<dependency>
<groupId>javax.ws.rs</groupId>
<artifactId>javax.ws.rs-api</artifactId>
<version>2.0.1</version>
</dependency>
<dependency>
<groupId>org.glassfish.jersey.core</groupId>
<artifactId>jersey-server</artifactId>
<version>2.23.1</version>
</dependency>
<dependency>
<groupId>org.glassfish.jersey.containers</groupId>
<artifactId>jersey-container-servlet-core</artifactId>
<version>2.23.1</version>
</dependency>

  启动本地spark 应用程序总算可以正常的访问 executors了.