使用Phoenix 4.5在CDH 5.4上运行Spark作业时找不到PhoenixOutputFormat

时间:2022-02-13 16:47:31

I managed to configure Phoenix 4.5 on Cloudera CDH 5.4 by recompiling the source code. sqlline.py works well, but there are problems with spark.

我设法通过重新编译源代码在Cloudera CDH 5.4上配置Phoenix 4.5。 sqlline.py运行良好,但有火花问题。

spark-submit --class my.JobRunner \
--master yarn --deploy-mode client \
--jars `ls -dm /myapp/lib/* | tr -d ' \r\n'` \
/myapp/mainjar.jar

The /myapp/lib folders contains the phoenix core lib, which contains class org.apache.phoenix.mapreduce.PhoenixOutputFormat. But it seems that the driver/executor cannot see it.

/ myapp / lib文件夹包含phoenix核心库,其中包含类org.apache.phoenix.mapreduce.PhoenixOutputFormat。但似乎驱动程序/执行程序无法看到它。

Exception in thread "main" java.lang.RuntimeException: java.lang.ClassNotFoundException: Class org.apache.phoenix.mapreduce.PhoenixOutputFormat not found
    at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2112)
    at org.apache.hadoop.mapreduce.task.JobContextImpl.getOutputFormatClass(JobContextImpl.java:232)
    at org.apache.spark.rdd.PairRDDFunctions.saveAsNewAPIHadoopDataset(PairRDDFunctions.scala:971)
    at org.apache.spark.rdd.PairRDDFunctions.saveAsNewAPIHadoopFile(PairRDDFunctions.scala:903)
    at org.apache.phoenix.spark.ProductRDDFunctions.saveToPhoenix(ProductRDDFunctions.scala:51)
    at com.mypackage.save(DAOImpl.scala:41)
    at com.mypackage.ProtoStreamingJob.execute(ProtoStreamingJob.scala:58)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at com.mypackage.SparkApplication.sparkRun(SparkApplication.scala:95)
    at com.mypackage.SparkApplication$delayedInit$body.apply(SparkApplication.scala:112)
    at scala.Function0$class.apply$mcV$sp(Function0.scala:40)
    at scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:12)
    at scala.App$$anonfun$main$1.apply(App.scala:71)
    at scala.App$$anonfun$main$1.apply(App.scala:71)
    at scala.collection.immutable.List.foreach(List.scala:318)
    at scala.collection.generic.TraversableForwarder$class.foreach(TraversableForwarder.scala:32)
    at scala.App$class.main(App.scala:71)
    at com.mypackage.SparkApplication.main(SparkApplication.scala:15)
    at com.mypackage.ProtoStreamingJobRunner.main(ProtoStreamingJob.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:569)
    at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:166)
    at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:189)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:110)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.ClassNotFoundException: Class org.apache.phoenix.mapreduce.PhoenixOutputFormat not found
    at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:2018)
    at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2110)
    ... 30 more

What can I do to overcome this exception?

我该怎么做才能克服这个异常?

1 个解决方案

#1


1  

Adding phoenix-core to classpath.txt solves the problem. This file is usually located under /etc/spark/conf folder.

将phoenix-core添加到classpath.txt可以解决问题。此文件通常位于/ etc / spark / conf文件夹下。

#1


1  

Adding phoenix-core to classpath.txt solves the problem. This file is usually located under /etc/spark/conf folder.

将phoenix-core添加到classpath.txt可以解决问题。此文件通常位于/ etc / spark / conf文件夹下。