I am trying to follow this instructions to enable history logs with Spark Oozie action. https://archive.cloudera.com/cdh5/cdh/5/oozie/DG_SparkActionExtension.html
我正在尝试按照此说明启用Spark Oozie操作的历史记录日志。 https://archive.cloudera.com/cdh5/cdh/5/oozie/DG_SparkActionExtension.html
To ensure that your Spark job shows up in the Spark History Server, make sure to specify these three Spark configuration properties either in spark-opts with --conf or from oozie.service.SparkConfigurationService.spark.configurations
要确保Spark作业显示在Spark History Server中,请确保在带有--conf的spark-opts或oozie.service.SparkConfigurationService.spark.configurations中指定这三个Spark配置属性。
- spark.yarn.historyServer.address=http://SPH-HOST:18088
- spark.yarn.historyServer.address = HTTP:// SPH-HOST:18088
- spark.eventLog.dir=hdfs://NN:8020/user/spark/applicationHistory
- spark.eventLog.dir = HDFS:// NN:8020 /用户/火花/ applicationHistory
- spark.eventLog.enabled=true
- spark.eventLog.enabled =真
Workflow defintion looks like this:
工作流程定义如下:
<action name="spark-9e7c">
<spark xmlns="uri:oozie:spark-action:0.1">
<job-tracker>${jobTracker}</job-tracker>
<name-node>${nameNode}</name-node>
<master>yarn-cluster</master>
<mode>cluster</mode>
<name>Correlation Engine</name>
<class>Main Class</class>
<jar>hdfs://<MACHINE IP>:8020/USER JAR</jar>
<spark-opts> --conf spark.eventLog.dir=<MACHINE IP>:8020/user/spark/applicationHistory --conf spark.eventLog.enabled=true --conf spark.yarn.historyServer.address=<MACHINE IP>:18088/</spark-opts>
</spark>
<ok to="email-f5d5"/>
<error to="email-a687"/>
</action>
When I test from a shell script history logs are logged correctly but with Oozie actions logs are not logged correctly. I have set all the three propeties.
当我从shell脚本测试历史记录时,日志记录正确,但使用Oozie操作日志未正确记录。我已经设定了所有三个职业。
1 个解决方案
#1
2
With my experience, I think you have passed arguments in wrong place.
根据我的经验,我认为你在错误的地方传递了争论。
Please refer to below xml snippet
请参阅下面的xml片段
<?xml version="1.0" encoding="UTF-8"?>
<workflow-app xmlns='uri:oozie:workflow:0.4' name='sparkjob'>
<start to='spark-process' />
<action name='spark-process'>
<spark xmlns='uri:oozie:spark-action:0.1'>
<job-tracker>${jobTracker}</job-tracker>
<name-node>${nameNode}</name-node>
<configuration>
<property>
<name>oozie.service.SparkConfigurationService.spark.configurations</name>
<value>spark.eventLog.dir=hdfs://node1.analytics.sub:8020/user/spark/applicationHistory,spark.yarn.historyServer.address=http://node1.analytics.sub:18088,spark.eventLog.enabled=true</value>
</property>
<!--property>
<name>oozie.hive.defaults</name>
<value>/user/ambari-qa/sparkActionPython/hive-config.xml</value>
</property-->
<!--property>
<name>oozie.use.system.libpath</name>
<value>true</value>
</property-->
<property>
<name>oozie.service.WorkflowAppService.system.libpath</name>
<value>/user/oozie/share/lib/lib_20150831190253/spark</value>
</property>
</configuration>
<master>yarn-client</master>
<!--master>local[4]</master-->
<mode>client</mode>
<name>wordcount</name>
<jar>/usr/hdp/current/spark-client/AnalyticsJar/wordcount.py</jar>
<spark-opts>--executor-memory 1G --driver-memory 1G --executor-cores 4 --num-executors 2 --jars /usr/hdp/current/spark-client/lib/spark-assembly-1.3.1.2.3.0.0-2557-hadoop2.7.1.2.3.0.0-2557.jar</spark-opts>
</spark>
<ok to='end'/>
<error to='spark-fail'/>
</action>
<kill name='spark-fail'>
<message>Spark job failed, error message[${wf:errorMessage(wf:lastErrorNode())}]</message>
</kill>
<end name='end' />
</workflow-app>
#1
2
With my experience, I think you have passed arguments in wrong place.
根据我的经验,我认为你在错误的地方传递了争论。
Please refer to below xml snippet
请参阅下面的xml片段
<?xml version="1.0" encoding="UTF-8"?>
<workflow-app xmlns='uri:oozie:workflow:0.4' name='sparkjob'>
<start to='spark-process' />
<action name='spark-process'>
<spark xmlns='uri:oozie:spark-action:0.1'>
<job-tracker>${jobTracker}</job-tracker>
<name-node>${nameNode}</name-node>
<configuration>
<property>
<name>oozie.service.SparkConfigurationService.spark.configurations</name>
<value>spark.eventLog.dir=hdfs://node1.analytics.sub:8020/user/spark/applicationHistory,spark.yarn.historyServer.address=http://node1.analytics.sub:18088,spark.eventLog.enabled=true</value>
</property>
<!--property>
<name>oozie.hive.defaults</name>
<value>/user/ambari-qa/sparkActionPython/hive-config.xml</value>
</property-->
<!--property>
<name>oozie.use.system.libpath</name>
<value>true</value>
</property-->
<property>
<name>oozie.service.WorkflowAppService.system.libpath</name>
<value>/user/oozie/share/lib/lib_20150831190253/spark</value>
</property>
</configuration>
<master>yarn-client</master>
<!--master>local[4]</master-->
<mode>client</mode>
<name>wordcount</name>
<jar>/usr/hdp/current/spark-client/AnalyticsJar/wordcount.py</jar>
<spark-opts>--executor-memory 1G --driver-memory 1G --executor-cores 4 --num-executors 2 --jars /usr/hdp/current/spark-client/lib/spark-assembly-1.3.1.2.3.0.0-2557-hadoop2.7.1.2.3.0.0-2557.jar</spark-opts>
</spark>
<ok to='end'/>
<error to='spark-fail'/>
</action>
<kill name='spark-fail'>
<message>Spark job failed, error message[${wf:errorMessage(wf:lastErrorNode())}]</message>
</kill>
<end name='end' />
</workflow-app>