I guesss its a silly question, but I couldnt find answer anywhere.
我猜这是个愚蠢的问题,但我在哪儿都找不到答案。
Can I configure logging in spark using log4j.xml?
我可以使用log4j.xml在spark中配置日志记录吗?
In spark documentation its mentioned you can configure logging with log4j.properties, I wish to use log4j.xml for more advance log4j capabilities such as async appender. my job will run in cluster mode over yarn (CDH) scheduled with oozie. Im aware ill need in any solution to use --files
在spark文档中,您可以使用log4j配置日志记录。属性,我希望使用log4j。用于更高级的log4j功能(如async appender)的xml。我的工作将以集群模式运行,而不是与oozie一起安排的纱线(CDH)。我知道我需要使用任何解决方案——文件
1 个解决方案
#1
4
You can set the spark.executor.extraJavaOptions
to add -Dlog4j.configuration=log4j.xml
and include your log4j.xml file on the classpath of the workers (either bundling in the application jar or adding to the files included with --files
).
你可以设置sparkexecutor。extraJavaOptions添加-Dlog4j.configuration = log4j。包括log4j。在工作人员的类路径上的xml文件(在应用程序jar中捆绑或添加到文件中包含的文件)。
#1
4
You can set the spark.executor.extraJavaOptions
to add -Dlog4j.configuration=log4j.xml
and include your log4j.xml file on the classpath of the workers (either bundling in the application jar or adding to the files included with --files
).
你可以设置sparkexecutor。extraJavaOptions添加-Dlog4j.configuration = log4j。包括log4j。在工作人员的类路径上的xml文件(在应用程序jar中捆绑或添加到文件中包含的文件)。