如何更改Spark的默认log4j配置文件

cbwuti44  于 2022-11-06  发布在  Spark
关注(0)|答案(1)|浏览(240)

我在Spyder IDE上运行PySpark,每次都会出现以下警告:

Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
22/02/15 17:05:12 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
22/02/15 17:05:29 WARN ProcfsMetricsGetter: Exception when trying to compute pagesize, as a result reporting of ProcessTree metrics is stopped

我已尝试编辑文件C:\spark\spark-3.2.1-bin-hadoop2.7\conf\log4j.properties.template以将警告级别更改为“ERROR”,但它没有任何作用

pgvzfuti

pgvzfuti1#

1.将log4j.properties.template重新命名为log4j.properties
1.确保log4j.properties在类路径内或在$SPARK_HOME/conf/

相关问题