Excessive parquet messages on Spark 2.1 -


after trying every approach shown on question: how suppress parquet log messages in spark? none works spark 2.1 - except blunt instrument approach of disabling all logging below warn level.

log4j.rootcategory=warn, console 

that not acceptable approach (our app writes info messages reason .. ).

note first approach taken add

log4j.logger.parquet=error log4j.logger.org.apache.spark.sql.execution.datasources.parquet=error log4j.logger.org.apache.spark.sql.execution.datasources.filescanrdd=error log4j.logger.org.apache.hadoop.io.compress.codecpool=error 

to log4j.properties. these had no effect. other approach included in attempts:

org.apache.parquet.handlers=java.util.logging.consolehandler java.util.logging.consolehandler.level=severe 

with following added jvm options

 -dspark.driver.extrajavaoptions="-djava.util.logging.config.file=/tmp/parquet.logging.properties"   -dspark.executor.extrajavaoptions="-djava.util.logging.config.file=/tmp/parquet.logging.properties" 

likewise no change.

if has found magic quiet down parquet button please chime in.

add:

log4j.logger.org.apache.parquet=error log4j.logger.parquet=error 

to log4j.properties file


Comments

Popular posts from this blog

node.js - Node js - Trying to send POST request, but it is not loading javascript content -

javascript - Replicate keyboard event with html button -

javascript - Web audio api 5.1 surround example not working in firefox -