Currently Spark SQL doesn’t support reading SQL specific configurations via system properties. But for |HiveContext|, you can put them in |hive-site.xml|.

On 10/13/14 4:28 PM, Kevin Paul wrote:

Hi all, I tried to set the configuration spark.sql.inMemoryColumnarStorage.compressed, and spark.sql.inMemoryColumnarStorage.batchSize in spark.executor.extraJavaOptions but it does not work, my spark.executor.extraJavaOptions contains "Dspark.sql.inMemoryColumnarStorage.compressed=true -Dspark.sql.inMemoryColumnarStorage.batchSize=1000000", and SparkUI did indicate that the same setting, but somehow the memory footprint of my cacheTable RDDs are the same as without the setting. Only when I use HiveContext.setConf, it reduce my RDDs memory usage. Is it a bug here, or user are required to set the config using HiveContext's setConf function?

Regards,
Kelvin Paul

Reply via email to