Spark 2.2.0 - Odd Hive SQL Warnings

2017-09-01 Thread Don Drake
I'm in the process of migrating a few applications from Spark 2.1.1 to Spark 2.2.0 and so far the transition has been smooth. One odd thing is that when I query a Hive table that I do not own, but have read access, I get a very long WARNING with a stack trace that basically says I do not have

Re: how can I set the log configuration file for spark history server ?

2016-12-08 Thread Don Drake
You can update $SPARK_HOME/spark-env.sh by setting the environment variable SPARK_HISTORY_OPTS. See http://spark.apache.org/docs/latest/monitoring.html#spark-configuration-options for options (spark.history.fs.logDirectory) you can set. There is log rotation built in (by time, not size) to the

Fwd: Outer Explode needed

2016-07-25 Thread Don Drake
No response on the Users list, I thought I would repost here. See below. -Don -- Forwarded message -- From: Don Drake <dondr...@gmail.com> Date: Sun, Jul 24, 2016 at 2:18 PM Subject: Outer Explode needed To: user <u...@spark.apache.org> I have a nested data stru