I'm in the process of migrating a few applications from Spark 2.1.1 to
Spark 2.2.0 and so far the transition has been smooth. One odd thing is
that when I query a Hive table that I do not own, but have read access, I
get a very long WARNING with a stack trace that basically says I do not
have
You can update $SPARK_HOME/spark-env.sh by setting the environment
variable SPARK_HISTORY_OPTS.
See
http://spark.apache.org/docs/latest/monitoring.html#spark-configuration-options
for options (spark.history.fs.logDirectory) you can set.
There is log rotation built in (by time, not size) to the
No response on the Users list, I thought I would repost here.
See below.
-Don
-- Forwarded message --
From: Don Drake <dondr...@gmail.com>
Date: Sun, Jul 24, 2016 at 2:18 PM
Subject: Outer Explode needed
To: user <u...@spark.apache.org>
I have a nested data stru