Looking at https://github.com/apache/spark/pull/1222/files , the following
change may have caused what Stephen described:
+ if (!fileSystem.isDirectory(new Path(logBaseDir))) {
When there is no schema associated with logBaseDir, local path should be
assumed.
On Fri, Jan 30, 2015 at 8:37 AM,
Looking at https://github.com/apache/spark/pull/1222/files ,
the following change may have caused what Stephen described:
+ if (!fileSystem.isDirectory(new Path(logBaseDir))) {
When there is no schema associated with logBaseDir, local path
should be assumed.
Yes, that looks right. In
Understood.
However the previous default was local directory. Now user has to specify
file:// scheme.
Maybe add release note to SPARK-2261 ?
Cheers
On Sat, Jan 31, 2015 at 8:40 AM, Sean Owen so...@cloudera.com wrote:
This might have been on purpose, since the goal is to make this
This might have been on purpose, since the goal is to make this
HDFS-friendly, and of course still allow local directories. With no
scheme, a path is ambiguous.
On Sat, Jan 31, 2015 at 4:18 PM, Ted Yu yuzhih...@gmail.com wrote:
Looking at https://github.com/apache/spark/pull/1222/files , the
Hi Krishna/all,
I think I found it, and it wasn't related to Scala-2.11...
I had spark.eventLog.dir=/mnt/spark/work/history, which worked
in Spark 1.2, but now am running Spark master, and it wants a
Hadoop URI, e.g. file:///mnt/spark/work/history (I believe due to
commit 45645191).
This looks