A crude workaround may be to run your spark shell with a sudo command.

Hope this helps,
Rick Hillegas


Sourav Mazumder <sourav.mazumde...@gmail.com> wrote on 10/15/2015 09:59:02
AM:

> From: Sourav Mazumder <sourav.mazumde...@gmail.com>
> To: user <user@spark.apache.org>
> Date: 10/15/2015 09:59 AM
> Subject: SQL Context error in 1.5.1 - any work around ?
>
> I keep on getting this error whenever I'm starting spark-shell : The
> root scratch dir: /tmp/hive on HDFS should be writable. Current
> permissions are: rwx------.

> I cannot work with this if I need to do anything with sqlContext as
> that does not get created.
>
> I could see that a bug is raised for this https://issues.apache.org/
> jira/browse/SPARK-10066.

> However, is there any work around for this.

> I didn't face this problem in 1.4.1

> Regards,
> Sourav

Reply via email to