There's really nothing of interest about the code - any code introduced
with the %pyspark exhibits this issue.

Do you have any examples of ubuntu 14.04 using local[*] master and spark
1.3.0 that *don't *show this problem? If so perhaps we could compare
configs.

Cheers,
/T

On 15 August 2015 at 18:21, Felix Cheung <felixcheun...@hotmail.com> wrote:

> Hmm .. it could help if you could share the code/sample data to reproduce
> this.
>
>
> ------------------------------
> Date: Sat, 15 Aug 2015 08:34:35 +0100
> Subject: Re: pyspark "running" hang?
> From: exception.bad...@gmail.com
> To: users@zeppelin.incubator.apache.org
>
>
> There's nothing of interest in any of these three - anywhere else to look?
>
> We've independently reproduced this on several machines/environments, all
> Ubuntu 14.04.
>
> zeppelin-*.out
> zeppelin-*.log
> zeppelin-interpreter-spark-*.log
>
> /T
>
> On 11 August 2015 at 08:27, Felix Cheung <felixcheun...@hotmail.com>
> wrote:
>
> Could you check under the log directory for log files to see if there is
> any error?
>
>
>
>
>
>
> On Mon, Aug 10, 2015 at 1:08 PM -0700, "Exception Badger" <
> exception.bad...@gmail.com> wrote:
>
> Hi all,
>
> We've been using Zeppelin for a little while with CDH clusters and it's
> great.
>
> Recently a few of us have tried getting it working on local dev machines
> (ubuntu 14.04) without clusters, i.e. local[*] master and separately
> downloaded spark 1.3.0 referenced through spark.home.
>
> What we're seeing is pyspark notes hanging in the "running" state.
>
> Following a few suggestions on the web we've tried setting SPARK_HOME and
> PYTHONPATH explicitly both in the environment and also in the zeppelin
> config script. None of this seems to help.
>
> I've also tried building the branch and master and I see the same
> behaviour with both.
>
> It would be really good to get this working but we're kind of stumped.
>
> Any help appreciated!
> /T
>
>
>
>

Reply via email to