I mean if I start spark for our cluster (16 nodes) and run zeppelin on
master node with local[*] setting of Spark Interpreter. I can't see
Zeppelin Application in port 8080's running applications or complete
applications.
Is that because I am setting local[*}?

On Wed, Mar 22, 2017 at 5:30 PM, Jianfeng (Jeff) Zhang <
jzh...@hortonworks.com> wrote:

> >>>  I can't see the application of Zeppelin running in Spark.
>
> What do you mean ? Do you mean you don’t see yarn app ? Maybe you didn’t
> run yarn-client mode
>
>
>
> Best Regard,
> Jeff Zhang
>
>
> From: mingda li <limingda1...@gmail.com>
> Reply-To: "users@zeppelin.apache.org" <users@zeppelin.apache.org>
> Date: Thursday, March 23, 2017 at 4:58 AM
> To: "users@zeppelin.apache.org" <users@zeppelin.apache.org>
> Subject: Re: Does Spark need to run for Zeppelin
>
> If I tell zeppelin spark's home in conf, will it start spark by self?
>
> On Wed, Mar 22, 2017 at 1:45 PM, mingda li <limingda1...@gmail.com> wrote:
>
>> Hi,
>>
>> I recently met a strange problem. My spark is not running. But when I
>> restart Zeppelin and run spark program on it. It can run.
>> And even after I start spark and run zeppelin, I can't see the
>> application of Zeppelin running in Spark.
>> Does anyone have idea of that?
>>
>> Thanks,
>> Mingda
>>
>
>

Reply via email to