You should be able to enable logging for the interpreter group. It's the log 
properties file under conf. There are instructions around, but let me know if 
you aren't able to get it working. 

> On May 11, 2016, at 12:06 AM, Samuel Alexander <sam...@palmtreeinfotech.com> 
> wrote:t
> 
> Hi Amos,
> 
> It doesn't contain any other information apart what I've sent earlier.
> 
> Thanks,
> Sam.
> 
>> On Tue, May 10, 2016 at 2:10 AM, Amos Elberg <amos.elb...@me.com> wrote:
>> Yes, please. 
>> 
>>> On May 9, 2016, at 12:14 PM, Samuel Alexander <sam...@palmtreeinfotech.com> 
>>> wrote:
>>> 
>>> Hi Amos,
>>> 
>>> I am using ubuntu docker in a ubuntu system.
>>> 
>>> Yes, it seems spark is not starting.  I don't see the statement "------ 
>>> Create new SparkContext" in my console as well.
>>> 
>>> I see a single file in zeppelin\logs zeppelin--7ed5948b0d52.log and 
>>> attached the same. It doesn't contain any errors. Do you want me to enable 
>>> some logs and rerun the paragraph?
>>> 
>>> Thanks,
>>> Sam.
>>> 
>>>> On Mon, May 9, 2016 at 8:42 PM, Amos Elberg <amos.elb...@me.com> wrote:
>>>> Sam - it should run fine in docker, but if you are running a Linux docker 
>>>> on Windows, I have heard that sometimes the environment can become flaky.
>>>> 
>>>> What you're seeing sounds like spark is failing to start.
>>>> 
>>>> You should be able to get Zeppelin to output a log for the entire spark 
>>>> interpreter group. (What you pasted from is the Zeppelin log.) Can you 
>>>> tell us what's in there?
>>>> 
>>>> > On May 9, 2016, at 10:22 AM, Samuel Alexander 
>>>> > <sam...@palmtreeinfotech.com> wrote:
>>>> >
>>>> > Hi,
>>>> >
>>>> > I am running zeppelin in a docker container.
>>>> >
>>>> > I built the zeppelin using `mvn clean package -Pspark-1.6 -Ppyspark
>>>> > -Phadoop-2.4 -Pr -DskipTests`. I am using RRepl Interpreter.
>>>> >
>>>> > I just created a very simple paragraph with the below content.
>>>> >
>>>> > <<<
>>>> > %r
>>>> >
>>>> > print(1+1)
>>>> >
>>>> >
>>>> > It goes to pending state and it is in pending state until I restart
>>>> > zeppelin-server.
>>>> >
>>>> > I am not able to execute any R paragraph. I don't see log statements as
>>>> > well.
>>>> >
>>>> > Here is the log snippet
>>>> >
>>>> > <<<
>>>> >
>>>> > INFO [2016-05-09 14:06:34,311] ({pool-1-thread-2}
>>>> > SchedulerFactory.java[jobStarted]:131) - Job
>>>> > paragraph_1461917041860_145219673 started by scheduler
>>>> > org.apache.zeppelin.interpreter.remote.RemoteInterpretershared_session499376351
>>>> > INFO [2016-05-09 14:06:34,312] ({pool-1-thread-2}
>>>> > Paragraph.java[jobRun]:226) - run paragraph 20160429-080401_1472155327
>>>> > using r org.apache.zeppelin.interpreter.LazyOpenInterpreter@47debb2d
>>>> > INFO [2016-05-09 14:06:34,321] ({pool-1-thread-2}
>>>> > RemoteInterpreterProcess.java[reference]:119) - Run interpreter process
>>>> > [/work/incubator-zeppelin/bin/interpreter.sh, -d,
>>>> > /work/incubator-zeppelin/interpreter/spark, -p, 48832, -l,
>>>> > /work/incubator-zeppelin/local-repo/2B894THHQ]
>>>> > INFO [2016-05-09 14:06:35,516] ({pool-1-thread-2}
>>>> > RemoteInterpreter.java[init]:149) - Create remote interpreter
>>>> > org.apache.zeppelin.rinterpreter.RRepl
>>>> > INFO [2016-05-09 14:06:35,547] ({pool-1-thread-2}
>>>> > RemoteInterpreter.java[pushAngularObjectRegistryToRemote]:416) - Push 
>>>> > local
>>>> > angular object registry from ZeppelinServer to remote interpreter group
>>>> > 2B894THHQ
>>>> > INFO [2016-05-09 14:06:35,578] ({pool-1-thread-2}
>>>> > RemoteInterpreter.java[init]:149) - Create remote interpreter
>>>> > org.apache.zeppelin.spark.SparkInterpreter
>>>> > INFO [2016-05-09 14:06:35,581] ({pool-1-thread-2}
>>>> > RemoteInterpreter.java[init]:149) - Create remote interpreter
>>>> > org.apache.zeppelin.spark.PySparkInterpreter
>>>> > INFO [2016-05-09 14:06:35,584] ({pool-1-thread-2}
>>>> > RemoteInterpreter.java[init]:149) - Create remote interpreter
>>>> > org.apache.zeppelin.rinterpreter.KnitR
>>>> > INFO [2016-05-09 14:06:35,585] ({pool-1-thread-2}
>>>> > RemoteInterpreter.java[init]:149) - Create remote interpreter
>>>> > org.apache.zeppelin.spark.SparkSqlInterpreter
>>>> > INFO [2016-05-09 14:06:35,586] ({pool-1-thread-2}
>>>> > RemoteInterpreter.java[init]:149) - Create remote interpreter
>>>> > org.apache.zeppelin.spark.DepInterpreter
>>>> >
>>>> >
>>>> > Please note that I've set SPARK_HOME in environment.
>>>> > Also when I did the same in my ubuntu instance, it is working fine.
>>>> >
>>>> > Is there any issue when RInterpreter(RRepl) running in docker container?
>>>> > Or did I miss something?
>>>> >
>>>> > Thanks,
>>>> > Sam.
>>> 
>>> <zeppelin--7ed5948b0d52.log>
> 

Reply via email to