Are you building with -Pspark-1.3?

https://github.com/NFLabs/zeppelin/pull/384

--- Original Message ---

From: "Nirav Mehta" <mehtani...@gmail.com>
Sent: March 24, 2015 8:30 AM
To: users@zeppelin.incubator.apache.org
Subject: Re: Zeppelin not holding contexts

mvn package -DskipTests -Dspark.version=1.3.0 -Dhadoop.version=2.4.0 -P
build-distr

I assumed that since I was building with Spark and Hadoop, the libraries
are built into the distribution. I'm setting the following in my
zeppelin-env.sh where "spark" is the hostname for the Spark Master + Hadoop
Namenode:

export MASTER="spark://spark:7077"
export 
SPARK_YARN_JAR=hdfs://spark:9000/spark/spark-assembly-1.3.0-hadoop2.4.0.jar
export HADOOP_CONF_DIR=/etc/hadoop/conf


I have my build up on docker. Feel free to make a pull using "docker pull
namehta/zeppelin". The code for it is on https://github.com/namehta/zeppelin

On Tue, Mar 24, 2015 at 7:34 AM, moon soo Lee <m...@apache.org> wrote:

> from your attached log
>
> java.lang.NoSuchMethodError:
> akka.actor.ActorContext.dispatcher()Lscala/concurrent/ExecutionContextExecutor;
> at
> org.apache.spark.storage.BlockManagerMasterActor.preStart(BlockManagerMasterActor.scala:63)
>
> I guess this kind of error is coming when dependency library version is
> not matched.
> Could you share your build command? and are you setting SPARK_HOME or
> HADOOP_HOME in your conf/zeppelin-env.sh?
>
> Thanks,
> moon
>
> On Mon, Mar 23, 2015 at 10:39 PM Nirav Mehta <mehtani...@gmail.com> wrote:
>
>> Moon,
>>
>> Thanks for the quick response.
>>
>> Not sure how to interpret this error:
>> ERROR [2015-03-23 13:35:53,149] ({pool-1-thread-3}
>> ProcessFunction.java[process]:41) - Internal error processing open
>> com.nflabs.zeppelin.interpreter.InterpreterException:
>> java.lang.IllegalStateException: cannot create children while terminating
>> or terminated
>>
>>  I've attached the log file for further reference.
>>
>> Thanks,
>> Nirav
>>
>> On Sun, Mar 22, 2015 at 11:06 PM, moon soo Lee <m...@apache.org> wrote:
>>
>>> Hi,
>>>
>>> There're log file that starts with 'zeppelin-interpreter-spark-*.log'
>>> under logs directory.
>>> Could you check this file and see is there any exception happened?
>>>
>>> Thanks,
>>> moon
>>>
>>> On Mon, Mar 23, 2015 at 10:36 AM Nirav Mehta <mehtani...@gmail.com>
>>> wrote:
>>>
>>>> Hi,
>>>>
>>>> I'm trying to run Zeppelin over an existing Spark cluster.
>>>>
>>>> My zeppelin-env.sh has the entry:
>>>> export MASTER=spark://spark:7077
>>>>
>>>> In the first paragraph, I executed bash commands:
>>>> %sh
>>>> hadoop fs -ls /user/root
>>>>
>>>> This returned:
>>>> drwxr-xr-x - root supergroup 0 2015-01-15 09:05 /user/root/input
>>>> -rw-r--r-- 3 root supergroup 29966462 2015-03-23 01:06
>>>> /user/root/product.txt
>>>>
>>>> In the next paragraph, I executed the following:
>>>> %spark
>>>> val prodRaw = sc.textFile("hdfs://user/root/product.txt")
>>>> prodRaw.count
>>>>
>>>> This doesn't return any result, or any errors on the console. Instead,
>>>> I see a new context create every time I execute something:
>>>> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
>>>> explanation.
>>>> SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
>>>> ------ Create new SparkContext spark://spark:7077 -------
>>>> ------ Create new SparkContext spark://spark:7077 -------
>>>> ------ Create new SparkContext spark://spark:7077 -------
>>>>
>>>> Is this expected behavior? Seems like Zeppelin should be holding the
>>>> context.
>>>>
>>>> Same issues when executing the sample notebook.
>>>>
>>>> Appreciate any help!
>>>>
>>>
>>

Reply via email to