It was built. I think binaries are only available for official releases?


-- 
Ruslan Dautkhanov

On Wed, Aug 2, 2017 at 4:41 PM, Benjamin Kim <bbuil...@gmail.com> wrote:

> Did you build Zeppelin or download the binary?
>
> On Wed, Aug 2, 2017 at 3:40 PM Ruslan Dautkhanov <dautkha...@gmail.com>
> wrote:
>
>> We're using an ~April snapshot of Zeppelin, so not sure about 0.7.1.
>>
>> Yes, we have that spark home in zeppelin-env.sh
>>
>>
>>
>> --
>> Ruslan Dautkhanov
>>
>> On Wed, Aug 2, 2017 at 4:31 PM, Benjamin Kim <bbuil...@gmail.com> wrote:
>>
>>> Does this work with Zeppelin 0.7.1? We an error when setting SPARK_HOME
>>> in zeppelin-env.sh to what you have below.
>>>
>>> On Wed, Aug 2, 2017 at 3:24 PM Ruslan Dautkhanov <dautkha...@gmail.com>
>>> wrote:
>>>
>>>> You don't have to use spark2-shell and spark2-submit to use Spark 2.
>>>> That can be controled by setting SPARK_HOME using regular
>>>> spark-submit/spark-shell.
>>>>
>>>> $ which spark-submit
>>>> /usr/bin/spark-submit
>>>> $ which spark-shell
>>>> /usr/bin/spark-shell
>>>>
>>>> $ spark-shell
>>>> Welcome to
>>>>       ____              __
>>>>      / __/__  ___ _____/ /__
>>>>     _\ \/ _ \/ _ `/ __/  '_/
>>>>    /___/ .__/\_,_/_/ /_/\_\   version 1.6.0
>>>>       /_/
>>>>
>>>>
>>>>
>>>> $ export SPARK_HOME=/opt/cloudera/parcels/SPARK2/lib/spark2
>>>>
>>>> $ spark-shell
>>>> Welcome to
>>>>       ____              __
>>>>      / __/__  ___ _____/ /__
>>>>     _\ \/ _ \/ _ `/ __/  '_/
>>>>    /___/ .__/\_,_/_/ /_/\_\   version 2.1.0.cloudera1
>>>>       /_/
>>>>
>>>>
>>>> spark-submit and spark-shell are just shell script wrappers.
>>>>
>>>>
>>>>
>>>> --
>>>> Ruslan Dautkhanov
>>>>
>>>> On Wed, Aug 2, 2017 at 10:22 AM, Benjamin Kim <bbuil...@gmail.com>
>>>> wrote:
>>>>
>>>>> According to the Zeppelin documentation, Zeppelin 0.7.1 supports Spark
>>>>> 2.1. But, I don't know if it supports Spark 2.2 or even 2.1 from Cloudera.
>>>>> For some reason, Cloudera defaults to Spark 1.6 and so does the calls to
>>>>> spark-shell and spark-submit. To force the use of Spark 2.x, the calls 
>>>>> need
>>>>> to be spark2-shell and spark2-submit. I wonder if this is causing the
>>>>> problem. By the way, we are using Java8 corporate wide, and there seems to
>>>>> be no problems using Zeppelin.
>>>>>
>>>>> Cheers,
>>>>> Ben
>>>>>
>>>>> On Tue, Aug 1, 2017 at 7:05 PM Ruslan Dautkhanov <dautkha...@gmail.com>
>>>>> wrote:
>>>>>
>>>>>> Might need to recompile Zeppelin with Scala 2.11?
>>>>>> Also Spark 2.2 now requires JDK8 I believe.
>>>>>>
>>>>>>
>>>>>>
>>>>>> --
>>>>>> Ruslan Dautkhanov
>>>>>>
>>>>>> On Tue, Aug 1, 2017 at 6:26 PM, Benjamin Kim <bbuil...@gmail.com>
>>>>>> wrote:
>>>>>>
>>>>>>> Here is more.
>>>>>>>
>>>>>>> org.apache.zeppelin.interpreter.InterpreterException: WARNING:
>>>>>>> User-defined SPARK_HOME (/opt/cloudera/parcels/SPARK2-
>>>>>>> 2.2.0.cloudera1-1.cdh5.12.0.p0.142354/lib/spark2) overrides
>>>>>>> detected (/opt/cloudera/parcels/SPARK2/lib/spark2).
>>>>>>> WARNING: Running spark-class from user-defined location.
>>>>>>> Exception in thread "main" java.lang.NoSuchMethodError:
>>>>>>> scala.Predef$.$conforms()Lscala/Predef$$less$colon$less;
>>>>>>> at org.apache.spark.util.Utils$.getDefaultPropertiesFile(
>>>>>>> Utils.scala:2103)
>>>>>>> at org.apache.spark.deploy.SparkSubmitArguments$$anonfun$
>>>>>>> mergeDefaultSparkProperties$1.apply(SparkSubmitArguments.scala:124)
>>>>>>> at org.apache.spark.deploy.SparkSubmitArguments$$anonfun$
>>>>>>> mergeDefaultSparkProperties$1.apply(SparkSubmitArguments.scala:124)
>>>>>>> at scala.Option.getOrElse(Option.scala:120)
>>>>>>> at org.apache.spark.deploy.SparkSubmitArguments.
>>>>>>> mergeDefaultSparkProperties(SparkSubmitArguments.scala:124)
>>>>>>> at org.apache.spark.deploy.SparkSubmitArguments.<init>(
>>>>>>> SparkSubmitArguments.scala:110)
>>>>>>> at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:112)
>>>>>>> at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>>>>>>>
>>>>>>> Cheers,
>>>>>>> Ben
>>>>>>>
>>>>>>>
>>>>>>> On Tue, Aug 1, 2017 at 5:24 PM Jeff Zhang <zjf...@gmail.com> wrote:
>>>>>>>
>>>>>>>>
>>>>>>>> Then it is due to some classpath issue. I am not sure familiar with
>>>>>>>> CDH, please check whether spark of CDH include hadoop jar with it.
>>>>>>>>
>>>>>>>>
>>>>>>>> Benjamin Kim <bbuil...@gmail.com>于2017年8月2日周三 上午8:22写道:
>>>>>>>>
>>>>>>>>> Here is the error that was sent to me.
>>>>>>>>>
>>>>>>>>> org.apache.zeppelin.interpreter.InterpreterException: Exception
>>>>>>>>> in thread "main" java.lang.NoClassDefFoundError:
>>>>>>>>> org/apache/hadoop/fs/FSDataInputStream
>>>>>>>>> Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.fs.
>>>>>>>>> FSDataInputStream
>>>>>>>>>
>>>>>>>>> Cheers,
>>>>>>>>> Ben
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> On Tue, Aug 1, 2017 at 5:20 PM Jeff Zhang <zjf...@gmail.com>
>>>>>>>>> wrote:
>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> By default, 0.7.1 doesn't support spark 2.2. But you can set
>>>>>>>>>> zeppelin.spark.enableSupportedVersionCheck in interpreter
>>>>>>>>>> setting to disable the supported version check.
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> Jeff Zhang <zjf...@gmail.com>于2017年8月2日周三 上午8:18写道:
>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> What's the error you see in log ?
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> Benjamin Kim <bbuil...@gmail.com>于2017年8月2日周三 上午8:18写道:
>>>>>>>>>>>
>>>>>>>>>>>> Has anyone configured Zeppelin 0.7.1 for Cloudera's release of
>>>>>>>>>>>> Spark 2.2? I can't get it to work. I downloaded the binary and set
>>>>>>>>>>>> SPARK_HOME to /opt/cloudera/parcels/SPARK2/lib/spark2. I must
>>>>>>>>>>>> be missing something.
>>>>>>>>>>>>
>>>>>>>>>>>> Cheers,
>>>>>>>>>>>> Ben
>>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>
>>>>
>>

Reply via email to