Might need to recompile Zeppelin with Scala 2.11?
Also Spark 2.2 now requires JDK8 I believe.



-- 
Ruslan Dautkhanov

On Tue, Aug 1, 2017 at 6:26 PM, Benjamin Kim <bbuil...@gmail.com> wrote:

> Here is more.
>
> org.apache.zeppelin.interpreter.InterpreterException: WARNING:
> User-defined SPARK_HOME (/opt/cloudera/parcels/SPARK2-
> 2.2.0.cloudera1-1.cdh5.12.0.p0.142354/lib/spark2) overrides detected
> (/opt/cloudera/parcels/SPARK2/lib/spark2).
> WARNING: Running spark-class from user-defined location.
> Exception in thread "main" java.lang.NoSuchMethodError:
> scala.Predef$.$conforms()Lscala/Predef$$less$colon$less;
> at org.apache.spark.util.Utils$.getDefaultPropertiesFile(Utils.scala:2103)
> at org.apache.spark.deploy.SparkSubmitArguments$$anonfun$
> mergeDefaultSparkProperties$1.apply(SparkSubmitArguments.scala:124)
> at org.apache.spark.deploy.SparkSubmitArguments$$anonfun$
> mergeDefaultSparkProperties$1.apply(SparkSubmitArguments.scala:124)
> at scala.Option.getOrElse(Option.scala:120)
> at org.apache.spark.deploy.SparkSubmitArguments.
> mergeDefaultSparkProperties(SparkSubmitArguments.scala:124)
> at org.apache.spark.deploy.SparkSubmitArguments.<init>(
> SparkSubmitArguments.scala:110)
> at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:112)
> at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>
> Cheers,
> Ben
>
>
> On Tue, Aug 1, 2017 at 5:24 PM Jeff Zhang <zjf...@gmail.com> wrote:
>
>>
>> Then it is due to some classpath issue. I am not sure familiar with CDH,
>> please check whether spark of CDH include hadoop jar with it.
>>
>>
>> Benjamin Kim <bbuil...@gmail.com>于2017年8月2日周三 上午8:22写道:
>>
>>> Here is the error that was sent to me.
>>>
>>> org.apache.zeppelin.interpreter.InterpreterException: Exception in
>>> thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/fs/
>>> FSDataInputStream
>>> Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.fs.
>>> FSDataInputStream
>>>
>>> Cheers,
>>> Ben
>>>
>>>
>>> On Tue, Aug 1, 2017 at 5:20 PM Jeff Zhang <zjf...@gmail.com> wrote:
>>>
>>>>
>>>> By default, 0.7.1 doesn't support spark 2.2. But you can set
>>>> zeppelin.spark.enableSupportedVersionCheck in interpreter setting to
>>>> disable the supported version check.
>>>>
>>>>
>>>> Jeff Zhang <zjf...@gmail.com>于2017年8月2日周三 上午8:18写道:
>>>>
>>>>>
>>>>> What's the error you see in log ?
>>>>>
>>>>>
>>>>> Benjamin Kim <bbuil...@gmail.com>于2017年8月2日周三 上午8:18写道:
>>>>>
>>>>>> Has anyone configured Zeppelin 0.7.1 for Cloudera's release of Spark
>>>>>> 2.2? I can't get it to work. I downloaded the binary and set SPARK_HOME 
>>>>>> to
>>>>>> /opt/cloudera/parcels/SPARK2/lib/spark2. I must be missing something.
>>>>>>
>>>>>> Cheers,
>>>>>> Ben
>>>>>>
>>>>>

Reply via email to