I believe Zeppelin at build time pull in Spark as per the profile specified, so 
I'm not sure switching SPARK_HOME is sufficient. You might want to look for 
Spark jars under Zeppelin.





On Sun, Jul 26, 2015 at 9:50 AM -0700, "Bob Beauchemin" <b...@sqlskills.com> 
wrote:
Thanks, Felix,

The same code on the same box with Spark 1.4.0 or 1.4.1 and spark-shell works 
fine.

I do have four different versions of Spark on the same box (1-machine setup for 
testing, running Spark in local mode) and alternate between them by setting 
SPARK_HOME. Haven’t run into any other problems doing this, but perhaps it is 
setup-related. As a test, I built Zeppelin pointing at 1.3.1 setup, rebooted, 
pointed SPARK_HOME at 1.4.1 and I do get sc.version = 1.3 and 1.3 
functionality, so it’s not just switching SPARK_HOME that tells Zeppelin which 
Spark version and Spark binaries to use; it seems to be baked in at build time. 
Building against 1.4.1/1.4 on the same machine and setting SPARK_HOME to the 
1.4.1/1.4 version causes the problem.

Good to hear that it works for you. Guess I’ll do some more digging into the 
source and/or try a different machine setup.

Cheers, and thanks, Bob

From: felixcheun...@hotmail.com [mailto:felixcheun...@hotmail.com]
Sent: Sunday, July 26, 2015 9:23 AM
To: users@zeppelin.incubator.apache.org; users@zeppelin.incubator.apache.org
Subject: RE: Error rendering Zeppelin sample notebook using Spark 1.4.1


That's odd. I rebuilt and ran against Spark 1.4.1 yesterday and it was fine.

The error is coming from RDD so it is not browser dependent. Could you run 
spark-shell with the same code on the box?


On Sat, Jul 25, 2015 at 5:05 PM -0700, "Bob Beauchemin" 
<b...@sqlskills.com<mailto:b...@sqlskills.com>> wrote:

Just rebuilt the same source against a Spark 1.3.1 install with –Pspark-1.3… 
works fine. Then rebuilt against a 1.4.0 install…same error as 1.4.1.



Cheers, Bob

Reply via email to