Re: Zeppelin packaging for Spark 1.3 gives errors

2015-05-26 Thread Roopa Prabhu
The following command goes fine without issues: mvn clean package -Pspark-1.3 -Dhadoop.version=2.2.0 -Phadoop-2.2 -DskipTests. But when I try to package the distribution using: mvn clean package -P build-distr -DskipTests it fails. On Wed, May 27, 2015 at 9:26 AM, Roopa Prabhu wrote: > The fo

Re: Zeppelin packaging for Spark 1.3 gives errors

2015-05-26 Thread Roopa Prabhu
The following command goes fine without issues: mvn clean package -Pspark-1.3 -Dhadoop.version=2.2.0 -Phadoop-2.2 -DskipTests. But when I try to package the distribution using:mvn clean package -P build-distr On Tue, May 26, 2015 at 7:06 AM, Alexander Bezzubov wrote: > Hi Roopa, > > which comma

Re: Is autocompletion available?

2015-05-26 Thread Giovanni Simonini
Thank you! It works, and on my mac is Ctrl-, > On 26 May 2015, at 19:02, Kevin Kuo wrote: > > Ctrl-.

Re: Is autocompletion available?

2015-05-26 Thread Kevin Kuo
Try Ctrl-. On Tue, May 26, 2015 at 6:40 PM, Giovanni Simonini < giovanni.simon...@unimore.it> wrote: > Is autocompletion available (for scala in particular)? > If yes, I don’t find how to enable it. >

Is autocompletion available?

2015-05-26 Thread Giovanni Simonini
Is autocompletion available (for scala in particular)? If yes, I don’t find how to enable it.

Zeppelin with highcharts on AWS.

2015-05-26 Thread Wood, Dean Jr (GE Oil & Gas)
Hi, I’m trying to use AWS with zeppelin and am having some issues with using highcharts with zeppelin. I’m not sure if this is a zeppelin or highcharts issue but maybe you’ll at least be able to clarify that. So I have a spark cluster on AWS and I have zeppelin running on another VM. When I

Re: Spark1.2 Exception with Zeppelin

2015-05-26 Thread Ronen Gross
Hi, I added the -Pyarn" flag, modified Master property and updated zeppelin-env.sh file but still I get the same error when running the %sql java.lang.ClassCastException: org.apache.hadoop.mapred.JobConf cannot be cast to org.apache.spark.rdd.RDD at org.apache.spark.SpaekContext$$anonfun$27.apply