Zeppelin hive + Kerberos

2016-04-27 Thread Margus Roo
Hi Does Zeppelin (0.5.6) support Hive + Kerberos connection? -- Margus (margusja) Roo http://margus.roo.ee skype: margusja +372 51 48 780

Re: Zeppelin hive + Kerberos

2016-04-27 Thread Hyung Sung Shim
If you are using spark interpreter, you can get the information 1. [1] https://zeppelin.incubator.apache.org/docs/0.6.0-incubating-SNAPSHOT/interpreter/spark.html 2016-04-27 17:09 GMT+09:00 Margus Roo : > Hi > > Does Zeppelin (0.5.6) support Hive + Kerberos connection? > > -- > Margus (margusja)

Failed to find data source: com.databricks.spark.avro.

2016-04-27 Thread Paul Buster
what am I doing wrong here ?  single instance ec2 with spark and zeppelin.  it works using spark/bin/pyspark, but in a zeppelin notebook it fails to find com.databricks.spark.avro.  thanks from the zeppelin notebook:%pysparksc._jsc.hadoopConfiguration().set("fs.s3n.awsAccessKeyId","xxx")sc._jsc

Re: ZeppelinContext not found when SPARK_HOME is set.

2016-04-27 Thread Ydalia Delgado
Hi, Using --jars instead of --driver-class-path solved my problem. Thanks! On Mon, Apr 25, 2016 at 8:42 PM, Hyung Sung Shim wrote: > Hello. > > If you want to use external library, use '--jars' option not ' > --driver-class-path' in the SPARK_SUBMIT_OPTIONS. > > Thanks. > > 2016-04-26 9:24 GM

Re: Failed to find data source: com.databricks.spark.avro.

2016-04-27 Thread Paul Buster
this works %depz.reset()z.addRepo("Spark Packages Repo").url("http://dl.bintray.com/spark-packages/maven";)z.load("com.databricks:spark-avro_2.10:2.0.1" but what did I miss when trying to load the dependency in the conf file(s) ? On Wednesday, April 27, 2016 1:18 PM, Paul Buster wrot