AW: AW: -Dspark.jars is ignored when running in yarn-client mode, also when adding the jar with sc.addJars

2016-02-29 Thread Rabe, Jens
on it: http://queirozf.com/entries/apache-zeppelin-spark-streaming-and-amazon-kinesis-simple-guide-and-examples FA Am 26.02.2016 05:57 schrieb "Rabe, Jens" <jens.r...@iwes.fraunhofer.de<mailto:jens.r...@iwes.fraunhofer.de>>: Hello, I found out ahow to add the library. Since

AW: -Dspark.jars is ignored when running in yarn-client mode, also when adding the jar with sc.addJars

2016-02-26 Thread Rabe, Jens
ough. Von: Rabe, Jens [mailto:jens.r...@iwes.fraunhofer.de] Gesendet: Freitag, 26. Februar 2016 09:26 An: users@zeppelin.incubator.apache.org Betreff: -Dspark.jars is ignored when running in yarn-client mode, also when adding the jar with sc.addJars Hello, I have a library I want to embed in Zeppe

-Dspark.jars is ignored when running in yarn-client mode, also when adding the jar with sc.addJars

2016-02-26 Thread Rabe, Jens
Hello, I have a library I want to embed in Zeppelin. I am using a build from Git yesterday, and Spark 1.6. Here is my conf/zeppelin-env.sh: export JAVA_HOME=/usr/lib/jvm/java-7-oracle export MASTER=yarn-client export HADOOP_CONF_DIR=/etc/hadoop/conf export ZEPPELIN_PORT=10080 export

Are the dependencies loaded with %dep interpreter picked up by Flink too?

2015-09-22 Thread Rabe, Jens
Hello, I want to evaluate Zeppelin with Flink. I need to add custom functionality I packed into a fat "uber-jar". When I load this jar with the %dep interpreter, will it then be picked-up when using Flink, or does this currently only work with Spark?

AW: Are the dependencies loaded with %dep interpreter picked up by Flink too?

2015-09-22 Thread Rabe, Jens
Sorry to answer myself. I found the answer myself: The dependencies are not loaded. I am going to raise a JIRA about this. Von: Rabe, Jens [mailto:jens.r...@iwes.fraunhofer.de] Gesendet: Dienstag, 22. September 2015 12:10 An: users@zeppelin.incubator.apache.org Betreff: Are the dependencies