Re: Optimizing Spark interpreter startup

2023-05-03 Thread Jeff Zhang
Hi Vladimir, Have you compared it with spark shell? I think it is similar as spark shell On Wed, May 3, 2023 at 10:12 PM Vladimir Prus wrote: > Hi, > > I was profiling the startup time of Spark Interpreter in our environment, > and it looks like > a total of 5 seconds is spe

Optimizing Spark interpreter startup

2023-05-03 Thread Vladimir Prus
Hi, I was profiling the startup time of Spark Interpreter in our environment, and it looks like a total of 5 seconds is spent at this line in SparkScala212Interpreter.scala: sparkILoop.initializeSynchronous() That line, eventually, calls nsc.Global constructor, which spends 5 seconds

Running spark interpreter in 0.10.0 docker image fails to delete files

2021-11-02 Thread brian
, however I run into an issue with the Spark Interpreter. In the 0.10.0 docker image, the DockerInterpreter tries to delete all interpreters, that are not relevant for Spark. There I do get a permission denied message. The Dockerfile uses USER 1000 and the Zeppelin files in the image are owned by root, thus

Re: Scala 2.12 version mismatch for Spark Interpreter

2021-10-28 Thread Mich Talebzadeh
apologies should say the docker image should be on 3.1.1 view my Linkedin profile *Disclaimer:* Use it at your own risk. Any and all responsibility for any loss, damage or destruction of data or any other property which may arise

Re: Scala 2.12 version mismatch for Spark Interpreter

2021-10-28 Thread Mich Talebzadeh
you should go for Spark 3.1.1 for k8s. That is the tried and tested one for Kubernetes in Spark 3 series, meaning the docker image should be on .1.1 and your client which I think is used to submit spark-submit on k8s should also be on 3.1.1 HTH view my Linkedin profile

Re: Scala 2.12 version mismatch for Spark Interpreter

2021-10-28 Thread Jeff Zhang
Hi Fabrizio, Spark 3.2.0 is supported recently in this PR https://github.com/apache/zeppelin/pull/4257 The problem you mentioned is solved. Fabrizio Fab 于2021年10月28日周四 下午7:43写道: > I am aware that Spark 3.20 is not officially released, but I am trying to > put it to work. > > The first thing

Scala 2.12 version mismatch for Spark Interpreter

2021-10-28 Thread Fabrizio Fab
I am aware that Spark 3.20 is not officially released, but I am trying to put it to work. The first thing that I noticed is the following: the SparkInterpreter is compiled for Scala 2.12.7 Spark 3.2 is compiled for Scala 2.12.15 Unfortunately there are some breaking changes between the two

Re: CVE-2019-10095: Apache Zeppelin: bash command injection in spark interpreter

2021-09-28 Thread Michiel Haisma
E List from the CNA. nvd.nist.gov Many thanks, Michiel On 2021/09/02 15:56:50, Jeff Zhang wrote: > Description:> > > bash command injection vulnerability in Apache Zeppelin allows an attacker to > inject system commands into Spark interpreter settings. This issue affects >

CVE-2019-10095: Apache Zeppelin: bash command injection in spark interpreter

2021-09-02 Thread Jeff Zhang
Description: bash command injection vulnerability in Apache Zeppelin allows an attacker to inject system commands into Spark interpreter settings. This issue affects Apache Zeppelin Apache Zeppelin version 0.9.0 and prior versions. Credit: Apache Zeppelin would like to thank HERE Security

Re: Local spark interpreter with extra java options

2021-07-25 Thread Lior Chaga
;>>> logged. >>>>> >>>>> This is how my cmd looks like (censored a bit): >>>>> >>>>> /usr/local/spark/bin/spark-submit >>>>> --class org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer >

Re: Local spark interpreter with extra java options

2021-07-25 Thread Jeff Zhang
ocess command that is created, is not really identical to the one that is >>>> logged. >>>> >>>> This is how my cmd looks like (censored a bit): >>>> >>>> /usr/local/spark/bin/spark-submit >>>> --class org.apache.zeppelin.interpreter.remote.R

Re: Local spark interpreter with extra java options

2021-07-25 Thread Lior Chaga
he.zeppelin.interpreter.remote.RemoteInterpreterServer >>> --driver-class-path >>> :/zeppelin/local-repo/spark/*:/zeppelin/interpreter/spark/*:::/zeppelin/inter >>> preter/zeppelin-interpreter-shaded-0.10.0-SNAPSHOT.jar:/zeppelin/interpreter/spark/spark-inte

Re: Local spark interpreter with extra java options

2021-07-11 Thread Lior Chaga
er/spark/*:::/zeppelin/inter >> preter/zeppelin-interpreter-shaded-0.10.0-SNAPSHOT.jar:/zeppelin/interpreter/spark/spark-interpreter-0.10.0-SNAPSHOT.jar:/etc/hadoop/conf >> >> *--driver-java-options " -DSERVICENAME=zeppelin_docker >> -Dfile.encoding=U

Re: Local spark interpreter with extra java options

2021-07-11 Thread Jeff Zhang
r-class-path > :/zeppelin/local-repo/spark/*:/zeppelin/interpreter/spark/*:::/zeppelin/inter > preter/zeppelin-interpreter-shaded-0.10.0-SNAPSHOT.jar:/zeppelin/interpreter/spark/spark-interpreter-0.10.0-SNAPSHOT.jar:/etc/hadoop/conf > > *--driver-java-options " -DSERVICENAME=zeppelin_

Re: Local spark interpreter with extra java options

2021-07-11 Thread Lior Chaga
org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer --driver-class-path :/zeppelin/local-repo/spark/*:/zeppelin/interpreter/spark/*:::/zeppelin/inter preter/zeppelin-interpreter-shaded-0.10.0-SNAPSHOT.jar:/zeppelin/interpreter/spark/spark-interpreter-0.10.0-SNAPSHOT.jar:/etc/hadoop/conf *--driver-java-options " -DSERVICENAME=zeppelin_d

Re: Local spark interpreter with extra java options

2021-07-08 Thread Jeff Zhang
_RUN_COMMAND in interpreter.sh > > I will fix it in my build, but would like a confirmation that this is > indeed the issue (and I'm not missing anything), so I'd open a pull > request. > > On Thu, Jul 8, 2021 at 3:05 PM Lior Chaga wrote: > >> I'm trying to run zeppelin usin

Re: Local spark interpreter with extra java options

2021-07-08 Thread Lior Chaga
that this is indeed the issue (and I'm not missing anything), so I'd open a pull request. On Thu, Jul 8, 2021 at 3:05 PM Lior Chaga wrote: > I'm trying to run zeppelin using local spark interpreter. > Basically everything works, but if I try to set > `spark.driver.extraJa

Local spark interpreter with extra java options

2021-07-08 Thread Lior Chaga
I'm trying to run zeppelin using local spark interpreter. Basically everything works, but if I try to set `spark.driver.extraJavaOptions` or `spark.executor.extraJavaOptions` containing several arguments, I get an exception. For instance, for providing `-DmyParam=1 -DmyOtherParam=2`, I'd get

Re: Running spark interpreter with zeppelin on k8s

2021-06-23 Thread Lior Chaga
下午4:35写道: > >> I'm trying to deploy zeppelin 0.10 on k8s, using following manual build: >> >> mvn clean package -DskipTests -Pspark-scala-2.12 -Pinclude-hadoop >> -Pspark-3.0 -Phadoop2 -Pbuild-distr -pl >> zeppelin-interpreter,zeppelin-zengine,spark/interpreter,spar

Re: Running spark interpreter with zeppelin on k8s

2021-06-23 Thread Jeff Zhang
> zeppelin-interpreter,zeppelin-zengine,spark/interpreter,spark/spark-dependencies,zeppelin-web,zeppelin-server,zeppelin-distribion,jdbc,zeppelin-plugins/notebookrepo/filesystem,zeppelin-plugins/launcher/k8s-standard > -am > > > Spark itself is configured to use mesos as resource man

Running spark interpreter with zeppelin on k8s

2021-06-23 Thread Lior Chaga
I'm trying to deploy zeppelin 0.10 on k8s, using following manual build: mvn clean package -DskipTests -Pspark-scala-2.12 -Pinclude-hadoop -Pspark-3.0 -Phadoop2 -Pbuild-distr -pl zeppelin-interpreter,zeppelin-zengine,spark/interpreter,spark/spark-dependencies,zeppelin-web,zeppelin-server

Re: Custom init for Spark interpreter

2021-05-21 Thread Jeff Zhang
e: > >> Right,we have hooks for each paragraph execution, but no interpreter >> process level hook. Could you create a ticket for that ? And welcome to >> contribute. >> >> Vladimir Prus 于2021年5月21日周五 下午4:16写道: >> >>> >>> Hi, >>> &g

Re: Custom init for Spark interpreter

2021-05-21 Thread Vladimir Prus
level hook. Could you create a ticket for that ? And welcome to > contribute. > > Vladimir Prus 于2021年5月21日周五 下午4:16写道: > >> >> Hi, >> >> is there a way, when using Spark interpreter, to always run additional >> Scala code after startup? E.g. I

Re: Custom init for Spark interpreter

2021-05-21 Thread Jeff Zhang
Right,we have hooks for each paragraph execution, but no interpreter process level hook. Could you create a ticket for that ? And welcome to contribute. Vladimir Prus 于2021年5月21日周五 下午4:16写道: > > Hi, > > is there a way, when using Spark interpreter, to always run additional > S

Custom init for Spark interpreter

2021-05-21 Thread Vladimir Prus
Hi, is there a way, when using Spark interpreter, to always run additional Scala code after startup? E.g. I want to automatically execute import com.joom.whatever._ so that users don't have to do it all the time. I see that BaseSparkScalaInterpreter.spark2CreateContext imports a few thing

Re: Zeppelin 0.9 / Kubernetes / Spark interpreter

2021-05-03 Thread Jeff Zhang
It is fixed here https://github.com/apache/zeppelin/pull/4105 Sylvain Gibier 于2021年5月1日周六 下午2:37写道: > Hi, > > Cf. ZEPPELIN-5337. > > Switching to isolated mode is not really an option - as it means one spark > interpreter per note. per user -- which consumes

Re: Zeppelin 0.9 / Kubernetes / Spark interpreter

2021-05-01 Thread Sylvain Gibier
Hi, Cf. ZEPPELIN-5337. Switching to isolated mode is not really an option - as it means one spark interpreter per note. per user -- which consumes a lot of resources, as there is no mechanism to clean k8s pods created afterwards. The scope mode allows us to share the spark interpreter along

Re: Zeppelin 0.9 / Kubernetes / Spark interpreter

2021-04-30 Thread moon soo Lee
wrote: > Any idea? > > Actually anyone using zeppelin 0.9+ on k8s, with spark interpreter scope > per note ? > > > On 2021/04/24 10:46:06, Sylvain Gibier wrote: > > Hi, > > > > we have an issue with our current deployment of zeppelin on k8s, and mo

Re: Zeppelin 0.9 / Kubernetes / Spark interpreter

2021-04-27 Thread Sylvain Gibier
Any idea? Actually anyone using zeppelin 0.9+ on k8s, with spark interpreter scope per note ? On 2021/04/24 10:46:06, Sylvain Gibier wrote: > Hi, > > we have an issue with our current deployment of zeppelin on k8s, and more > precisely with spark interpreter. > > For ref

Zeppelin 0.9 / Kubernetes / Spark interpreter

2021-04-24 Thread Sylvain Gibier
Hi, we have an issue with our current deployment of zeppelin on k8s, and more precisely with spark interpreter. For reference - the spark context is: scala 2.12.10 / spark 2.4.7 We have a weird behaviour, running the spark interpreter in per note, scoped To reproduce currently - we restart

Re: Spark interpreter Repl injection

2021-03-09 Thread Carlos Diogo
Thanks I created the issue Regards Carlos On Tue 9. Mar 2021 at 19:02, moon soo Lee wrote: > Pyspark interpreter have 'intp' variable exposed in its repl environment > (for internal use). And we can resolve reference to Spark interpreter from > the 'intp' variable. However, s

Re: Spark interpreter Repl injection

2021-03-09 Thread moon soo Lee
Pyspark interpreter have 'intp' variable exposed in its repl environment (for internal use). And we can resolve reference to Spark interpreter from the 'intp' variable. However, scala repl environment in Spark Interpreter doesn't expose any variables that is useful for finding Spark Interpreter

Re: Spark interpreter Repl injection

2021-03-09 Thread Carlos Diogo
Looks good Moon Is there a specific reason why you needed the pyspark interpreter to access the spark interpreter? Could not the spark interpreter programmatically access itself (and the same for the pyspark interpreter) Would the issue be to expose the z.interpret() method? Best regards Carlos

Re: Spark interpreter Repl injection

2021-03-09 Thread moon soo Lee
ly most of the times >>> Carlos >>> >>> On Sat 6. Mar 2021 at 11:36, Jeff Zhang wrote: >>> >>>> Why not copying scala code in zeppelin and run the notebook directly ? >>>> >>>> Carlos Diogo 于2021年3月6日周六 下午3:51写道: >>>

Re: Spark interpreter Repl injection

2021-03-08 Thread Carlos Diogo
. Mar 2021 at 11:36, Jeff Zhang wrote: >> >>> Why not copying scala code in zeppelin and run the notebook directly ? >>> >>> Carlos Diogo 于2021年3月6日周六 下午3:51写道: >>> >>>> Dear all >>>> I have been trying to find a was to inj

Re: Spark interpreter Repl injection

2021-03-08 Thread moon soo Lee
Dear all >>> I have been trying to find a was to inject scala Code ( from String) >>> into the spark interpreter >>> In pyspark is easy with the exec function >>> It should not be very difficult to access from the Note scala repl >>> interpreter but i could no

Re: Spark interpreter Repl injection

2021-03-06 Thread Carlos Diogo
ve been trying to find a was to inject scala Code ( from String) >> into the spark interpreter >> In pyspark is easy with the exec function >> It should not be very difficult to access from the Note scala repl >> interpreter but i could not find a way . I was even able to create

Re: Spark interpreter Repl injection

2021-03-06 Thread Jeff Zhang
Why not copying scala code in zeppelin and run the notebook directly ? Carlos Diogo 于2021年3月6日周六 下午3:51写道: > Dear all > I have been trying to find a was to inject scala Code ( from String) > into the spark interpreter > In pyspark is easy with the exec function > It sho

Spark interpreter Repl injection

2021-03-05 Thread Carlos Diogo
Dear all I have been trying to find a was to inject scala Code ( from String) into the spark interpreter In pyspark is easy with the exec function It should not be very difficult to access from the Note scala repl interpreter but i could not find a way . I was even able to create a new repl

spark.jars.packages not working in spark interpreter tutorial

2020-07-02 Thread David Boyd
All:    Trying to run the Spark Interpreter tutorial note. The spark.conf paragraph which specifies spark.jars.packages runs clean. But the next paragraph which tries to use the avro jar fails with a class not found for org.apache.spark.sql.avro.AvroFileFormat.DefaultSource Spark is set

Re: Error starting spark interpreter with 0.9.0

2020-06-30 Thread Jeff Zhang
Which spark version do you use ? And could you check the spark interpreter log file ? It is in ZEPPELIN_HOME/logs/zeppelin-interpreter-spark-*.log David Boyd 于2020年6月30日周二 下午11:11写道: > All: > > Just trying to get 0.9.0 to work and running into all sorts of issues. > Previous

Error starting spark interpreter with 0.9.0

2020-06-30 Thread David Boyd
PSHOT-shaded.jar /opt/zeppelin/zeppelin-current/interpreter/zeppelin-interpreter-shaded-0.9.0-SNAPSHOT.jar:/opt/zeppelin/zeppelin-current/interpreter/spark/spark-interpreter-0.9.0-SNAPSHOT.jar:/opt/hadoop/hadoop-current/etc/hadoop" --driver-java-options " -Dfile.encoding=UTF-8 -Dlog4j.conf

Re: Question: Adding Dependencies in with the Spark Interpreter with Kubernetes

2020-05-14 Thread Sebastian Albrecht
Am Mi., 13. Mai 2020 um 21:59 Uhr schrieb Hetul Patel : > > Are dependency downloads supported with zeppelin and spark over > kubernetes? Or am I required to add the dependency jars directly to my > spark docker image and add them to the classpath? > > Hi Hetu, i don't use docker but to connect

Question: Adding Dependencies in with the Spark Interpreter with Kubernetes

2020-05-13 Thread Hetul Patel
Hi all, I've been trying the 0.9.0-preview1 build on minikube with the spark interpreter. It's working, but I'm unable to work with any dependencies that I've added to the spark interpreter. (Note: I had to add `SPARK_SUBMIT_OPTIONS=--conf spark.jars.ivy=/tmp/.ivy` and `SPARK_USER=root

Re: Zeppelin 0.8.2 New Spark Interpreter

2019-11-08 Thread Mark Bidewell
ing UI. > > >>>> > > >>>> Jeff Zhang 于2019年10月11日周五 上午9:54写道: > > >>>> > > >>>>> Like I said above, try to set them via spark.jars and > > >>>>> spark.jars.packages. > > >>>>> > > >>>>

Re: Zeppelin 0.8.2 New Spark Interpreter

2019-11-08 Thread Anton Kulaga
ELIN-4374 that user can > >>>> still set dependencies in interpreter setting UI. > >>>> > >>>> Jeff Zhang 于2019年10月11日周五 上午9:54写道: > >>>> > >>>>> Like I said above, try to set them via spark.jars and > >>>>

Re: Zeppelin 0.8.2 New Spark Interpreter

2019-10-11 Thread Jeff Zhang
>>>> >>>>> Like I said above, try to set them via spark.jars and >>>>> spark.jars.packages. >>>>> >>>>> Don't set them here >>>>> >>>>> [image: image.png] >>>>> >>>>>

Re: Zeppelin 0.8.2 New Spark Interpreter

2019-10-11 Thread Mark Bidewell
't set them here >>>> >>>> [image: image.png] >>>> >>>> >>>> Mark Bidewell 于2019年10月11日周五 上午9:35写道: >>>> >>>>> I was specifying them in the interpreter settings in the UI. >>>>> >>>>

Re: Zeppelin 0.8.2 New Spark Interpreter

2019-10-10 Thread Jeff Zhang
he interpreter settings in the UI. >> >> On Thu, Oct 10, 2019 at 9:30 PM Jeff Zhang wrote: >> >>> How do you specify your spark interpreter dependencies ? You need to >>> specify it via property spark.jars or spark.jars.packages for non-local >>> mod

Re: Zeppelin 0.8.2 New Spark Interpreter

2019-10-10 Thread Jeff Zhang
gt;> How do you specify your spark interpreter dependencies ? You need to >> specify it via property spark.jars or spark.jars.packages for non-local >> model. >> >> Mark Bidewell 于2019年10月11日周五 上午3:45写道: >> >>> I am running some initial tests of Zeppelin 0.8.

Re: Zeppelin 0.8.2 New Spark Interpreter

2019-10-10 Thread Mark Bidewell
I was specifying them in the interpreter settings in the UI. On Thu, Oct 10, 2019 at 9:30 PM Jeff Zhang wrote: > How do you specify your spark interpreter dependencies ? You need to > specify it via property spark.jars or spark.jars.packages for non-local > model. > > Mark Bidewe

Re: Zeppelin 0.8.2 New Spark Interpreter

2019-10-10 Thread Jeff Zhang
How do you specify your spark interpreter dependencies ? You need to specify it via property spark.jars or spark.jars.packages for non-local model. Mark Bidewell 于2019年10月11日周五 上午3:45写道: > I am running some initial tests of Zeppelin 0.8.2 and I am seeing some > weird issues with depend

Zeppelin 0.8.2 New Spark Interpreter

2019-10-10 Thread Mark Bidewell
I am running some initial tests of Zeppelin 0.8.2 and I am seeing some weird issues with dependencies. When I use the old interpreter, everything works as expected. When I use the new interpreter, classes in my interpreter dependencies cannot be resolved when connecting to a master that is not

Re: spark interpreter "master" parameter always resets to yarn-client after restart zeppelin

2019-08-19 Thread Jeff Zhang
ark integration and the > “master” parameter in the spark interpreter configuration always resets its > value from “yarn” to “yarn-client” after zeppelin service reboot. > > > > How can I stop that? > > > > Thank you > > > NOTICE > Please consider the envir

Re: python virtual environment on spark interpreter

2019-08-19 Thread Jeff Zhang
have a zeppelin installation connected to a Spark cluster. I setup > Zeppelin to submit jobs in yarn cluster mode and also impersonation is > enabled. Now I would like to be able to use a python virtual environment > instead of system one. > > Is there a way I could specify the python parame

spark interpreter "master" parameter always resets to yarn-client after restart zeppelin

2019-08-19 Thread Manuel Sopena Ballesteros
Dear Zeppelin user community, I would like I a zeppelin installation with spark integration and the "master" parameter in the spark interpreter configuration always resets its value from "yarn" to "yarn-client" after zeppelin service reboot. How can I stop t

python virtual environment on spark interpreter

2019-08-19 Thread Manuel Sopena Ballesteros
specify the python parameter in the spark interpreter settings so is can point to specific folder use home folder (eg /home/{user_home}/python_virt_env/python) instead of a system one? If not how should I achieve what I want? Thank you Manuel NOTICE Please consider the environment before printing

Spark Interpreter failing to start: NumberFormat exception

2019-04-18 Thread Krentz
All - I am having an issue with a build I forked from master that is compiled as 0.9. We have another build running 0.8 that works just fine. The Spark interpreter is failing to start, and giving a NumberFormatException. It looks like when Zeppelin runs interpreter.sh

Re: Multi-line scripts in spark interpreter

2018-07-12 Thread Sanjay Dasgupta
behavior is coming from the new spark interpreter. Jeff opened > ZEPPELIN-3587 > to fix it. In the mean time you can use the old spark interpreter (set > zeppelin.spark.useNew > to false) to get around this. Hopefully you aren't dependent on the new > spark interpreter. > > >

Re: Multi-line scripts in spark interpreter

2018-07-12 Thread Paul Brenner
This behavior is coming from the new spark interpreter. Jeff opened  ZEPPELIN-3587 to fix it. In the mean time you can use the old spark interpreter (set zeppelin.spark.useNew to false) to get around this. Hopefully you aren't dependent on the new spark interpreter. ( https://share.polymail.io

Multi-line scripts in spark interpreter

2018-07-12 Thread Christopher Piggott
Hi, This used to work: val a = new Something() .someMethod() .someMethod2() in 0.7.3 but it doesn't in 0.8.0 ... it says the .someMethod(), etc. are an illegal start of expression. Some of these setups I have are fluently expressed but would be unmanageable in a single long line.

Re: illegal start of definition with new spark interpreter

2018-07-05 Thread Jeff Zhang
This is due to different behavior of new spark interpreter, I have created ZEPPELIN-3587 and will fix it asap. Paul Brenner 于2018年7月6日周五 上午1:11写道: > Hi all, > > When I try switching over to the new spark interpreter it seems there is a > fundamental difference in how code is interp

illegal start of definition with new spark interpreter

2018-07-05 Thread Paul Brenner
Hi all, When I try switching over to the new spark interpreter it seems there is a fundamental difference in how code is interpreted? Maybe that shouldn't be a surprise, but I'm wondering if other people have experienced it and if there is any work around or hope for a change in the future

Re: Where Spark home is pick up in the new Spark interpreter

2018-06-06 Thread Jeff Zhang
It is picked up from interpreter setting. You can define SPARK_HOME in spark's interpreter setting page Anthony Corbacho 于2018年6月7日周四 上午11:50写道: > Hi, > > I am a bit confused where spark home is pick up in the new Spark > interpreter in the 0.8 branch? > > Regards, > Anthony >

Where Spark home is pick up in the new Spark interpreter

2018-06-06 Thread Anthony Corbacho
Hi, I am a bit confused where spark home is pick up in the new Spark interpreter in the 0.8 branch? Regards, Anthony

Spark Interpreter Tutorial in Apache Zeppelin

2018-05-30 Thread Jeff Zhang
Hi Folks, I often see users asking how to use spark interpreter in mail-list, specially how to configure spark interpreter. So I wrote this article about how to use spark interpreter in Apache Zeppelin (It is based on Zeppelin 0.8.0). But it is not completed yet, I will continue to add more

Re: Spark Interpreter error: 'not found: type'

2018-03-19 Thread Jeff Zhang
in classpath and i didn't have any issues with import. >> >> >> Can you give me idea of how you are loading this jar datavec-api for >> zeppelin or spark-submit to access? >> >> >> Best >> >> Karan >> --

Re: Spark Interpreter error: 'not found: type'

2018-03-19 Thread Marcus
api for > zeppelin or spark-submit to access? > > > Best > > Karan > -- > *From:* Marcus <marcus.hun...@gmail.com> > *Sent:* Saturday, March 10, 2018 10:43:25 AM > *To:* users@zeppelin.apache.org > *Subject:* Spark Interpreter error: 'not found: type'

Re: Spark Interpreter error: 'not found: type'

2018-03-14 Thread Karan Sewani
Best Karan From: Marcus <marcus.hun...@gmail.com> Sent: Saturday, March 10, 2018 10:43:25 AM To: users@zeppelin.apache.org Subject: Spark Interpreter error: 'not found: type' Hi, I am new to Zeppelin and encountered a strange behavior. When copying my r

Spark Interpreter error: 'not found: type'

2018-03-09 Thread Marcus
Hi, I am new to Zeppelin and encountered a strange behavior. When copying my running scala-code to a notebook, I've got errors from the spark interpreter, saying it could not find some types. Strangely the code worked, when I used the fqcn instead of the simple name. But since I want the create

Re: Cannot define UDAF in %spark interpreter

2018-02-27 Thread Vannson, Raphael
-+--+---+ |some| secret | thing | here | 40| ++++--+---+ From: Paul Brenner <pbren...@placeiq.com> Date: Tuesday, February 27, 2018 at 3:31 PM To: Raphael Vannson <raphael.vann...@thinkbiganalytics.com>, "users@zeppelin.apache.org" <

Cannot define UDAF in %spark interpreter

2018-02-27 Thread Paul Brenner
ning a UDAF, using the same code in spark-shell in > :paste mode works fine. > > Environment: > - Amazon EMR > - Apache Zeppelin Version 0.7.3 > - Spark version 2.2.1 > - Using Scala version 2.11.8 (OpenJDK 64-Bit Server VM, Java 1.8.0_161) > > 1) Is there a way to co

Jar dependencies are not reloaded when Spark interpreter is restarted?

2018-02-22 Thread Partridge, Lucas (GE Aviation)
the file system path of the jar; it’s not even prefixed with file:///. From: Jhon Anderson Cardenas Diaz [mailto:jhonderson2...@gmail.com] Sent: 22 February 2018 12:18 To: users@zeppelin.apache.org Subject: EXT: Re: Jar dependencies are not reloaded when Spark interpreter is restarted? When you say

Re: Jar dependencies are not reloaded when Spark interpreter is restarted?

2018-02-22 Thread Jhon Anderson Cardenas Diaz
ridge, Lucas (GE Aviation)" < lucas.partri...@ge.com> escribió: > I’m using Zeppelin 0.7.3 against a local standalone Spark ‘cluster’. I’ve > added a Scala jar dependency to my Spark interpreter using Zeppelin’s UI. I > thought if I changed my Scala code and updated the jar (using

Jar dependencies are not reloaded when Spark interpreter is restarted?

2018-02-22 Thread Partridge, Lucas (GE Aviation)
I'm using Zeppelin 0.7.3 against a local standalone Spark 'cluster'. I've added a Scala jar dependency to my Spark interpreter using Zeppelin's UI. I thought if I changed my Scala code and updated the jar (using sbt outside of Zeppelin) then all I'd have to do is restart the interpreter

Re: Custom Spark Interpreter?

2018-01-25 Thread Nick Moeckel
I am beginning work on extending the SparkInterpreter class right now- I would be interested to hear more details about why this idea is not straightforward. Thanks, Nick -- Sent from: http://apache-zeppelin-users-incubating-mailing-list.75479.x6.nabble.com/

Re: Custom Spark Interpreter?

2018-01-25 Thread ankit jain
ail.com>于2018年1月25日周四 下午3:03写道: > >> That method is just reading it from a config defined in interpreter >> settings called "uiWebUrl" which makes it configurable but still static. >> >> On Wed, Jan 24, 2018 at 10:58 PM, Jeff Zhang <zjf...@gmail.com> wrote: >&g

Re: Custom Spark Interpreter?

2018-01-24 Thread Jeff Zhang
rable but still static. > > On Wed, Jan 24, 2018 at 10:58 PM, Jeff Zhang <zjf...@gmail.com> wrote: > >> >> IIRC, spark interpreter can get web ui url at runtime instead of static >> url. >> >> >> https://github.com/apache/zeppelin/blob/master

Re: Custom Spark Interpreter?

2018-01-24 Thread ankit jain
That method is just reading it from a config defined in interpreter settings called "uiWebUrl" which makes it configurable but still static. On Wed, Jan 24, 2018 at 10:58 PM, Jeff Zhang <zjf...@gmail.com> wrote: > > IIRC, spark interpreter can get web ui url at runtime i

Re: How does user user jar conflict resolved in spark interpreter?

2017-11-15 Thread Jeff Zhang
sers! > I have the question about dependencies users are using while running > notebooks using spark interpreter. > > Imagine I have configured spark intepreter. > > Two users write their spark notebooks. > the first user does > > z.load("com:best-it-company:0.1")

How does user user jar conflict resolved in spark interpreter?

2017-11-15 Thread Serega Sheypak
Hi zeppelin users! I have the question about dependencies users are using while running notebooks using spark interpreter. Imagine I have configured spark intepreter. Two users write their spark notebooks. the first user does z.load("com:best-it-company:0.1") the second one user a

Re: Configure spark interpreter setting from environment variables

2017-09-27 Thread benoitdr
That is working. Thanks a lot

Re: Configure spark interpreter setting from environment variables

2017-09-27 Thread Jeff Zhang
unfortunately it is packaged in the spark interpreter jar. but you can get it from source code. Benoit Drooghaag <benoit.droogh...@skynet.be>于2017年9月27日周三 下午5:11写道: > Thanks for your quick feedback. > There is no "interpreter-setting.json" in zeppelin-0.7.3-bin-all.tgz.

Re: Configure spark interpreter setting from environment variables

2017-09-27 Thread Benoit Drooghaag
ember 2017 at 10:56, Jeff Zhang <zjf...@gmail.com> wrote: > > Set interpreter setting is one time effort, it should not be inconvenient > for users. But if you are a zeppelin vender and want to customize zeppelin, > you can edit interpreter-setting.json of spark interpreter and c

Re: Configure spark interpreter setting from environment variables

2017-09-27 Thread Jeff Zhang
Set interpreter setting is one time effort, it should not be inconvenient for users. But if you are a zeppelin vender and want to customize zeppelin, you can edit interpreter-setting.json of spark interpreter and copy it into $ZEPPELIN_HOME/interpreter/spark benoit.droogh...@gmail.com

Re: Configuring Zeppelin spark interpreter to work with different hadoop clusters

2017-06-30 Thread Jeff Zhang
rpreter setting would affect that interpreter. > > And all the capitalized property name would be taken as env variable. > > Serega Sheypak <serega.shey...@gmail.com>于2017年7月1日周六 上午3:20写道: > >> hi, thanks for your reply. How should I set this variable? >> I'm looking

Re: Configuring Zeppelin spark interpreter to work with different hadoop clusters

2017-06-30 Thread Serega Sheypak
hi, thanks for your reply. How should I set this variable? I'm looking at Spark interpreter config UI. It doesn't allow me to set env variable. https://zeppelin.apache.org/docs/latest/interpreter/spark.html#1-export-spark_home tells that HADOOP_CONF_DIR should be set once per whole Zeppelin

Re: Configuring Zeppelin spark interpreter to work with different hadoop clusters

2017-06-30 Thread Jeff Zhang
Right, create three spark interpreters for your 3 yarn cluster. Serega Sheypak 于2017年6月30日周五 下午10:33写道: > Hi, thanks for your reply! > What do you mean by that? > I can have only one env variable HADOOP_CONF_DIR... > And how can user pick which env to run? > > Or you

Re: Configuring Zeppelin spark interpreter to work with different hadoop clusters

2017-06-30 Thread Serega Sheypak
Hi, thanks for your reply! What do you mean by that? I can have only one env variable HADOOP_CONF_DIR... And how can user pick which env to run? Or you mean I have to create three Spark interpreters and each of them would have it's own HADOOP_CONF_DIR pointed to single cluster config? 2017-06-30

Re: java.lang.NullPointerException on adding local jar as dependency to the spark interpreter

2017-05-09 Thread Jongyoul Lee
Can you add your spark interpreter's log file? On Sat, May 6, 2017 at 12:53 AM, shyla deshpande wrote: > Also, my local jar file that I want to add as dependency is a fat jar with > dependencies. Nothing works after I add my local fat jar, I get >

Re: java.lang.NullPointerException on adding local jar as dependency to the spark interpreter

2017-05-05 Thread shyla deshpande
Also, my local jar file that I want to add as dependency is a fat jar with dependencies. Nothing works after I add my local fat jar, I get *java.lang.NullPointerException for everything. Please help* On Thu, May 4, 2017 at 10:18 PM, shyla deshpande wrote: > Adding the

Re: Preconfigure Spark interpreter

2017-04-22 Thread Paul Brenner
rote: Hi, I need to pre-configure spark interpreter with my own artifacts and internal repositories. How can I do it?

Re: Preconfigure Spark interpreter

2017-04-22 Thread Serega Sheypak
Aha, thanks. I'm building Zeppelin from source, so I can put my custom settings directly? BTW, why does interpreter-list file don't contain spark interpreter? 2017-04-22 13:33 GMT+02:00 Fabian Böhnlein <fabian.boehnl...@gmail.com>: > Do it via the Ui once and you'll see how interpr

Re: Preconfigure Spark interpreter

2017-04-22 Thread Fabian Böhnlein
Do it via the Ui once and you'll see how interpreter.json of the Zeppelin installation will be changed. On Sat, Apr 22, 2017, 11:35 Serega Sheypak <serega.shey...@gmail.com> wrote: > Hi, I need to pre-configure spark interpreter with my own artifacts and > internal repositories. How can I do it? >

Preconfigure Spark interpreter

2017-04-22 Thread Serega Sheypak
Hi, I need to pre-configure spark interpreter with my own artifacts and internal repositories. How can I do it?

Re: Spark Interpreter: Change default scheduler pool

2017-04-17 Thread Fabian Böhnlein
Hi moon, exactly, thanks for the pointer. Added the issue: https://issues.apache.org/jira/browse/ZEPPELIN-2413 Best, Fabian On Tue, 28 Mar 2017 at 15:48 moon soo Lee wrote: > Hi Fabian, > > Thanks for sharing the issue. > SparkSqlInterpreter set scheduler to "fair" depends

Re: "spark ui" button in spark interpreter does not show Spark web-ui

2017-03-13 Thread Hyung Sung Shim
Hello. Thank you for sharing the problem. Could you file a jira issue for this? 2017년 3월 13일 (월) 오후 3:18, Meethu Mathew 님이 작성: > Hi, > > I have noticed the same problem > > Regards, > > > Meethu Mathew > > > On Mon, Mar 13, 2017 at 9:56 AM, Xiaohui Liu

Re: "spark ui" button in spark interpreter does not show Spark web-ui

2017-03-13 Thread Meethu Mathew
Hi, I have noticed the same problem Regards, Meethu Mathew On Mon, Mar 13, 2017 at 9:56 AM, Xiaohui Liu wrote: > Hi, > > We used 0.7.1-snapshot with our Mesos cluster, almost all our needed > features (ldap login, notebook acl control, livy/pyspark/rspark/scala, > etc.)

"spark ui" button in spark interpreter does not show Spark web-ui

2017-03-12 Thread Xiaohui Liu
Hi, We used 0.7.1-snapshot with our Mesos cluster, almost all our needed features (ldap login, notebook acl control, livy/pyspark/rspark/scala, etc.) work pretty well. But one thing does not work for us is the 'spark ui' button does not response to user clicks. No errors in browser side. Anyone

Re: Unable to connect with Spark Interpreter

2016-11-29 Thread Felix Cheung
onnect with Spark Interpreter Your last advice helped me to progress a little bit: - I started spark interpreter manually o c:\zepp\\bin\interpreter.cmd, -d, c:\zepp\interpreter\spark\, -p, 61176, -l, c:\zepp\/local-repo/2C2ZNEH5W o I needed to add a ‚\‘ into the –d attributte an

  1   2   >