Re: spark interpreter

2016-06-24 Thread moon soo Lee
Hi, Thanks for asking question. It's not dumb question at all, Zeppelin docs does not explain very well. Spark Interpreter, 'shared' mode, a spark interpreter setting spawn a interpreter process to serve all notebooks which binded to this interpreter setting. 'scoped' mode, a spark interpreter s

Re: spark interpreter

2016-06-28 Thread Chen Song
Thanks for your explanation, Moon. Following up on this, I can see the difference in terms of single or multiple interpreter processes. With respect to spark drivers, since each interpreter spawns a separate Spark driver in regular Spark interpreter setting, it is clear to me the different implic

Re: spark interpreter

2016-06-29 Thread moon soo Lee
Livy interpreter internally creates multiple sessions for each user, independently from 3 binding modes supported in Zeppelin. Therefore, 'shared' mode, Livy interpreter will create sessions per each user, 'scoped' or 'isolated' mode will result create sessions per notebook, per user. Notebook is

Re: spark interpreter

2016-06-29 Thread Benjamin Kim
On a side note… Has anyone got the Livy interpreter to be added as an interpreter in the latest build of Zeppelin 0.6.0? By the way, I have Shiro authentication on. Could this interfere? Thanks, Ben > On Jun 29, 2016, at 11:18 AM, moon soo Lee wrote: > > Livy interpreter internally creates

Re: spark interpreter

2016-06-30 Thread moon soo Lee
Hi Ben, Livy interpreter is included in 0.6.0. If it is not listed when you create interpreter setting, could you check if your 'zeppelin.interpreters' property list Livy interpreter classes? (conf/zeppelin-site.xml) Thanks, moon On Wed, Jun 29, 2016 at 11:52 AM Benjamin Kim wrote: > On a side

Re: spark interpreter

2016-06-30 Thread Benjamin Kim
Moon, That worked! There were quite a few more configuration properties added, so I added those too in both zeppelin-site.xml and zeppelin-env.sh. But, now, I’m getting errors starting a spark context. Thanks, Ben > On Jun 30, 2016, at 8:10 AM, moon soo Lee wrote: > > Hi Ben, > > Livy inter

Re: spark interpreter

2016-06-30 Thread Jongyoul Lee
Hi Ben, I suggest you stop Z, remove conf/interpreter.json, and start Z again. Regards, JL On Friday, 1 July 2016, moon soo Lee wrote: > Hi Ben, > > Livy interpreter is included in 0.6.0. If it is not listed when you create > interpreter setting, could you check if your 'zeppelin.interpreters'

Re: spark interpreter

2016-06-30 Thread Benjamin Kim
Hi, I fixed it. I had to restart Livy server as the hue user and not as root. Thanks, Ben > On Jun 30, 2016, at 8:59 AM, Jongyoul Lee wrote: > > Hi Ben, > > I suggest you stop Z, remove conf/interpreter.json, and start Z again. > > Regards, > JL > > On Friday, 1 July 2016, moon soo Lee

Re: spark interpreter

2016-06-30 Thread Leon Katsnelson
What is the expected day for v0.6? From: moon soo Lee To: users@zeppelin.apache.org Date: 2016/06/30 11:36 AM Subject:Re: spark interpreter Hi Ben, Livy interpreter is included in 0.6.0. If it is not listed when you create interpreter setting, could you check if your

Re: spark interpreter

2016-07-01 Thread moon soo Lee
moon soo Lee > To:users@zeppelin.apache.org > Date:2016/06/30 11:36 AM > Subject: Re: spark interpreter > -- > > > > Hi Ben, > > Livy interpreter is included in 0.6.0. If it is not listed when you creat

Re: spark interpreter

2016-07-01 Thread Benjamin Kim
users@zeppelin.apache.org <mailto:users@zeppelin.apache.org> > Date:2016/06/30 11:36 AM > Subject:Re: spark interpreter > > > > Hi Ben, > > Livy interpreter is included in 0.6.0. If it is not listed when you create > interpreter setting, could

Re: spark interpreter

2016-07-02 Thread moon soo Lee
he-Zeppelin-release-0-6-0-rc1-tp11505.html > > Thanks, > moon > > On Thu, Jun 30, 2016 at 1:54 PM Leon Katsnelson wrote: > >> What is the expected day for v0.6? >> >> >> >> >> From: moon soo Lee >> To:users@zeppelin.apac

Re: spark interpreter

2016-07-03 Thread Benjamin Kim
505.html >> >> Thanks, >> moon >> >> On Thu, Jun 30, 2016 at 1:54 PM Leon Katsnelson > > wrote: >> >>> What is the expected day for v0.6? >>> >>> >>> >>> >>> From:moon soo Lee >> > &

Re: Spark interpreter Repl injection

2021-03-06 Thread Jeff Zhang
Why not copying scala code in zeppelin and run the notebook directly ? Carlos Diogo 于2021年3月6日周六 下午3:51写道: > Dear all > I have been trying to find a was to inject scala Code ( from String) > into the spark interpreter > In pyspark is easy with the exec function > It should not be very difficul

Re: Spark interpreter Repl injection

2021-03-06 Thread Carlos Diogo
That does not work if you want to have Scala code in a file ( common functions) which you want to invoke in the note The alternative is to compile the code and then add the jar which would be normal for an application. But zeppelin is about scripting so this is a request I get very often from the u

Re: Spark interpreter Repl injection

2021-03-08 Thread moon soo Lee
Hi, How about precode ? "zeppelin.SparkInterpreter.precode" can run scala code. Thanks, moon On Sat, Mar 6, 2021 at 4:51 AM Carlos Diogo wrote: > That does not work if you want to have Scala code in a file ( common

Re: Spark interpreter Repl injection

2021-03-08 Thread Carlos Diogo
Are you able to specify a file on the precode? For now my work around is from within the note and with the rest api , to add a paragraph with the code I want to inject ( which can come from a file ) It works ok , but with run all or schedule the code gets updated in the note , but the old Code stil

Re: Spark interpreter Repl injection

2021-03-09 Thread moon soo Lee
I see. If you want to specify a file, precode might not the best option. I found a hacky way to do it. Accessing SparkInterpreter instance object from PysparkInterpreter. %pyspark sparkIntpField = intp.getClass().getDeclaredField("sparkInterpreter") sparkIntpField.setAccessible(True) sparkIntp = s

Re: Spark interpreter Repl injection

2021-03-09 Thread Carlos Diogo
Looks good Moon Is there a specific reason why you needed the pyspark interpreter to access the spark interpreter? Could not the spark interpreter programmatically access itself (and the same for the pyspark interpreter) Would the issue be to expose the z.interpret() method? Best regards Carlos

Re: Spark interpreter Repl injection

2021-03-09 Thread moon soo Lee
Pyspark interpreter have 'intp' variable exposed in its repl environment (for internal use). And we can resolve reference to Spark interpreter from the 'intp' variable. However, scala repl environment in Spark Interpreter doesn't expose any variables that is useful for finding Spark Interpreter its

Re: Spark interpreter Repl injection

2021-03-09 Thread Carlos Diogo
Thanks I created the issue Regards Carlos On Tue 9. Mar 2021 at 19:02, moon soo Lee wrote: > Pyspark interpreter have 'intp' variable exposed in its repl environment > (for internal use). And we can resolve reference to Spark interpreter from > the 'intp' variable. However, scala repl environmen

Re: Spark Interpreter: Change default scheduler pool

2017-03-28 Thread moon soo Lee
Hi Fabian, Thanks for sharing the issue. SparkSqlInterpreter set scheduler to "fair" depends on interpreter property [1]. I think we can do the similar for SparkInterpreter. Do you mind file a new JIRA issue for it? Regards, moon [1] https://github.com/apache/zeppelin/blob/0e1964877654c56c72473a

Re: Spark Interpreter: Change default scheduler pool

2017-04-17 Thread Fabian Böhnlein
Hi moon, exactly, thanks for the pointer. Added the issue: https://issues.apache.org/jira/browse/ZEPPELIN-2413 Best, Fabian On Tue, 28 Mar 2017 at 15:48 moon soo Lee wrote: > Hi Fabian, > > Thanks for sharing the issue. > SparkSqlInterpreter set scheduler to "fair" depends on interpreter > p

Re: Spark Interpreter error: 'not found: type'

2018-03-13 Thread Karan Sewani
Hello Marcus Maybe it has something to do with https://stackoverflow.com/questions/13008792/how-to-import-class-using-fully-qualified-namehttps://stackoverflow.com/questions/13008792/how-to-import-class-using-fully-qualified-name

Re: Spark Interpreter error: 'not found: type'

2018-03-19 Thread Marcus
Hi Karan, thanks for your hint, and sorry for the late response. I've tried the import using _root_ as suggested on stackoverflow, but it didn't change anything. Also, the import statement runs. The error occurs when using the classname. As for datavec-api, it is a transient dependency of deeplea

Re: Spark Interpreter error: 'not found: type'

2018-03-19 Thread Jeff Zhang
I try it in master branch, it looks like it fails to download the dependencies. and it also fails when I try use spark-submit directly. It should not be a zeppelin issue, please check these 2 dependencies. Exception in thread "main" java.lang.RuntimeException: problem during retrieve of org.apach

Re: Spark Interpreter Tutorial in Apache Zeppelin

2018-05-31 Thread moon soo Lee
Thanks Jeff for sharing. It's very helpful. On Wed, May 30, 2018 at 8:47 PM Jeff Zhang wrote: > Hi Folks, > > > I often see users asking how to use spark interpreter in mail-list, > specially how to configure spark interpreter. So I wrote this article about > how to use spark interpreter in Apac

Re: Re: Spark Interpreter Tutorial in Apache Zeppelin

2018-05-31 Thread Paul Brenner
This is great! It inspired us to take another crack at using three of the newish features (new spark interpreter, yarn-cluster mode, and impersonation). If we run into problems and then find solutions we will share in case they are worth including in the documentation. ( https://share.polymail

Re: spark interpreter "master" parameter always resets to yarn-client after restart zeppelin

2019-08-19 Thread Jeff Zhang
Do you mean you will manually change master to yarn after zeppelin service start, and want to reset it to yarn-client after restart zeppelin ? Manuel Sopena Ballesteros 于2019年8月20日周二 上午8:01写道: > Dear Zeppelin user community, > > > > I would like I a zeppelin installation with spark integration a