Re: Custom Spark Interpreter?

2018-01-24 Thread Jeff Zhang
But if you don't set it in interpreter setting, it would get spark ui url
dynamically.



ankit jain 于2018年1月25日周四 下午3:03写道:

> That method is just reading it from a config defined in interpreter
> settings called "uiWebUrl" which makes it configurable but still static.
>
> On Wed, Jan 24, 2018 at 10:58 PM, Jeff Zhang  wrote:
>
>>
>> IIRC, spark interpreter can get web ui url at runtime instead of static
>> url.
>>
>>
>> https://github.com/apache/zeppelin/blob/master/spark/src/main/java/org/apache/zeppelin/spark/SparkInterpreter.java#L940
>>
>>
>> ankit jain 于2018年1月25日周四 下午2:55写道:
>>
>>> Issue with Spark UI when running on AWS EMR is it requires ssh tunneling
>>> to be setup which requires private aws keys.
>>>
>>> Our team is building a analytic platform on zeppelin for end-users who
>>> we obviously can't hand out these keys.
>>>
>>> Another issue is setting up correct port - Zeppelin tries to use 4040
>>> for spark but during an interpreter restart 4040 could be used by an old
>>> still stuck paragraph. In that case Zeppelin simply tries the next port and
>>> so on.
>>>
>>> Static url for Spark can't handle this and hence requires some dynamic
>>> implementation.
>>>
>>> PS - As I write this a lightbulb goes on in my head. I guess we could
>>> also modify Zeppelin restart script to kill those rogue processes and make
>>> sure 4040 is always available?
>>>
>>> Thanks
>>> Ankit
>>>
>>> On Wed, Jan 24, 2018 at 6:10 PM, Jeff Zhang  wrote:
>>>

 If Spark interpreter didn't give you the correct spark UI, this should
 be a bug, you can file a ticket to fix it. Although you can make a custom
 interpreter by extending the current spark interpreter, it is not a trivial
 work.


 ankit jain 于2018年1月25日周四 上午8:07写道:

> Hi fellow Zeppelin users,
> Has anyone tried to write a custom Spark Interpreter perhaps extending
> from the one that ships currently with zeppelin -
> spark/src/main/java/org/apache/zeppelin/spark/*SparkInterpreter.java?*
>
> We are coming across cases where we need the interpreter to do "more",
> eg change getSparkUIUrl() to directly load Yarn
> ResourceManager/proxy/application_id123 rather than a fixed web ui.
>
> If we directly modify Zeppelin source code, upgrading to new zeppelin
> versions will be a mess.
>
> Before we get too deep into it, wanted to get thoughts of the
> community.
>
> What is a "clean" way to do such changes?
>
> --
> Thanks & Regards,
> Ankit.
>

>>>
>>>
>>> --
>>> Thanks & Regards,
>>> Ankit.
>>>
>>
>
>
> --
> Thanks & Regards,
> Ankit.
>


Re: Custom Spark Interpreter?

2018-01-24 Thread ankit jain
That method is just reading it from a config defined in interpreter
settings called "uiWebUrl" which makes it configurable but still static.

On Wed, Jan 24, 2018 at 10:58 PM, Jeff Zhang  wrote:

>
> IIRC, spark interpreter can get web ui url at runtime instead of static
> url.
>
> https://github.com/apache/zeppelin/blob/master/spark/
> src/main/java/org/apache/zeppelin/spark/SparkInterpreter.java#L940
>
>
> ankit jain 于2018年1月25日周四 下午2:55写道:
>
>> Issue with Spark UI when running on AWS EMR is it requires ssh tunneling
>> to be setup which requires private aws keys.
>>
>> Our team is building a analytic platform on zeppelin for end-users who we
>> obviously can't hand out these keys.
>>
>> Another issue is setting up correct port - Zeppelin tries to use 4040 for
>> spark but during an interpreter restart 4040 could be used by an old still
>> stuck paragraph. In that case Zeppelin simply tries the next port and so on.
>>
>> Static url for Spark can't handle this and hence requires some dynamic
>> implementation.
>>
>> PS - As I write this a lightbulb goes on in my head. I guess we could
>> also modify Zeppelin restart script to kill those rogue processes and make
>> sure 4040 is always available?
>>
>> Thanks
>> Ankit
>>
>> On Wed, Jan 24, 2018 at 6:10 PM, Jeff Zhang  wrote:
>>
>>>
>>> If Spark interpreter didn't give you the correct spark UI, this should
>>> be a bug, you can file a ticket to fix it. Although you can make a custom
>>> interpreter by extending the current spark interpreter, it is not a trivial
>>> work.
>>>
>>>
>>> ankit jain 于2018年1月25日周四 上午8:07写道:
>>>
 Hi fellow Zeppelin users,
 Has anyone tried to write a custom Spark Interpreter perhaps extending
 from the one that ships currently with zeppelin -
 spark/src/main/java/org/apache/zeppelin/spark/*SparkInterpreter.java?*

 We are coming across cases where we need the interpreter to do "more",
 eg change getSparkUIUrl() to directly load Yarn 
 ResourceManager/proxy/application_id123
 rather than a fixed web ui.

 If we directly modify Zeppelin source code, upgrading to new zeppelin
 versions will be a mess.

 Before we get too deep into it, wanted to get thoughts of the community.

 What is a "clean" way to do such changes?

 --
 Thanks & Regards,
 Ankit.

>>>
>>
>>
>> --
>> Thanks & Regards,
>> Ankit.
>>
>


-- 
Thanks & Regards,
Ankit.