If it is blank page, it should be some kind of bug.

ankit jain <ankitjain....@gmail.com>于2018年1月26日周五 上午12:16写道:

> Don't think that works, it just loads a blank page.
>
> On Wed, Jan 24, 2018 at 11:06 PM, Jeff Zhang <zjf...@gmail.com> wrote:
>
>>
>> But if you don't set it in interpreter setting, it would get spark ui url
>> dynamically.
>>
>>
>>
>> ankit jain <ankitjain....@gmail.com>于2018年1月25日周四 下午3:03写道:
>>
>>> That method is just reading it from a config defined in interpreter
>>> settings called "uiWebUrl" which makes it configurable but still static.
>>>
>>> On Wed, Jan 24, 2018 at 10:58 PM, Jeff Zhang <zjf...@gmail.com> wrote:
>>>
>>>>
>>>> IIRC, spark interpreter can get web ui url at runtime instead of static
>>>> url.
>>>>
>>>>
>>>> https://github.com/apache/zeppelin/blob/master/spark/src/main/java/org/apache/zeppelin/spark/SparkInterpreter.java#L940
>>>>
>>>>
>>>> ankit jain <ankitjain....@gmail.com>于2018年1月25日周四 下午2:55写道:
>>>>
>>>>> Issue with Spark UI when running on AWS EMR is it requires ssh
>>>>> tunneling to be setup which requires private aws keys.
>>>>>
>>>>> Our team is building a analytic platform on zeppelin for end-users who
>>>>> we obviously can't hand out these keys.
>>>>>
>>>>> Another issue is setting up correct port - Zeppelin tries to use 4040
>>>>> for spark but during an interpreter restart 4040 could be used by an old
>>>>> still stuck paragraph. In that case Zeppelin simply tries the next port 
>>>>> and
>>>>> so on.
>>>>>
>>>>> Static url for Spark can't handle this and hence requires some dynamic
>>>>> implementation.
>>>>>
>>>>> PS - As I write this a lightbulb goes on in my head. I guess we could
>>>>> also modify Zeppelin restart script to kill those rogue processes and make
>>>>> sure 4040 is always available?
>>>>>
>>>>> Thanks
>>>>> Ankit
>>>>>
>>>>> On Wed, Jan 24, 2018 at 6:10 PM, Jeff Zhang <zjf...@gmail.com> wrote:
>>>>>
>>>>>>
>>>>>> If Spark interpreter didn't give you the correct spark UI, this
>>>>>> should be a bug, you can file a ticket to fix it. Although you can make a
>>>>>> custom interpreter by extending the current spark interpreter, it is not 
>>>>>> a
>>>>>> trivial work.
>>>>>>
>>>>>>
>>>>>> ankit jain <ankitjain....@gmail.com>于2018年1月25日周四 上午8:07写道:
>>>>>>
>>>>>>> Hi fellow Zeppelin users,
>>>>>>> Has anyone tried to write a custom Spark Interpreter perhaps
>>>>>>> extending from the one that ships currently with zeppelin -
>>>>>>> spark/src/main/java/org/apache/zeppelin/spark/
>>>>>>> *SparkInterpreter.java?*
>>>>>>>
>>>>>>> We are coming across cases where we need the interpreter to do
>>>>>>> "more", eg change getSparkUIUrl() to directly load Yarn
>>>>>>> ResourceManager/proxy/application_id123 rather than a fixed web ui.
>>>>>>>
>>>>>>> If we directly modify Zeppelin source code, upgrading to new
>>>>>>> zeppelin versions will be a mess.
>>>>>>>
>>>>>>> Before we get too deep into it, wanted to get thoughts of the
>>>>>>> community.
>>>>>>>
>>>>>>> What is a "clean" way to do such changes?
>>>>>>>
>>>>>>> --
>>>>>>> Thanks & Regards,
>>>>>>> Ankit.
>>>>>>>
>>>>>>
>>>>>
>>>>>
>>>>> --
>>>>> Thanks & Regards,
>>>>> Ankit.
>>>>>
>>>>
>>>
>>>
>>> --
>>> Thanks & Regards,
>>> Ankit.
>>>
>>
>
>
> --
> Thanks & Regards,
> Ankit.
>

Reply via email to