gt;>>> ssh in
>>>> >and cd to the spark bin and run spark-shell --master yarn. Once this
>>>> spins
>>>> >up I can see that the UI is started at the internal ip of 4040. If I
>>>> hit
>>>> >the
>>>> >public dns at 4040
ins
>>> >up I can see that the UI is started at the internal ip of 4040. If I hit
>>> >the
>>> >public dns at 4040 with dynamic port tunneling and foxyproxy then I get
>>> a
>>> >crude UI (css seems broken), however the proxy continuously redirect
and foxyproxy then I get a
>> >crude UI (css seems broken), however the proxy continuously redirects me
>> >to
>> >the main page, so I cannot drill into anything. So, I tried static
>> >tunneling, but can't seem to get through.
>> >
>> >So
t;So, how can I access the spark UI when running a spark shell in AWS yarn?
> >
> >
> >
> >--
> >View this message in context:
> >
> http://apache-spark-user-list.1001560.n3.nabble.com/How-to-access-Spark-UI
> >-through-AWS-tp24436.html
> >Sent
shell in AWS yarn?
>
>
>
>--
>View this message in context:
>http://apache-spark-user-list.1001560.n3.nabble.com/How-to-access-Spark-UI
>-through-AWS-tp24436.html
>Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
>---
get through.
So, how can I access the spark UI when running a spark shell in AWS yarn?
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/How-to-access-Spark-UI-through-AWS-tp24436.html
Sent from the Apache Spark User List mailing list archive at