Thanks. I just tried and still am having trouble. It seems to still be
using the private address even if I try going through the resource manager.

On Tue, Aug 25, 2015 at 12:34 PM, Kelly, Jonathan <jonat...@amazon.com>
wrote:

> I'm not sure why the UI appears broken like that either and haven't
> investigated it myself yet, but if you instead go to the YARN
> ResourceManager UI (port 8088 if you are using emr-4.x; port 9026 for 3.x,
> I believe), then you should be able to click on the ApplicationMaster link
> (or the History link for completed applications) to get to the Spark UI
> from there. The ApplicationMaster link will use the YARN Proxy Service
> (port 20888 on emr-4.x; not sure about 3.x) to proxy through the Spark
> application's UI, regardless of what port it's running on. For completed
> applications, the History link will send you directly to the Spark History
> Server UI on port 18080. Hope that helps!
>
> ~ Jonathan
>
>
>
>
> On 8/24/15, 10:51 PM, "Justin Pihony" <justin.pih...@gmail.com> wrote:
>
> >I am using the steps from  this article
> ><https://aws.amazon.com/articles/Elastic-MapReduce/4926593393724923>   to
> >get spark up and running on EMR through yarn. Once up and running I ssh in
> >and cd to the spark bin and run spark-shell --master yarn. Once this spins
> >up I can see that the UI is started at the internal ip of 4040. If I hit
> >the
> >public dns at 4040 with dynamic port tunneling and foxyproxy then I get a
> >crude UI (css seems broken), however the proxy continuously redirects me
> >to
> >the main page, so I cannot drill into anything. So, I tried static
> >tunneling, but can't seem to get through.
> >
> >So, how can I access the spark UI when running a spark shell in AWS yarn?
> >
> >
> >
> >--
> >View this message in context:
> >
> http://apache-spark-user-list.1001560.n3.nabble.com/How-to-access-Spark-UI
> >-through-AWS-tp24436.html
> >Sent from the Apache Spark User List mailing list archive at Nabble.com.
> >
> >---------------------------------------------------------------------
> >To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> >For additional commands, e-mail: user-h...@spark.apache.org
> >
>
>

Reply via email to