I ran into a similar issue. What's happening is that when Spark is running in YARN client mode, YARN automatically launches a Web Application Proxy <http://archive.cloudera.com/cdh5/cdh/5/hadoop/hadoop-yarn/hadoop-yarn-site/WebApplicationProxy.html> to reduce hacking attempts. In doing so, it adds the AmIpFilter to the proxy. You can see this is the example log snippet below:
15/03/20 21:33:14 INFO cluster.YarnClientSchedulerBackend: ApplicationMaster registered as Actor[akka.tcp://sparkyar...@ip-172-31-44-228.us-west-2.compute.internal:53028/user/YarnAM#-1897510590] 15/03/20 21:33:14 INFO cluster.YarnClientSchedulerBackend: Add WebUI Filter. org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter, Map(PROXY_HOSTS -> 172.31.36.22, PROXY_URI_BASES -> http://172.31.36.22:9046/proxy/application_1426881405719_0009), /proxy/application_1426881405719_0009 15/03/20 21:33:14 INFO ui.JettyUtils: Adding filter: org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter 15/03/20 21:33:15 INFO yarn.Client: Application report for application_1426881405719_0009 (state: RUNNING) 15/03/20 21:33:15 INFO yarn.Client: client token: N/A diagnostics: N/A ApplicationMaster host: ip-172-31-44-228.us-west-2.compute.internal ApplicationMaster RPC port: 0 queue: default start time: 1426887190001 final status: UNDEFINED *tracking URL: http://172.31.36.22:9046/proxy/application_1426881405719_0009/* While I haven't found a way to disable it (the Spark doc <http://spark.apache.org/docs/1.2.1/security.html> may help), you can view the Web UI by forwarding the proxy's port (9046 by default). Then point your browser the the tracking URL with the host IP replaced with localhost, eg: http://localhost:9046/proxy/application_1426881405719_0009 Hope that helps. Ben -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/WebUI-on-yarn-through-ssh-tunnel-affected-by-AmIpfilter-tp21540p22169.html Sent from the Apache Spark User List mailing list archive at Nabble.com. --------------------------------------------------------------------- To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org