Yeah upon running the test locally I receive:

"Pi is roughly 3.139948”

So spark is working, it’s just the application ui that is not…


> On Jan 13, 2015, at 1:13 PM, Ganon Pierce <ganon.pie...@me.com> wrote:
> 
> My application logs remain stored as .inprogress files, e.g. 
> "app-20150113190025-0004.inprogress” even after completion, could this have 
> something to do with what is going on.
> 
> @ Ted Yu
> 
> Where do I find the master log? It’s not very obviously labeled in my /tmp/ 
> directory. Sorry if I should know, I’ve read through the documentation many 
> times, but probably missed it. Do I need to set this in my configuration 
> somehow? Could the problem actually be that I’m just not setting a conf 
> property properly somewhere?
> 
> @Robin East
> 
> All of these work fine, the only issue I’m having is viewing the application 
> ui after I submit a job to run. Further, the application runs and will even 
> compute some of the outputs I want (they get stored to s3 and I’ve used the 
> outputs in other models I’m making). It’s simply when I access the master web 
> ui, click the application that is currently running, and then click 
> “Application Detail UI” that I receive this error. When I an application has 
> completed and I click the same link I receive: 
> 
> “Application history not found (app-201501131190025-004)"
> 
>> On Jan 13, 2015, at 8:07 AM, Ted Yu <yuzhih...@gmail.com 
>> <mailto:yuzhih...@gmail.com>> wrote:
>> 
>> Gabon:
>> Can you check the master log to see if there is some clue ?
>> 
>> Cheers
>> 
>> 
>> 
>> On Jan 13, 2015, at 2:03 AM, Robin East <robin.e...@xense.co.uk 
>> <mailto:robin.e...@xense.co.uk>> wrote:
>> 
>>> I’ve just pulled down the latest commits from github, and done the 
>>> following:
>>> 
>>> 1)
>>> mvn clean package -DskipTests
>>> 
>>> builds fine
>>> 
>>> 2)
>>> ./bin/spark-shell works
>>> 
>>> 3)
>>> run SparkPi example with no problems:
>>> 
>>> ./bin/run-example SparkPi 10
>>> 
>>> 4)
>>> Started a master 
>>> 
>>> ./sbin/start-master.sh
>>> 
>>> grabbed the MasterWebUI from the master log - Started MasterWebUI at 
>>> http://x.x.x.x:8080 <http://x.x.x.x:8080/>
>>> 
>>> Can view the MasterWebUI from local browser
>>> 
>>> 5)
>>> grabbed the spark url from the master log and started a local slave:
>>> 
>>> ./bin/spark-class org.apache.spark.deploy.worker.Worker 
>>> spark://<hostname>:7077 &
>>> 
>>> 6)
>>> Ran jps to confirm both Master and Worker processes are present.
>>> 
>>> 7)
>>> Ran SparkPi on the mini-cluster:
>>> 
>>> MASTER=spark://<host>:7077 ./bin/run-example SparkPi 10
>>> 
>>> All worked fine, can see information in the MasterWebUI
>>> 
>>> Which of these stops doesn’t work for you? I presume you’ve tried 
>>> re-pulling from git and a clean build again.
>>> 
>>> Robin
>>> On 13 Jan 2015, at 08:07, Ganon Pierce <ganon.pie...@me.com 
>>> <mailto:ganon.pie...@me.com>> wrote:
>>> 
>>>> After clean build still receiving the same error.
>>>> 
>>>> 
>>>> 
>>>> On Jan 6, 2015, at 3:59 PM, Sean Owen <so...@cloudera.com 
>>>> <mailto:so...@cloudera.com>> wrote:
>>>> 
>>>>> FWIW I do not see any such error, after a "mvn -DskipTests clean package" 
>>>>> and "./bin/spark-shell" from master. Maybe double-check you have done a 
>>>>> full clean build.
>>>>> 
>>>>> On Tue, Jan 6, 2015 at 9:09 PM, Ganon Pierce <ganon.pie...@me.com 
>>>>> <mailto:ganon.pie...@me.com>> wrote:
>>>>> I’m attempting to build from the latest commit on git and receive the 
>>>>> following error upon attempting to access the application web ui:
>>>>> 
>>>>> HTTP ERROR: 500
>>>>> 
>>>>> Problem accessing /jobs/. Reason:
>>>>> 
>>>>>     Server Error
>>>>> Powered by Jetty://
>>>>> 
>>>>> My driver also prints this error:
>>>>> 
>>>>> java.lang.UnsupportedOperationException: empty.max
>>>>>   at scala.collection.TraversableOnce$class.max(TraversableOnce.scala:216)
>>>>>   at scala.collection.AbstractTraversable.max(Traversable.scala:105)
>>>>>   at org.apache.spark.ui.jobs.AllJobsPage.org 
>>>>> <http://org.apache.spark.ui.jobs.alljobspage.org/>$apache$spark$ui$jobs$AllJobsPage$$makeRow$1(AllJobsPage.scala:46)
>>>>>   at 
>>>>> org.apache.spark.ui.jobs.AllJobsPage$$anonfun$jobsTable$1.apply(AllJobsPage.scala:91)
>>>>>   at 
>>>>> org.apache.spark.ui.jobs.AllJobsPage$$anonfun$jobsTable$1.apply(AllJobsPage.scala:91)
>>>>>   at 
>>>>> scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
>>>>>   at 
>>>>> scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
>>>>>   at scala.collection.immutable.List.foreach(List.scala:318)
>>>>>   at scala.collection.TraversableLike$class.map(TraversableLike.scala:244)
>>>>>   at scala.collection.AbstractTraversable.map(Traversable.scala:105)
>>>>>   at org.apache.spark.ui.jobs.AllJobsPage.jobsTable(AllJobsPage.scala:91)
>>>>>   at org.apache.spark.ui.jobs.AllJobsPage.render(AllJobsPage.scala:106)
>>>>>   at org.apache.spark.ui.WebUI$$anonfun$attachPage$1.apply(WebUI.scala:68)
>>>>>   at org.apache.spark.ui.WebUI$$anonfun$attachPage$1.apply(WebUI.scala:68)
>>>>>   at org.apache.spark.ui.JettyUtils$$anon$1.doGet(JettyUtils.scala:68)
>>>>>   at javax.servlet.http.HttpServlet.service(HttpServlet.java:735)
>>>>>   at javax.servlet.http.HttpServlet.service(HttpServlet.java:848)
>>>>>   at 
>>>>> org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:684)
>>>>>   at 
>>>>> org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:501)
>>>>>   at 
>>>>> org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1086)
>>>>>   at 
>>>>> org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:428)
>>>>>   at 
>>>>> org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1020)
>>>>>   at 
>>>>> org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:135)
>>>>>   at 
>>>>> org.eclipse.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:255)
>>>>>   at 
>>>>> org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:116)
>>>>>   at org.eclipse.jetty.server.Server.handle(Server.java:370)
>>>>>   at 
>>>>> org.eclipse.jetty.server.AbstractHttpConnection.handleRequest(AbstractHttpConnection.java:494)
>>>>>   at 
>>>>> org.eclipse.jetty.server.AbstractHttpConnection.headerComplete(AbstractHttpConnection.java:971)
>>>>>   at 
>>>>> org.eclipse.jetty.server.AbstractHttpConnection$RequestHandler.headerComplete(AbstractHttpConnection.java:1033)
>>>>>   at org.eclipse.jetty.http.HttpParser.parseNext(HttpParser.java:644)
>>>>>   at org.eclipse.jetty.http.HttpParser.parseAvailable(HttpParser.java:235)
>>>>>   at 
>>>>> org.eclipse.jetty.server.AsyncHttpConnection.handle(AsyncHttpConnection.java:82)
>>>>>   at 
>>>>> org.eclipse.jetty.io.nio.SelectChannelEndPoint.handle(SelectChannelEndPoint.java:667)
>>>>>   at 
>>>>> org.eclipse.jetty.io.nio.SelectChannelEndPoint$1.run(SelectChannelEndPoint.java:52)
>>>>>   at 
>>>>> org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:608)
>>>>>   at 
>>>>> org.eclipse.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:543)
>>>>>   at java.lang.Thread.run(Thread.java:745)
>>>>> 
>>>>> 
>>>>> Has the ui been disabled intentionally for development purposes, have I 
>>>>> not set something up correctly, or is this a bug?
>>>>> 
>>> 
> 

Reply via email to