[
https://issues.apache.org/jira/browse/SPARK-9960?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14696419#comment-14696419
]
Naga commented on SPARK-9960:
-----------------------------
Here's runtime output (includes stack trace) ...
{code}
~/spark-1.4.1-bin-without-hadoop $ bin/run-example SparkPi
--driver-library-path native
15/08/13 20:42:05 INFO spark.SparkContext: Running Spark version 1.4.1
15/08/13 20:42:06 WARN util.NativeCodeLoader: Unable to load native-hadoop
library for your platform... using builtin-java classes where applicable
15/08/13 20:42:06 INFO spark.SecurityManager: Changing view acls to: nv
15/08/13 20:42:06 INFO spark.SecurityManager: Changing modify acls to: nv
15/08/13 20:42:06 INFO spark.SecurityManager: SecurityManager: authentication
disabled; ui acls disabled; users with view permissions: Set(nv); users with
modify permissions: Set(nv)
15/08/13 20:42:06 INFO slf4j.Slf4jLogger: Slf4jLogger started
15/08/13 20:42:06 INFO Remoting: Starting remoting
15/08/13 20:42:06 INFO Remoting: Remoting started; listening on addresses
:[akka.tcp://[email protected]:62576]
15/08/13 20:42:06 INFO util.Utils: Successfully started service 'sparkDriver'
on port 62576.
15/08/13 20:42:06 INFO spark.SparkEnv: Registering MapOutputTracker
15/08/13 20:42:06 INFO spark.SparkEnv: Registering BlockManagerMaster
15/08/13 20:42:07 INFO storage.DiskBlockManager: Created local directory at
/private/var/folders/0j/bkhg_dw17w96qxddkmryz63r0000gn/T/spark-c2b2b947-327a-4e54-aa3b-ae11526aebde/blockmgr-37bc3de6-86da-491a-9a39-a9f44c3a0919
15/08/13 20:42:07 INFO storage.MemoryStore: MemoryStore started with capacity
265.4 MB
15/08/13 20:42:07 INFO spark.HttpFileServer: HTTP File server directory is
/private/var/folders/0j/bkhg_dw17w96qxddkmryz63r0000gn/T/spark-c2b2b947-327a-4e54-aa3b-ae11526aebde/httpd-775eeb2e-20ac-4dd4-92db-e94bd053ebf3
15/08/13 20:42:07 INFO spark.HttpServer: Starting HTTP Server
15/08/13 20:42:07 INFO server.Server: jetty-8.y.z-SNAPSHOT
15/08/13 20:42:07 INFO server.AbstractConnector: Started
[email protected]:62577
15/08/13 20:42:07 INFO util.Utils: Successfully started service 'HTTP file
server' on port 62577.
15/08/13 20:42:07 INFO spark.SparkEnv: Registering OutputCommitCoordinator
15/08/13 20:42:07 INFO server.Server: jetty-8.y.z-SNAPSHOT
15/08/13 20:42:07 INFO server.AbstractConnector: Started
[email protected]:4040
15/08/13 20:42:07 INFO util.Utils: Successfully started service 'SparkUI' on
port 4040.
15/08/13 20:42:07 INFO ui.SparkUI: Started SparkUI at http://10.0.0.6:4040
15/08/13 20:42:07 INFO spark.SparkContext: Added JAR
file:/Users/nv/spark-1.4.1-bin-without-hadoop/lib/spark-examples-1.4.1-hadoop2.2.0.jar
at http://10.0.0.6:62577/jars/spark-examples-1.4.1-hadoop2.2.0.jar with
timestamp 1439523727458
15/08/13 20:42:07 INFO executor.Executor: Starting executor ID driver on host
localhost
15/08/13 20:42:07 INFO util.Utils: Successfully started service
'org.apache.spark.network.netty.NettyBlockTransferService' on port 62578.
15/08/13 20:42:07 INFO netty.NettyBlockTransferService: Server created on 62578
15/08/13 20:42:07 INFO storage.BlockManagerMaster: Trying to register
BlockManager
15/08/13 20:42:07 INFO storage.BlockManagerMasterEndpoint: Registering block
manager localhost:62578 with 265.4 MB RAM, BlockManagerId(driver, localhost,
62578)
15/08/13 20:42:07 INFO storage.BlockManagerMaster: Registered BlockManager
15/08/13 20:42:07 INFO spark.SparkContext: Starting job: reduce at
SparkPi.scala:35
15/08/13 20:42:08 INFO scheduler.DAGScheduler: Got job 0 (reduce at
SparkPi.scala:35) with 2 output partitions (allowLocal=false)
15/08/13 20:42:08 INFO scheduler.DAGScheduler: Final stage: ResultStage
0(reduce at SparkPi.scala:35)
15/08/13 20:42:08 INFO scheduler.DAGScheduler: Parents of final stage: List()
15/08/13 20:42:08 INFO scheduler.DAGScheduler: Missing parents: List()
15/08/13 20:42:08 INFO scheduler.DAGScheduler: Submitting ResultStage 0
(MapPartitionsRDD[1] at map at SparkPi.scala:31), which has no missing parents
15/08/13 20:42:08 INFO storage.MemoryStore: ensureFreeSpace(1888) called with
curMem=0, maxMem=278302556
15/08/13 20:42:08 INFO storage.MemoryStore: Block broadcast_0 stored as values
in memory (estimated size 1888.0 B, free 265.4 MB)
15/08/13 20:42:08 INFO storage.MemoryStore: ensureFreeSpace(1202) called with
curMem=1888, maxMem=278302556
15/08/13 20:42:08 INFO storage.MemoryStore: Block broadcast_0_piece0 stored as
bytes in memory (estimated size 1202.0 B, free 265.4 MB)
15/08/13 20:42:08 INFO storage.BlockManagerInfo: Added broadcast_0_piece0 in
memory on localhost:62578 (size: 1202.0 B, free: 265.4 MB)
15/08/13 20:42:08 INFO spark.SparkContext: Created broadcast 0 from broadcast
at DAGScheduler.scala:874
15/08/13 20:42:08 INFO scheduler.DAGScheduler: Submitting 2 missing tasks from
ResultStage 0 (MapPartitionsRDD[1] at map at SparkPi.scala:31)
15/08/13 20:42:08 INFO scheduler.TaskSchedulerImpl: Adding task set 0.0 with 2
tasks
15/08/13 20:42:08 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 0.0
(TID 0, localhost, PROCESS_LOCAL, 1442 bytes)
15/08/13 20:42:08 INFO scheduler.TaskSetManager: Starting task 1.0 in stage 0.0
(TID 1, localhost, PROCESS_LOCAL, 1442 bytes)
15/08/13 20:42:08 INFO executor.Executor: Running task 1.0 in stage 0.0 (TID 1)
15/08/13 20:42:08 INFO executor.Executor: Running task 0.0 in stage 0.0 (TID 0)
15/08/13 20:42:08 INFO executor.Executor: Fetching
http://10.0.0.6:62577/jars/spark-examples-1.4.1-hadoop2.2.0.jar with timestamp
1439523727458
15/08/13 20:43:08 INFO executor.Executor: Fetching
http://10.0.0.6:62577/jars/spark-examples-1.4.1-hadoop2.2.0.jar with timestamp
1439523727458
15/08/13 20:43:08 ERROR executor.Executor: Exception in task 0.0 in stage 0.0
(TID 0)
java.net.SocketTimeoutException: connect timed out
at java.net.PlainSocketImpl.socketConnect(Native Method)
at
java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:339)
at
java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:200)
at
java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:182)
at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)
at java.net.Socket.connect(Socket.java:579)
at sun.net.NetworkClient.doConnect(NetworkClient.java:175)
at sun.net.www.http.HttpClient.openServer(HttpClient.java:432)
at sun.net.www.http.HttpClient.openServer(HttpClient.java:527)
at sun.net.www.http.HttpClient.<init>(HttpClient.java:211)
at sun.net.www.http.HttpClient.New(HttpClient.java:308)
at sun.net.www.http.HttpClient.New(HttpClient.java:326)
at
sun.net.www.protocol.http.HttpURLConnection.getNewHttpClient(HttpURLConnection.java:996)
at
sun.net.www.protocol.http.HttpURLConnection.plainConnect(HttpURLConnection.java:932)
at
sun.net.www.protocol.http.HttpURLConnection.connect(HttpURLConnection.java:850)
at org.apache.spark.util.Utils$.doFetchFile(Utils.scala:639)
at org.apache.spark.util.Utils$.fetchFile(Utils.scala:453)
at
org.apache.spark.executor.Executor$$anonfun$org$apache$spark$executor$Executor$$updateDependencies$5.apply(Executor.scala:398)
at
org.apache.spark.executor.Executor$$anonfun$org$apache$spark$executor$Executor$$updateDependencies$5.apply(Executor.scala:390)
at
scala.collection.TraversableLike$WithFilter$$anonfun$foreach$1.apply(TraversableLike.scala:772)
at
scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:98)
at
scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:98)
at
scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:226)
at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:39)
at scala.collection.mutable.HashMap.foreach(HashMap.scala:98)
at
scala.collection.TraversableLike$WithFilter.foreach(TraversableLike.scala:771)
at
org.apache.spark.executor.Executor.org$apache$spark$executor$Executor$$updateDependencies(Executor.scala:390)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:193)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
15/08/13 20:43:08 WARN scheduler.TaskSetManager: Lost task 0.0 in stage 0.0
(TID 0, localhost): java.net.SocketTimeoutException: connect timed out
at java.net.PlainSocketImpl.socketConnect(Native Method)
at
java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:339)
at
java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:200)
at
java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:182)
at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)
at java.net.Socket.connect(Socket.java:579)
at sun.net.NetworkClient.doConnect(NetworkClient.java:175)
at sun.net.www.http.HttpClient.openServer(HttpClient.java:432)
at sun.net.www.http.HttpClient.openServer(HttpClient.java:527)
at sun.net.www.http.HttpClient.<init>(HttpClient.java:211)
at sun.net.www.http.HttpClient.New(HttpClient.java:308)
at sun.net.www.http.HttpClient.New(HttpClient.java:326)
at
sun.net.www.protocol.http.HttpURLConnection.getNewHttpClient(HttpURLConnection.java:996)
at
sun.net.www.protocol.http.HttpURLConnection.plainConnect(HttpURLConnection.java:932)
at
sun.net.www.protocol.http.HttpURLConnection.connect(HttpURLConnection.java:850)
at org.apache.spark.util.Utils$.doFetchFile(Utils.scala:639)
at org.apache.spark.util.Utils$.fetchFile(Utils.scala:453)
at
org.apache.spark.executor.Executor$$anonfun$org$apache$spark$executor$Executor$$updateDependencies$5.apply(Executor.scala:398)
at
org.apache.spark.executor.Executor$$anonfun$org$apache$spark$executor$Executor$$updateDependencies$5.apply(Executor.scala:390)
at
scala.collection.TraversableLike$WithFilter$$anonfun$foreach$1.apply(TraversableLike.scala:772)
at
scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:98)
at
scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:98)
at
scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:226)
at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:39)
at scala.collection.mutable.HashMap.foreach(HashMap.scala:98)
at
scala.collection.TraversableLike$WithFilter.foreach(TraversableLike.scala:771)
at
org.apache.spark.executor.Executor.org$apache$spark$executor$Executor$$updateDependencies(Executor.scala:390)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:193)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
15/08/13 20:43:08 ERROR scheduler.TaskSetManager: Task 0 in stage 0.0 failed 1
times; aborting job
15/08/13 20:43:08 INFO scheduler.TaskSchedulerImpl: Cancelling stage 0
15/08/13 20:43:08 INFO executor.Executor: Executor is trying to kill task 1.0
in stage 0.0 (TID 1)
15/08/13 20:43:08 INFO scheduler.TaskSchedulerImpl: Stage 0 was cancelled
15/08/13 20:43:08 INFO scheduler.DAGScheduler: ResultStage 0 (reduce at
SparkPi.scala:35) failed in 60.088 s
15/08/13 20:43:08 INFO scheduler.DAGScheduler: Job 0 failed: reduce at
SparkPi.scala:35, took 60.229834 s
Exception in thread "main" org.apache.spark.SparkException: Job aborted due to
stage failure: Task 0 in stage 0.0 failed 1 times, most recent failure: Lost
task 0.0 in stage 0.0 (TID 0, localhost): java.net.SocketTimeoutException:
connect timed out
at java.net.PlainSocketImpl.socketConnect(Native Method)
at
java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:339)
at
java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:200)
at
java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:182)
at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)
at java.net.Socket.connect(Socket.java:579)
at sun.net.NetworkClient.doConnect(NetworkClient.java:175)
at sun.net.www.http.HttpClient.openServer(HttpClient.java:432)
at sun.net.www.http.HttpClient.openServer(HttpClient.java:527)
at sun.net.www.http.HttpClient.<init>(HttpClient.java:211)
at sun.net.www.http.HttpClient.New(HttpClient.java:308)
at sun.net.www.http.HttpClient.New(HttpClient.java:326)
at
sun.net.www.protocol.http.HttpURLConnection.getNewHttpClient(HttpURLConnection.java:996)
at
sun.net.www.protocol.http.HttpURLConnection.plainConnect(HttpURLConnection.java:932)
at
sun.net.www.protocol.http.HttpURLConnection.connect(HttpURLConnection.java:850)
at org.apache.spark.util.Utils$.doFetchFile(Utils.scala:639)
at org.apache.spark.util.Utils$.fetchFile(Utils.scala:453)
at
org.apache.spark.executor.Executor$$anonfun$org$apache$spark$executor$Executor$$updateDependencies$5.apply(Executor.scala:398)
at
org.apache.spark.executor.Executor$$anonfun$org$apache$spark$executor$Executor$$updateDependencies$5.apply(Executor.scala:390)
at
scala.collection.TraversableLike$WithFilter$$anonfun$foreach$1.apply(TraversableLike.scala:772)
at
scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:98)
at
scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:98)
at
scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:226)
at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:39)
at scala.collection.mutable.HashMap.foreach(HashMap.scala:98)
at
scala.collection.TraversableLike$WithFilter.foreach(TraversableLike.scala:771)
at
org.apache.spark.executor.Executor.org$apache$spark$executor$Executor$$updateDependencies(Executor.scala:390)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:193)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
Driver stacktrace:
at
org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1273)
at
org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1264)
at
org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1263)
at
scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
at
org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1263)
at
org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:730)
at
org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:730)
at scala.Option.foreach(Option.scala:236)
at
org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:730)
at
org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1457)
at
org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1418)
at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48)
15/08/13 20:43:08 INFO spark.SparkContext: Invoking stop() from shutdown hook
15/08/13 20:43:08 INFO handler.ContextHandler: stopped
o.s.j.s.ServletContextHandler{/metrics/json,null}
15/08/13 20:43:08 INFO handler.ContextHandler: stopped
o.s.j.s.ServletContextHandler{/stages/stage/kill,null}
15/08/13 20:43:08 INFO handler.ContextHandler: stopped
o.s.j.s.ServletContextHandler{/api,null}
15/08/13 20:43:08 INFO handler.ContextHandler: stopped
o.s.j.s.ServletContextHandler{/,null}
15/08/13 20:43:08 INFO handler.ContextHandler: stopped
o.s.j.s.ServletContextHandler{/static,null}
15/08/13 20:43:08 INFO handler.ContextHandler: stopped
o.s.j.s.ServletContextHandler{/executors/threadDump/json,null}
15/08/13 20:43:08 INFO handler.ContextHandler: stopped
o.s.j.s.ServletContextHandler{/executors/threadDump,null}
15/08/13 20:43:08 INFO handler.ContextHandler: stopped
o.s.j.s.ServletContextHandler{/executors/json,null}
15/08/13 20:43:08 INFO handler.ContextHandler: stopped
o.s.j.s.ServletContextHandler{/executors,null}
15/08/13 20:43:08 INFO handler.ContextHandler: stopped
o.s.j.s.ServletContextHandler{/environment/json,null}
15/08/13 20:43:08 INFO handler.ContextHandler: stopped
o.s.j.s.ServletContextHandler{/environment,null}
15/08/13 20:43:08 INFO handler.ContextHandler: stopped
o.s.j.s.ServletContextHandler{/storage/rdd/json,null}
15/08/13 20:43:08 INFO handler.ContextHandler: stopped
o.s.j.s.ServletContextHandler{/storage/rdd,null}
15/08/13 20:43:08 INFO handler.ContextHandler: stopped
o.s.j.s.ServletContextHandler{/storage/json,null}
15/08/13 20:43:08 INFO handler.ContextHandler: stopped
o.s.j.s.ServletContextHandler{/storage,null}
15/08/13 20:43:08 INFO handler.ContextHandler: stopped
o.s.j.s.ServletContextHandler{/stages/pool/json,null}
15/08/13 20:43:08 INFO handler.ContextHandler: stopped
o.s.j.s.ServletContextHandler{/stages/pool,null}
15/08/13 20:43:08 INFO handler.ContextHandler: stopped
o.s.j.s.ServletContextHandler{/stages/stage/json,null}
15/08/13 20:43:08 INFO handler.ContextHandler: stopped
o.s.j.s.ServletContextHandler{/stages/stage,null}
15/08/13 20:43:08 INFO handler.ContextHandler: stopped
o.s.j.s.ServletContextHandler{/stages/json,null}
15/08/13 20:43:08 INFO handler.ContextHandler: stopped
o.s.j.s.ServletContextHandler{/stages,null}
15/08/13 20:43:08 INFO handler.ContextHandler: stopped
o.s.j.s.ServletContextHandler{/jobs/job/json,null}
15/08/13 20:43:08 INFO handler.ContextHandler: stopped
o.s.j.s.ServletContextHandler{/jobs/job,null}
15/08/13 20:43:08 INFO handler.ContextHandler: stopped
o.s.j.s.ServletContextHandler{/jobs/json,null}
15/08/13 20:43:08 INFO handler.ContextHandler: stopped
o.s.j.s.ServletContextHandler{/jobs,null}
15/08/13 20:43:08 INFO ui.SparkUI: Stopped Spark web UI at http://10.0.0.6:4040
15/08/13 20:43:08 INFO scheduler.DAGScheduler: Stopping DAGScheduler
15/08/13 20:43:08 INFO spark.MapOutputTrackerMasterEndpoint:
MapOutputTrackerMasterEndpoint stopped!
15/08/13 20:43:08 INFO util.Utils: path =
/private/var/folders/0j/bkhg_dw17w96qxddkmryz63r0000gn/T/spark-c2b2b947-327a-4e54-aa3b-ae11526aebde/blockmgr-37bc3de6-86da-491a-9a39-a9f44c3a0919,
already present as root for deletion.
15/08/13 20:43:08 INFO storage.MemoryStore: MemoryStore cleared
15/08/13 20:43:08 INFO storage.BlockManager: BlockManager stopped
15/08/13 20:43:08 INFO storage.BlockManagerMaster: BlockManagerMaster stopped
15/08/13 20:43:08 INFO
scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint:
OutputCommitCoordinator stopped!
15/08/13 20:43:08 INFO spark.SparkContext: Successfully stopped SparkContext
15/08/13 20:43:08 INFO util.Utils: Shutdown hook called
15/08/13 20:43:08 INFO util.Utils: Deleting directory
/private/var/folders/0j/bkhg_dw17w96qxddkmryz63r0000gn/T/spark-c2b2b947-327a-4e54-aa3b-ae11526aebde
~/spark-1.4.1-bin-without-hadoop $
{code}
> run-example SparkPi fails on Mac
> --------------------------------
>
> Key: SPARK-9960
> URL: https://issues.apache.org/jira/browse/SPARK-9960
> Project: Spark
> Issue Type: Bug
> Environment: java version "1.7.0_71", Mac OS X 10.9.5
> Reporter: Naga
>
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]