DB Tsai created SPARK-39459:
-------------------------------

             Summary: LocalSchedulerBackend doesn't support IPV6
                 Key: SPARK-39459
                 URL: https://issues.apache.org/jira/browse/SPARK-39459
             Project: Spark
          Issue Type: Sub-task
          Components: Spark Core
    Affects Versions: 3.2.1
            Reporter: DB Tsai



{code:java}
➜  ./bin/spark-shell
22/06/09 14:52:35 WARN Utils: Your hostname, DBs-Mac-mini-2.local resolves to a 
loopback address: 127.0.0.1; using 2600:1700:1151:11ef:0:0:0:2000 instead (on 
interface en1)
22/06/09 14:52:35 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another 
address
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use 
setLogLevel(newLevel).
22/06/09 14:52:43 WARN NativeCodeLoader: Unable to load native-hadoop library 
for your platform... using builtin-java classes where applicable
22/06/09 14:52:44 ERROR SparkContext: Error initializing SparkContext.
java.lang.AssertionError: assertion failed: Expected hostname or IPv6 IP 
enclosed in [] but got 2600:1700:1151:11ef:0:0:0:2000
        at scala.Predef$.assert(Predef.scala:223) ~[scala-library-2.12.15.jar:?]
        at org.apache.spark.util.Utils$.checkHost(Utils.scala:1110) 
~[spark-core_2.12-3.2.0.jar:3.2.0.37]
        at org.apache.spark.executor.Executor.<init>(Executor.scala:89) 
~[spark-core_2.12-3.2.0.jar:3.2.0.37]
        at 
org.apache.spark.scheduler.local.LocalEndpoint.<init>(LocalSchedulerBackend.scala:64)
 ~[spark-core_2.12-3.2.0.jar:3.2.0]
        at 
org.apache.spark.scheduler.local.LocalSchedulerBackend.start(LocalSchedulerBackend.scala:132)
 ~[spark-core_2.12-3.2.0.jar:3.2.0]
{code}




--
This message was sent by Atlassian Jira
(v8.20.7#820007)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to