Thomas Dudziak created SPARK-11115:
--------------------------------------

             Summary: IPv6 regression
                 Key: SPARK-11115
                 URL: https://issues.apache.org/jira/browse/SPARK-11115
             Project: Spark
          Issue Type: Bug
          Components: Spark Core
    Affects Versions: 1.5.1
         Environment: CentOS 6.7, Java 1.8.0_25, dual stack IPv4 + IPv6
            Reporter: Thomas Dudziak
            Priority: Critical


When running Spark with -Djava.net.preferIPv6Addresses=true, I get this error:

15/10/14 14:36:01 ERROR SparkContext: Error initializing SparkContext.
java.lang.AssertionError: assertion failed: Expected hostname
        at scala.Predef$.assert(Predef.scala:179)
        at org.apache.spark.util.Utils$.checkHost(Utils.scala:805)
        at 
org.apache.spark.storage.BlockManagerId.<init>(BlockManagerId.scala:48)
        at 
org.apache.spark.storage.BlockManagerId$.apply(BlockManagerId.scala:107)
        at 
org.apache.spark.storage.BlockManager.initialize(BlockManager.scala:190)
        at org.apache.spark.SparkContext.<init>(SparkContext.scala:528)
        at 
org.apache.spark.repl.SparkILoop.createSparkContext(SparkILoop.scala:1017)

Looking at the code in question, it seems that the code will only work for IPv4 
as it assumes ':' can't be part of the hostname (which it clearly can for IPv6 
addresses).
Instead, the code should probably use Guava's HostAndPort class, i.e.:

  def checkHost(host: String, message: String = "") {
    assert(!HostAndPort.fromString(host).hasPort, message)
  }

  def checkHostPort(hostPort: String, message: String = "") {
    assert(HostAndPort.fromString(hostPort).hasPort, message)
  }




--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to