Github user obermeier commented on the issue:
https://github.com/apache/spark/pull/19408
If Spark runs in YARN Cluster this issue still exists
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
GitHub user obermeier reopened a pull request:
https://github.com/apache/spark/pull/19408
[SPARK-22180][CORE] Allow IPv6 address in
org.apache.spark.util.Utils.parseHostPort
External applications like Apache Cassandra are able to deal with IPv6
addresses. Libraries like spark
Github user obermeier closed the pull request at:
https://github.com/apache/spark/pull/19408
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org
Github user obermeier commented on the issue:
https://github.com/apache/spark/pull/19408
This issue seems to be fixed in Spark 2.3.2
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional
Github user obermeier commented on the issue:
https://github.com/apache/spark/pull/19408
I total agree with you.
What do you think about just adding a log message if the given string is
obviously not a valid host name.
Because the given _NumberFormatException_ much later after
Github user obermeier commented on a diff in the pull request:
https://github.com/apache/spark/pull/19408#discussion_r150909772
--- Diff: core/src/test/scala/org/apache/spark/util/UtilsSuite.scala ---
@@ -1146,6 +1146,20 @@ class UtilsSuite extends SparkFunSuite
Github user obermeier commented on a diff in the pull request:
https://github.com/apache/spark/pull/19408#discussion_r150907312
--- Diff: core/src/main/scala/org/apache/spark/util/Utils.scala ---
@@ -982,7 +982,19 @@ private[spark] object Utils extends Logging {
return
Github user obermeier commented on a diff in the pull request:
https://github.com/apache/spark/pull/19408#discussion_r150798868
--- Diff: core/src/main/scala/org/apache/spark/util/Utils.scala ---
@@ -982,7 +982,19 @@ private[spark] object Utils extends Logging {
return
Github user obermeier commented on the issue:
https://github.com/apache/spark/pull/19408
I chose this function because I had some exceptions like this [1] if I used
IPv6 hosts.
In this example ```org.apache.spark.util.Utils$.parseHostPort``` decided to
use f904 as port
Github user obermeier commented on the issue:
https://github.com/apache/spark/pull/19408
Done
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h
Github user obermeier commented on a diff in the pull request:
https://github.com/apache/spark/pull/19408#discussion_r142047496
--- Diff: core/src/main/scala/org/apache/spark/util/Utils.scala ---
@@ -981,7 +981,13 @@ private[spark] object Utils extends Logging {
return
GitHub user obermeier opened a pull request:
https://github.com/apache/spark/pull/19408
[SPARK-22180][CORE] Allow IPv6
External applications like Apache Cassandra are able to deal with IPv6
addresses. Libraries like spark-cassandra-connector combine Apache Cassandra
with Apache
12 matches
Mail list logo