[jira] [Commented] (SPARK-43366) Spark Driver Bind Address is off-by-one
[ https://issues.apache.org/jira/browse/SPARK-43366?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17726727#comment-17726727 ] Derek Brown commented on SPARK-43366: - [~srowen] the issue isn't with the port; the issue is with the IP address. The ports are both 32805. > Spark Driver Bind Address is off-by-one > --- > > Key: SPARK-43366 > URL: https://issues.apache.org/jira/browse/SPARK-43366 > Project: Spark > Issue Type: Bug > Components: Block Manager >Affects Versions: 3.3.3 >Reporter: Derek Brown >Priority: Major > > I have the following environment variable set in my driver pod configuration: > {code:java} > SPARK_DRIVER_BIND_ADDRESS=10.244.0.53{code} > However, I see an off-by-one IP address being referred to in the Spark logs: > {code:java} > 23/05/04 02:37:03 INFO > KubernetesClusterSchedulerBackend$KubernetesDriverEndpoint: Registered > executor NettyRpcEndpointRef(spark-client://Executor) (10.244.0.54:53140) > with ID 1, ResourceProfileId 0 > 23/05/04 02:37:03 INFO BlockManagerMasterEndpoint: Registering block manager > 10.244.0.54:32805 with 413.9 MiB RAM, BlockManagerId(1, 10.244.0.54, 32805, > None){code} > I am not sure why this might be the case. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Commented] (SPARK-41992) Migrate the official Spark History Server helm chart to a non-archived repository
[ https://issues.apache.org/jira/browse/SPARK-41992?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17675695#comment-17675695 ] Derek Brown commented on SPARK-41992: - How do we go about creating a new GitHub repo? > Migrate the official Spark History Server helm chart to a non-archived > repository > - > > Key: SPARK-41992 > URL: https://issues.apache.org/jira/browse/SPARK-41992 > Project: Spark > Issue Type: Improvement > Components: Kubernetes, Web UI >Affects Versions: 3.4.0 >Reporter: Derek Brown >Priority: Major > > * The Spark History server Helm chart is currently published in the > [`helm/charts` > repository|https://github.com/helm/charts/tree/master/stable/spark-history-server]. > This repository has been > [deprecated|https://github.com/helm/charts#%EF%B8%8F-deprecation-and-archive-notice] > in favor of having chart maintainers own their own repository. > * We need to create a new GitHub repository to host this chart so that it > can be updated/modified/maintained. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Created] (SPARK-41992) Migrate the official Spark History Server helm chart to a non-archived repository
Derek Brown created SPARK-41992: --- Summary: Migrate the official Spark History Server helm chart to a non-archived repository Key: SPARK-41992 URL: https://issues.apache.org/jira/browse/SPARK-41992 Project: Spark Issue Type: Improvement Components: Kubernetes, Web UI Affects Versions: 3.4.0 Reporter: Derek Brown * The Spark History server Helm chart is currently published in the [`helm/charts` repository|https://github.com/helm/charts/tree/master/stable/spark-history-server]. This repository has been [deprecated|https://github.com/helm/charts#%EF%B8%8F-deprecation-and-archive-notice] in favor of having chart maintainers own their own repository. * We need to create a new GitHub repository to host this chart so that it can be updated/modified/maintained. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Created] (SPARK-43166) Docker images are missing passwd entry for UID 185
Derek Brown created SPARK-43166: --- Summary: Docker images are missing passwd entry for UID 185 Key: SPARK-43166 URL: https://issues.apache.org/jira/browse/SPARK-43166 Project: Spark Issue Type: Bug Components: python, R Affects Versions: 3.4.0 Reporter: Derek Brown Currently, the official Spark docker images run as the UID {{{}185{}}}, but no corresponding entry exists in {{{}/etc/passwd{}}}. This causes [issues|https://stackoverflow.com/questions/41864985/hadoop-ioexception-failure-to-login] when libraries try to fetch the current unix username. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Commented] (SPARK-43166) Docker images are missing passwd entry for UID 185
[ https://issues.apache.org/jira/browse/SPARK-43166?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17713232#comment-17713232 ] Derek Brown commented on SPARK-43166: - I created a PR for this here: https://github.com/apache/spark/pull/40798 > Docker images are missing passwd entry for UID 185 > -- > > Key: SPARK-43166 > URL: https://issues.apache.org/jira/browse/SPARK-43166 > Project: Spark > Issue Type: Bug > Components: python, R >Affects Versions: 3.4.0 >Reporter: Derek Brown >Priority: Minor > Original Estimate: 168h > Remaining Estimate: 168h > > Currently, the official Spark docker images run as the UID {{{}185{}}}, but > no corresponding entry exists in {{{}/etc/passwd{}}}. This causes > [issues|https://stackoverflow.com/questions/41864985/hadoop-ioexception-failure-to-login] > when libraries try to fetch the current unix username. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Created] (SPARK-43366) Spark Driver Bind Address is off-by-one
Derek Brown created SPARK-43366: --- Summary: Spark Driver Bind Address is off-by-one Key: SPARK-43366 URL: https://issues.apache.org/jira/browse/SPARK-43366 Project: Spark Issue Type: Bug Components: Block Manager Affects Versions: 3.3.3 Reporter: Derek Brown I have the following environment variable set in my driver pod configuration: {code:java} SPARK_DRIVER_BIND_ADDRESS=10.244.0.53{code} However, I see an off-by-one IP address being referred to in the Spark logs: {code:java} 23/05/04 02:37:03 INFO KubernetesClusterSchedulerBackend$KubernetesDriverEndpoint: Registered executor NettyRpcEndpointRef(spark-client://Executor) (10.244.0.54:53140) with ID 1, ResourceProfileId 0 23/05/04 02:37:03 INFO BlockManagerMasterEndpoint: Registering block manager 10.244.0.54:32805 with 413.9 MiB RAM, BlockManagerId(1, 10.244.0.54, 32805, None){code} I am not sure why this might be the case. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org