Github user rvesse commented on a diff in the pull request:
https://github.com/apache/spark/pull/22904#discussion_r238782502
--- Diff:
resource-managers/kubernetes/core/src/main/scala/org/apache/spark/deploy/k8s/SparkKubernetesClientFactory.scala
---
@@ -67,8 +66,16 @@ private
Github user rvesse commented on a diff in the pull request:
https://github.com/apache/spark/pull/22904#discussion_r238779965
--- Diff:
resource-managers/kubernetes/core/src/main/scala/org/apache/spark/deploy/k8s/SparkKubernetesClientFactory.scala
---
@@ -67,8 +66,16 @@ private
Github user rvesse commented on a diff in the pull request:
https://github.com/apache/spark/pull/22904#discussion_r238484145
--- Diff:
resource-managers/kubernetes/core/src/main/scala/org/apache/spark/deploy/k8s/SparkKubernetesClientFactory.scala
---
@@ -67,8 +66,16 @@ private
Github user rvesse commented on a diff in the pull request:
https://github.com/apache/spark/pull/22904#discussion_r238483901
--- Diff:
resource-managers/kubernetes/core/src/main/scala/org/apache/spark/deploy/k8s/SparkKubernetesClientFactory.scala
---
@@ -67,8 +66,16 @@ private
Github user rvesse commented on a diff in the pull request:
https://github.com/apache/spark/pull/22904#discussion_r237447038
--- Diff: docs/running-on-kubernetes.md ---
@@ -298,6 +298,16 @@ the Spark application.
## Kubernetes Features
+### Configuration File
Github user rvesse commented on the issue:
https://github.com/apache/spark/pull/22904
@liyinan926 Any chance of rounding up some other folks to get this reviewed
and merged?
---
-
To unsubscribe, e-mail: reviews
Github user rvesse commented on the issue:
https://github.com/apache/spark/pull/23017
Rebased to catch up with master and adapt for @vanzin's improvements to
Docker build context from PR #23019
Github user rvesse commented on the issue:
https://github.com/apache/spark/pull/22904
Rebased to bring up to date with master and adapt to @vanzin's changes from
PR #23019
---
-
To unsubscribe, e-mail: reviews
Github user rvesse commented on a diff in the pull request:
https://github.com/apache/spark/pull/23017#discussion_r237441889
--- Diff: docs/running-on-kubernetes.md ---
@@ -19,9 +19,9 @@ Please see [Spark Security](security.html) and the
specific advice below before
Github user rvesse commented on the issue:
https://github.com/apache/spark/pull/23053
Probably also want to update `docs/running-on-kubernetes.md` to make it
clear that you now have to opt into building the additional language bindings
Github user rvesse commented on the issue:
https://github.com/apache/spark/pull/23053
@ramaddepally Sorry, clearly not reading straight today!
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
Github user rvesse commented on the issue:
https://github.com/apache/spark/pull/23017
Have now added the doc updates to this so think this is ready for final
review and merging
---
-
To unsubscribe, e-mail: reviews
Github user rvesse commented on a diff in the pull request:
https://github.com/apache/spark/pull/23017#discussion_r234970716
--- Diff:
resource-managers/kubernetes/integration-tests/src/test/scala/org/apache/spark/deploy/k8s/integrationtest/ClientModeTestsSuite.scala
Github user rvesse commented on a diff in the pull request:
https://github.com/apache/spark/pull/23053#discussion_r234968357
--- Diff: bin/docker-image-tool.sh ---
@@ -41,6 +41,18 @@ function image_ref {
echo "$image"
}
+function docker_push {
Github user rvesse commented on a diff in the pull request:
https://github.com/apache/spark/pull/23053#discussion_r234967639
--- Diff: bin/docker-image-tool.sh ---
@@ -102,33 +114,37 @@ function build {
error "Failed to build Spark JVM Docker image, please refer to D
Github user rvesse commented on a diff in the pull request:
https://github.com/apache/spark/pull/23053#discussion_r234968048
--- Diff: bin/docker-image-tool.sh ---
@@ -41,6 +41,18 @@ function image_ref {
echo "$image"
}
+function docker_push {
Github user rvesse commented on a diff in the pull request:
https://github.com/apache/spark/pull/23053#discussion_r234968130
--- Diff: bin/docker-image-tool.sh ---
@@ -102,33 +114,37 @@ function build {
error "Failed to build Spark JVM Docker image, please refer to D
Github user rvesse commented on a diff in the pull request:
https://github.com/apache/spark/pull/23017#discussion_r234146162
--- Diff:
resource-managers/kubernetes/docker/src/main/dockerfiles/spark/Dockerfile ---
@@ -53,5 +54,9 @@ COPY data /opt/spark/data
ENV SPARK_HOME /opt
Github user rvesse commented on a diff in the pull request:
https://github.com/apache/spark/pull/23013#discussion_r234144540
--- Diff: docs/running-on-kubernetes.md ---
@@ -15,7 +15,19 @@ container images and entrypoints.**
# Security
Security in Spark is OFF
Github user rvesse commented on a diff in the pull request:
https://github.com/apache/spark/pull/23017#discussion_r234143917
--- Diff:
resource-managers/kubernetes/integration-tests/src/test/scala/org/apache/spark/deploy/k8s/integrationtest/ClientModeTestsSuite.scala
Github user rvesse commented on the issue:
https://github.com/apache/spark/pull/23013
@srowen I'm happy with it
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail
Github user rvesse commented on the issue:
https://github.com/apache/spark/pull/23017
> noted test issue. let's kick off test though
@felixcheung This is now resolved, please kick off a retest when you get
cha
Github user rvesse commented on the issue:
https://github.com/apache/spark/pull/23017
Resolved the issue with the client mode test. The test itself was actually
badly written in that it used the Spark images but overrode the entry point
which avoided the logic that sets up the `/etc
Github user rvesse commented on a diff in the pull request:
https://github.com/apache/spark/pull/23017#discussion_r233560448
--- Diff:
resource-managers/kubernetes/docker/src/main/dockerfiles/spark/entrypoint.sh ---
@@ -30,6 +30,10 @@ set -e
# If there is no passwd entry
Github user rvesse commented on a diff in the pull request:
https://github.com/apache/spark/pull/23017#discussion_r233385383
--- Diff:
resource-managers/kubernetes/docker/src/main/dockerfiles/spark/entrypoint.sh ---
@@ -30,6 +30,10 @@ set -e
# If there is no passwd entry
Github user rvesse commented on the issue:
https://github.com/apache/spark/pull/23013
@tgravescs Re: Point 1 I have a separate PR #22904 which makes some
improvements to the docs around that point
Github user rvesse commented on the issue:
https://github.com/apache/spark/pull/23013
@mccheah I have tried to keep it minimal and just point to the official K8S
docs. Obviously there is a balance to be had between high level warnings and
detailed advice. K8S is still a relatively
Github user rvesse commented on the issue:
https://github.com/apache/spark/pull/23017
For those with more knowledge of client mode here is the specific error
seen in the integration tests:
```
Exception in thread "main" java.lang.IllegalArgumentException: ba
GitHub user rvesse opened a pull request:
https://github.com/apache/spark/pull/23017
[WIP][SPARK-26015][K8S] Set a default UID for Spark on K8S Images
## What changes were proposed in this pull request?
Adds USER directives to the Dockerfiles which is configurable via build
GitHub user rvesse opened a pull request:
https://github.com/apache/spark/pull/23013
[SPARK-25023] More detailed security guidance for K8S
## What changes were proposed in this pull request?
Highlights specific security issues to be aware of with Spark on K8S
Github user rvesse commented on a diff in the pull request:
https://github.com/apache/spark/pull/22959#discussion_r231838596
--- Diff:
resource-managers/kubernetes/core/src/main/scala/org/apache/spark/deploy/k8s/KubernetesConf.scala
---
@@ -112,125 +72,139 @@ private[spark] case
Github user rvesse commented on the issue:
https://github.com/apache/spark/pull/22959
First glance this looks like a lot of nice simplification, will take a
proper look over this tomorrow
---
-
To unsubscribe, e
Github user rvesse commented on a diff in the pull request:
https://github.com/apache/spark/pull/22911#discussion_r231195413
--- Diff:
resource-managers/kubernetes/core/src/main/scala/org/apache/spark/deploy/k8s/features/KerberosConfDriverFeatureStep.scala
---
@@ -126,20 +134,53
Github user rvesse commented on the issue:
https://github.com/apache/spark/pull/22904
@mccheah I have made the requested changes, can I get another review please?
---
-
To unsubscribe, e-mail: reviews-unsubscr
Github user rvesse commented on a diff in the pull request:
https://github.com/apache/spark/pull/22904#discussion_r230318413
--- Diff:
resource-managers/kubernetes/core/src/main/scala/org/apache/spark/deploy/k8s/Config.scala
---
@@ -23,6 +23,18 @@ import
Github user rvesse commented on the issue:
https://github.com/apache/spark/pull/22805
@mccheah I have updated the comment to reference the follow up issue and
opened a PR for that as #22904. Can we go ahead and merge now
GitHub user rvesse opened a pull request:
https://github.com/apache/spark/pull/22904
[SPARK-25887][K8S] Configurable K8S context support
## What changes were proposed in this pull request?
This enhancement allows for specifying the desired context to use for
the initial
Github user rvesse commented on a diff in the pull request:
https://github.com/apache/spark/pull/22805#discussion_r229630011
--- Diff:
resource-managers/kubernetes/core/src/main/scala/org/apache/spark/deploy/k8s/SparkKubernetesClientFactory.scala
---
@@ -63,6 +66,8 @@ private
Github user rvesse commented on the issue:
https://github.com/apache/spark/pull/22805
Did a bunch more testing on our internal K8S clusters today after rebasing
this onto master. I am now happy that this is ready for final review and
merging so I have removed the `[WIP]` tag
Github user rvesse commented on a diff in the pull request:
https://github.com/apache/spark/pull/22805#discussion_r229400780
--- Diff:
resource-managers/kubernetes/core/src/main/scala/org/apache/spark/deploy/k8s/SparkKubernetesClientFactory.scala
---
@@ -42,6 +42,9 @@ private
Github user rvesse commented on a diff in the pull request:
https://github.com/apache/spark/pull/22805#discussion_r229395036
--- Diff:
resource-managers/kubernetes/integration-tests/src/test/scala/org/apache/spark/deploy/k8s/integrationtest/backend/IntegrationTestBackend.scala
Github user rvesse commented on a diff in the pull request:
https://github.com/apache/spark/pull/22805#discussion_r228470066
--- Diff:
resource-managers/kubernetes/integration-tests/src/test/scala/org/apache/spark/deploy/k8s/integrationtest/Utils.scala
---
@@ -27,4 +27,36
Github user rvesse commented on a diff in the pull request:
https://github.com/apache/spark/pull/22805#discussion_r228470004
--- Diff:
resource-managers/kubernetes/integration-tests/scripts/setup-integration-test-env.sh
---
@@ -71,19 +71,36 @@ if [[ $IMAGE_TAG == &q
Github user rvesse commented on a diff in the pull request:
https://github.com/apache/spark/pull/22805#discussion_r228469510
--- Diff:
resource-managers/kubernetes/integration-tests/src/test/scala/org/apache/spark/deploy/k8s/integrationtest/Utils.scala
---
@@ -27,4 +27,36
Github user rvesse commented on a diff in the pull request:
https://github.com/apache/spark/pull/22805#discussion_r228467937
--- Diff:
resource-managers/kubernetes/integration-tests/src/test/scala/org/apache/spark/deploy/k8s/integrationtest/backend/IntegrationTestBackend.scala
Github user rvesse commented on a diff in the pull request:
https://github.com/apache/spark/pull/22805#discussion_r228467705
--- Diff: resource-managers/kubernetes/integration-tests/README.md ---
@@ -41,12 +71,127 @@ The Spark code to test is handed to the integration
test system
Github user rvesse commented on a diff in the pull request:
https://github.com/apache/spark/pull/22805#discussion_r228467766
--- Diff: resource-managers/kubernetes/integration-tests/README.md ---
@@ -13,15 +13,45 @@ The simplest way to run the integration tests is to
install
Github user rvesse commented on a diff in the pull request:
https://github.com/apache/spark/pull/22805#discussion_r228467591
--- Diff:
resource-managers/kubernetes/integration-tests/src/test/scala/org/apache/spark/deploy/k8s/integrationtest/backend/cloud/KubeConfigBackend.scala
Github user rvesse commented on a diff in the pull request:
https://github.com/apache/spark/pull/22805#discussion_r228467650
--- Diff:
resource-managers/kubernetes/integration-tests/src/test/scala/org/apache/spark/deploy/k8s/integrationtest/backend/cloud/KubeConfigBackend.scala
Github user rvesse commented on the issue:
https://github.com/apache/spark/pull/22805
@liyinan926 I will rebase and squash appropriately once PR #22820 is merged
---
-
To unsubscribe, e-mail: reviews-unsubscr
Github user rvesse commented on a diff in the pull request:
https://github.com/apache/spark/pull/22820#discussion_r228458947
--- Diff:
resource-managers/kubernetes/core/src/main/scala/org/apache/spark/deploy/k8s/KubernetesUtils.scala
---
@@ -157,7 +157,9 @@ private[spark] object
Github user rvesse commented on a diff in the pull request:
https://github.com/apache/spark/pull/22820#discussion_r228458281
--- Diff:
resource-managers/kubernetes/core/src/main/scala/org/apache/spark/deploy/k8s/KubernetesUtils.scala
---
@@ -157,7 +157,9 @@ private[spark] object
Github user rvesse commented on the issue:
https://github.com/apache/spark/pull/22805
Ran successfully against one of our dev K8S clusters today:
![screen shot 2018-10-25 at 17 39
40](https://user-images.githubusercontent.com/2104864/47516269-fd2e1480-d87c-11e8-80f9
Github user rvesse commented on the issue:
https://github.com/apache/spark/pull/22805
@ifilonenko I have done the generalisation today since it was fairly
trivial and it actually resolves a number of concerns about the first pass
implementation
@skonto I have restored
Github user rvesse commented on a diff in the pull request:
https://github.com/apache/spark/pull/22805#discussion_r227835021
--- Diff:
resource-managers/kubernetes/integration-tests/src/test/scala/org/apache/spark/deploy/k8s/integrationtest/backend/docker/DockerForDesktopBackend.scala
Github user rvesse commented on a diff in the pull request:
https://github.com/apache/spark/pull/22805#discussion_r227833846
--- Diff:
resource-managers/kubernetes/integration-tests/src/test/scala/org/apache/spark/deploy/k8s/integrationtest/backend/docker/DockerForDesktopBackend.scala
Github user rvesse commented on the issue:
https://github.com/apache/spark/pull/22805
@skonto Yep, I plan to do that tomorrow
At least for my `minikube` instance I found 4g insufficient and a couple of
tests would fail because their pods didn't get scheduled. 8g is probably
Github user rvesse commented on the issue:
https://github.com/apache/spark/pull/22805
@srowen Yes there are a lot of assumptions made by the integration tests
that are not documented anywhere and I figured out by digging in the code and
POMs.
Broadly speaking right now
Github user rvesse commented on a diff in the pull request:
https://github.com/apache/spark/pull/22805#discussion_r227400699
--- Diff:
resource-managers/kubernetes/integration-tests/src/test/scala/org/apache/spark/deploy/k8s/integrationtest/backend/docker/DockerForDesktopBackend.scala
GitHub user rvesse opened a pull request:
https://github.com/apache/spark/pull/22805
[WIP][SPARK-25809][K8S][TEST] New K8S integration testing backends
## What changes were proposed in this pull request?
Currently K8S integration tests are hardcoded to use a `minikube
Github user rvesse commented on a diff in the pull request:
https://github.com/apache/spark/pull/22782#discussion_r227373891
--- Diff: bin/docker-image-tool.sh ---
@@ -79,7 +79,7 @@ function build {
fi
# Verify that Spark has actually been built/is a runnable
Github user rvesse commented on a diff in the pull request:
https://github.com/apache/spark/pull/22782#discussion_r227274978
--- Diff: bin/docker-image-tool.sh ---
@@ -79,7 +79,7 @@ function build {
fi
# Verify that Spark has actually been built/is a runnable
Github user rvesse commented on a diff in the pull request:
https://github.com/apache/spark/pull/22782#discussion_r226983785
--- Diff: bin/docker-image-tool.sh ---
@@ -79,7 +79,7 @@ function build {
fi
# Verify that Spark has actually been built/is a runnable
Github user rvesse commented on the issue:
https://github.com/apache/spark/pull/22748
Rebased onto master, should be ready for merging
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
Github user rvesse commented on the issue:
https://github.com/apache/spark/pull/22681
I actually just got bit by this today while trying to run K8S integration
tests with custom images. The integration tests assume the runnable
distribution layout for `/opt/spark/examples
Github user rvesse commented on a diff in the pull request:
https://github.com/apache/spark/pull/22681#discussion_r226289881
--- Diff:
resource-managers/kubernetes/docker/src/main/dockerfiles/spark/Dockerfile ---
@@ -18,6 +18,7 @@
FROM openjdk:8-alpine
ARG
Github user rvesse commented on a diff in the pull request:
https://github.com/apache/spark/pull/22748#discussion_r225867655
--- Diff: bin/docker-image-tool.sh ---
@@ -78,20 +91,38 @@ function build {
docker build $NOCACHEARG "${BUILD_ARGS[@]}" \
-t $
Github user rvesse commented on a diff in the pull request:
https://github.com/apache/spark/pull/22748#discussion_r225867566
--- Diff: bin/docker-image-tool.sh ---
@@ -44,28 +44,41 @@ function image_ref {
function build {
local BUILD_ARGS
local IMG_PATH
Github user rvesse commented on the issue:
https://github.com/apache/spark/pull/22748
> There seems to be overlapping logic between this PR and #22681
Yes sorry, I was having issues with the script while working on something
unrelated and hadn't realised your integrat
Github user rvesse commented on a diff in the pull request:
https://github.com/apache/spark/pull/22608#discussion_r225614226
--- Diff: bin/docker-image-tool.sh ---
@@ -71,18 +71,29 @@ function build {
--build-arg
base_img=$(image_ref spark)
)
- local
GitHub user rvesse opened a pull request:
https://github.com/apache/spark/pull/22748
[SPARK-25745][K8S] Improve docker-image-tool.sh script
## What changes were proposed in this pull request?
Adds error checking and handling to `docker` invocations ensuring the
script
Github user rvesse commented on the issue:
https://github.com/apache/spark/pull/22748
Suggested reviewers: @mccheah @liyinan926 @skonto
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
Github user rvesse commented on a diff in the pull request:
https://github.com/apache/spark/pull/22146#discussion_r223620758
--- Diff: docs/running-on-kubernetes.md ---
@@ -799,4 +815,168 @@ specific to Spark on Kubernetes.
This sets the major Python version of the docker
Github user rvesse commented on the issue:
https://github.com/apache/spark/pull/22146
@mccheah I was taking that as a given
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands
Github user rvesse commented on a diff in the pull request:
https://github.com/apache/spark/pull/21669#discussion_r221539921
--- Diff: docs/security.md ---
@@ -729,6 +729,15 @@ so that non-local processes can authenticate. These
delegation tokens in Kuberne
shared
Github user rvesse commented on a diff in the pull request:
https://github.com/apache/spark/pull/22584#discussion_r221538327
--- Diff: docs/running-on-kubernetes.md ---
@@ -799,7 +799,7 @@ specific to Spark on Kubernetes.
spark.kubernetes.local.dirs.tmpfs
Github user rvesse commented on the issue:
https://github.com/apache/spark/pull/21669
@vanzin I think in the current implementation of this PR the Kerberos login
is happening inside the driver pod which is running inside the K8S cluster.
The old design from the Spark on K8S
Github user rvesse closed the pull request at:
https://github.com/apache/spark/pull/22256
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org
Github user rvesse commented on the issue:
https://github.com/apache/spark/pull/22256
Closed in favour of #22323 which has been merged
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
Github user rvesse commented on a diff in the pull request:
https://github.com/apache/spark/pull/21669#discussion_r215695909
--- Diff:
resource-managers/kubernetes/core/src/main/scala/org/apache/spark/deploy/k8s/Config.scala
---
@@ -212,6 +212,60 @@ private[spark] object Config
Github user rvesse commented on a diff in the pull request:
https://github.com/apache/spark/pull/21669#discussion_r215692843
--- Diff: docs/security.md ---
@@ -722,6 +722,62 @@ with encryption, at least.
The Kerberos login will be periodically renewed using the provided
Github user rvesse commented on the issue:
https://github.com/apache/spark/pull/22215
Think this is pretty much ready to merge, can folks take another look when
they get chance
---
-
To unsubscribe, e-mail: reviews
Github user rvesse commented on the issue:
https://github.com/apache/spark/pull/22323
All comments so far addressed, can we kick off the PR builder on this now?
---
-
To unsubscribe, e-mail: reviews-unsubscr
Github user rvesse commented on a diff in the pull request:
https://github.com/apache/spark/pull/22323#discussion_r215625508
--- Diff:
resource-managers/kubernetes/core/src/main/scala/org/apache/spark/deploy/k8s/features/LocalDirsFeatureStep.scala
---
@@ -22,6 +22,7 @@ import
Github user rvesse commented on a diff in the pull request:
https://github.com/apache/spark/pull/22323#discussion_r215625448
--- Diff:
resource-managers/kubernetes/core/src/main/scala/org/apache/spark/deploy/k8s/Config.scala
---
@@ -225,6 +225,15 @@ private[spark] object Config
Github user rvesse commented on a diff in the pull request:
https://github.com/apache/spark/pull/22323#discussion_r215625299
--- Diff:
resource-managers/kubernetes/core/src/main/scala/org/apache/spark/deploy/k8s/features/LocalDirsFeatureStep.scala
---
@@ -45,6 +47,10 @@ private
Github user rvesse commented on a diff in the pull request:
https://github.com/apache/spark/pull/22323#discussion_r215625636
--- Diff: docs/running-on-kubernetes.md ---
@@ -215,6 +215,19 @@
spark.kubernetes.driver.volumes.persistentVolumeClaim.checkpointpvc.options.clai
Github user rvesse commented on a diff in the pull request:
https://github.com/apache/spark/pull/22323#discussion_r215625362
--- Diff: docs/running-on-kubernetes.md ---
@@ -215,6 +215,19 @@
spark.kubernetes.driver.volumes.persistentVolumeClaim.checkpointpvc.options.clai
Github user rvesse commented on a diff in the pull request:
https://github.com/apache/spark/pull/22323#discussion_r215338145
--- Diff: docs/running-on-kubernetes.md ---
@@ -215,6 +215,19 @@
spark.kubernetes.driver.volumes.persistentVolumeClaim.checkpointpvc.options.clai
Github user rvesse commented on a diff in the pull request:
https://github.com/apache/spark/pull/22323#discussion_r215187426
--- Diff: docs/running-on-kubernetes.md ---
@@ -215,6 +215,19 @@
spark.kubernetes.driver.volumes.persistentVolumeClaim.checkpointpvc.options.clai
Github user rvesse commented on the issue:
https://github.com/apache/spark/pull/21669
@ifilonenko I think the issue with the `UnixUsername` might possibly be
avoided by exporting `HADOOP_USER_NAME` as an environment variable in the pod
spec set to the same value as `SPARK_USER
Github user rvesse commented on a diff in the pull request:
https://github.com/apache/spark/pull/22256#discussion_r214623272
--- Diff:
resource-managers/kubernetes/core/src/main/scala/org/apache/spark/deploy/k8s/features/LocalDirsFeatureStep.scala
---
@@ -37,41 +40,99 @@ private
GitHub user rvesse opened a pull request:
https://github.com/apache/spark/pull/22323
[SPARK-25262][K8S] Allow SPARK_LOCAL_DIRS to be tmpfs backed on K8S
## What changes were proposed in this pull request?
The default behaviour of Spark on K8S currently is to create
Github user rvesse commented on a diff in the pull request:
https://github.com/apache/spark/pull/22215#discussion_r214614277
--- Diff:
resource-managers/kubernetes/core/src/main/scala/org/apache/spark/scheduler/cluster/k8s/ExecutorPodsLifecycleManager.scala
---
@@ -151,13 +152,15
Github user rvesse commented on the issue:
https://github.com/apache/spark/pull/22215
@mccheah Thanks for the review, have made the change you suggested to use
N/A instead of empty string.
I have left indentation as tabs for now, as I said in a previous comment
this was just
Github user rvesse commented on a diff in the pull request:
https://github.com/apache/spark/pull/22215#discussion_r214612510
--- Diff:
resource-managers/kubernetes/core/src/main/scala/org/apache/spark/deploy/k8s/KubernetesUtils.scala
---
@@ -60,4 +64,81 @@ private[spark] object
Github user rvesse commented on a diff in the pull request:
https://github.com/apache/spark/pull/22256#discussion_r214416634
--- Diff:
resource-managers/kubernetes/core/src/main/scala/org/apache/spark/deploy/k8s/features/LocalDirsFeatureStep.scala
---
@@ -37,41 +40,99 @@ private
Github user rvesse commented on a diff in the pull request:
https://github.com/apache/spark/pull/22256#discussion_r214277672
--- Diff:
resource-managers/kubernetes/core/src/main/scala/org/apache/spark/deploy/k8s/features/LocalDirsFeatureStep.scala
---
@@ -37,41 +40,99 @@ private
Github user rvesse commented on the issue:
https://github.com/apache/spark/pull/22256
@skonto I haven't done anything specific for the size limit ATM. From the
K8S docs `tmpfs` backed `emptyDir` usage counts towards your containers memory
limits so you can just set
Github user rvesse commented on a diff in the pull request:
https://github.com/apache/spark/pull/22256#discussion_r213954156
--- Diff:
resource-managers/kubernetes/core/src/main/scala/org/apache/spark/deploy/k8s/features/LocalDirsFeatureStep.scala
---
@@ -37,41 +40,99 @@ private
1 - 100 of 124 matches
Mail list logo