Github user AzureQ commented on the issue:
https://github.com/apache/spark/pull/23037
> ok, I give up on flaky tests.
>
> Merging to master.
Thanks!
---
-
To unsubscribe, e-mail: reviews
Github user AzureQ commented on the issue:
https://github.com/apache/spark/pull/23037
done the change...not sure why the test build errors out with a hive
related issue
---
-
To unsubscribe, e-mail: reviews
Github user AzureQ commented on the issue:
https://github.com/apache/spark/pull/23037
Cou
> When I tried to write automated tests for pyspark in the past it was kind
of a pain. It doesn't work the way you expect unless you have a
pseudo-terminal, apparen
Github user AzureQ commented on the issue:
https://github.com/apache/spark/pull/23037
@vanzin @ifilonenko I'm able to capture stdout of `spark-shell` but not
`pyspark` and `sparkR`. Need more investigation on it. Do you happen to know
why
Github user AzureQ commented on a diff in the pull request:
https://github.com/apache/spark/pull/23037#discussion_r237618024
--- Diff:
resource-managers/kubernetes/integration-tests/src/test/scala/org/apache/spark/deploy/k8s/integrationtest/PythonTestsSuite.scala
---
@@ -89,6
Github user AzureQ commented on the issue:
https://github.com/apache/spark/pull/23037
@ifilonenko I added the test case in `PythonTestsSuites` since it's pyspark
specific.
---
-
To unsubscribe, e-mail: reviews
Github user AzureQ commented on the issue:
https://github.com/apache/spark/pull/23037
> > This is fine, but please file a bug.
>
> Okay, as such, @AzureQ could you add an integration test to
`ClientModeTestsSuite`
Github user AzureQ commented on the issue:
https://github.com/apache/spark/pull/23037
> I see this customization to be specific towards how you build your custom
Docker image. Unless it is relevant towards testing, we are trying to keep the
default Docker image as lightwei
GitHub user AzureQ opened a pull request:
https://github.com/apache/spark/pull/23037
[MINOR][k8s] Add Copy pyspark into corresponding dir cmd in pyspark
Dockerfile
When I try to run `./bin/pyspark` cmd in a pod in Kubernetes(image built
without change from pyspark Dockerfile), I'm