> ‘pyspark.tests.test_broadcast'
> java -version
openjdk version "11.0.17" 2022-10-18
OpenJDK Runtime Environment Homebrew (build 11.0.17+0)
OpenJDK 64-Bit Server VM Homebrew (build 11.0.17+0, mixed mode)
> OS
Ventura 13.1 (22C65)
Best,
Adam Chhina
> On Jan 18, 2023
---
Ran 7 tests in 12.950s
FAILED (errors=7)
sys:1: ResourceWarning: unclosed file <_io.BufferedWriter name=4>
Had test failures in pyspark.tests.test_broadcast with /usr/local/bin/python3;
see logs.
```
Best,
Adam Chhina
> On Jan 18, 2023, at 5:03 PM, Sean Owen wro
but python executable used for running these
tests is `Python 3.10.9` under `/user/local/bin/python3`.
Best,
Adam Chhina
> On Jan 18, 2023, at 3:05 PM, Bjørn Jørgensen wrote:
>
> Replace
> > > git clone g...@github.com:apache/spark.git
> > > git checkout -b spark-321
PR for any
docs if required afterwards), however if there would be a more appropriate
place, please let me know.
Best,
Adam Chhina
> On Dec 27, 2022, at 11:37 AM, Adam Chhina wrote:
>
> As part of an upgrade I was looking to run upstream PySpark unit tests on
> `v3.2.1-rc2` befo
hether an upstream test is failing for a
specific release?
3. Would it be possible to configure the run-tests script to run all
tests regardless of test failures?
Any help would be much appreciated!
Best,
Adam Chhina