+1 (non-binding). Run k8s tests with Scala 2.12. Also included the
RTestsSuite (mentioned by Ilan) although not part of the 2.4 rc tag:

[INFO] --- scalatest-maven-plugin:1.0:test (integration-test) @
spark-kubernetes-integration-tests_2.12 ---
Discovery starting.
Discovery completed in 239 milliseconds.
Run starting. Expected test count is: 15
KubernetesSuite:
- Run SparkPi with no resources
- Run SparkPi with a very long application name.
- Use SparkLauncher.NO_RESOURCE
- Run SparkPi with a master URL without a scheme.
- Run SparkPi with an argument.
- Run SparkPi with custom labels, annotations, and environment variables.
- Run extraJVMOptions check on driver
- Run SparkRemoteFileTest using a remote data file
- Run SparkPi with env and mount secrets.
- Run PySpark on simple pi.py example
- Run PySpark with Python2 to test a pyfiles example
- Run PySpark with Python3 to test a pyfiles example
- Run PySpark with memory customization
- Run in client mode.
- Run SparkR on simple dataframe.R example
Run completed in 6 minutes, 32 seconds.
Total number of tests run: 15
Suites: completed 2, aborted 0
Tests: succeeded 15, failed 0, canceled 0, ignored 0, pending 0
All tests passed.
[INFO]
------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Spark Project Parent POM 2.4.0 ..................... SUCCESS [
4.480 s]
[INFO] Spark Project Tags ................................. SUCCESS [
3.898 s]
[INFO] Spark Project Local DB ............................. SUCCESS [
2.773 s]
[INFO] Spark Project Networking ........................... SUCCESS [
5.063 s]
[INFO] Spark Project Shuffle Streaming Service ............ SUCCESS [
2.651 s]
[INFO] Spark Project Unsafe ............................... SUCCESS [
2.662 s]
[INFO] Spark Project Launcher ............................. SUCCESS [
5.103 s]
[INFO] Spark Project Core ................................. SUCCESS [
25.703 s]
[INFO] Spark Project Kubernetes Integration Tests 2.4.0 ... SUCCESS [06:51
min]
[INFO]
------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO]
------------------------------------------------------------------------
[INFO] Total time: 07:44 min
[INFO] Finished at: 2018-10-23T19:09:41Z
[INFO]
------------------------------------------------------------------------

Stavros

On Tue, Oct 23, 2018 at 9:46 PM, Dongjoon Hyun <dongjoon.h...@gmail.com>
wrote:

> BTW, for that integration suite, I saw the related artifacts in the RC4
> staging directory.
>
> Does Spark 2.4.0 need to start to release these `spark-kubernetes
> -integration-tests` artifacts?
>
>    - https://repository.apache.org/content/repositories/
>    orgapachespark-1290/org/apache/spark/spark-kubernetes-
>    integration-tests_2.11/
>    
> <https://repository.apache.org/content/repositories/orgapachespark-1290/org/apache/spark/spark-kubernetes-integration-tests_2.11/>
>    - https://repository.apache.org/content/repositories/
>    orgapachespark-1290/org/apache/spark/spark-kubernetes-
>    integration-tests_2.12/
>    
> <https://repository.apache.org/content/repositories/orgapachespark-1290/org/apache/spark/spark-kubernetes-integration-tests_2.12/>
>
> Historically, Spark released `spark-docker-integration-tests` at Spark
> 1.6.x era and stopped since Spark 2.0.0.
>
>    - http://central.maven.org/maven2/org/apache/spark/spark-
>    docker-integration-tests_2.10/
>    - http://central.maven.org/maven2/org/apache/spark/spark-
>    docker-integration-tests_2.11/
>
>
> Bests,
> Dongjoon.
>
> On Tue, Oct 23, 2018 at 11:43 AM Stavros Kontopoulos <stavros.kontopoulos@
> lightbend.com> wrote:
>
>> Sean,
>>
>> Ok makes sense, im using a cloned repo. I built with Scala 2.12 profile
>> using the related tag v2.4.0-rc4:
>>
>> ./dev/change-scala-version.sh 2.12
>> ./dev/make-distribution.sh  --name test --r --tgz -Pscala-2.12 -Psparkr
>> -Phadoop-2.7 -Pkubernetes -Phive
>> Pushed images to dockerhub (previous email) since I didnt use the
>> minikube daemon (default behavior).
>>
>> Then run tests successfully against minikube:
>>
>> TGZ_PATH=$(pwd)/spark-2.4.0-bin-test.gz
>> cd resource-managers/kubernetes/integration-tests
>>
>> ./dev/dev-run-integration-tests.sh --spark-tgz $TGZ_PATH
>> --service-account default --namespace default --image-tag k8s-scala-12 
>> --image-repo
>> skonto
>>
>>
>> [INFO]
>> [INFO] --- scalatest-maven-plugin:1.0:test (integration-test) @
>> spark-kubernetes-integration-tests_2.12 ---
>> Discovery starting.
>> Discovery completed in 229 milliseconds.
>> Run starting. Expected test count is: 14
>> KubernetesSuite:
>> - Run SparkPi with no resources
>> - Run SparkPi with a very long application name.
>> - Use SparkLauncher.NO_RESOURCE
>> - Run SparkPi with a master URL without a scheme.
>> - Run SparkPi with an argument.
>> - Run SparkPi with custom labels, annotations, and environment variables.
>> - Run extraJVMOptions check on driver
>> - Run SparkRemoteFileTest using a remote data file
>> - Run SparkPi with env and mount secrets.
>> - Run PySpark on simple pi.py example
>> - Run PySpark with Python2 to test a pyfiles example
>> - Run PySpark with Python3 to test a pyfiles example
>> - Run PySpark with memory customization
>> - Run in client mode.
>> Run completed in 5 minutes, 24 seconds.
>> Total number of tests run: 14
>> Suites: completed 2, aborted 0
>> Tests: succeeded 14, failed 0, canceled 0, ignored 0, pending 0
>> All tests passed.
>> [INFO] ------------------------------------------------------------
>> ------------
>> [INFO] Reactor Summary:
>> [INFO]
>> [INFO] Spark Project Parent POM 2.4.0 ..................... SUCCESS [
>> 4.491 s]
>> [INFO] Spark Project Tags ................................. SUCCESS [
>> 3.833 s]
>> [INFO] Spark Project Local DB ............................. SUCCESS [
>> 2.680 s]
>> [INFO] Spark Project Networking ........................... SUCCESS [
>> 4.817 s]
>> [INFO] Spark Project Shuffle Streaming Service ............ SUCCESS [
>> 2.541 s]
>> [INFO] Spark Project Unsafe ............................... SUCCESS [
>> 2.795 s]
>> [INFO] Spark Project Launcher ............................. SUCCESS [
>> 5.593 s]
>> [INFO] Spark Project Core ................................. SUCCESS [
>> 25.160 s]
>> [INFO] Spark Project Kubernetes Integration Tests 2.4.0 ... SUCCESS
>> [05:30 min]
>> [INFO] ------------------------------------------------------------
>> ------------
>> [INFO] BUILD SUCCESS
>> [INFO] ------------------------------------------------------------
>> ------------
>> [INFO] Total time: 06:23 min
>> [INFO] Finished at: 2018-10-23T18:39:11Z
>> [INFO] ------------------------------------------------------------
>> ------------
>>
>>
>> but had to modify this line
>> <https://github.com/apache/spark/blob/master/resource-managers/kubernetes/integration-tests/dev/dev-run-integration-tests.sh#L106>
>>  and
>> added -Pscala-2.12 , otherwise it fails (these tests inherit from the
>> parent pom but the profile is not propagated to the mvn command that
>> launches the tests, I can create a PR to fix that).
>>
>>
>> On Tue, Oct 23, 2018 at 7:44 PM, Hyukjin Kwon <gurwls...@gmail.com>
>> wrote:
>>
>>> https://github.com/apache/spark/pull/22514 sounds like a regression
>>> that affects Hive CTAS in write path (by not replacing them into Spark
>>> internal datasources; therefore performance regression).
>>> but yea I suspect if we should block the release by this.
>>>
>>> https://github.com/apache/spark/pull/22144 is just being discussed if I
>>> am not mistaken.
>>>
>>> Thanks.
>>>
>>> 2018년 10월 24일 (수) 오전 12:27, Xiao Li <gatorsm...@gmail.com>님이 작성:
>>>
>>>> https://github.com/apache/spark/pull/22144 is also not a blocker of
>>>> Spark 2.4 release, as discussed in the PR.
>>>>
>>>> Thanks,
>>>>
>>>> Xiao
>>>>
>>>> Xiao Li <gatorsm...@gmail.com> 于2018年10月23日周二 上午9:20写道:
>>>>
>>>>> Thanks for reporting this. https://github.com/apache/spark/pull/22514
>>>>> is not a blocker. We can fix it in the next minor release, if we are 
>>>>> unable
>>>>> to make it in this release.
>>>>>
>>>>> Thanks,
>>>>>
>>>>> Xiao
>>>>>
>>>>> Sean Owen <sro...@gmail.com> 于2018年10月23日周二 上午9:14写道:
>>>>>
>>>>>> (I should add, I only observed this with the Scala 2.12 build. It all
>>>>>> seemed to work with 2.11. Therefore I'm not too worried about it. I
>>>>>> don't think it's a Scala version issue, but perhaps something looking
>>>>>> for a spark 2.11 tarball and not finding it. See
>>>>>> https://github.com/apache/spark/pull/22805#issuecomment-432304622 for
>>>>>> a change that might address this kind of thing.)
>>>>>>
>>>>>> On Tue, Oct 23, 2018 at 11:05 AM Sean Owen <sro...@gmail.com> wrote:
>>>>>> >
>>>>>> > Yeah, that's maybe the issue here. This is a source release, not a
>>>>>> git checkout, and it still needs to work in this context.
>>>>>> >
>>>>>> > I just added -Pkubernetes to my build and didn't do anything else.
>>>>>> I think the ideal is that a "mvn -P... -P... install" to work from a 
>>>>>> source
>>>>>> release; that's a good expectation and consistent with docs.
>>>>>> >
>>>>>> > Maybe these tests simply don't need to run with the normal suite of
>>>>>> tests, and can be considered tests run manually by developers running 
>>>>>> these
>>>>>> scripts? Basically, KubernetesSuite shouldn't run in a normal mvn 
>>>>>> install?
>>>>>> >
>>>>>> > I don't think this has to block the release even if so, just trying
>>>>>> to get to the bottom of it.
>>>>>>
>>>>>> ---------------------------------------------------------------------
>>>>>> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>>>>>>
>>>>>>
>>
>>
>>

Reply via email to