GitHub user felixcheung opened a pull request:

    https://github.com/apache/spark/pull/19165

    [SPARKR][BACKPORT-2.1] backporting package and test changes

    ## What changes were proposed in this pull request?
    
    cherrypick or manually porting changes to 2.1
    
    ## How was this patch tested?
    
    Jenkins


You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/felixcheung/spark rbackportpkg21

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/spark/pull/19165.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #19165
    
----
commit b3062ebcf09b37f826449a6f88f3162cb772eea2
Author: Wayne Zhang <actuaryzh...@uber.com>
Date:   2017-05-08T06:16:30Z

    cherrypick 2fdaeb52bbe2ed1a9127ac72917286e505303c85

commit 5e9483871ad34fa8d653b69a77f066bd3183bd47
Author: Felix Cheung <felixcheun...@hotmail.com>
Date:   2017-05-08T06:10:18Z

    cherrypick c24bdaab5a234d18b273544cefc44cc4005bf8fc

commit 15c289477fe8313920f3656734fdd0eba1cfb3dc
Author: Felix Cheung <felixcheun...@hotmail.com>
Date:   2017-09-08T07:48:23Z

    manual port of 0b0be47e7b742d96810c60b19a9aa920242e5224

commit 79c82c773cbd8b1dbb59f8329c621bdbe9e2c287
Author: Felix Cheung <felixcheun...@hotmail.com>
Date:   2017-09-08T07:50:16Z

    port subset of 9f4ff9552470fb97ca38bb56bbf43be49a9a316c

commit a74ba0276705e9ac84f56b57a558b2211cdf947b
Author: hyukjinkwon <gurwls...@gmail.com>
Date:   2017-06-18T18:26:27Z

    [SPARK-21128][R] Remove both "spark-warehouse" and "metastore_db" before 
listing files in R tests
    
    ## What changes were proposed in this pull request?
    
    This PR proposes to list the files in test _after_ removing both 
"spark-warehouse" and "metastore_db" so that the next run of R tests pass fine. 
This is sometimes a bit annoying.
    
    ## How was this patch tested?
    
    Manually running multiple times R tests via `./R/run-tests.sh`.
    
    **Before**
    
    Second run:
    
    ```
    SparkSQL functions: Spark package found in SPARK_HOME: .../spark
    
...............................................................................................................................................................
    
...............................................................................................................................................................
    
...............................................................................................................................................................
    
...............................................................................................................................................................
    
...............................................................................................................................................................
    
....................................................................................................1234.......................
    
    Failed 
-------------------------------------------------------------------------
    1. Failure: No extra files are created in SPARK_HOME by starting session 
and making calls (test_sparkSQL.R#3384)
    length(list1) not equal to length(list2).
    1/1 mismatches
    [1] 25 - 23 == 2
    
    2. Failure: No extra files are created in SPARK_HOME by starting session 
and making calls (test_sparkSQL.R#3384)
    sort(list1, na.last = TRUE) not equal to sort(list2, na.last = TRUE).
    10/25 mismatches
    x[16]: "metastore_db"
    y[16]: "pkg"
    
    x[17]: "pkg"
    y[17]: "R"
    
    x[18]: "R"
    y[18]: "README.md"
    
    x[19]: "README.md"
    y[19]: "run-tests.sh"
    
    x[20]: "run-tests.sh"
    y[20]: "SparkR_2.2.0.tar.gz"
    
    x[21]: "metastore_db"
    y[21]: "pkg"
    
    x[22]: "pkg"
    y[22]: "R"
    
    x[23]: "R"
    y[23]: "README.md"
    
    x[24]: "README.md"
    y[24]: "run-tests.sh"
    
    x[25]: "run-tests.sh"
    y[25]: "SparkR_2.2.0.tar.gz"
    
    3. Failure: No extra files are created in SPARK_HOME by starting session 
and making calls (test_sparkSQL.R#3388)
    length(list1) not equal to length(list2).
    1/1 mismatches
    [1] 25 - 23 == 2
    
    4. Failure: No extra files are created in SPARK_HOME by starting session 
and making calls (test_sparkSQL.R#3388)
    sort(list1, na.last = TRUE) not equal to sort(list2, na.last = TRUE).
    10/25 mismatches
    x[16]: "metastore_db"
    y[16]: "pkg"
    
    x[17]: "pkg"
    y[17]: "R"
    
    x[18]: "R"
    y[18]: "README.md"
    
    x[19]: "README.md"
    y[19]: "run-tests.sh"
    
    x[20]: "run-tests.sh"
    y[20]: "SparkR_2.2.0.tar.gz"
    
    x[21]: "metastore_db"
    y[21]: "pkg"
    
    x[22]: "pkg"
    y[22]: "R"
    
    x[23]: "R"
    y[23]: "README.md"
    
    x[24]: "README.md"
    y[24]: "run-tests.sh"
    
    x[25]: "run-tests.sh"
    y[25]: "SparkR_2.2.0.tar.gz"
    
    DONE 
===========================================================================
    ```
    
    **After**
    
    Second run:
    
    ```
    SparkSQL functions: Spark package found in SPARK_HOME: .../spark
    
...............................................................................................................................................................
    
...............................................................................................................................................................
    
...............................................................................................................................................................
    
...............................................................................................................................................................
    
...............................................................................................................................................................
    
...............................................................................................................................
    ```
    
    Author: hyukjinkwon <gurwls...@gmail.com>
    
    Closes #18335 from HyukjinKwon/SPARK-21128.

commit 129a9798a98ddd86601c876fb83eeb5f5995d814
Author: Felix Cheung <felixcheun...@hotmail.com>
Date:   2017-08-24T04:35:17Z

    [SPARK-21805][SPARKR] Disable R vignettes code on Windows
    
    ## What changes were proposed in this pull request?
    
    Code in vignettes requires winutils on windows to run, when publishing to 
CRAN or building from source, winutils might not be available, so it's better 
to disable code run (so resulting vigenttes will not have output from code, but 
text is still there and code is still there)
    
    fix * checking re-building of vignette outputs ... WARNING
    and
    > %LOCALAPPDATA% not found. Please define the environment variable or 
restart and enter an installation path in localDir.
    
    ## How was this patch tested?
    
    jenkins, appveyor, r-hub
    
    before: 
https://artifacts.r-hub.io/SparkR_2.2.0.tar.gz-49cecef3bb09db1db130db31604e0293/SparkR.Rcheck/00check.log
    after: 
https://artifacts.r-hub.io/SparkR_2.2.0.tar.gz-86a066c7576f46794930ad114e5cff7c/SparkR.Rcheck/00check.log
    
    Author: Felix Cheung <felixcheun...@hotmail.com>
    
    Closes #19016 from felixcheung/rvigwind.

commit b53f812d729bb56929876bde9872cf1f2451e63a
Author: Felix Cheung <felixcheun...@hotmail.com>
Date:   2017-09-08T08:01:02Z

    not cran

----


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to