[ 
https://issues.apache.org/jira/browse/SPARK-15439?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15297714#comment-15297714
 ] 

Miao Wang commented on SPARK-15439:
-----------------------------------

I am still investigating the errors.
Now, I see new errors without using sudo, like below:
Failed -------------------------------------------------------------------------
1. Failure: Check masked functions (@test_context.R#30) ------------------------
length(maskedBySparkR) not equal to length(namesOfMasked).
1/1 mismatches
[1] 22 - 20 == 2


2. Failure: Check masked functions (@test_context.R#31) ------------------------
sort(maskedBySparkR) not equal to sort(namesOfMasked).
Lengths differ: 22 vs 20


3. Failure: Check masked functions (@test_context.R#40) ------------------------
length(maskedCompletely) not equal to length(namesOfMaskedCompletely).
1/1 mismatches
[1] 5 - 3 == 2


4. Failure: Check masked functions (@test_context.R#41) ------------------------
sort(maskedCompletely) not equal to sort(namesOfMaskedCompletely).
Lengths differ: 5 vs 3
I am learning how R works.

> Failed to run unit test in SparkR
> ---------------------------------
>
>                 Key: SPARK-15439
>                 URL: https://issues.apache.org/jira/browse/SPARK-15439
>             Project: Spark
>          Issue Type: Bug
>          Components: SparkR
>    Affects Versions: 2.0.0
>            Reporter: Kai Jiang
>
> Failed to run ./R/run-tests.sh   around recent commit (May 19, 2016)
> It might be related to permission. It seems I used `sudo ./R/run-tests.sh` 
> and it worked sometimes. Without permission, maybe we couldn't access /tmp 
> directory.  However, the SparkR unit testing is still brittle.
> [error 
> message|https://gist.github.com/vectorijk/71f4ff34e3d34a628b8a3013f0ca2aa2]



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to