Github user vanzin commented on the issue:

    https://github.com/apache/spark/pull/22997
  
    > It is not possible to build a distribution that doesn't contain hadoop 
dependencies but include SparkR
    
    I wouldn't say that. It seems like it's possible, it just can't be 
published to CRAN because it fails tests without further configuration.
    
    Which is kinda the point of hadoop-provided (it requires the user to 
configure Spark to point at existing Hadoop libraries).


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to