Github user markgrover commented on a diff in the pull request:

    https://github.com/apache/spark/pull/11278#discussion_r60290250
  
    --- Diff: docs/building-spark.md ---
    @@ -123,6 +123,21 @@ To produce a Spark package compiled with Scala 2.10, 
use the `-Dscala-2.10` prop
         ./dev/change-scala-version.sh 2.10
         mvn -Pyarn -Phadoop-2.4 -Dscala-2.10 -DskipTests clean package
     
    +# PySpark Tests with Maven
    +
    +If you are building PySpark and wish to run the PySpark tests you will 
need to build an assembly JAR and also ensure you have built with hive support.
    --- End diff --
    
    Yeah, I agree with @JoshRosen since assembly jars are now gone. I think the 
hive support part still applies but we should take out the part to build an 
assembly jar.
    
    The ```hadoop-2.4``` profile sets the ```hadoop.version``` so I don't think 
we need to specify the latter explicitly. I am also not sure, at the top of my 
head, if we need the ```hive-thriftserver``` profile for running python tests 
so it may make sense to see if just specifying the hive profile would suffice.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to