Github user gatorsmile commented on a diff in the pull request:

    https://github.com/apache/spark/pull/21370#discussion_r194287915
  
    --- Diff: python/pyspark/sql/dataframe.py ---
    @@ -351,8 +354,68 @@ def show(self, n=20, truncate=True, vertical=False):
             else:
                 print(self._jdf.showString(n, int(truncate), vertical))
     
    +    @property
    +    def _eager_eval(self):
    +        """Returns true if the eager evaluation enabled.
    +        """
    +        return self.sql_ctx.getConf(
    +            "spark.sql.repl.eagerEval.enabled", "false").lower() == "true"
    --- End diff --
    
    In the ongoing release, a nice-to-have refactoring is to move all the Core 
Confs into a single file just like what we did in Spark SQL Conf. Default 
values, boundary checking, types and descriptions. Thus, in PySpark, it would 
be better to do it starting from now. 


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to