There is an implicit conversion in scope

https://github.com/apache/spark/blob/master/sql/core/src/main/scala/org/apache/spark/sql/DataFrame.scala#L153


  /**
   * An implicit conversion function internal to this class for us to avoid
doing
   * "new DataFrame(...)" everywhere.
   */
  @inline private implicit def logicalPlanToDataFrame(logicalPlan:
LogicalPlan): DataFrame = {
    new DataFrame(sqlContext, logicalPlan)
  }


On Tue, Sep 22, 2015 at 10:57 PM, qiuhai <986775...@qq.com> wrote:

> Hi,
>   Recently,I am reading source code(1.5 version) about sparksql .
>   In DataFrame.scala, there is a funtion named filter in the 737 row
>
>         *def filter(condition: Column): DataFrame = Filter(condition.expr,
> logicalPlan)*
>
>   The fucntion return  a Filter object,but it require a DataFrame object.
>
>
>   thanks.
>
>
>
> --
> View this message in context:
> http://apache-spark-developers-list.1001551.n3.nabble.com/Why-Filter-return-a-DataFrame-object-in-DataFrame-scala-tp14295.html
> Sent from the Apache Spark Developers List mailing list archive at
> Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
> For additional commands, e-mail: dev-h...@spark.apache.org
>
>

Reply via email to