Takeshi Yamamuro created SPARK-26868:
----------------------------------------

             Summary: Duplicate error message for implicit cartesian product in 
verbose explain
                 Key: SPARK-26868
                 URL: https://issues.apache.org/jira/browse/SPARK-26868
             Project: Spark
          Issue Type: Improvement
          Components: SQL
    Affects Versions: 2.4.0
            Reporter: Takeshi Yamamuro


Super trivial though, I just report this just in case (I think it would be nice 
if we could print this error message in a cleaner way):
{code:java}
scala> Seq(1).toDF("id").write.saveAsTable("t1")
scala> Seq(1).toDF("id").write.saveAsTable("t2")
scala> sql("SELECT * FROM t1 JOIN t2").explain(true)
== Parsed Logical Plan ==
'Project [*]
+- 'Join Inner
   :- 'UnresolvedRelation `t1`
   +- 'UnresolvedRelation `t2`

== Analyzed Logical Plan ==
id: int, id: int
Project [id#14, id#15]
+- Join Inner
   :- SubqueryAlias `default`.`t1`
   :  +- Relation[id#14] parquet
   +- SubqueryAlias `default`.`t2`
      +- Relation[id#15] parquet

== Optimized Logical Plan ==
org.apache.spark.sql.AnalysisException: Detected implicit cartesian product for 
INNER join between logical plans
Relation[id#14] parquet
and
Relation[id#15] parquet
Join condition is missing or trivial.
Either: use the CROSS JOIN syntax to allow cartesian products between these
relations, or: enable implicit cartesian products by setting the configuration
variable spark.sql.crossJoin.enabled=true;
== Physical Plan ==
org.apache.spark.sql.AnalysisException: Detected implicit cartesian product for 
INNER join between logical plans
Relation[id#14] parquet
and
Relation[id#15] parquet
Join condition is missing or trivial.
Either: use the CROSS JOIN syntax to allow cartesian products between these
relations, or: enable implicit cartesian products by setting the configuration
variable spark.sql.crossJoin.enabled=true;
{code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to