[ 
https://issues.apache.org/jira/browse/SPARK-33897?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Hyukjin Kwon resolved SPARK-33897.
----------------------------------
    Fix Version/s: 3.1.0
       Resolution: Fixed

Issue resolved by pull request 30803
[https://github.com/apache/spark/pull/30803]

> Can't set option 'cross' in join method.
> ----------------------------------------
>
>                 Key: SPARK-33897
>                 URL: https://issues.apache.org/jira/browse/SPARK-33897
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 3.0.1
>            Reporter: GokiMori
>            Assignee: GokiMori
>            Priority: Minor
>             Fix For: 3.1.0
>
>
> [The PySpark 
> documentation|https://spark.apache.org/docs/3.0.1/api/python/pyspark.sql.html#pyspark.sql.DataFrame.join]
>  says "Must be one of: inner, cross, outer, full, fullouter, full_outer, 
> left, leftouter, left_outer, right, rightouter, right_outer, semi, leftsemi, 
> left_semi, anti, leftanti and left_anti."
> However, I get the following error when I set the cross option.
>  
> {code:java}
> scala> val df1 = spark.createDataFrame(Seq((1,"a"),(2,"b")))
> df1: org.apache.spark.sql.DataFrame = [_1: int, _2: string]
> scala> val df2 = spark.createDataFrame(Seq((1,"A"),(2,"B"), (3, "C")))
> df2: org.apache.spark.sql.DataFrame = [_1: int, _2: string]
> scala> df1.join(right = df2, usingColumns = Seq("_1"), joinType = 
> "cross").show()
> java.lang.IllegalArgumentException: requirement failed: Unsupported using 
> join type Cross
>  at scala.Predef$.require(Predef.scala:281)
>  at org.apache.spark.sql.catalyst.plans.UsingJoin.<init>(joinTypes.scala:106)
>  at org.apache.spark.sql.Dataset.join(Dataset.scala:1025)
>  ... 53 elided
> {code}
>  



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to