[ 
https://issues.apache.org/jira/browse/SPARK-17749?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15542850#comment-15542850
 ] 

Andreas Damm commented on SPARK-17749:
--------------------------------------

Combining the two ON clauses into one by anding the conditions does indeed 
solve the problem.

> Unresolved columns when nesting SQL join clauses
> ------------------------------------------------
>
>                 Key: SPARK-17749
>                 URL: https://issues.apache.org/jira/browse/SPARK-17749
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 2.0.0
>            Reporter: Andreas Damm
>
> Given tables
> CREATE TABLE `sf_datedconversionrate2`(`isocode` string)
> CREATE TABLE `sf_opportunity2`(`currencyisocode` string, `accountid` string)
> CREATE TABLE `sf_account2`(`id` string)
> the following SQL will cause an analysis exception (cannot resolve 
> '`sf_opportunity.currencyisocode`' given input columns: [isocode, id])
> SELECT    0 
> FROM      `sf_datedconversionrate2` AS `sf_datedconversionrate` 
> LEFT JOIN `sf_account2`             AS `sf_account` 
> LEFT JOIN `sf_opportunity2`         AS `sf_opportunity` 
> ON        `sf_account`.`id` = `sf_opportunity`.`accountid` 
> ON        `sf_datedconversionrate`.`isocode` = 
> `sf_opportunity`.`currencyisocode` 
> even though all columns referred to in the conditions should be in scope.
> Re-ordering the join and on clauses will make it work
> SELECT    0 
> FROM      `sf_datedconversionrate2` AS `sf_datedconversionrate` 
> LEFT JOIN `sf_opportunity2`         AS `sf_opportunity` 
> LEFT JOIN `sf_account2`             AS `sf_account` 
> ON        `sf_account`.`id` = `sf_opportunity`.`accountid` 
> ON        `sf_datedconversionrate`.`isocode` = 
> `sf_opportunity`.`currencyisocode` 
> but the original should work also.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to