[jira] [Commented] (SPARK-17749) Unresolved columns when nesting SQL join clauses
[ https://issues.apache.org/jira/browse/SPARK-17749?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15542850#comment-15542850 ] Andreas Damm commented on SPARK-17749: -- Combining the two ON clauses into one by anding the conditions does indeed solve the problem. > Unresolved columns when nesting SQL join clauses > > > Key: SPARK-17749 > URL: https://issues.apache.org/jira/browse/SPARK-17749 > Project: Spark > Issue Type: Bug > Components: SQL >Affects Versions: 2.0.0 >Reporter: Andreas Damm > > Given tables > CREATE TABLE `sf_datedconversionrate2`(`isocode` string) > CREATE TABLE `sf_opportunity2`(`currencyisocode` string, `accountid` string) > CREATE TABLE `sf_account2`(`id` string) > the following SQL will cause an analysis exception (cannot resolve > '`sf_opportunity.currencyisocode`' given input columns: [isocode, id]) > SELECT0 > FROM `sf_datedconversionrate2` AS `sf_datedconversionrate` > LEFT JOIN `sf_account2` AS `sf_account` > LEFT JOIN `sf_opportunity2` AS `sf_opportunity` > ON`sf_account`.`id` = `sf_opportunity`.`accountid` > ON`sf_datedconversionrate`.`isocode` = > `sf_opportunity`.`currencyisocode` > even though all columns referred to in the conditions should be in scope. > Re-ordering the join and on clauses will make it work > SELECT0 > FROM `sf_datedconversionrate2` AS `sf_datedconversionrate` > LEFT JOIN `sf_opportunity2` AS `sf_opportunity` > LEFT JOIN `sf_account2` AS `sf_account` > ON`sf_account`.`id` = `sf_opportunity`.`accountid` > ON`sf_datedconversionrate`.`isocode` = > `sf_opportunity`.`currencyisocode` > but the original should work also. -- This message was sent by Atlassian JIRA (v6.3.4#6332) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Created] (SPARK-17750) Cannot create view which includes interval arithmetic
Andreas Damm created SPARK-17750: Summary: Cannot create view which includes interval arithmetic Key: SPARK-17750 URL: https://issues.apache.org/jira/browse/SPARK-17750 Project: Spark Issue Type: Bug Components: SQL Affects Versions: 2.0.0 Reporter: Andreas Damm Given table create table dates (ts timestamp) the following view creation SQL failes with Failed to analyze the canonicalized SQL. It is possible there is a bug in Spark. create view test_dates as select ts + interval 1 day from dates -- This message was sent by Atlassian JIRA (v6.3.4#6332) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Created] (SPARK-17749) Unresolved columns when nesting SQL join clauses
Andreas Damm created SPARK-17749: Summary: Unresolved columns when nesting SQL join clauses Key: SPARK-17749 URL: https://issues.apache.org/jira/browse/SPARK-17749 Project: Spark Issue Type: Bug Components: SQL Affects Versions: 2.0.0 Reporter: Andreas Damm Given tables CREATE TABLE `sf_datedconversionrate2`(`isocode` string) CREATE TABLE `sf_opportunity2`(`currencyisocode` string, `accountid` string) CREATE TABLE `sf_account2`(`id` string) the following SQL will cause an analysis exception (cannot resolve '`sf_opportunity.currencyisocode`' given input columns: [isocode, id]) SELECT0 FROM `sf_datedconversionrate2` AS `sf_datedconversionrate` LEFT JOIN `sf_account2` AS `sf_account` LEFT JOIN `sf_opportunity2` AS `sf_opportunity` ON`sf_account`.`id` = `sf_opportunity`.`accountid` ON`sf_datedconversionrate`.`isocode` = `sf_opportunity`.`currencyisocode` even though all columns referred to in the conditions should be in scope. Re-ordering the join and on clauses will make it work SELECT0 FROM `sf_datedconversionrate2` AS `sf_datedconversionrate` LEFT JOIN `sf_opportunity2` AS `sf_opportunity` LEFT JOIN `sf_account2` AS `sf_account` ON`sf_account`.`id` = `sf_opportunity`.`accountid` ON`sf_datedconversionrate`.`isocode` = `sf_opportunity`.`currencyisocode` but the original should work also. -- This message was sent by Atlassian JIRA (v6.3.4#6332) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org