[ 
https://issues.apache.org/jira/browse/SPARK-7730?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Sean Owen updated SPARK-7730:
-----------------------------
    Component/s:     (was: Spark Shell)
                 SQL

> Complex Teradata queries throwing Analysis Exception when running on spark
> --------------------------------------------------------------------------
>
>                 Key: SPARK-7730
>                 URL: https://issues.apache.org/jira/browse/SPARK-7730
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 1.3.1
>         Environment: develeopement
>            Reporter: pankhuri
>
> Connected spark wth tearadata. When running below TeraData query on 
> spark-shell:
> select substr(w_warehouse_name,1,20) as xx,sm_type,cc_name
> ,sum(case when (cs_ship_date_sk - cs_sold_date_sk <= 30 ) then 1 else 0 end)  
> as days
>   ,sum(case when (cs_ship_date_sk - cs_sold_date_sk > 30) and 
>                  (cs_ship_date_sk - cs_sold_date_sk <= 60) then 1 else 0 end 
> )  as sdays 
>   ,sum(case when (cs_ship_date_sk - cs_sold_date_sk > 60) and 
>                  (cs_ship_date_sk - cs_sold_date_sk <= 90) then 1 else 0 end) 
>  as rdays 
>   ,sum(case when (cs_ship_date_sk - cs_sold_date_sk > 90) and
>                  (cs_ship_date_sk - cs_sold_date_sk <= 120) then 1 else 0 
> end)  as ndays
>   ,sum(case when (cs_ship_date_sk - cs_sold_date_sk  > 120) then 1 else 0 
> end)  as dfdays
> from test
> where d_month_seq between 1193 and 1193 + 11
> and cs_ship_date_sk   = d_date_sk
> and cs_warehouse_sk   = w_warehouse_sk
> and cs_ship_mode_sk   = sm_ship_mode_sk
> and cs_call_center_sk = cc_call_center_sk
> group by xx ,sm_type ,cc_name order by xx,sm_type,cc_name
> org.apache.spark.sql.AnalysisException: cannot resolve 'xx' given input 
> columns cc_name, sdays, days, sm_type, rdays, xx, ndays, dfdays;



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to