[ 
https://issues.apache.org/jira/browse/SPARK-23837?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Shahid K I updated SPARK-23837:
-------------------------------
    Description: 
For spark generated alias name contains comma, Hive metastore throws exception.

0: jdbc:hive2://ha-cluster/default> create table a (col1 decimal(18,3), col2 
decimal(18,5));
+---------+--+
| Result  |
+---------+--+
+---------+--+
No rows selected (0.171 seconds)
0: jdbc:hive2://ha-cluster/default> select col1*col2 from a;
+-------------------------------------------------------------------------------------------+
| (CAST(col1 AS DECIMAL(20,5)) * CAST(col2 AS DECIMAL(20,5)))  |
+-------------------------------------------------------------------------------------------+
+-------------------------------------------------------------------------------------------+
No rows selected (0.168 seconds)
0: jdbc:hive2://ha-cluster/default> create table b as select col1*col2 from a;

Error: org.apache.spark.sql.AnalysisException: Cannot create a table having a 
column whose name contains commas in Hive metastore. Table: `default`.`b`; 
Column: (CAST(col1 AS DECIMAL(20,5)) * CAST(col2 AS DECIMAL(20,5))); 
(state=,code=0)



  was:
For spark generated alias name contains comma, Hive metastore throws exception.

0: jdbc:hive2://ha-cluster/default> create table a (col1 decimal(18,3), col2 
decimal(18,5));
+---------+--+
| Result  |
+---------+--+
+---------+--+
No rows selected (0.171 seconds)
0: jdbc:hive2://ha-cluster/default> select col1*col2 from a;
+--------------------------------------------------------------+--+
| (CAST(col1 AS DECIMAL(20,5)) * CAST(col2 AS DECIMAL(20,5)))  |
+--------------------------------------------------------------+--+
+--------------------------------------------------------------+--+
No rows selected (0.168 seconds)
0: jdbc:hive2://ha-cluster/default> create table b as select col1*col2 from a;
*Error: org.apache.spark.sql.AnalysisException: Cannot create a table having a 
column whose name contains commas in Hive me                                    
                                                tastore. Table: `default`.`b`; 
Column: (CAST(col1 AS DECIMAL(20,5)) * CAST(col2 AS DECIMAL(20,5))); 
(state=,code=0)
*

!image-2018-03-31-19-57-38-496.png!


> Create table as select gives exception if the spark generated alias name 
> contains comma
> ---------------------------------------------------------------------------------------
>
>                 Key: SPARK-23837
>                 URL: https://issues.apache.org/jira/browse/SPARK-23837
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 2.2.1, 2.3.0
>            Reporter: Shahid K I
>            Priority: Major
>
> For spark generated alias name contains comma, Hive metastore throws 
> exception.
> 0: jdbc:hive2://ha-cluster/default> create table a (col1 decimal(18,3), col2 
> decimal(18,5));
> +---------+--+
> | Result  |
> +---------+--+
> +---------+--+
> No rows selected (0.171 seconds)
> 0: jdbc:hive2://ha-cluster/default> select col1*col2 from a;
> +-------------------------------------------------------------------------------------------+
> | (CAST(col1 AS DECIMAL(20,5)) * CAST(col2 AS DECIMAL(20,5)))  |
> +-------------------------------------------------------------------------------------------+
> +-------------------------------------------------------------------------------------------+
> No rows selected (0.168 seconds)
> 0: jdbc:hive2://ha-cluster/default> create table b as select col1*col2 from a;
> Error: org.apache.spark.sql.AnalysisException: Cannot create a table having a 
> column whose name contains commas in Hive metastore. Table: `default`.`b`; 
> Column: (CAST(col1 AS DECIMAL(20,5)) * CAST(col2 AS DECIMAL(20,5))); 
> (state=,code=0)



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to