[ 
https://issues.apache.org/jira/browse/SPARK-21791?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16543291#comment-16543291
 ] 

Furcy Pin commented on SPARK-21791:
-----------------------------------

Hi, this bug is in fact only half-solved.
Here is a minimal example: this works fine in Spark 2.3.1:
{code:java}
spark.sql("DROP TABLE IF EXISTS test")
spark.sql("CREATE TABLE test STORED AS ORC as SELECT 'a' as `c.d`")
spark.sql("REFRESH TABLE test")
spark.table("test").count()
{code}
 

But this fails:
{code:java}
spark.sql("DROP TABLE IF EXISTS test")
spark.sql("SELECT 'a' as `c.d`").write.format("orc").saveAsTable("test")
spark.sql("REFRESH TABLE test")
spark.table("test").count()
{code}
with this error message:
{code:java}
mismatched input '.' expecting ':'(line 1, pos 8)

== SQL ==
struct<c.d:string>
{code}
 

Do you want to reopen it, or do you want me to create a new ticket?

 

> ORC should support column names with dot
> ----------------------------------------
>
>                 Key: SPARK-21791
>                 URL: https://issues.apache.org/jira/browse/SPARK-21791
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 2.0.0, 2.1.0, 2.2.0
>            Reporter: Dongjoon Hyun
>            Priority: Major
>             Fix For: 2.3.0
>
>
> *PARQUET*
> {code}
> scala> Seq(Some(1), None).toDF("col.dots").write.parquet("/tmp/parquet_dot")
> scala> spark.read.parquet("/tmp/parquet_dot").show
> +--------+
> |col.dots|
> +--------+
> |       1|
> |    null|
> +--------+
> {code}
> *ORC*
> {code}
> scala> Seq(Some(1), None).toDF("col.dots").write.orc("/tmp/orc_dot")
> scala> spark.read.orc("/tmp/orc_dot").show
> org.apache.spark.sql.catalyst.parser.ParseException:
> mismatched input '.' expecting ':'(line 1, pos 10)
> == SQL ==
> struct<col.dots:int>
> ----------^^^
> {code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to