[jira] [Created] (SPARK-24151) CURRENT_DATE, CURRENT_TIMESTAMP incorrectly resolved as column names when caseSensitive is enabled

2018-05-02 Thread James Thompson (JIRA)
James Thompson created SPARK-24151:
--

 Summary: CURRENT_DATE, CURRENT_TIMESTAMP incorrectly resolved as 
column names when caseSensitive is enabled
 Key: SPARK-24151
 URL: https://issues.apache.org/jira/browse/SPARK-24151
 Project: Spark
  Issue Type: Bug
  Components: SQL
Affects Versions: 2.3.0
Reporter: James Thompson


After this change: https://issues.apache.org/jira/browse/SPARK-22333

Running SQL such as "CURRENT_TIMESTAMP" can fail spark.sql.caseSensitive has 
been enabled:
{code:java}
org.apache.spark.sql.AnalysisException: cannot resolve '`CURRENT_TIMESTAMP`' 
given input columns: [col1]{code}
This is due to the fact that the analyzer incorrectly uses a case sensitive 
resolver to resolve the function. I will submit a PR with a fix + test for this.

 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-23388) Support for Parquet Binary DecimalType in VectorizedColumnReader

2018-02-11 Thread James Thompson (JIRA)
James Thompson created SPARK-23388:
--

 Summary: Support for Parquet Binary DecimalType in 
VectorizedColumnReader
 Key: SPARK-23388
 URL: https://issues.apache.org/jira/browse/SPARK-23388
 Project: Spark
  Issue Type: Bug
  Components: SQL
Affects Versions: 2.3.0
Reporter: James Thompson


The following commit to spark removed support for decimal binary types: 
[https://github.com/apache/spark/commit/9c29c557635caf739fde942f53255273aac0d7b1#diff-7bdf5fd0ce0b1ccbf4ecf083611976e6R428]

As per the parquet spec, decimal can be used to annotate binary types, so 
support should be re-added: 
[https://github.com/apache/parquet-format/blob/master/LogicalTypes.md#decimal]



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org