RussellSpitzer commented on a change in pull request #2543:
URL: https://github.com/apache/iceberg/pull/2543#discussion_r623501203
##########
File path: spark3/src/main/java/org/apache/iceberg/spark/SparkFilters.java
##########
@@ -198,6 +200,10 @@ private static Object convertLiteral(Object value) {
return DateTimeUtils.fromJavaTimestamp((Timestamp) value);
} else if (value instanceof Date) {
return DateTimeUtils.fromJavaDate((Date) value);
+ } else if (value instanceof Instant) {
Review comment:
https://github.com/apache/spark/blob/master/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/literals.scala#L81
- Spark's literal Instant is always a timestamp
https://github.com/apache/spark/blob/master/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/literals.scala#L83
- Is always a LocalDate
If there is a type mismatch Spark will be doing the conversions and we won't
end up with a Instant or LocalDate, we'll get whatever Spark thinks is the
correct corresponding type for the Table column
> LGTM, though I'm not a spark expert; some minor questions. Do we know if
this only impact spark3 but not 2?
Yep the patch to add support for these literals only exists in Spark3
https://issues.apache.org/jira/browse/SPARK-26902
##########
File path: spark3/src/main/java/org/apache/iceberg/spark/SparkFilters.java
##########
@@ -198,6 +200,10 @@ private static Object convertLiteral(Object value) {
return DateTimeUtils.fromJavaTimestamp((Timestamp) value);
} else if (value instanceof Date) {
return DateTimeUtils.fromJavaDate((Date) value);
+ } else if (value instanceof Instant) {
Review comment:
https://github.com/apache/spark/blob/master/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/literals.scala#L81
- Spark's literal Instant is always a timestamp
https://github.com/apache/spark/blob/master/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/literals.scala#L83
- Is always a LocalDate
If there is a type mismatch Spark will be doing the conversions and we won't
end up with a Instant or LocalDate, we'll get whatever Spark thinks is the
correct corresponding type for the Table column
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]