SparkQA commented on issue #26358: [SPARK-29712][SQL] Take into account the left bound in `fromDayTimeString()` URL: https://github.com/apache/spark/pull/26358#issuecomment-552030249 **[Test build #113472 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/113472/testReport)** for PR 26358 at commit [`a78dcc3`](https://github.com/apache/spark/commit/a78dcc36248ebcba7b9338f30fbd5065fbf0e713). * This patch passes all tests. * This patch merges cleanly. * This patch adds the following public classes _(experimental)_: * `public class DateTimeConstants ` * `public final class CalendarInterval implements Serializable, Comparable<CalendarInterval> ` * ` .doc(\"Comma-separated list of class names implementing \" +` * `sealed abstract class PluginContainer ` * ` class PluginMetricsSource(` * `case class PluginMessage(pluginName: String, message: AnyRef)` * `abstract class IntervalNumOperation(` * `case class MultiplyInterval(interval: Expression, num: Expression)` * `case class DivideInterval(interval: Expression, num: Expression)` * `case class AlterTableAddPartitionStatement(` * `case class AlterTableSerDePropertiesStatement(` * `case class ShowCurrentNamespaceStatement() extends ParsedStatement` * `case class ShowCurrentNamespace(catalogManager: CatalogManager) extends Command ` * `case class LocalShuffleReaderExec(child: SparkPlan) extends UnaryExecNode ` * `case class ShowCurrentNamespaceExec(` * `class ContinuousRecordEndpoint(buckets: Seq[Seq[UnsafeRow]], lock: Object)`
---------------------------------------------------------------- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org