zhengruifeng commented on code in PR #38686: URL: https://github.com/apache/spark/pull/38686#discussion_r1027507821
########## connector/connect/src/main/scala/org/apache/spark/sql/connect/planner/SparkConnectPlanner.scala: ########## @@ -523,6 +524,19 @@ class SparkConnectPlanner(session: SparkSession) { sameOrderExpressions = Seq.empty) } + private def transformDrop(rel: proto.Drop): LogicalPlan = { + assert(rel.getColsCount > 0, s"cols must contains at least 1 item!") + + val cols = rel.getColsList.asScala.toArray.map { expr => + Column(transformExpression(expr)) Review Comment: Do you mean verify for the arrow-based collect? since we will remove the json code path, it always fails if there are unsupported types. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org