KyleLi1985 commented on a change in pull request #23271: [SPARK-26318][SQL] Enhance function merge performance in Row URL: https://github.com/apache/spark/pull/23271#discussion_r240491672
########## File path: sql/catalyst/src/main/scala/org/apache/spark/sql/Row.scala ########## @@ -58,8 +58,21 @@ object Row { * Merge multiple rows into a single row, one after another. */ def merge(rows: Row*): Row = { - // TODO: Improve the performance of this if used in performance critical part. - new GenericRow(rows.flatMap(_.toSeq).toArray) + val size = rows.size + var number = 0 + for (i <- 0 until size) { + number = number + rows(i).size + } + val container = Array.ofDim[Any](number) + var n = 0 + for (i <- 0 until size) { Review comment: Only primitively use size, subSize, and number information and control the container will improve the performance more. up to call 100000000 time Row.merge(row1) need 18064 millisecond call 100000000 time Row.merge(rows:_*) need 25651 millisecond ---------------------------------------------------------------- This is an automated message from the Apache Git Service. To respond to the message, please log on GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org