[ 
https://issues.apache.org/jira/browse/SPARK-26318?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16716364#comment-16716364
 ] 

ASF GitHub Bot commented on SPARK-26318:
----------------------------------------

KyleLi1985 commented on a change in pull request #23271: [SPARK-26318][SQL] 
Enhance function merge performance in Row
URL: https://github.com/apache/spark/pull/23271#discussion_r240491672
 
 

 ##########
 File path: sql/catalyst/src/main/scala/org/apache/spark/sql/Row.scala
 ##########
 @@ -58,8 +58,21 @@ object Row {
    * Merge multiple rows into a single row, one after another.
    */
   def merge(rows: Row*): Row = {
-    // TODO: Improve the performance of this if used in performance critical 
part.
-    new GenericRow(rows.flatMap(_.toSeq).toArray)
+    val size = rows.size
+    var number = 0
+    for (i <- 0 until size) {
+      number = number + rows(i).size
+    }
+    val container = Array.ofDim[Any](number)
+    var n = 0
+    for (i <- 0 until size) {
 
 Review comment:
   Only primitively use size, subSize, and number information and control the 
container will improve the performance more.
   up to 
   call 100000000 time Row.merge(row1) need 18064 millisecond
   call 100000000 time Row.merge(rows:_*) need 25651 millisecond

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Enhance function merge performance in Row
> -----------------------------------------
>
>                 Key: SPARK-26318
>                 URL: https://issues.apache.org/jira/browse/SPARK-26318
>             Project: Spark
>          Issue Type: Improvement
>          Components: SQL
>    Affects Versions: 2.4.0
>            Reporter: Liang Li
>            Priority: Minor
>
> Enhance function merge performance in Row
> Like do 100000000 time Row.merge for input 
> val row1 = Row("name", "work", 2314, "null", 1, ""), it need 108458 
> millisecond
> After do some enhancement, it only need 24967 millisecond



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to