Github user cxzl25 commented on a diff in the pull request:

    https://github.com/apache/spark/pull/21311#discussion_r189997697
  
    --- Diff: 
sql/core/src/test/scala/org/apache/spark/sql/execution/joins/HashedRelationSuite.scala
 ---
    @@ -254,6 +254,30 @@ class HashedRelationSuite extends SparkFunSuite with 
SharedSQLContext {
         map.free()
       }
     
    +  test("LongToUnsafeRowMap with big values") {
    +    val taskMemoryManager = new TaskMemoryManager(
    +      new StaticMemoryManager(
    +        new SparkConf().set(MEMORY_OFFHEAP_ENABLED.key, "false"),
    +        Long.MaxValue,
    +        Long.MaxValue,
    +        1),
    +      0)
    +    val unsafeProj = UnsafeProjection.create(Array[DataType](StringType))
    +    val map = new LongToUnsafeRowMap(taskMemoryManager, 1)
    +
    +    val key = 0L
    +    // the page array is initialized with length 1 << 17 (1M bytes),
    +    // so here we need a value larger than 1 << 18 (2M bytes),to trigger 
the bug
    +    val bigStr = UTF8String.fromString("x" * (1 << 22))
    --- End diff --
    
    Not necessary. 
    Just chose a larger value to make it easier to lose data.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to