[GitHub] spark pull request #21175: [SPARK-24107][CORE] ChunkedByteBuffer.writeFully ...

2018-05-14 Thread cloud-fan
Github user cloud-fan commented on a diff in the pull request:

https://github.com/apache/spark/pull/21175#discussion_r188161280
  
--- Diff: 
core/src/main/scala/org/apache/spark/util/io/ChunkedByteBuffer.scala ---
@@ -63,10 +63,15 @@ private[spark] class ChunkedByteBuffer(var chunks: 
Array[ByteBuffer]) {
*/
   def writeFully(channel: WritableByteChannel): Unit = {
 for (bytes <- getChunks()) {
-  while (bytes.remaining() > 0) {
-val ioSize = Math.min(bytes.remaining(), bufferWriteChunkSize)
-bytes.limit(bytes.position() + ioSize)
-channel.write(bytes)
+  val curChunkLimit = bytes.limit()
+  while (bytes.hasRemaining) {
+try {
+  val ioSize = Math.min(bytes.remaining(), bufferWriteChunkSize)
+  bytes.limit(bytes.position() + ioSize)
+  channel.write(bytes)
+} finally {
--- End diff --

I get your point. if there is an exception, there is no next loop and we 
don't need to restore the limit. so try finally is not needed


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #21175: [SPARK-24107][CORE] ChunkedByteBuffer.writeFully ...

2018-05-14 Thread cloud-fan
Github user cloud-fan commented on a diff in the pull request:

https://github.com/apache/spark/pull/21175#discussion_r188160813
  
--- Diff: 
core/src/main/scala/org/apache/spark/util/io/ChunkedByteBuffer.scala ---
@@ -63,10 +63,15 @@ private[spark] class ChunkedByteBuffer(var chunks: 
Array[ByteBuffer]) {
*/
   def writeFully(channel: WritableByteChannel): Unit = {
 for (bytes <- getChunks()) {
-  while (bytes.remaining() > 0) {
-val ioSize = Math.min(bytes.remaining(), bufferWriteChunkSize)
-bytes.limit(bytes.position() + ioSize)
-channel.write(bytes)
+  val curChunkLimit = bytes.limit()
+  while (bytes.hasRemaining) {
+try {
+  val ioSize = Math.min(bytes.remaining(), bufferWriteChunkSize)
+  bytes.limit(bytes.position() + ioSize)
+  channel.write(bytes)
+} finally {
--- End diff --

I think the problem is, `bytes.limit(bytes.position() + ioSize)` will 
change the result of `bytes.hasRemaining`, so we have to restore the limit in 
each loop.


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #21175: [SPARK-24107][CORE] ChunkedByteBuffer.writeFully ...

2018-05-14 Thread cloud-fan
Github user cloud-fan commented on a diff in the pull request:

https://github.com/apache/spark/pull/21175#discussion_r188160044
  
--- Diff: 
core/src/main/scala/org/apache/spark/util/io/ChunkedByteBuffer.scala ---
@@ -63,10 +63,15 @@ private[spark] class ChunkedByteBuffer(var chunks: 
Array[ByteBuffer]) {
*/
   def writeFully(channel: WritableByteChannel): Unit = {
 for (bytes <- getChunks()) {
-  while (bytes.remaining() > 0) {
-val ioSize = Math.min(bytes.remaining(), bufferWriteChunkSize)
-bytes.limit(bytes.position() + ioSize)
-channel.write(bytes)
+  val curChunkLimit = bytes.limit()
+  while (bytes.hasRemaining) {
+try {
+  val ioSize = Math.min(bytes.remaining(), bufferWriteChunkSize)
+  bytes.limit(bytes.position() + ioSize)
+  channel.write(bytes)
+} finally {
--- End diff --

Do you mean this is not a real bug that can cause real workload to fail?


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #21175: [SPARK-24107][CORE] ChunkedByteBuffer.writeFully ...

2018-05-14 Thread JoshRosen
Github user JoshRosen commented on a diff in the pull request:

https://github.com/apache/spark/pull/21175#discussion_r188154900
  
--- Diff: 
core/src/main/scala/org/apache/spark/util/io/ChunkedByteBuffer.scala ---
@@ -63,10 +63,15 @@ private[spark] class ChunkedByteBuffer(var chunks: 
Array[ByteBuffer]) {
*/
   def writeFully(channel: WritableByteChannel): Unit = {
 for (bytes <- getChunks()) {
-  while (bytes.remaining() > 0) {
-val ioSize = Math.min(bytes.remaining(), bufferWriteChunkSize)
-bytes.limit(bytes.position() + ioSize)
-channel.write(bytes)
+  val curChunkLimit = bytes.limit()
+  while (bytes.hasRemaining) {
+try {
+  val ioSize = Math.min(bytes.remaining(), bufferWriteChunkSize)
+  bytes.limit(bytes.position() + ioSize)
--- End diff --

The rationale for the `limit()` isn't super-clear, but that was a problem 
in the original PR which introduced the bug (#18730). I'm commenting here only 
for cross-reference reference for folks who come across this patch in the 
future. I believe that the original motivation was 
http://www.evanjones.ca/java-bytebuffer-leak.html


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #21175: [SPARK-24107][CORE] ChunkedByteBuffer.writeFully ...

2018-05-14 Thread JoshRosen
Github user JoshRosen commented on a diff in the pull request:

https://github.com/apache/spark/pull/21175#discussion_r188154716
  
--- Diff: 
core/src/main/scala/org/apache/spark/util/io/ChunkedByteBuffer.scala ---
@@ -63,10 +63,15 @@ private[spark] class ChunkedByteBuffer(var chunks: 
Array[ByteBuffer]) {
*/
   def writeFully(channel: WritableByteChannel): Unit = {
 for (bytes <- getChunks()) {
-  while (bytes.remaining() > 0) {
-val ioSize = Math.min(bytes.remaining(), bufferWriteChunkSize)
-bytes.limit(bytes.position() + ioSize)
-channel.write(bytes)
+  val curChunkLimit = bytes.limit()
+  while (bytes.hasRemaining) {
+try {
+  val ioSize = Math.min(bytes.remaining(), bufferWriteChunkSize)
+  bytes.limit(bytes.position() + ioSize)
+  channel.write(bytes)
+} finally {
--- End diff --

I don't think we need the `try` and `finally` here because `getChunks()` 
returns duplicated ByteBuffers which have their own position and limit.


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #21175: [SPARK-24107][CORE] ChunkedByteBuffer.writeFully ...

2018-05-02 Thread asfgit
Github user asfgit closed the pull request at:

https://github.com/apache/spark/pull/21175


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #21175: [SPARK-24107][CORE] ChunkedByteBuffer.writeFully ...

2018-05-02 Thread manbuyun
Github user manbuyun commented on a diff in the pull request:

https://github.com/apache/spark/pull/21175#discussion_r185415529
  
--- Diff: 
core/src/test/scala/org/apache/spark/io/ChunkedByteBufferSuite.scala ---
@@ -20,12 +20,12 @@ package org.apache.spark.io
 import java.nio.ByteBuffer
 
 import com.google.common.io.ByteStreams
-
-import org.apache.spark.SparkFunSuite
+import org.apache.spark.{SparkFunSuite, SharedSparkContext}
--- End diff --

I have fixed and commit. Thanks


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #21175: [SPARK-24107][CORE] ChunkedByteBuffer.writeFully ...

2018-04-29 Thread xuanyuanking
Github user xuanyuanking commented on a diff in the pull request:

https://github.com/apache/spark/pull/21175#discussion_r184882338
  
--- Diff: 
core/src/test/scala/org/apache/spark/io/ChunkedByteBufferSuite.scala ---
@@ -20,12 +20,12 @@ package org.apache.spark.io
 import java.nio.ByteBuffer
 
 import com.google.common.io.ByteStreams
-
-import org.apache.spark.SparkFunSuite
+import org.apache.spark.{SparkFunSuite, SharedSparkContext}
--- End diff --

move SharedSparkContext before SparkFunSuite


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #21175: [SPARK-24107][CORE] ChunkedByteBuffer.writeFully ...

2018-04-29 Thread xuanyuanking
Github user xuanyuanking commented on a diff in the pull request:

https://github.com/apache/spark/pull/21175#discussion_r184882396
  
--- Diff: 
core/src/test/scala/org/apache/spark/io/ChunkedByteBufferSuite.scala ---
@@ -20,12 +20,12 @@ package org.apache.spark.io
 import java.nio.ByteBuffer
 
 import com.google.common.io.ByteStreams
--- End diff --

add an empty line behind 22 to separate spark and third-party group.


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #21175: [SPARK-24107][CORE] ChunkedByteBuffer.writeFully ...

2018-04-27 Thread manbuyun
Github user manbuyun commented on a diff in the pull request:

https://github.com/apache/spark/pull/21175#discussion_r184728954
  
--- Diff: 
core/src/main/scala/org/apache/spark/util/io/ChunkedByteBuffer.scala ---
@@ -63,10 +63,12 @@ private[spark] class ChunkedByteBuffer(var chunks: 
Array[ByteBuffer]) {
*/
   def writeFully(channel: WritableByteChannel): Unit = {
 for (bytes <- getChunks()) {
-  while (bytes.remaining() > 0) {
+  val curChunkLimit = bytes.limit()
+  while (bytes.hasRemaining) {
 val ioSize = Math.min(bytes.remaining(), bufferWriteChunkSize)
 bytes.limit(bytes.position() + ioSize)
 channel.write(bytes)
+bytes.limit(curChunkLimit)
--- End diff --

I have commit this modified


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #21175: [SPARK-24107][CORE] ChunkedByteBuffer.writeFully ...

2018-04-27 Thread manbuyun
Github user manbuyun commented on a diff in the pull request:

https://github.com/apache/spark/pull/21175#discussion_r184727560
  
--- Diff: 
core/src/main/scala/org/apache/spark/util/io/ChunkedByteBuffer.scala ---
@@ -63,10 +63,12 @@ private[spark] class ChunkedByteBuffer(var chunks: 
Array[ByteBuffer]) {
*/
   def writeFully(channel: WritableByteChannel): Unit = {
 for (bytes <- getChunks()) {
-  while (bytes.remaining() > 0) {
+  val curChunkLimit = bytes.limit()
+  while (bytes.hasRemaining) {
 val ioSize = Math.min(bytes.remaining(), bufferWriteChunkSize)
 bytes.limit(bytes.position() + ioSize)
 channel.write(bytes)
+bytes.limit(curChunkLimit)
--- End diff --

Right. When channel write throw IOException


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #21175: [SPARK-24107][CORE] ChunkedByteBuffer.writeFully ...

2018-04-27 Thread jiangxb1987
Github user jiangxb1987 commented on a diff in the pull request:

https://github.com/apache/spark/pull/21175#discussion_r184713695
  
--- Diff: 
core/src/main/scala/org/apache/spark/util/io/ChunkedByteBuffer.scala ---
@@ -63,10 +63,12 @@ private[spark] class ChunkedByteBuffer(var chunks: 
Array[ByteBuffer]) {
*/
   def writeFully(channel: WritableByteChannel): Unit = {
 for (bytes <- getChunks()) {
-  while (bytes.remaining() > 0) {
+  val curChunkLimit = bytes.limit()
+  while (bytes.hasRemaining) {
 val ioSize = Math.min(bytes.remaining(), bufferWriteChunkSize)
 bytes.limit(bytes.position() + ioSize)
 channel.write(bytes)
+bytes.limit(curChunkLimit)
--- End diff --

I would rewrite this using:
```
try {
val ioSize = Math.min(bytes.remaining(), bufferWriteChunkSize)
bytes.limit(bytes.position() + ioSize)
channel.write(bytes)
} finally {
bytes.limit(curChunkLimit)
}
```
to be safe.


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #21175: [SPARK-24107][CORE] ChunkedByteBuffer.writeFully ...

2018-04-27 Thread manbuyun
Github user manbuyun commented on a diff in the pull request:

https://github.com/apache/spark/pull/21175#discussion_r184706100
  
--- Diff: 
core/src/test/scala/org/apache/spark/io/ChunkedByteBufferSuite.scala ---
@@ -56,6 +56,15 @@ class ChunkedByteBufferSuite extends SparkFunSuite {
 assert(chunkedByteBuffer.getChunks().head.position() === 0)
   }
 
+  test("SPARK-24107: writeFully() write buffer which is larger than 
bufferWriteChunkSize") {
+val bufferWriteChunkSize = 
Option(SparkEnv.get).map(_.conf.get(config.BUFFER_WRITE_CHUNK_SIZE))
+
.getOrElse(config.BUFFER_WRITE_CHUNK_SIZE.defaultValue.get).toInt
--- End diff --

Ok. I have added. Please check


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #21175: [SPARK-24107][CORE] ChunkedByteBuffer.writeFully ...

2018-04-27 Thread maropu
Github user maropu commented on a diff in the pull request:

https://github.com/apache/spark/pull/21175#discussion_r184675304
  
--- Diff: 
core/src/test/scala/org/apache/spark/io/ChunkedByteBufferSuite.scala ---
@@ -56,6 +56,15 @@ class ChunkedByteBufferSuite extends SparkFunSuite {
 assert(chunkedByteBuffer.getChunks().head.position() === 0)
   }
 
+  test("SPARK-24107: writeFully() write buffer which is larger than 
bufferWriteChunkSize") {
+val bufferWriteChunkSize = 
Option(SparkEnv.get).map(_.conf.get(config.BUFFER_WRITE_CHUNK_SIZE))
+
.getOrElse(config.BUFFER_WRITE_CHUNK_SIZE.defaultValue.get).toInt
--- End diff --

How about setting this value via `spark.buffer.write.chunkSize`?


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #21175: [SPARK-24107][CORE] ChunkedByteBuffer.writeFully ...

2018-04-27 Thread manbuyun
Github user manbuyun commented on a diff in the pull request:

https://github.com/apache/spark/pull/21175#discussion_r184668664
  
--- Diff: 
core/src/test/scala/org/apache/spark/io/ChunkedByteBufferSuite.scala ---
@@ -56,6 +56,13 @@ class ChunkedByteBufferSuite extends SparkFunSuite {
 assert(chunkedByteBuffer.getChunks().head.position() === 0)
   }
 
+  test("SPARK-24107: writeFully() write buffer which is larger than 
bufferWriteChunkSize") {
+val chunkedByteBuffer = new 
ChunkedByteBuffer(Array(ByteBuffer.allocate(80 * 1024 * 1024)))
--- End diff --

I have modified.Please check


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #21175: [SPARK-24107][CORE] ChunkedByteBuffer.writeFully ...

2018-04-27 Thread maropu
Github user maropu commented on a diff in the pull request:

https://github.com/apache/spark/pull/21175#discussion_r184645573
  
--- Diff: 
core/src/main/scala/org/apache/spark/util/io/ChunkedByteBuffer.scala ---
@@ -63,10 +63,12 @@ private[spark] class ChunkedByteBuffer(var chunks: 
Array[ByteBuffer]) {
*/
   def writeFully(channel: WritableByteChannel): Unit = {
 for (bytes <- getChunks()) {
+  val limit = bytes.limit()
--- End diff --

How about renaming `limit` to `curChunkLimit`?


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #21175: [SPARK-24107][CORE] ChunkedByteBuffer.writeFully ...

2018-04-27 Thread maropu
Github user maropu commented on a diff in the pull request:

https://github.com/apache/spark/pull/21175#discussion_r184644678
  
--- Diff: 
core/src/test/scala/org/apache/spark/io/ChunkedByteBufferSuite.scala ---
@@ -56,6 +56,13 @@ class ChunkedByteBufferSuite extends SparkFunSuite {
 assert(chunkedByteBuffer.getChunks().head.position() === 0)
   }
 
+  test("SPARK-24107: writeFully() write buffer which is larger than 
bufferWriteChunkSize") {
+val chunkedByteBuffer = new 
ChunkedByteBuffer(Array(ByteBuffer.allocate(80 * 1024 * 1024)))
--- End diff --

Can you configure `bufferWriteChunkSize` explicitly?


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #21175: [SPARK-24107][CORE] ChunkedByteBuffer.writeFully ...

2018-04-27 Thread maropu
Github user maropu commented on a diff in the pull request:

https://github.com/apache/spark/pull/21175#discussion_r184644454
  
--- Diff: 
core/src/main/scala/org/apache/spark/util/io/ChunkedByteBuffer.scala ---
@@ -63,10 +63,12 @@ private[spark] class ChunkedByteBuffer(var chunks: 
Array[ByteBuffer]) {
*/
   def writeFully(channel: WritableByteChannel): Unit = {
 for (bytes <- getChunks()) {
+  val limit = bytes.limit()
   while (bytes.remaining() > 0) {
--- End diff --

This is not related to this pr though, `while (bytes.hasRemaining) {`?


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #21175: [SPARK-24107][CORE] ChunkedByteBuffer.writeFully ...

2018-04-27 Thread manbuyun
Github user manbuyun commented on a diff in the pull request:

https://github.com/apache/spark/pull/21175#discussion_r184612119
  
--- Diff: 
core/src/test/scala/org/apache/spark/io/ChunkedByteBufferSuite.scala ---
@@ -56,6 +56,12 @@ class ChunkedByteBufferSuite extends SparkFunSuite {
 assert(chunkedByteBuffer.getChunks().head.position() === 0)
   }
 
+  test("SPARK-24107: writeFully() write buffer which is larger than 
bufferWriteChunkSize") {
+val chunkedByteBuffer = new 
ChunkedByteBuffer(Array(ByteBuffer.allocate(80 * 1024 * 1024)))
+chunkedByteBuffer.writeFully(new 
ByteArrayWritableChannel(chunkedByteBuffer.size.toInt))
+assert(chunkedByteBuffer.size === (80L * 1024L * 1024L))
--- End diff --

My mistake, has been fixed. Thanks


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #21175: [SPARK-24107][CORE] ChunkedByteBuffer.writeFully ...

2018-04-27 Thread Ngone51
Github user Ngone51 commented on a diff in the pull request:

https://github.com/apache/spark/pull/21175#discussion_r184607965
  
--- Diff: 
core/src/test/scala/org/apache/spark/io/ChunkedByteBufferSuite.scala ---
@@ -56,6 +56,12 @@ class ChunkedByteBufferSuite extends SparkFunSuite {
 assert(chunkedByteBuffer.getChunks().head.position() === 0)
   }
 
+  test("SPARK-24107: writeFully() write buffer which is larger than 
bufferWriteChunkSize") {
+val chunkedByteBuffer = new 
ChunkedByteBuffer(Array(ByteBuffer.allocate(80 * 1024 * 1024)))
+chunkedByteBuffer.writeFully(new 
ByteArrayWritableChannel(chunkedByteBuffer.size.toInt))
+assert(chunkedByteBuffer.size === (80L * 1024L * 1024L))
--- End diff --

`ByteArrayWritableChannel `'s size, not `chunkedByteBuffer`'s size.


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #21175: [SPARK-24107][CORE] ChunkedByteBuffer.writeFully ...

2018-04-26 Thread manbuyun
Github user manbuyun commented on a diff in the pull request:

https://github.com/apache/spark/pull/21175#discussion_r184603606
  
--- Diff: 
core/src/test/scala/org/apache/spark/io/ChunkedByteBufferSuite.scala ---
@@ -56,6 +56,12 @@ class ChunkedByteBufferSuite extends SparkFunSuite {
 assert(chunkedByteBuffer.getChunks().head.position() === 0)
   }
 
+  test("writeFully() can write buffer which is larger than 
bufferWriteChunkSize correctly") {
+val chunkedByteBuffer = new 
ChunkedByteBuffer(Array(ByteBuffer.allocate(80*1024*1024)))
--- End diff --

Done. Thanks


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #21175: [SPARK-24107][CORE] ChunkedByteBuffer.writeFully ...

2018-04-26 Thread manbuyun
Github user manbuyun commented on a diff in the pull request:

https://github.com/apache/spark/pull/21175#discussion_r184603588
  
--- Diff: 
core/src/test/scala/org/apache/spark/io/ChunkedByteBufferSuite.scala ---
@@ -56,6 +56,12 @@ class ChunkedByteBufferSuite extends SparkFunSuite {
 assert(chunkedByteBuffer.getChunks().head.position() === 0)
   }
 
+  test("writeFully() can write buffer which is larger than 
bufferWriteChunkSize correctly") {
--- End diff --

Done. Thanks


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #21175: [SPARK-24107][CORE] ChunkedByteBuffer.writeFully ...

2018-04-26 Thread kiszk
Github user kiszk commented on a diff in the pull request:

https://github.com/apache/spark/pull/21175#discussion_r184602233
  
--- Diff: 
core/src/test/scala/org/apache/spark/io/ChunkedByteBufferSuite.scala ---
@@ -56,6 +56,12 @@ class ChunkedByteBufferSuite extends SparkFunSuite {
 assert(chunkedByteBuffer.getChunks().head.position() === 0)
   }
 
+  test("writeFully() can write buffer which is larger than 
bufferWriteChunkSize correctly") {
--- End diff --

nit: Would it be possible to add `SPARK-24107: ` into the start of the 
string? It would help us connect a UT with JIRA entry.


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #21175: [SPARK-24107][CORE] ChunkedByteBuffer.writeFully ...

2018-04-26 Thread Ngone51
Github user Ngone51 commented on a diff in the pull request:

https://github.com/apache/spark/pull/21175#discussion_r184597197
  
--- Diff: 
core/src/test/scala/org/apache/spark/io/ChunkedByteBufferSuite.scala ---
@@ -56,6 +56,12 @@ class ChunkedByteBufferSuite extends SparkFunSuite {
 assert(chunkedByteBuffer.getChunks().head.position() === 0)
   }
 
+  test("writeFully() can write buffer which is larger than 
bufferWriteChunkSize correctly") {
+val chunkedByteBuffer = new 
ChunkedByteBuffer(Array(ByteBuffer.allocate(80*1024*1024)))
--- End diff --

nit: space beside `*`.


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #21175: [SPARK-24107][CORE] ChunkedByteBuffer.writeFully ...

2018-04-26 Thread Ngone51
Github user Ngone51 commented on a diff in the pull request:

https://github.com/apache/spark/pull/21175#discussion_r184596199
  
--- Diff: 
core/src/test/scala/org/apache/spark/io/ChunkedByteBufferSuite.scala ---
@@ -56,6 +56,12 @@ class ChunkedByteBufferSuite extends SparkFunSuite {
 assert(chunkedByteBuffer.getChunks().head.position() === 0)
   }
 
+  test("writeFully() can write buffer which is larger than 
bufferWriteChunkSize correctly") {
+val chunkedByteBuffer = new 
ChunkedByteBuffer(Array(ByteBuffer.allocate(80*1024*1024)))
+chunkedByteBuffer.writeFully(new 
ByteArrayWritableChannel(chunkedByteBuffer.size.toInt))
+assert(chunkedByteBuffer.getChunks().head.position() === 0)
--- End diff --

This assert is unnecessary for this PR change. Please replace it with 
assert channel's length here.


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #21175: [SPARK-24107][CORE] ChunkedByteBuffer.writeFully ...

2018-04-26 Thread manbuyun
Github user manbuyun commented on a diff in the pull request:

https://github.com/apache/spark/pull/21175#discussion_r184594822
  
--- Diff: 
core/src/test/scala/org/apache/spark/io/ChunkedByteBufferSuite.scala ---
@@ -56,6 +56,12 @@ class ChunkedByteBufferSuite extends SparkFunSuite {
 assert(chunkedByteBuffer.getChunks().head.position() === 0)
   }
 
+  test("writeFully() does not affect original buffer's position") {
--- End diff --

Done. Thanks


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #21175: [SPARK-24107][CORE] ChunkedByteBuffer.writeFully ...

2018-04-26 Thread Ngone51
Github user Ngone51 commented on a diff in the pull request:

https://github.com/apache/spark/pull/21175#discussion_r184590989
  
--- Diff: 
core/src/test/scala/org/apache/spark/io/ChunkedByteBufferSuite.scala ---
@@ -56,6 +56,12 @@ class ChunkedByteBufferSuite extends SparkFunSuite {
 assert(chunkedByteBuffer.getChunks().head.position() === 0)
   }
 
+  test("writeFully() does not affect original buffer's position") {
--- End diff --

Hi @manbuyun .You should add a new unit test to support your own change. 
For example, "writeFully() can write buffer which is larger than 
`bufferWriteChunkSize` correctly. " And update the test code.


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org