[GitHub] spark pull request #23225: [SPARK-26287][CORE]Don't need to create an empty ...

2018-12-09 Thread wangjiaochun
Github user wangjiaochun commented on a diff in the pull request:

https://github.com/apache/spark/pull/23225#discussion_r239990587
  
--- Diff: 
core/src/main/java/org/apache/spark/shuffle/sort/ShuffleExternalSorter.java ---
@@ -161,6 +161,10 @@ private void writeSortedFile(boolean isLastFile) {
 final ShuffleInMemorySorter.ShuffleSorterIterator sortedRecords =
   inMemSorter.getSortedIterator();
 
+// If there are no sorted records, so we don't need to create an empty 
spill file.
+if (!sortedRecords.hasNext()) {
+  return;
+}
--- End diff --

Okay, I will make the changes.


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #23225: [SPARK-26287][CORE]Don't need to create an empty ...

2018-12-09 Thread wangjiaochun
Github user wangjiaochun commented on a diff in the pull request:

https://github.com/apache/spark/pull/23225#discussion_r239990083
  
--- Diff: 
core/src/test/java/org/apache/spark/shuffle/sort/UnsafeShuffleWriterSuite.java 
---
@@ -235,6 +235,7 @@ public void writeEmptyIterator() throws Exception {
 final Option mapStatus = writer.stop(true);
 assertTrue(mapStatus.isDefined());
 assertTrue(mergedOutputFile.exists());
+assertEquals(0, spillFilesCreated.size());
--- End diff --

I think it's unnecessary to add  a new test case, and it can delete  line 
239~242 of this test writeEmptyIterator because they're always right.


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #23225: [SPARK-26287][CORE]Don't need to create an empty spill f...

2018-12-09 Thread wangjiaochun
Github user wangjiaochun commented on the issue:

https://github.com/apache/spark/pull/23225
  
Okey.@dongjoon-hyun


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #23225: [SPARK-26287][CORE]Don't need to create an empty ...

2018-12-07 Thread wangjiaochun
Github user wangjiaochun commented on a diff in the pull request:

https://github.com/apache/spark/pull/23225#discussion_r239989805
  
--- Diff: 
core/src/main/java/org/apache/spark/shuffle/sort/ShuffleExternalSorter.java ---
@@ -161,6 +161,10 @@ private void writeSortedFile(boolean isLastFile) {
 final ShuffleInMemorySorter.ShuffleSorterIterator sortedRecords =
   inMemSorter.getSortedIterator();
 
+// If there are no sorted records, so we don't need to create an empty 
spill file.
+if (!sortedRecords.hasNext()) {
+  return;
+}
--- End diff --

I think it's better not to do that.Because change the original code style 
and it don't makes an appreciable difference in readability.


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #23225: [SPARK-26287][CORE]Don't need to create an empty ...

2018-12-06 Thread wangjiaochun
Github user wangjiaochun commented on a diff in the pull request:

https://github.com/apache/spark/pull/23225#discussion_r239711919
  
--- Diff: 
core/src/test/java/org/apache/spark/shuffle/sort/UnsafeShuffleWriterSuite.java 
---
@@ -562,4 +562,18 @@ public void testPeakMemoryUsed() throws Exception {
 }
   }
 
+  @Test
+  public void writeEmptyIteratorNotCreateEmptySpillFile() throws Exception 
{
+final UnsafeShuffleWriter writer = createWriter(true);
+writer.write(Iterators.emptyIterator());
+final Option mapStatus = writer.stop(true);
+assertTrue(mapStatus.isDefined());
+assertTrue(mergedOutputFile.exists());
+assertEquals(0, spillFilesCreated.size());
--- End diff --

That's it.


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #23225: [SPARK-26287][CORE]Don't need to create an empty ...

2018-12-06 Thread wangjiaochun
Github user wangjiaochun commented on a diff in the pull request:

https://github.com/apache/spark/pull/23225#discussion_r239704796
  
--- Diff: 
core/src/test/java/org/apache/spark/shuffle/sort/UnsafeShuffleWriterSuite.java 
---
@@ -562,4 +562,18 @@ public void testPeakMemoryUsed() throws Exception {
 }
   }
 
+  @Test
+  public void writeEmptyIteratorNotCreateEmptySpillFile() throws Exception 
{
+final UnsafeShuffleWriter writer = createWriter(true);
+writer.write(Iterators.emptyIterator());
+final Option mapStatus = writer.stop(true);
+assertTrue(mapStatus.isDefined());
+assertTrue(mergedOutputFile.exists());
+assertEquals(0, spillFilesCreated.size());
--- End diff --

I mean that before add code "if (sortedRecords.hasNext()) { return }" it 
will fail. now add assertEquals(0, spillFilesCreated.size()) to 
writeEmptyIterator seems good. 


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #23225: [SPARK-26287][CORE]Don't need to create an empty ...

2018-12-06 Thread wangjiaochun
Github user wangjiaochun commented on a diff in the pull request:

https://github.com/apache/spark/pull/23225#discussion_r239700999
  
--- Diff: 
core/src/test/java/org/apache/spark/shuffle/sort/UnsafeShuffleWriterSuite.java 
---
@@ -562,4 +562,18 @@ public void testPeakMemoryUsed() throws Exception {
 }
   }
 
+  @Test
+  public void writeEmptyIteratorNotCreateEmptySpillFile() throws Exception 
{
+final UnsafeShuffleWriter writer = createWriter(true);
+writer.write(Iterators.emptyIterator());
+final Option mapStatus = writer.stop(true);
+assertTrue(mapStatus.isDefined());
+assertTrue(mergedOutputFile.exists());
+assertEquals(0, spillFilesCreated.size());
--- End diff --

Test writeEmptyIterator() has create spill file although write  empty 
iterator,but the writeEmptyIteratorNotCreateEmptySpillFile test not create 
spill file while there has not records in memroy


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #23247: [SPARK-26294][CORE]Delete Unnecessary If statement

2018-12-06 Thread wangjiaochun
Github user wangjiaochun commented on the issue:

https://github.com/apache/spark/pull/23247
  
ok! thanks, I will focus on commit bug later.


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #23247: [SPARK-26294][CORE]Delete Unnecessary If statement

2018-12-06 Thread wangjiaochun
Github user wangjiaochun commented on the issue:

https://github.com/apache/spark/pull/23247
  
> @wangjiaochun, I think you better stop fixing trivial stuff in each PR. 
Those stuff can be fixed when the codes around here is fixed, or let other 
people fix it later.

ok! thanks, I will focus on commit bug later.



---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #23246: [SPARK-26292][CORE]Assert statement of currentPag...

2018-12-06 Thread wangjiaochun
Github user wangjiaochun closed the pull request at:

https://github.com/apache/spark/pull/23246


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #23247: [SPARK-26294][CORE]Delete Unnecessary If statemen...

2018-12-06 Thread wangjiaochun
GitHub user wangjiaochun opened a pull request:

https://github.com/apache/spark/pull/23247

[SPARK-26294][CORE]Delete Unnecessary If statement

## What changes were proposed in this pull request?
Delete unnecessary If statement, because it Impossible execution when 
records less than or equal to zero.it is only execution when records begin 
zero.

## How was this patch tested?
Existing tests

(Please explain how this patch was tested. E.g. unit tests, integration 
tests, manual tests)
(If this patch involves UI changes, please attach a screenshot; otherwise, 
remove this)

Please review http://spark.apache.org/contributing.html before opening a 
pull request.


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/wangjiaochun/spark inMemSorter

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/spark/pull/23247.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #23247


commit d5aea03764159914942eac2d3e5565ee9862424f
Author: 10087686 
Date:   2018-12-06T11:38:59Z

Delete Unnecessary If statement




---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #23246: [SPARK-26292][CORE]Assert statement of currentPag...

2018-12-06 Thread wangjiaochun
GitHub user wangjiaochun opened a pull request:

https://github.com/apache/spark/pull/23246

[SPARK-26292][CORE]Assert statement of currentPage may be not in right place

## What changes were proposed in this pull request?
The assert  statement of currentPage is not in right place,it should be 
add to the back of 
allocatePage in a function acquireNewPageIfNecessary.

## How was this patch tested?
Existing tests

(Please explain how this patch was tested. E.g. unit tests, integration 
tests, manual tests)
(If this patch involves UI changes, please attach a screenshot; otherwise, 
remove this)

Please review http://spark.apache.org/contributing.html before opening a 
pull request.


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/wangjiaochun/spark SorterPagePointer

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/spark/pull/23246.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #23246


commit 877cf4cf8d2a0f3c6868980eabaa41b5ee20767b
Author: 10087686 
Date:   2018-12-06T10:49:49Z

assert place is not right




---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #23229: [MINOR][CORE] Modify some field name because it may be c...

2018-12-05 Thread wangjiaochun
Github user wangjiaochun commented on the issue:

https://github.com/apache/spark/pull/23229
  
ok,close this PR.


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #23229: [MINOR][CORE] Modify some field name because it m...

2018-12-05 Thread wangjiaochun
Github user wangjiaochun closed the pull request at:

https://github.com/apache/spark/pull/23229


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #23225: [MINOR][CORE]Don't need to create an empty spill file wh...

2018-12-05 Thread wangjiaochun
Github user wangjiaochun commented on the issue:

https://github.com/apache/spark/pull/23225
  
1. I think test case writeEmptyIterator in UnsafeShuffleWriterSuite.java 
cover this scenes
2. I will propose a  JIRA soon.


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #23226: [MINOR][TEST] Add MAXIMUM_PAGE_SIZE_BYTES Excepti...

2018-12-05 Thread wangjiaochun
Github user wangjiaochun commented on a diff in the pull request:

https://github.com/apache/spark/pull/23226#discussion_r239066440
  
--- Diff: 
core/src/test/java/org/apache/spark/unsafe/map/AbstractBytesToBytesMapSuite.java
 ---
@@ -622,6 +622,17 @@ public void initialCapacityBoundsChecking() {
 } catch (IllegalArgumentException e) {
   // expected exception
 }
+
+try {
+  new BytesToBytesMap(
+  taskMemoryManager,
--- End diff --

ok,I will correct this indentation and propose JIRA.


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #23229: [MINOR][CORE] Modify some field name because it m...

2018-12-05 Thread wangjiaochun
GitHub user wangjiaochun opened a pull request:

https://github.com/apache/spark/pull/23229

[MINOR][CORE] Modify some field name because it may be cause confusion

## What changes were proposed in this pull request?
There is different field name style for tracking allocated data pages, 
such as class BytesToBytesMap use field name dataPages for allocated data 
pages,
class UnsafeExternalSorter and ShuffleExternalSorter use field name 
allocatedPages for allocated data pages
They are all belong to memory consumer, so I think it is best to use 
unified name;
and class TaskMemoryManager filed name allocatedPages is modified to 
pagesBitSet,used to indicate the function of bitmap ;

## How was this patch tested?
Existing tests

(Please explain how this patch was tested. E.g. unit tests, integration 
tests, manual tests)
(If this patch involves UI changes, please attach a screenshot; otherwise, 
remove this)

Please review http://spark.apache.org/contributing.html before opening a 
pull request.


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/wangjiaochun/spark memory_consumer_name

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/spark/pull/23229.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #23229


commit 00fa455a6e145350a2bc5750df54cd0a9d1f0cdc
Author: 10087686 
Date:   2018-12-05T08:48:08Z

  modify field name in MemoryConsumer




---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #23226: [MINOR][TEST] Add MAXIMUM_PAGE_SIZE_BYTES Excepti...

2018-12-05 Thread wangjiaochun
GitHub user wangjiaochun opened a pull request:

https://github.com/apache/spark/pull/23226

[MINOR][TEST]  Add MAXIMUM_PAGE_SIZE_BYTES Exception test

## What changes were proposed in this pull request?
Add MAXIMUM_PAGE_SIZE_BYTES Exception test

## How was this patch tested?
Existing tests

(Please explain how this patch was tested. E.g. unit tests, integration 
tests, manual tests)
(If this patch involves UI changes, please attach a screenshot; otherwise, 
remove this)

Please review http://spark.apache.org/contributing.html before opening a 
pull request.


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/wangjiaochun/spark BytesToBytesMapSuite

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/spark/pull/23226.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #23226


commit dd3b0e5fe45cd0d9f1bf689a6b4cd3cec41867a1
Author: 10087686 
Date:   2018-12-05T08:10:38Z

  add MAXIMUM_PAGE_SIZE_BYTES Exception test




---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #23225: [MINOR][CORE]Don't need to create an empty spill ...

2018-12-05 Thread wangjiaochun
GitHub user wangjiaochun opened a pull request:

https://github.com/apache/spark/pull/23225

[MINOR][CORE]Don't need to create an empty spill file when memory has no 
records

## What changes were proposed in this pull request?
 If there are no records in memory, then we don't need to create an empty 
temp spill file.

## How was this patch tested?
Existing tests

(Please explain how this patch was tested. E.g. unit tests, integration 
tests, manual tests)
(If this patch involves UI changes, please attach a screenshot; otherwise, 
remove this)

Please review http://spark.apache.org/contributing.html before opening a 
pull request.


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/wangjiaochun/spark ShufflSorter

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/spark/pull/23225.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #23225


commit 6f9dcee39131429d9b40df45229149ffdde4fdbd
Author: 10087686 
Date:   2018-12-05T07:52:55Z

"add if"




---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #18474: [SPARK-21235][TESTS] UTest should clear temp results whe...

2017-08-08 Thread wangjiaochun
Github user wangjiaochun commented on the issue:

https://github.com/apache/spark/pull/18474
  
Thanks,I will resolve all problems. @kiszk 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #18867: [SPARK-21663][TESTS]test("remote fetch below max RPC mes...

2017-08-08 Thread wangjiaochun
Github user wangjiaochun commented on the issue:

https://github.com/apache/spark/pull/18867
  
ok,I will update the PR title. @cloud-fan 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #18867: MapOutputTrackerSuite Utest

2017-08-07 Thread wangjiaochun
GitHub user wangjiaochun opened a pull request:

https://github.com/apache/spark/pull/18867

MapOutputTrackerSuite Utest

Signed-off-by: 10087686 <wang.jiaoc...@zte.com.cn>

## What changes were proposed in this pull request?
After Unit tests end,there should be call masterTracker.stop() to free 
resource; 
(Please fill in changes proposed in this fix)

## How was this patch tested?
Run Unit tests;
(Please explain how this patch was tested. E.g. unit tests, integration 
tests, manual tests)
(If this patch involves UI changes, please attach a screenshot; otherwise, 
remove this)

Please review http://spark.apache.org/contributing.html before opening a 
pull request.


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/wangjiaochun/spark mapout

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/spark/pull/18867.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #18867


commit b0c58c7ff4f0be063e31d9b2f3b0b8b01094c51d
Author: 10087686 <wang.jiaoc...@zte.com.cn>
Date:   2017-08-07T07:56:56Z

mapoutputtrack Utest

Signed-off-by: 10087686 <wang.jiaoc...@zte.com.cn>




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #18474: [SPARK-21235][TESTS] UTest should clear temp results whe...

2017-08-06 Thread wangjiaochun
Github user wangjiaochun commented on the issue:

https://github.com/apache/spark/pull/18474
  
Yes, Running this on Windows7.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #18475: [SPARK][Tests] assert messager not right

2017-07-31 Thread wangjiaochun
Github user wangjiaochun commented on the issue:

https://github.com/apache/spark/pull/18475
  
ok!


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #18475: [SPARK][Tests] assert messager not right

2017-07-31 Thread wangjiaochun
Github user wangjiaochun closed the pull request at:

https://github.com/apache/spark/pull/18475


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #18474: [SPARK-21235][TESTS] UTest should clear temp results whe...

2017-07-03 Thread wangjiaochun
Github user wangjiaochun commented on the issue:

https://github.com/apache/spark/pull/18474
  
1.  Test environment and test method:IDEA project direct Run 
BlockManagerSuite.scala.
2.  I test this case again use step through,find this case 
encryptionTest("on-disk storage") Runs a test twice, if SparkConf object with 
encryption off(false), the disk blocks will clear. if encryption is on(set 
ture), disk blocks not clear。


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #18474: [SPARK-21235][TESTS] UTest should clear temp results whe...

2017-07-02 Thread wangjiaochun
Github user wangjiaochun commented on the issue:

https://github.com/apache/spark/pull/18474
  
I have run this case many times,the memoryStore temp file will be 
cleared,but the disk blocks is really not clear. 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #18475: [SPARK][Tests] assert messager not right

2017-06-29 Thread wangjiaochun
GitHub user wangjiaochun opened a pull request:

https://github.com/apache/spark/pull/18475

[SPARK][Tests] assert messager not right

Signed-off-by: 10087686 <wang.jiaoc...@zte.com.cn>

## What changes were proposed in this pull request?
Modify the assert message;
(Please fill in changes proposed in this fix)

## How was this patch tested?

(Please explain how this patch was tested. E.g. unit tests, integration 
tests, manual tests)
(If this patch involves UI changes, please attach a screenshot; otherwise, 
remove this)

Please review http://spark.apache.org/contributing.html before opening a 
pull request.


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/wangjiaochun/spark master

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/spark/pull/18475.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #18475


commit 9491b11207b4d9b922c700fa3eb3eea9160c1180
Author: 10087686 <wang.jiaoc...@zte.com.cn>
Date:   2017-06-30T02:01:56Z

assert messager not right
Signed-off-by: 10087686 <wang.jiaoc...@zte.com.cn>




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #18474: [SPARK-21235][SPARKR] UTest should clear temp res...

2017-06-29 Thread wangjiaochun
GitHub user wangjiaochun opened a pull request:

https://github.com/apache/spark/pull/18474

[SPARK-21235][SPARKR] UTest should clear temp results when run case 

Signed-off-by: 10087686 <wang.jiaoc...@zte.com.cn>

## What changes were proposed in this pull request?
when run this case encryptionTest("on-disk storage")  end, it has temp 
result not clear

Users\...\AppData\Local\Temp\blockmgr-865114ea-8e5c-4b20-9a25-1224cfe5545b\01\test_a3

Users\...\AppData\Local\Temp\blockmgr-865114ea-8e5c-4b20-9a25-1224cfe5545b\01\test_a2

Users\...\AppData\Local\Temp\blockmgr-865114ea-8e5c-4b20-9a25-1224cfe5545b\01\test_a1

so,I think it's best to clear result file;
(Please fill in changes proposed in this fix)
store.removeBlock("a1")
store.removeBlock("a2")
store.removeBlock("a3")
## How was this patch tested?

(Please explain how this patch was tested. E.g. unit tests, integration 
tests, manual tests)
(If this patch involves UI changes, please attach a screenshot; otherwise, 
remove this)
Run encryptionTest("on-disk storage") 
Please review http://spark.apache.org/contributing.html before opening a 
pull request.


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/wangjiaochun/spark commitBlock

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/spark/pull/18474.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #18474


commit 1011d833ab659948dfb1c4dbd51364dbf504c952
Author: 10087686 <wang.jiaoc...@zte.com.cn>
Date:   2017-06-30T01:43:25Z

Utest should clear eniv when run end
Signed-off-by: 10087686 <wang.jiaoc...@zte.com.cn>




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #18226: [SPARK-21006][TESTS] Create rpcEnv and run later needs s...

2017-06-07 Thread wangjiaochun
Github user wangjiaochun commented on the issue:

https://github.com/apache/spark/pull/18226
  
ok, I have re submit, Thanks for reviewing @srowen 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #18226: [SPARK-21006][TESTS] Create rpcEnv and run later ...

2017-06-07 Thread wangjiaochun
GitHub user wangjiaochun opened a pull request:

https://github.com/apache/spark/pull/18226

[SPARK-21006][TESTS] Create rpcEnv and run later needs shutdown and 
awaitTermination

Signed-off-by: 10087686 <wang.jiaoc...@zte.com.cn>

## What changes were proposed in this pull request?
When  run test("port conflict") case, we need run anotherEnv.shutdown() and 
anotherEnv.awaitTermination() for free resource.
(Please fill in changes proposed in this fix)

## How was this patch tested?
run RpcEnvSuit.scala Utest
(Please explain how this patch was tested. E.g. unit tests, integration 
tests, manual tests)
(If this patch involves UI changes, please attach a screenshot; otherwise, 
remove this)

Please review http://spark.apache.org/contributing.html before opening a 
pull request.


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/wangjiaochun/spark master

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/spark/pull/18226.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #18226


commit f6e42a077e4228efd1ff5d742e2e35ade78d212d
Author: 10087686 <wang.jiaoc...@zte.com.cn>
Date:   2017-06-07T07:59:51Z

Create rpcEnv and run later needs shutdown and awaitTermination
Signed-off-by: 10087686 <wang.jiaoc...@zte.com.cn>




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org