[GitHub] spark pull request: [SPARK-13746][Tests]stop using deprecated Sync...

2016-03-13 Thread wilson888888888
Github user wilson8 commented on a diff in the pull request:

https://github.com/apache/spark/pull/11580#discussion_r55941313
  
--- Diff: core/src/test/scala/org/apache/spark/ContextCleanerSuite.scala ---
@@ -578,18 +578,27 @@ class CleanerTester(
   }
 
   private def uncleanedResourcesToString = {
+val s1 = {toBeCleanedRDDIds.synchronized {
--- End diff --

@srowen 
Sorry, I overlooked this.  Will change now.  Thanks!


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-13746][Tests]stop using deprecated Sync...

2016-03-12 Thread wilson888888888
Github user wilson8 commented on a diff in the pull request:

https://github.com/apache/spark/pull/11580#discussion_r55933247
  
--- Diff: core/src/test/scala/org/apache/spark/ContextCleanerSuite.scala ---
@@ -579,17 +579,20 @@ class CleanerTester(
 
   private def uncleanedResourcesToString = {
 s"""
-  |\tRDDs = ${toBeCleanedRDDIds.toSeq.sorted.mkString("[", ", ", "]")}
-  |\tShuffles = ${toBeCleanedShuffleIds.toSeq.sorted.mkString("[", ", 
", "]")}
-  |\tBroadcasts = ${toBeCleanedBroadcstIds.toSeq.sorted.mkString("[", 
", ", "]")}
+   |\tRDDs = ${toBeCleanedRDDIds.synchronized
--- End diff --

@srowen 
I will change this.  Thanks!


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-13746][Tests]stop using deprecated Sync...

2016-03-12 Thread wilson888888888
Github user wilson8 commented on a diff in the pull request:

https://github.com/apache/spark/pull/11580#discussion_r55933241
  
--- Diff: core/src/test/scala/org/apache/spark/ContextCleanerSuite.scala ---
@@ -442,25 +442,25 @@ class CleanerTester(
 checkpointIds: Seq[Long] = Seq.empty)
   extends Logging {
 
-  val toBeCleanedRDDIds = new HashSet[Int] with SynchronizedSet[Int] ++= 
rddIds
-  val toBeCleanedShuffleIds = new HashSet[Int] with SynchronizedSet[Int] 
++= shuffleIds
-  val toBeCleanedBroadcstIds = new HashSet[Long] with 
SynchronizedSet[Long] ++= broadcastIds
-  val toBeCheckpointIds = new HashSet[Long] with SynchronizedSet[Long] ++= 
checkpointIds
+  val toBeCleanedRDDIds = new HashSet[Int] ++= rddIds
--- End diff --

@srowen 
I tried to change to val toBeCleanedRDDIds = HashSet(rddIds), but  I got 
error at  
toBeCleanedRDDIds -= rddId
so I will keep val toBeCleanedRDDIds = new HashSet[Int] ++= rddIds


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-13746][Tests]stop using deprecated Sync...

2016-03-09 Thread wilson888888888
Github user wilson8 commented on a diff in the pull request:

https://github.com/apache/spark/pull/11580#discussion_r55634450
  
--- Diff: core/src/test/scala/org/apache/spark/ContextCleanerSuite.scala ---
@@ -586,10 +586,12 @@ class CleanerTester(
   }
 
   private def isAllCleanedUp =
-toBeCleanedRDDIds.isEmpty &&
-toBeCleanedShuffleIds.isEmpty &&
-toBeCleanedBroadcstIds.isEmpty &&
-toBeCheckpointIds.isEmpty
+synchronized {
--- End diff --

@srowen 
Fixed.  Thanks!


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-13746][Tests]stop using deprecated Sync...

2016-03-08 Thread wilson888888888
GitHub user wilson8 opened a pull request:

https://github.com/apache/spark/pull/11580

[SPARK-13746][Tests]stop using deprecated SynchronizedSet

trait SynchronizedSet in package mutable is deprecated



You can merge this pull request into a Git repository by running:

$ git pull https://github.com/wilson8/spark spark-synchronizedset

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/spark/pull/11580.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #11580


commit c0bcba3e9f4426e240b77f7b706aaa8aa643b7f1
Author: Wilson Wu <wilson88...@gmail.com>
Date:   2016-03-08T17:53:08Z

[SPARK-13746][Tests]stop using deprecated SynchronizedSet




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-12506][SQL]push down WHERE clause arith...

2015-12-29 Thread wilson888888888
Github user wilson8 closed the pull request at:

https://github.com/apache/spark/pull/10503


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-12506][SQL]push down WHERE clause arith...

2015-12-29 Thread wilson888888888
Github user wilson8 commented on the pull request:

https://github.com/apache/spark/pull/10503#issuecomment-167742432
  
Used the wrong ID.  Will close for now and open another one.  Sorry for the 
confusion 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-12460][SQL]Add ExpressionDescription to...

2015-12-28 Thread wilson888888888
Github user wilson8 commented on the pull request:

https://github.com/apache/spark/pull/10486#issuecomment-167640827
  
@vectorijk 
Thanks. I changed the commit message. 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-12506][SQL]push down WHERE clause arith...

2015-12-28 Thread wilson888888888
GitHub user wilson8 opened a pull request:

https://github.com/apache/spark/pull/10503

[SPARK-12506][SQL]push down WHERE clause arithmetic operator to JDBC …

…layer
For arithmetic operator in WHERE clause such as
select * from table where c1 + c2 > 10
Currently where c1 + c2 >10 is done at spark layer.
Will push this to JDBC layer so it will be done in database.

You can merge this pull request into a Git repository by running:

$ git pull https://github.com/huaxingao/spark spark-12506

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/spark/pull/10503.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #10503


commit 09e2135d6e5abef21328941d73031109e6d4d4b6
Author: Huaxin Gao <huax...@us.ibm.com>
Date:   2015-12-27T19:58:12Z

[SPARK-12506][SQL]push down WHERE clause arithmetic operator to JDBC layer




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-11460][SQL]Add ExpressionDescription to...

2015-12-26 Thread wilson888888888
GitHub user wilson8 opened a pull request:

https://github.com/apache/spark/pull/10486

[SPARK-11460][SQL]Add ExpressionDescription to aggregate functions



You can merge this pull request into a Git repository by running:

$ git pull https://github.com/wilson8/spark spark_12460

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/spark/pull/10486.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #10486


commit 1a88a15304bc688a724d56b8cef7ff76c51760b3
Author: Wilson Wu <wilson88...@gmail.com>
Date:   2015-12-27T05:21:43Z

[SPARK-11460][SQL]Add ExpressionDescription to aggregate functions




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org