GitHub user adrian-wang opened a pull request:
https://github.com/apache/spark/pull/6851
[SPARK-8203] [SPARK-8204] [WIP] [SQL] conditional function: least/greatest
@chenghao-intel @zhichao-li @qiansl127
You can merge this pull request into a Git repository by running:
$ git
Github user adrian-wang commented on the pull request:
https://github.com/apache/spark/pull/6775#issuecomment-112719572
@rxin
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
Github user adrian-wang commented on a diff in the pull request:
https://github.com/apache/spark/pull/6762#discussion_r32591166
--- Diff:
sql/core/src/test/scala/org/apache/spark/sql/DataFrameFunctionsSuite.scala ---
@@ -123,21 +123,145 @@ class DataFrameFunctionsSuite extends
Github user adrian-wang commented on a diff in the pull request:
https://github.com/apache/spark/pull/6762#discussion_r32591224
--- Diff: sql/core/src/main/scala/org/apache/spark/sql/functions.scala ---
@@ -1339,19 +1339,361 @@ object functions
Github user adrian-wang commented on a diff in the pull request:
https://github.com/apache/spark/pull/6843#discussion_r32591920
--- Diff:
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/stringOperations.scala
---
@@ -313,3 +313,131 @@ case class StringLength
Github user adrian-wang commented on the pull request:
https://github.com/apache/spark/pull/6779#issuecomment-112317630
add a test using `sql()`, and maybe change the filename of misc.scala to
miscFunctions.scala
LGTM otherwise.
---
If your project is set up for it, you can
Github user adrian-wang commented on the pull request:
https://github.com/apache/spark/pull/6783#issuecomment-112315090
@zhichao-li if you are using `sql()`, even without those APIs, the query
will still work. It is just for dataframe api
---
If your project is set up for it, you
Github user adrian-wang commented on a diff in the pull request:
https://github.com/apache/spark/pull/6775#discussion_r32487610
--- Diff: sql/core/src/main/scala/org/apache/spark/sql/functions.scala ---
@@ -914,6 +914,42 @@ object functions {
def ceil(columnName: String
Github user adrian-wang commented on the pull request:
https://github.com/apache/spark/pull/6783#issuecomment-112269008
LGTM except the full type APIs
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does
Github user adrian-wang commented on the pull request:
https://github.com/apache/spark/pull/5208#issuecomment-112260579
@justmytwospence yes.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have
Github user adrian-wang commented on a diff in the pull request:
https://github.com/apache/spark/pull/6775#discussion_r32395410
--- Diff: sql/core/src/main/scala/org/apache/spark/sql/functions.scala ---
@@ -914,6 +914,42 @@ object functions {
def ceil(columnName: String
Github user adrian-wang commented on a diff in the pull request:
https://github.com/apache/spark/pull/6775#discussion_r32395319
--- Diff:
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/stringOperations.scala
---
@@ -315,4 +315,65 @@ case class StringLength
Github user adrian-wang commented on a diff in the pull request:
https://github.com/apache/spark/pull/6779#discussion_r32377995
--- Diff:
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/misc.scala
---
@@ -0,0 +1,58 @@
+/*
+ * Licensed to the Apache
Github user adrian-wang commented on a diff in the pull request:
https://github.com/apache/spark/pull/6783#discussion_r32378005
--- Diff:
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/math.scala
---
@@ -254,3 +255,46 @@ case class Pow(left: Expression
Github user adrian-wang commented on a diff in the pull request:
https://github.com/apache/spark/pull/6783#discussion_r32378007
--- Diff:
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/math.scala
---
@@ -254,3 +255,46 @@ case class Pow(left: Expression
Github user adrian-wang commented on a diff in the pull request:
https://github.com/apache/spark/pull/6775#discussion_r32389816
--- Diff:
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/stringOperations.scala
---
@@ -315,4 +315,65 @@ case class StringLength
Github user adrian-wang commented on a diff in the pull request:
https://github.com/apache/spark/pull/6775#discussion_r32389968
--- Diff:
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/stringOperations.scala
---
@@ -315,4 +315,65 @@ case class StringLength
Github user adrian-wang commented on a diff in the pull request:
https://github.com/apache/spark/pull/6775#discussion_r32389929
--- Diff:
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/stringOperations.scala
---
@@ -315,4 +315,65 @@ case class StringLength
Github user adrian-wang commented on a diff in the pull request:
https://github.com/apache/spark/pull/6775#discussion_r32390080
--- Diff: sql/core/src/main/scala/org/apache/spark/sql/functions.scala ---
@@ -914,6 +914,42 @@ object functions {
def ceil(columnName: String
Github user adrian-wang commented on a diff in the pull request:
https://github.com/apache/spark/pull/6775#discussion_r32390057
--- Diff:
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/stringOperations.scala
---
@@ -315,4 +315,65 @@ case class StringLength
Github user adrian-wang commented on a diff in the pull request:
https://github.com/apache/spark/pull/6782#discussion_r32298508
--- Diff:
sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/expressions/DatetimeFunctionsSuite.scala
---
@@ -0,0 +1,59
Github user adrian-wang commented on a diff in the pull request:
https://github.com/apache/spark/pull/6780#discussion_r32300510
--- Diff:
sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/optimizer/LimitPushDownSuit.scala
---
@@ -0,0 +1,72 @@
+/*
+ * Licensed
Github user adrian-wang commented on a diff in the pull request:
https://github.com/apache/spark/pull/6783#discussion_r32299300
--- Diff: sql/core/src/main/scala/org/apache/spark/sql/functions.scala ---
@@ -1205,6 +1205,70 @@ object functions {
def pow(l: Double, rightName
Github user adrian-wang commented on a diff in the pull request:
https://github.com/apache/spark/pull/6783#discussion_r32299333
--- Diff: sql/core/src/main/scala/org/apache/spark/sql/functions.scala ---
@@ -1205,6 +1205,70 @@ object functions {
def pow(l: Double, rightName
Github user adrian-wang commented on the pull request:
https://github.com/apache/spark/pull/6780#issuecomment-111423603
What's the relationship between this PR and SPARK-7289?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub
GitHub user adrian-wang opened a pull request:
https://github.com/apache/spark/pull/6782
[SPARK-8185] [SPARK-8186] [SPARK-8187] [SQL] datetime function: date_add,
date_sub, datediff
You can merge this pull request into a Git repository by running:
$ git pull https
Github user adrian-wang commented on a diff in the pull request:
https://github.com/apache/spark/pull/6779#discussion_r32298798
--- Diff:
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/misc.scala
---
@@ -0,0 +1,58 @@
+/*
+ * Licensed to the Apache
Github user adrian-wang commented on a diff in the pull request:
https://github.com/apache/spark/pull/6783#discussion_r32299205
--- Diff:
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/math.scala
---
@@ -254,3 +255,46 @@ case class Pow(left: Expression
Github user adrian-wang commented on a diff in the pull request:
https://github.com/apache/spark/pull/6783#discussion_r32299255
--- Diff:
sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/expressions/MathFunctionsSuite.scala
---
@@ -204,4 +204,19 @@ class
Github user adrian-wang commented on a diff in the pull request:
https://github.com/apache/spark/pull/6779#discussion_r32301015
--- Diff:
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/misc.scala
---
@@ -0,0 +1,58 @@
+/*
+ * Licensed to the Apache
Github user adrian-wang commented on a diff in the pull request:
https://github.com/apache/spark/pull/5717#discussion_r32337036
--- Diff:
sql/core/src/main/scala/org/apache/spark/sql/execution/joins/SortMergeJoin.scala
---
@@ -36,46 +36,91 @@ import
Github user adrian-wang commented on the pull request:
https://github.com/apache/spark/pull/5717#issuecomment-111570564
@JoshRosen Thanks for your comments, I will refine the code accordingly.
---
If your project is set up for it, you can reply to this email and have your
reply
Github user adrian-wang commented on a diff in the pull request:
https://github.com/apache/spark/pull/6754#discussion_r32192179
--- Diff:
sql/core/src/test/scala/org/apache/spark/sql/MathExpressionsSuite.scala ---
@@ -211,4 +236,18 @@ class MathExpressionsSuite extends QueryTest
Github user adrian-wang commented on a diff in the pull request:
https://github.com/apache/spark/pull/6733#discussion_r32195271
--- Diff:
sql/core/src/main/scala/org/apache/spark/sql/parquet/ParquetConverter.scala ---
@@ -266,8 +267,8 @@ private[parquet] abstract class
Github user adrian-wang commented on a diff in the pull request:
https://github.com/apache/spark/pull/6733#discussion_r32195542
--- Diff:
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/util/DateUtils.scala
---
@@ -87,4 +88,33 @@ object DateUtils
Github user adrian-wang commented on a diff in the pull request:
https://github.com/apache/spark/pull/6733#discussion_r32196529
--- Diff:
sql/core/src/main/scala/org/apache/spark/sql/parquet/ParquetConverter.scala ---
@@ -266,8 +267,8 @@ private[parquet] abstract class
Github user adrian-wang commented on a diff in the pull request:
https://github.com/apache/spark/pull/6759#discussion_r32196559
--- Diff:
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/util/DateTimeUtils.scala
---
@@ -26,7 +26,7 @@ import
Github user adrian-wang commented on a diff in the pull request:
https://github.com/apache/spark/pull/6759#discussion_r32196785
--- Diff:
sql/core/src/main/scala/org/apache/spark/sql/parquet/ParquetConverter.scala ---
@@ -498,69 +493,21 @@ private[parquet] object
Github user adrian-wang commented on a diff in the pull request:
https://github.com/apache/spark/pull/6759#discussion_r32197105
--- Diff:
sql/core/src/main/scala/org/apache/spark/sql/parquet/ParquetConverter.scala ---
@@ -498,69 +493,21 @@ private[parquet] object
Github user adrian-wang commented on a diff in the pull request:
https://github.com/apache/spark/pull/6759#discussion_r32197414
--- Diff:
sql/core/src/main/scala/org/apache/spark/sql/parquet/ParquetConverter.scala ---
@@ -498,69 +493,21 @@ private[parquet] object
Github user adrian-wang commented on a diff in the pull request:
https://github.com/apache/spark/pull/6733#discussion_r32194333
--- Diff:
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/util/DateUtils.scala
---
@@ -28,6 +28,7 @@ import
GitHub user adrian-wang opened a pull request:
https://github.com/apache/spark/pull/6775
[SPARK-8240] [SPARK-8241] [SQL] string function: concat/concat_ws
You can merge this pull request into a Git repository by running:
$ git pull https://github.com/adrian-wang/spark
Github user adrian-wang commented on a diff in the pull request:
https://github.com/apache/spark/pull/6759#discussion_r32199472
--- Diff:
sql/core/src/main/scala/org/apache/spark/sql/parquet/ParquetConverter.scala ---
@@ -498,69 +493,21 @@ private[parquet] object
Github user adrian-wang commented on a diff in the pull request:
https://github.com/apache/spark/pull/6718#discussion_r32098163
--- Diff:
sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/expressions/MathFunctionsSuite.scala
---
@@ -163,6 +163,12 @@ class
Github user adrian-wang commented on a diff in the pull request:
https://github.com/apache/spark/pull/6718#discussion_r32090916
--- Diff:
sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/expressions/MathFunctionsSuite.scala
---
@@ -163,6 +163,12 @@ class
Github user adrian-wang commented on a diff in the pull request:
https://github.com/apache/spark/pull/6718#discussion_r32091043
--- Diff: sql/core/src/main/scala/org/apache/spark/sql/functions.scala ---
@@ -122,7 +122,7 @@ object functions {
* @group agg_funcs
Github user adrian-wang commented on a diff in the pull request:
https://github.com/apache/spark/pull/6716#discussion_r32092540
--- Diff: sql/core/src/test/scala/org/apache/spark/sql/SQLQuerySuite.scala
---
@@ -145,6 +145,17 @@ class SQLQuerySuite extends QueryTest
Github user adrian-wang commented on the pull request:
https://github.com/apache/spark/pull/6754#issuecomment-110999758
We are having problems with If.
The `checkInputDataTypes` method should enable type coercion.
---
If your project is set up for it, you can reply to this email
Github user adrian-wang commented on a diff in the pull request:
https://github.com/apache/spark/pull/6754#discussion_r32191081
--- Diff:
sql/core/src/test/scala/org/apache/spark/sql/MathExpressionsSuite.scala ---
@@ -211,4 +236,18 @@ class MathExpressionsSuite extends QueryTest
Github user adrian-wang commented on a diff in the pull request:
https://github.com/apache/spark/pull/6754#discussion_r32190270
--- Diff:
sql/core/src/test/scala/org/apache/spark/sql/MathExpressionsSuite.scala ---
@@ -211,4 +236,18 @@ class MathExpressionsSuite extends QueryTest
GitHub user adrian-wang opened a pull request:
https://github.com/apache/spark/pull/6718
[SPARK-8217] [SQL] math function log2
You can merge this pull request into a Git repository by running:
$ git pull https://github.com/adrian-wang/spark udflog2
Alternatively you can
GitHub user adrian-wang opened a pull request:
https://github.com/apache/spark/pull/6716
[SPARK-8215] [SPARK-8212] [SQL] add leaf math expression for e and pi
You can merge this pull request into a Git repository by running:
$ git pull https://github.com/adrian-wang/spark epi
Github user adrian-wang commented on a diff in the pull request:
https://github.com/apache/spark/pull/6718#discussion_r32088206
--- Diff: sql/core/src/main/scala/org/apache/spark/sql/functions.scala ---
@@ -1106,6 +1106,22 @@ object functions {
def log1p(columnName: String
Github user adrian-wang commented on a diff in the pull request:
https://github.com/apache/spark/pull/6718#discussion_r32088297
--- Diff: sql/core/src/main/scala/org/apache/spark/sql/functions.scala ---
@@ -1106,6 +1106,22 @@ object functions {
def log1p(columnName: String
Github user adrian-wang closed the pull request at:
https://github.com/apache/spark/pull/6697
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature
GitHub user adrian-wang opened a pull request:
https://github.com/apache/spark/pull/6697
[SPARK-8157] [SQL] should use expression.unapply to match in
HiveTypeCoercion
fix bug for #6405
You can merge this pull request into a Git repository by running:
$ git pull https
GitHub user adrian-wang opened a pull request:
https://github.com/apache/spark/pull/6700
[SPARK-8158] [SQL] several fix for HiveShim
1. explicitly import implicit conversion support.
2. use .nonEmpty instead of .size 0
3. use val instead of var
4. comment indention
You
Github user adrian-wang commented on the pull request:
https://github.com/apache/spark/pull/6700#issuecomment-109916069
retest this please.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have
Github user adrian-wang commented on the pull request:
https://github.com/apache/spark/pull/6434#issuecomment-109917785
Oh, sorry for missing this, I'll update it very soon.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well
Github user adrian-wang commented on the pull request:
https://github.com/apache/spark/pull/6700#issuecomment-110053078
retest this please.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have
GitHub user adrian-wang opened a pull request:
https://github.com/apache/spark/pull/6636
[SPARK-8097] [SQL] add ifExists for dropTempTable
You can merge this pull request into a Git repository by running:
$ git pull https://github.com/adrian-wang/spark dropTT
Alternatively
Github user adrian-wang commented on a diff in the pull request:
https://github.com/apache/spark/pull/6636#discussion_r31702662
--- Diff: sql/core/src/main/scala/org/apache/spark/sql/SQLContext.scala ---
@@ -679,7 +679,25 @@ class SQLContext(@transient val sparkContext
Github user adrian-wang closed the pull request at:
https://github.com/apache/spark/pull/6636
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature
GitHub user adrian-wang opened a pull request:
https://github.com/apache/spark/pull/6639
[SPARK-4761] [DOC] [SQL] kryo default setting in SQL Thrift server
this is a follow up of #3621
/cc @liancheng @pwendell
You can merge this pull request into a Git repository
Github user adrian-wang commented on a diff in the pull request:
https://github.com/apache/spark/pull/6636#discussion_r31703437
--- Diff: sql/core/src/main/scala/org/apache/spark/sql/SQLContext.scala ---
@@ -679,7 +679,25 @@ class SQLContext(@transient val sparkContext
Github user adrian-wang commented on the pull request:
https://github.com/apache/spark/pull/6429#issuecomment-105813936
ok, I'm closing it. Thanks!
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does
Github user adrian-wang closed the pull request at:
https://github.com/apache/spark/pull/6429
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature
GitHub user adrian-wang opened a pull request:
https://github.com/apache/spark/pull/6434
[MINOR] change new Exception to sys.err
You can merge this pull request into a Git repository by running:
$ git pull https://github.com/adrian-wang/spark joinerr
Alternatively you can
Github user adrian-wang commented on the pull request:
https://github.com/apache/spark/pull/5717#issuecomment-105867923
retest this please.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have
Github user adrian-wang commented on the pull request:
https://github.com/apache/spark/pull/6318#issuecomment-105945257
retest this please.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have
GitHub user adrian-wang opened a pull request:
https://github.com/apache/spark/pull/6429
[SPARK-6590] [SQL] df.where accept a string conditionExpr
cc @yhuai
You can merge this pull request into a Git repository by running:
$ git pull https://github.com/adrian-wang/spark
GitHub user adrian-wang opened a pull request:
https://github.com/apache/spark/pull/6318
[SPARK-7790] [SQL] date and decimal conversion for dynamic partition key
You can merge this pull request into a Git repository by running:
$ git pull https://github.com/adrian-wang/spark
Github user adrian-wang commented on the pull request:
https://github.com/apache/spark/pull/6233#issuecomment-103087582
retest this please.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have
Github user adrian-wang commented on the pull request:
https://github.com/apache/spark/pull/6233#issuecomment-103141080
cc @rxin @davies
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have
Github user adrian-wang commented on a diff in the pull request:
https://github.com/apache/spark/pull/6230#discussion_r30529447
--- Diff: python/pyspark/sql/context.py ---
@@ -122,6 +122,26 @@ def udf(self):
Returns a :class:`UDFRegistration` for UDF registration
Github user adrian-wang commented on the pull request:
https://github.com/apache/spark/pull/6233#issuecomment-103143436
but there are bugs there
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does
Github user adrian-wang commented on a diff in the pull request:
https://github.com/apache/spark/pull/6230#discussion_r30568130
--- Diff: python/pyspark/sql/context.py ---
@@ -122,6 +122,26 @@ def udf(self):
Returns a :class:`UDFRegistration` for UDF registration
Github user adrian-wang closed the pull request at:
https://github.com/apache/spark/pull/6233
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature
Github user adrian-wang commented on the pull request:
https://github.com/apache/spark/pull/6233#issuecomment-10014
I was wrong, the type thing is not a bug. I'm closing this.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub
GitHub user adrian-wang opened a pull request:
https://github.com/apache/spark/pull/6233
add range() api
This PR is based on #6230 , thanks @davies .
Closes #6230
You can merge this pull request into a Git repository by running:
$ git pull https://github.com/adrian
Github user adrian-wang closed the pull request at:
https://github.com/apache/spark/pull/6081
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature
Github user adrian-wang commented on a diff in the pull request:
https://github.com/apache/spark/pull/6081#discussion_r30458848
--- Diff: sql/core/src/main/scala/org/apache/spark/sql/SQLContext.scala ---
@@ -1126,6 +1126,57 @@ class SQLContext(@transient val sparkContext
Github user adrian-wang commented on the pull request:
https://github.com/apache/spark/pull/6081#issuecomment-102574463
@davies, if we use parallelize to do this, we have to generate the whole
seq in driver side, which could be a nightmare if the data size is huge...
---
If your
Github user adrian-wang commented on a diff in the pull request:
https://github.com/apache/spark/pull/6081#discussion_r30458857
--- Diff: python/pyspark/sql/context.py ---
@@ -427,6 +428,31 @@ def func(iterator):
df = self._ssql_ctx.jsonRDD(jrdd.rdd
Github user adrian-wang commented on the pull request:
https://github.com/apache/spark/pull/6081#issuecomment-102300561
Will do, Thanks!
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have
Github user adrian-wang commented on a diff in the pull request:
https://github.com/apache/spark/pull/3820#discussion_r30381187
--- Diff: sql/core/pom.xml ---
@@ -69,6 +69,11 @@
version2.3.0/version
/dependency
dependency
+ groupIdorg.jodd
Github user adrian-wang commented on a diff in the pull request:
https://github.com/apache/spark/pull/6081#discussion_r30380936
--- Diff: core/src/main/scala/org/apache/spark/SparkContext.scala ---
@@ -689,6 +689,58 @@ class SparkContext(config: SparkConf) extends Logging
Github user adrian-wang commented on a diff in the pull request:
https://github.com/apache/spark/pull/6081#discussion_r30299808
--- Diff: core/src/main/scala/org/apache/spark/SparkContext.scala ---
@@ -689,6 +689,64 @@ class SparkContext(config: SparkConf) extends Logging
Github user adrian-wang commented on a diff in the pull request:
https://github.com/apache/spark/pull/6081#discussion_r30301057
--- Diff: core/src/main/scala/org/apache/spark/SparkContext.scala ---
@@ -689,6 +689,64 @@ class SparkContext(config: SparkConf) extends Logging
Github user adrian-wang commented on a diff in the pull request:
https://github.com/apache/spark/pull/6081#discussion_r30301275
--- Diff: core/src/main/scala/org/apache/spark/SparkContext.scala ---
@@ -689,6 +689,64 @@ class SparkContext(config: SparkConf) extends Logging
Github user adrian-wang commented on a diff in the pull request:
https://github.com/apache/spark/pull/6081#discussion_r30302022
--- Diff: core/src/main/scala/org/apache/spark/SparkContext.scala ---
@@ -689,6 +689,64 @@ class SparkContext(config: SparkConf) extends Logging
Github user adrian-wang commented on a diff in the pull request:
https://github.com/apache/spark/pull/6081#discussion_r30300445
--- Diff: core/src/main/scala/org/apache/spark/SparkContext.scala ---
@@ -689,6 +689,64 @@ class SparkContext(config: SparkConf) extends Logging
Github user adrian-wang commented on a diff in the pull request:
https://github.com/apache/spark/pull/6081#discussion_r30300456
--- Diff: core/src/main/scala/org/apache/spark/SparkContext.scala ---
@@ -689,6 +689,64 @@ class SparkContext(config: SparkConf) extends Logging
Github user adrian-wang commented on a diff in the pull request:
https://github.com/apache/spark/pull/6081#discussion_r30299864
--- Diff: core/src/main/scala/org/apache/spark/SparkContext.scala ---
@@ -689,6 +689,64 @@ class SparkContext(config: SparkConf) extends Logging
Github user adrian-wang commented on a diff in the pull request:
https://github.com/apache/spark/pull/6081#discussion_r30299903
--- Diff: core/src/main/scala/org/apache/spark/SparkContext.scala ---
@@ -689,6 +689,64 @@ class SparkContext(config: SparkConf) extends Logging
Github user adrian-wang commented on a diff in the pull request:
https://github.com/apache/spark/pull/6081#discussion_r30300491
--- Diff: core/src/main/scala/org/apache/spark/SparkContext.scala ---
@@ -689,6 +689,64 @@ class SparkContext(config: SparkConf) extends Logging
Github user adrian-wang commented on a diff in the pull request:
https://github.com/apache/spark/pull/6081#discussion_r30300169
--- Diff: core/src/main/scala/org/apache/spark/SparkContext.scala ---
@@ -689,6 +689,64 @@ class SparkContext(config: SparkConf) extends Logging
Github user adrian-wang commented on a diff in the pull request:
https://github.com/apache/spark/pull/6081#discussion_r30300624
--- Diff: core/src/main/scala/org/apache/spark/SparkContext.scala ---
@@ -689,6 +689,64 @@ class SparkContext(config: SparkConf) extends Logging
Github user adrian-wang commented on a diff in the pull request:
https://github.com/apache/spark/pull/3820#discussion_r30382532
--- Diff: sql/core/pom.xml ---
@@ -69,6 +69,11 @@
version2.3.0/version
/dependency
dependency
+ groupIdorg.jodd
Github user adrian-wang commented on the pull request:
https://github.com/apache/spark/pull/6081#issuecomment-101903785
@rxin for the more general range() api, should I change the API both in
SQLContext and SparkContext, or just generalize it in SparkContext and keep
`range(n: Long
301 - 400 of 794 matches
Mail list logo