[GitHub] spark issue #22054: [SPARK-24703][SQL]: To add support to multiply CalendarI...

2018-11-01 Thread priyankagargnitk
Github user priyankagargnitk commented on the issue:

https://github.com/apache/spark/pull/22054
  
Please review this PR.


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #22054: [SPARK-24703][SQL]: To add support to multiply CalendarI...

2018-09-11 Thread priyankagargnitk
Github user priyankagargnitk commented on the issue:

https://github.com/apache/spark/pull/22054
  
PLease review this PR.


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #22054: [SPARK-24703][SQL]: To add support to multiply CalendarI...

2018-08-17 Thread priyankagargnitk
Github user priyankagargnitk commented on the issue:

https://github.com/apache/spark/pull/22054
  
PLease review this PR.


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #22054: [SPARK-24703][SQL]: To add support to multiply Ca...

2018-08-09 Thread priyankagargnitk
GitHub user priyankagargnitk opened a pull request:

https://github.com/apache/spark/pull/22054

 [SPARK-24703][SQL]: To add support to multiply CalendarInterval with 
Integral Type.

## What changes were proposed in this pull request?

This change adds capability to multiply Calender interval.

Earlier the multiplication was throwing exception as follow:
spark.sql("select  interval '1' day * 3").show()

org.apache.spark.sql.AnalysisException: cannot resolve '(interval 1 
days * 3)' due to data type mismatch: differing types in '(interval 1 days) * 
3' (int and calendarinterval).; line 1 pos 7;
'Project [unresolvedalias((interval 1 days * 3) , None)]

+- OneRowRelation

at 
org.apache.spark.sql.catalyst.analysis.package.failAnalysis(package.scala:42)
at 
org.apache.spark.sql.catalyst.analysis.CheckAnalysis1433anonfun1433anonfun.applyOrElse(CheckAnalysis.scala:93)
at

but now, we have added this support.

## How was this patch tested?

Added test case in CalendarIntervalSuite.java, 
ArithmeticExpressionSuite.scala and ExpressionTypeCheckingSuite.scala
Also, tested by spark-shell by multiplying calendarinterval with 
Integral type.

scala> spark.sql(\"select interval '1' day\").show()
+---+
|interval 1 days|
+---+
|interval 1 days|
+---+


scala> spark.sql(\"select interval '1' day * 3\").show()
+-+
|(interval 1 days * 3)|
+-+
|  interval 3 days|
+-+


scala> spark.sql(\"select 3 * interval '1' day * 3\").show()
+---+
|((3 * interval 1 days) * 3)|
+---+
|   interval 1 weeks ...|
+---+


scala> spark.sql("select 3 * interval '1' day * 3").collect()
res7: Array[org.apache.spark.sql.Row] = Array([interval 1 weeks 2 days])



You can merge this pull request into a Git repository by running:

$ git pull https://github.com/priyankagargnitk/spark SPARK-24703

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/spark/pull/22054.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #22054


commit 31312c86f83fbd0aaf6d0d4a706d3469792a23df
Author: Priyanka Garg 
Date:   2018-08-09T10:24:27Z

 [SPARK-24703][SQL]: To add support to multiply CalendarInterval with 
Integral Type.
## What changes were proposed in this pull request?

This change adds capability to multiply Calender interval.

Earlier the multiplication was throwing exception as follow:
spark.sql("select  interval '1' day * 3").show()

org.apache.spark.sql.AnalysisException: cannot resolve '(interval 1 
days * 3)' due to data type mismatch: differing types in '(interval 1 days) * 
3' (int and calendarinterval).; line 1 pos 7;
'Project [unresolvedalias((interval 1 days * 3) , None)]

+- OneRowRelation

at 
org.apache.spark.sql.catalyst.analysis.package.failAnalysis(package.scala:42)
at 
org.apache.spark.sql.catalyst.analysis.CheckAnalysis1433anonfun1433anonfun.applyOrElse(CheckAnalysis.scala:93)
at

but now, we have added this support.

## How was this patch tested?

Added test case in CalendarIntervalSuite.java, 
ArithmeticExpressionSuite.scala and ExpressionTypeCheckingSuite.scala
Also, tested by spark-shell by multiplying calendarinterval with 
Integral type.




---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #21679: [SPARK-24695] [SQL]: To add support to return Calendar i...

2018-07-06 Thread priyankagargnitk
Github user priyankagargnitk commented on the issue:

https://github.com/apache/spark/pull/21679
  
What if i make changes to expose it?



---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #21679: [SPARK-24695] [SQL]: To add support to return Calendar i...

2018-07-02 Thread priyankagargnitk
Github user priyankagargnitk commented on the issue:

https://github.com/apache/spark/pull/21679
  
org.apache.spark.unsafe.types.CalenderInterval is already public, am i 
missing something.
Also, what if i want to do some computation on any data type and return 
Calender Interval.. How should i solve this problem in current scenario.


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #21679: SPARK-24695: To add support to return Calendar in...

2018-06-30 Thread priyankagargnitk
GitHub user priyankagargnitk opened a pull request:

https://github.com/apache/spark/pull/21679

SPARK-24695: To add support to return Calendar interval from udf.

## What changes were proposed in this pull request?

This change adds capability to return Calender interval from udf.

Earlier, the  udf  of Type (String => CalendarInterval) was throwing 
Exception stating:
Schema for type org.apache.spark.unsafe.types.CalendarInterval is not 
supported
java.lang.UnsupportedOperationException: Schema for type 
org.apache.spark.unsafe.types.CalendarInterval is not supported
at 
org.apache.spark.sql.catalyst.ScalaReflection391anonfun.apply(ScalaReflection.scala:781)

## How was this patch tested?

Added test case in ScalaReflectionSuite.scala and 
ExpressionEncoderSuite.scala
Also, tested by creating an udf that returns Calendar interval.

jira entry for detail: https://issues.apache.org/jira/browse/SPARK-24695

You can merge this pull request into a Git repository by running:

$ git pull https://github.com/priyankagargnitk/spark SPARK-24695

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/spark/pull/21679.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #21679


commit bd805299cc9802597c165a6de1667a7b02ad48ae
Author: Priyanka Garg 
Date:   2018-06-30T07:23:56Z

SPARK-24695: To add support to return Calender interval from udf.

## What changes were proposed in this pull request?

This change adds capability to return Calender interval from udf.

Earlier, the  udf  of Type (String => CalendarInterval) was throwing 
Exception stating:
Schema for type org.apache.spark.unsafe.types.CalendarInterval is not 
supported
java.lang.UnsupportedOperationException: Schema for type 
org.apache.spark.unsafe.types.CalendarInterval is not supported
at 
org.apache.spark.sql.catalyst.ScalaReflection391anonfun.apply(ScalaReflection.scala:781)

## How was this patch tested?

Added test case in ScalaReflectionSuite.scala and 
ExpressionEncoderSuite.scala
Also, tested by creating an udf that returns Calendar interval.

jira entry for detail: https://issues.apache.org/jira/browse/SPARK-24695




---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #15479: [SPARK-17884][SQL] To resolve Null pointer except...

2016-10-27 Thread priyankagargnitk
Github user priyankagargnitk closed the pull request at:

https://github.com/apache/spark/pull/15479


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #15479: [SPARK-17884][SQL] To resolve Null pointer exception whe...

2016-10-27 Thread priyankagargnitk
Github user priyankagargnitk commented on the issue:

https://github.com/apache/spark/pull/15479
  
Yeah.. closing this.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #15609: [SPARK-18048][SQL] To make behaviour of If consis...

2016-10-24 Thread priyankagargnitk
Github user priyankagargnitk closed the pull request at:

https://github.com/apache/spark/pull/15609


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #15609: [SPARK-18048][SQL] To make behaviour of If consistent, i...

2016-10-24 Thread priyankagargnitk
Github user priyankagargnitk commented on the issue:

https://github.com/apache/spark/pull/15609
  
Sure.. I am closing it... If still face any issues... will raise another.. 
Thanks for the help.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #15609: [SPARK-18048][SQL] To make behaviour of If consistent, i...

2016-10-24 Thread priyankagargnitk
Github user priyankagargnitk commented on the issue:

https://github.com/apache/spark/pull/15609
  
Hmm... I think that makes sense.. i am trying a couple of more things... 
Thanks


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #15609: [SPARK-18048][SQL] To make behaviour of If consistent, i...

2016-10-24 Thread priyankagargnitk
Github user priyankagargnitk commented on the issue:

https://github.com/apache/spark/pull/15609
  
Actually our use case is little different, we are not invoking it with the 
Select queries.. We are using JS to let user type expressions and then we have 
created out own layer on top of spark that converts this JS to Spark 
expressions.. So if user writes an expression that returns date in true branch 
and timestamp in false branch.. Then the IF expression fails... because it 
internally getting mapped to If(Literal.create(true, BooleanType), 
Literal.create(identity(1), DateType), Literal.create(identity(2L), 
TimestampType)) , Which fails.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #15609: [SPARK-18048][SQL] To make behaviour of If consistent, i...

2016-10-24 Thread priyankagargnitk
Github user priyankagargnitk commented on the issue:

https://github.com/apache/spark/pull/15609
  
actually, its not covered in type widening... So, if any of my expression 
is calling if expression ( in nested) .. its failing because of this issue.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #15609: [SPARK-18048][SQL] To make behaviour of If consis...

2016-10-23 Thread priyankagargnitk
GitHub user priyankagargnitk opened a pull request:

https://github.com/apache/spark/pull/15609

[SPARK-18048][SQL] To make behaviour of If consistent, in case of true 
expression and false expression are of compatible data types.

## What changes were proposed in this pull request?
This change adds a type conversion from false value datatype to true 
value’s datatype.

Earlier, the expression If(Literal.create(true, BooleanType), 
Literal.create(identity(1), DateType), Literal.create(identity(2L), 
TimestampType))  was throwing Exception while the expression 
If(Literal.create(true, BooleanType), Literal.create(identity(1L), 
TimestampType), Literal.create(identity(2), DateType)) was working fine. So , 
if we interchange the true and false expressions, behaviour of the IF 
expression changes.

## How was this patch tested?
Added test case in ConditionalExpressionSuite.scala

jira entry for detail: https://issues.apache.org/jira/browse/SPARK-18048

You can merge this pull request into a Git repository by running:

$ git pull https://github.com/priyankagargnitk/spark SPARK-18048

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/spark/pull/15609.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #15609


commit 9a8ef85c9e8924a44b4e7d5639f259c950d03a9c
Author: prigarg 
Date:   2016-10-24T06:34:58Z

[SPARK-18048][SQL] To make behaviour of If consistent, in case of true 
expression and false expression are of compatible data types.

## What changes were proposed in this pull request?
This change adds a type conversion from false value datatype to true 
value’s datatype.

Earlier, the expression If(Literal.create(true, BooleanType), 
Literal.create(identity(1), DateType), Literal.create(identity(2L), 
TimestampType))  was throwing Exception while the expression 
If(Literal.create(true, BooleanType), Literal.create(identity(1L), 
TimestampType), Literal.create(identity(2), DateType)) was working fine. So , 
if we interchange the true and false expressions, behaviour of the IF 
expression changes.

## How was this patch tested?
Added test case in ConditionalExpressionSuite.scala

jira entry for detail: https://issues.apache.org/jira/browse/SPARK-18048




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #15449: [SPARK-17884][SQL] To resolve Null pointer exception whe...

2016-10-13 Thread priyankagargnitk
Github user priyankagargnitk commented on the issue:

https://github.com/apache/spark/pull/15449
  
Done. PR #15479


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #15479: [SPARK-17884][SQL] To resolve Null pointer except...

2016-10-13 Thread priyankagargnitk
GitHub user priyankagargnitk opened a pull request:

https://github.com/apache/spark/pull/15479

[SPARK-17884][SQL] To resolve Null pointer exception when casting from  
empty string to interval type

## What changes were proposed in this pull request?
This change adds a check in castToInterval method of Cast expression , such 
that if converted value is null , then isNull variable should be set to true.

Earlier, the expression Cast(Literal(), CalendarIntervalType) was throwing 
NullPointerException because of the above mentioned reason.

## How was this patch tested?
Added test case in CastSuite.scala

jira entry for detail: https://issues.apache.org/jira/browse/SPARK-17884

You can merge this pull request into a Git repository by running:

$ git pull https://github.com/priyankagargnitk/spark cast_empty_string_bug

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/spark/pull/15479.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #15479


commit 6bdbe7d31f7b01727858fe10593b35ac1b2dafad
Author: prigarg 
Date:   2016-10-12T17:14:45Z

[SPARK-17884][SQL] To resolve Null pointer exception when casting from 
empty string to interval type.

## What changes were proposed in this pull request?
This change adds a check in castToInterval method of Cast expression , such 
that if converted value is null , then isNull variable should be set to true.

Earlier, the expression Cast(Literal(), CalendarIntervalType) was throwing 
NullPointerException because of the above mentioned reason.

## How was this patch tested?
Added test case in CastSuite.scala

jira entry for detail: https://issues.apache.org/jira/browse/SPARK-17884

Author: prigarg 

Closes #15449 from priyankagargnitk/SPARK-17884.




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #15449: [SPARK-17884][SQL] To resolve Null pointer exception whe...

2016-10-13 Thread priyankagargnitk
Github user priyankagargnitk commented on the issue:

https://github.com/apache/spark/pull/15449
  
Hi rxin, Can we merge the same change in branch 1.6 as well... As we are 
still using spark 1.6 and need this change?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #15449: [SPARK-17884][SQL] To resolve Null pointer exception whe...

2016-10-12 Thread priyankagargnitk
Github user priyankagargnitk commented on the issue:

https://github.com/apache/spark/pull/15449
  
Thanks rxin


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #15449: [SPARK-17884][SQL] To resolve Null pointer except...

2016-10-12 Thread priyankagargnitk
GitHub user priyankagargnitk opened a pull request:

https://github.com/apache/spark/pull/15449

[SPARK-17884][SQL] To resolve Null pointer exception when casting from 
empty string to interval type.

## What changes were proposed in this pull request?
This change adds a check in castToInterval method of Cast expression , such 
that if converted value is null , then isNull variable should be set to true.

Earlier, the expression Cast(Literal(), CalendarIntervalType) was throwing 
NullPointerException because of the above mentioned reason.

## How was this patch tested?
Added test case in CastSuite.scala

jira entry for detail: https://issues.apache.org/jira/browse/SPARK-17884

You can merge this pull request into a Git repository by running:

$ git pull https://github.com/priyankagargnitk/spark SPARK-17884

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/spark/pull/15449.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #15449


commit 9dc99adecfadb60ef402f68e956a0f94734b1226
Author: prigarg 
Date:   2016-10-12T08:51:13Z

[SPARK-17884][SQL] To resolve Null pointer exception when  casting from 
empty string to interval type.

## What changes were proposed in this pull request?
This change adds a check in castToInterval method of Cast expression , such 
that if converted value is null , then isNull variable should be set to true.

Earlier, the expression Cast(Literal(), CalendarIntervalType) was throwing 
NullPointerException because of the above mentioned reason.

## How was this patch tested?
Added test case in CastSuite.scala

jira entry for detail: https://issues.apache.org/jira/browse/SPARK-17884




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #15294: [SPARK-17619][SQL] To add support for pattern matching i...

2016-09-30 Thread priyankagargnitk
Github user priyankagargnitk commented on the issue:

https://github.com/apache/spark/pull/15294
  
Reverted the previous changes that i did in ArrayContains and now a new 
expression is added as ArrayContainsWithPatternMatch.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #15294: [SPARK-17619][SQL] To add support for pattern mat...

2016-09-29 Thread priyankagargnitk
Github user priyankagargnitk commented on a diff in the pull request:

https://github.com/apache/spark/pull/15294#discussion_r81103845
  
--- Diff: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/collectionOperations.scala
 ---
@@ -191,11 +193,15 @@ case class SortArray(base: Expression, 
ascendingOrder: Expression)
 }
 
 /**
- * Checks if the array (left) has the element (right)
+ * Checks if the array (left) has the element (right) and pattern match in
+ * case left is Array of type string
  */
+
 @ExpressionDescription(
-  usage = "_FUNC_(array, value) - Returns TRUE if the array contains the 
value.",
-  extended = " > SELECT _FUNC_(array(1, 2, 3), 2);\n true")
+  usage = """_FUNC_(array, value) - Returns TRUE if the array contains the 
value or
--- End diff --

So, in that case we can add one more expression , something like 
ArrayContainsWithPatternMatch?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #15294: [SPARK-17619][SQL] To add support for pattern mat...

2016-09-29 Thread priyankagargnitk
GitHub user priyankagargnitk opened a pull request:

https://github.com/apache/spark/pull/15294

[SPARK-17619][SQL] To add support for pattern matching in ArrayContains 
expression

## What changes were proposed in this pull request?
This change adds support for pattern matching in arrayContains expression 
for the string arrays.
For eg.
a. arrayContains ( Seq ( “\\d\\d\\s-\\s\\d\\d”,  null, "", 
"pattern"), "12 - 20" ) returns true
b. arrayContains ( Seq ( "\\d\\d\\s-\\s\\d\\d",  "", "pattern"), 
"132 - 20" ) ) returns  false
c. arrayContains ( Seq ( "\\d\\d\\s-\\s\\d\\d",  null, ””, 
"pattern"), "132 - 20" ) ) returns  null

This change is completely backward compatible.

## How was this patch tested?
Added some more test cases for pattern match use case in the following:
 a. CollectionFunctionsSuite.scala
 b. DataFrameFunctionsSuite.scala
 c. ExpressionToSQLSuite.scala

jira entry for detail: https://issues.apache.org/jira/browse/SPARK-17619

You can merge this pull request into a Git repository by running:

$ git pull https://github.com/priyankagargnitk/spark 
array_contains_with_pattern_match

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/spark/pull/15294.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #15294


commit 69f63092ddcb04ea10186d71d407b4886dea3245
Author: Priyanka Garg 
Date:   2016-09-29T08:13:08Z

[SPARK-17619][SQL] To add support for pattern matching in ArrayContains 
Expression.

## What changes were proposed in this pull request?
This change adds support for pattern matching in arrayContains expression 
for the string arrays.
For eg.
a. arrayContains ( Seq ( “\\d\\d\\s-\\s\\d\\d”,  null, "", 
"pattern"), "12 - 20" ) returns true
b. arrayContains ( Seq ( "\\d\\d\\s-\\s\\d\\d",  "", "pattern"), 
"132 - 20" ) ) returns  false
c. arrayContains ( Seq ( "\\d\\d\\s-\\s\\d\\d",  null, ””, 
"pattern"), "132 - 20" ) ) returns  null

This change is completely backward compatible.

## How was this patch tested?
Added some more test cases for pattern match use case in the following:
 a. CollectionFunctionsSuite.scala
 b. DataFrameFunctionsSuite.scala
 c. ExpressionToSQLSuite.scala

jira entry for detail: https://issues.apache.org/jira/browse/SPARK-17619




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org