[jira] [Updated] (SPARK-8420) Inconsistent behavior with Dataframe Timestamp between 1.3.1 and 1.4.0

2015-06-22 Thread Michael Armbrust (JIRA)

 [ 
https://issues.apache.org/jira/browse/SPARK-8420?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Michael Armbrust updated SPARK-8420:

Labels:   (was: releasenotes)

 Inconsistent behavior with Dataframe Timestamp between 1.3.1 and 1.4.0
 --

 Key: SPARK-8420
 URL: https://issues.apache.org/jira/browse/SPARK-8420
 Project: Spark
  Issue Type: Bug
  Components: SQL
Affects Versions: 1.4.0
Reporter: Justin Yip
Assignee: Michael Armbrust
Priority: Blocker
 Fix For: 1.4.1, 1.5.0


 I am trying out 1.4.0 and notice there are some differences in behavior with 
 Timestamp between 1.3.1 and 1.4.0. 
 In 1.3.1, I can compare a Timestamp with string.
 {code}
 scala val df = sqlContext.createDataFrame(Seq((1, 
 Timestamp.valueOf(2015-01-01 00:00:00)), (2, Timestamp.valueOf(2014-01-01 
 00:00:00
 ...
 scala df.filter($_2 = 2014-06-01).show
 ...
 _1 _2  
 2  2014-01-01 00:00:...
 {code}
 However, in 1.4.0, the filter is always false:
 {code}
 scala val df = sqlContext.createDataFrame(Seq((1, 
 Timestamp.valueOf(2015-01-01 00:00:00)), (2, Timestamp.valueOf(2014-01-01 
 00:00:00
 df: org.apache.spark.sql.DataFrame = [_1: int, _2: timestamp]
 scala df.filter($_2 = 2014-06-01).show
 +--+--+
 |_1|_2|
 +--+--+
 +--+--+
 {code}
 Not sure if that is intended, but I cannot find any doc mentioning these 
 inconsistencies.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-8420) Inconsistent behavior with Dataframe Timestamp between 1.3.1 and 1.4.0

2015-06-19 Thread Michael Armbrust (JIRA)

 [ 
https://issues.apache.org/jira/browse/SPARK-8420?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Michael Armbrust updated SPARK-8420:

Shepherd: Yin Huai

 Inconsistent behavior with Dataframe Timestamp between 1.3.1 and 1.4.0
 --

 Key: SPARK-8420
 URL: https://issues.apache.org/jira/browse/SPARK-8420
 Project: Spark
  Issue Type: Bug
  Components: SQL
Affects Versions: 1.4.0
Reporter: Justin Yip
Assignee: Michael Armbrust
Priority: Blocker
  Labels: releasenotes

 I am trying out 1.4.0 and notice there are some differences in behavior with 
 Timestamp between 1.3.1 and 1.4.0. 
 In 1.3.1, I can compare a Timestamp with string.
 {code}
 scala val df = sqlContext.createDataFrame(Seq((1, 
 Timestamp.valueOf(2015-01-01 00:00:00)), (2, Timestamp.valueOf(2014-01-01 
 00:00:00
 ...
 scala df.filter($_2 = 2014-06-01).show
 ...
 _1 _2  
 2  2014-01-01 00:00:...
 {code}
 However, in 1.4.0, the filter is always false:
 {code}
 scala val df = sqlContext.createDataFrame(Seq((1, 
 Timestamp.valueOf(2015-01-01 00:00:00)), (2, Timestamp.valueOf(2014-01-01 
 00:00:00
 df: org.apache.spark.sql.DataFrame = [_1: int, _2: timestamp]
 scala df.filter($_2 = 2014-06-01).show
 +--+--+
 |_1|_2|
 +--+--+
 +--+--+
 {code}
 Not sure if that is intended, but I cannot find any doc mentioning these 
 inconsistencies.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-8420) Inconsistent behavior with Dataframe Timestamp between 1.3.1 and 1.4.0

2015-06-19 Thread Yin Huai (JIRA)

 [ 
https://issues.apache.org/jira/browse/SPARK-8420?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yin Huai updated SPARK-8420:

Fix Version/s: 1.5.0

 Inconsistent behavior with Dataframe Timestamp between 1.3.1 and 1.4.0
 --

 Key: SPARK-8420
 URL: https://issues.apache.org/jira/browse/SPARK-8420
 Project: Spark
  Issue Type: Bug
  Components: SQL
Affects Versions: 1.4.0
Reporter: Justin Yip
Assignee: Michael Armbrust
Priority: Blocker
  Labels: releasenotes
 Fix For: 1.5.0


 I am trying out 1.4.0 and notice there are some differences in behavior with 
 Timestamp between 1.3.1 and 1.4.0. 
 In 1.3.1, I can compare a Timestamp with string.
 {code}
 scala val df = sqlContext.createDataFrame(Seq((1, 
 Timestamp.valueOf(2015-01-01 00:00:00)), (2, Timestamp.valueOf(2014-01-01 
 00:00:00
 ...
 scala df.filter($_2 = 2014-06-01).show
 ...
 _1 _2  
 2  2014-01-01 00:00:...
 {code}
 However, in 1.4.0, the filter is always false:
 {code}
 scala val df = sqlContext.createDataFrame(Seq((1, 
 Timestamp.valueOf(2015-01-01 00:00:00)), (2, Timestamp.valueOf(2014-01-01 
 00:00:00
 df: org.apache.spark.sql.DataFrame = [_1: int, _2: timestamp]
 scala df.filter($_2 = 2014-06-01).show
 +--+--+
 |_1|_2|
 +--+--+
 +--+--+
 {code}
 Not sure if that is intended, but I cannot find any doc mentioning these 
 inconsistencies.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-8420) Inconsistent behavior with Dataframe Timestamp between 1.3.1 and 1.4.0

2015-06-18 Thread Xiangrui Meng (JIRA)

 [ 
https://issues.apache.org/jira/browse/SPARK-8420?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Xiangrui Meng updated SPARK-8420:
-
Assignee: Yin Huai

 Inconsistent behavior with Dataframe Timestamp between 1.3.1 and 1.4.0
 --

 Key: SPARK-8420
 URL: https://issues.apache.org/jira/browse/SPARK-8420
 Project: Spark
  Issue Type: Bug
  Components: SQL
Affects Versions: 1.4.0
Reporter: Justin Yip
Assignee: Yin Huai
Priority: Blocker

 I am trying out 1.4.0 and notice there are some differences in behavior with 
 Timestamp between 1.3.1 and 1.4.0. 
 In 1.3.1, I can compare a Timestamp with string.
 {code}
 scala val df = sqlContext.createDataFrame(Seq((1, 
 Timestamp.valueOf(2015-01-01 00:00:00)), (2, Timestamp.valueOf(2014-01-01 
 00:00:00
 ...
 scala df.filter($_2 = 2014-06-01).show
 ...
 _1 _2  
 2  2014-01-01 00:00:...
 {code}
 However, in 1.4.0, the filter is always false:
 {code}
 scala val df = sqlContext.createDataFrame(Seq((1, 
 Timestamp.valueOf(2015-01-01 00:00:00)), (2, Timestamp.valueOf(2014-01-01 
 00:00:00
 df: org.apache.spark.sql.DataFrame = [_1: int, _2: timestamp]
 scala df.filter($_2 = 2014-06-01).show
 +--+--+
 |_1|_2|
 +--+--+
 +--+--+
 {code}
 Not sure if that is intended, but I cannot find any doc mentioning these 
 inconsistencies.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-8420) Inconsistent behavior with Dataframe Timestamp between 1.3.1 and 1.4.0

2015-06-18 Thread Michael Armbrust (JIRA)

 [ 
https://issues.apache.org/jira/browse/SPARK-8420?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Michael Armbrust updated SPARK-8420:

Labels: releasenotes  (was: )

 Inconsistent behavior with Dataframe Timestamp between 1.3.1 and 1.4.0
 --

 Key: SPARK-8420
 URL: https://issues.apache.org/jira/browse/SPARK-8420
 Project: Spark
  Issue Type: Bug
  Components: SQL
Affects Versions: 1.4.0
Reporter: Justin Yip
Assignee: Michael Armbrust
Priority: Blocker
  Labels: releasenotes

 I am trying out 1.4.0 and notice there are some differences in behavior with 
 Timestamp between 1.3.1 and 1.4.0. 
 In 1.3.1, I can compare a Timestamp with string.
 {code}
 scala val df = sqlContext.createDataFrame(Seq((1, 
 Timestamp.valueOf(2015-01-01 00:00:00)), (2, Timestamp.valueOf(2014-01-01 
 00:00:00
 ...
 scala df.filter($_2 = 2014-06-01).show
 ...
 _1 _2  
 2  2014-01-01 00:00:...
 {code}
 However, in 1.4.0, the filter is always false:
 {code}
 scala val df = sqlContext.createDataFrame(Seq((1, 
 Timestamp.valueOf(2015-01-01 00:00:00)), (2, Timestamp.valueOf(2014-01-01 
 00:00:00
 df: org.apache.spark.sql.DataFrame = [_1: int, _2: timestamp]
 scala df.filter($_2 = 2014-06-01).show
 +--+--+
 |_1|_2|
 +--+--+
 +--+--+
 {code}
 Not sure if that is intended, but I cannot find any doc mentioning these 
 inconsistencies.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-8420) Inconsistent behavior with Dataframe Timestamp between 1.3.1 and 1.4.0

2015-06-17 Thread Yin Huai (JIRA)

 [ 
https://issues.apache.org/jira/browse/SPARK-8420?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yin Huai updated SPARK-8420:

Target Version/s: 1.4.1, 1.5.0  (was: 1.4.1)

 Inconsistent behavior with Dataframe Timestamp between 1.3.1 and 1.4.0
 --

 Key: SPARK-8420
 URL: https://issues.apache.org/jira/browse/SPARK-8420
 Project: Spark
  Issue Type: Bug
  Components: SQL
Affects Versions: 1.4.0
Reporter: Justin Yip
Priority: Blocker

 I am trying out 1.4.0 and notice there are some differences in behavior with 
 Timestamp between 1.3.1 and 1.4.0. 
 In 1.3.1, I can compare a Timestamp with string.
 {code}
 scala val df = sqlContext.createDataFrame(Seq((1, 
 Timestamp.valueOf(2015-01-01 00:00:00)), (2, Timestamp.valueOf(2014-01-01 
 00:00:00
 ...
 scala df.filter($_2 = 2014-06-01).show
 ...
 _1 _2  
 2  2014-01-01 00:00:...
 {code}
 However, in 1.4.0, the filter is always false:
 {code}
 scala val df = sqlContext.createDataFrame(Seq((1, 
 Timestamp.valueOf(2015-01-01 00:00:00)), (2, Timestamp.valueOf(2014-01-01 
 00:00:00
 df: org.apache.spark.sql.DataFrame = [_1: int, _2: timestamp]
 scala df.filter($_2 = 2014-06-01).show
 +--+--+
 |_1|_2|
 +--+--+
 +--+--+
 {code}
 Not sure if that is intended, but I cannot find any doc mentioning these 
 inconsistencies.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-8420) Inconsistent behavior with Dataframe Timestamp between 1.3.1 and 1.4.0

2015-06-17 Thread Michael Armbrust (JIRA)

 [ 
https://issues.apache.org/jira/browse/SPARK-8420?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Michael Armbrust updated SPARK-8420:

 Description: 
I am trying out 1.4.0 and notice there are some differences in behavior with 
Timestamp between 1.3.1 and 1.4.0. 

In 1.3.1, I can compare a Timestamp with string.
{code}
scala val df = sqlContext.createDataFrame(Seq((1, 
Timestamp.valueOf(2015-01-01 00:00:00)), (2, Timestamp.valueOf(2014-01-01 
00:00:00
...
scala df.filter($_2 = 2014-06-01).show
...
_1 _2  
2  2014-01-01 00:00:...
{code}

However, in 1.4.0, the filter is always false:
{code}
scala val df = sqlContext.createDataFrame(Seq((1, 
Timestamp.valueOf(2015-01-01 00:00:00)), (2, Timestamp.valueOf(2014-01-01 
00:00:00
df: org.apache.spark.sql.DataFrame = [_1: int, _2: timestamp]

scala df.filter($_2 = 2014-06-01).show
+--+--+
|_1|_2|
+--+--+
+--+--+
{code}


Not sure if that is intended, but I cannot find any doc mentioning these 
inconsistencies.

  was:

I am trying out 1.4.0 and notice there are some differences in behavior with 
Timestamp between 1.3.1 and 1.4.0. 

In 1.3.1, I can compare a Timestamp with string.
{code}
scala val df = sqlContext.createDataFrame(Seq((1, 
Timestamp.valueOf(2015-01-01 00:00:00)), (2, Timestamp.valueOf(2014-01-01 
00:00:00
...
scala df.filter($_2 = 2014-06-01).show
...
_1 _2  
2  2014-01-01 00:00:...
{code}

However, in 1.4.0, the filter is always false:
{code}
scala val df = sqlContext.createDataFrame(Seq((1, 
Timestamp.valueOf(2015-01-01 00:00:00)), (2, Timestamp.valueOf(2014-01-01 
00:00:00
df: org.apache.spark.sql.DataFrame = [_1: int, _2: timestamp]

scala df.filter($_2 = 2014-06-01).show
+--+--+
|_1|_2|
+--+--+
+--+--+
{code}


Not sure if that is intended, but I cannot find any doc mentioning these 
inconsistencies.

Priority: Blocker  (was: Minor)
Target Version/s: 1.4.0

 Inconsistent behavior with Dataframe Timestamp between 1.3.1 and 1.4.0
 --

 Key: SPARK-8420
 URL: https://issues.apache.org/jira/browse/SPARK-8420
 Project: Spark
  Issue Type: Bug
  Components: SQL
Affects Versions: 1.4.0
Reporter: Justin Yip
Priority: Blocker

 I am trying out 1.4.0 and notice there are some differences in behavior with 
 Timestamp between 1.3.1 and 1.4.0. 
 In 1.3.1, I can compare a Timestamp with string.
 {code}
 scala val df = sqlContext.createDataFrame(Seq((1, 
 Timestamp.valueOf(2015-01-01 00:00:00)), (2, Timestamp.valueOf(2014-01-01 
 00:00:00
 ...
 scala df.filter($_2 = 2014-06-01).show
 ...
 _1 _2  
 2  2014-01-01 00:00:...
 {code}
 However, in 1.4.0, the filter is always false:
 {code}
 scala val df = sqlContext.createDataFrame(Seq((1, 
 Timestamp.valueOf(2015-01-01 00:00:00)), (2, Timestamp.valueOf(2014-01-01 
 00:00:00
 df: org.apache.spark.sql.DataFrame = [_1: int, _2: timestamp]
 scala df.filter($_2 = 2014-06-01).show
 +--+--+
 |_1|_2|
 +--+--+
 +--+--+
 {code}
 Not sure if that is intended, but I cannot find any doc mentioning these 
 inconsistencies.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-8420) Inconsistent behavior with Dataframe Timestamp between 1.3.1 and 1.4.0

2015-06-17 Thread Michael Armbrust (JIRA)

 [ 
https://issues.apache.org/jira/browse/SPARK-8420?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Michael Armbrust updated SPARK-8420:

Target Version/s: 1.4.1  (was: 1.4.0)

 Inconsistent behavior with Dataframe Timestamp between 1.3.1 and 1.4.0
 --

 Key: SPARK-8420
 URL: https://issues.apache.org/jira/browse/SPARK-8420
 Project: Spark
  Issue Type: Bug
  Components: SQL
Affects Versions: 1.4.0
Reporter: Justin Yip
Priority: Blocker

 I am trying out 1.4.0 and notice there are some differences in behavior with 
 Timestamp between 1.3.1 and 1.4.0. 
 In 1.3.1, I can compare a Timestamp with string.
 {code}
 scala val df = sqlContext.createDataFrame(Seq((1, 
 Timestamp.valueOf(2015-01-01 00:00:00)), (2, Timestamp.valueOf(2014-01-01 
 00:00:00
 ...
 scala df.filter($_2 = 2014-06-01).show
 ...
 _1 _2  
 2  2014-01-01 00:00:...
 {code}
 However, in 1.4.0, the filter is always false:
 {code}
 scala val df = sqlContext.createDataFrame(Seq((1, 
 Timestamp.valueOf(2015-01-01 00:00:00)), (2, Timestamp.valueOf(2014-01-01 
 00:00:00
 df: org.apache.spark.sql.DataFrame = [_1: int, _2: timestamp]
 scala df.filter($_2 = 2014-06-01).show
 +--+--+
 |_1|_2|
 +--+--+
 +--+--+
 {code}
 Not sure if that is intended, but I cannot find any doc mentioning these 
 inconsistencies.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org