I have been searching and have not found a solution as to how one might query
on dates stored as UTC milliseconds from the epoch.  The schema I have
pulled in from a NoSQL datasource (JSON from MongoDB) has the target date
as:

 |-- dateCreated: struct (nullable = true)
 |    |-- $date: long (nullable = true)

and my goal is to write queries such along the lines of:

SELECT COUNT(*) FROM myTable WHERE dateCreated BETWEEN [dateStoredAsLong0]
AND [dateStoredAsLong1]

Of course wrapped in the Spark specific sqlContext.sql("SELECT myStuff BLAH
BLAH").collect...

I am new to both Scala and Spark, so forgive me if this is an elementary
question, but my searches have turned up empty.

Thank you.



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-SQL-how-to-query-dates-stored-as-millis-tp17670.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to