Could you try to cast the timestamp as long?
Internally, timestamp are stored as microseconds in UTC, you will got
seconds in UTC if you cast it to long.
On Thu, Mar 17, 2016 at 1:28 PM, Andy Davidson <
a...@santacruzintegration.com> wrote:
> I am using python spark 1.6 and the --packages
>
I am using python spark 1.6 and the --packages
datastax:spark-cassandra-connector:1.6.0-M1-s_2.10
I need to convert a time stamp string into a unix epoch time stamp. The
function unix_timestamp() function assume current time zone. How ever my
string data is UTC and encodes the time zone as zero.