Hi Tal,
I'm not sure there is currently a built-in function for it, but you can
easily define a UDF (user defined function) by extending
org.apache.spark.sql.api.java.UDF1, registering it
(sparkContext.udf().register(...)), and then use it inside your query.
RK.
On Tue, Jul 21, 2015 at 7:04
Hi,
I'm running a query with sql context where one of the fields is of type
java.sql.Timestamp. I'd like to set a function similar to DATEDIFF in
mysql, between the date given in each row, and now. So If I was able to use
the same syntax as in mysql it would be:
val date_diff_df =
,rowTimeStamp ):INT={
here is something you need to do;
}
-- --
??: Tal Rozen;t...@scaleka.com;
: 2015??7??22??(??) 0:04
??: useruser@spark.apache.org;
: Timestamp functions for sqlContext
Hi,
I'm running a query with sql context