I think after you get your table as a DataFrame, you can do a filter over
it, something like:
val t = sqlContext.sql("select * from table t")
val df = t.filter(t("a").contains(t("b")))
Let us know the results.
2015-09-12 10:45 GMT+01:00 liam :
>
> OK, I got another way, it
OK, I got another way, it looks silly and low inefficiency but works.
tradeDF.registerTempTable(tradeTab);
orderDF.registerTempTable(orderTab);
//orderId = tid + "_x"
String sql1 = "select * from " + tradeTab + " a, " + orderTab + " b where
substr(b.orderId,1,15) = substr(a.tid,1) ";
String
concat and locate are available as of version 1.5.0, according to the
Scaladocs. For earlier versions of Spark, and for the operations that are
still not supported, it's pretty straightforward to define your own
UserDefinedFunctions in either Scala or Java (I don't know about other
languages).