I think after you get your table as a DataFrame, you can do a filter over it, something like:
val t = sqlContext.sql("select * from table t") val df = t.filter(t("a").contains(t("b"))) Let us know the results. 2015-09-12 10:45 GMT+01:00 liam <liaml...@gmail.com>: > > OK, I got another way, it looks silly and low inefficiency but works. > > tradeDF.registerTempTable(tradeTab); > > orderDF.registerTempTable(orderTab); > > //orderId = tid + "_x" > > String sql1 = "select * from " + tradeTab + " a, " + orderTab + " b where > substr(b.orderId,1,15) = substr(a.tid,1) "; > > String sql2 = "select * from " + tradeTab + " a, " + orderTab + " b where > substr(b.orderId,1,16) = substr(a.tid,1) "; > > String sql3 = "select * from " + tradeTab + " a, " + orderTab + " b where > substr(b.orderId,1,17) = substr(a.tid,1) "; > > DataFrame combinDF = > sqlContext.sql(sql1).unionAll(sqlContext.sql(sql2)).unionAll(sqlContext.sql(sql3)); > > > As I try : > substr(b.orderId,1,length(a.tid)) = a.tid *-> no length available* > b.orderId like concat(a.tid,'%') *-> no concat available* > instr(b.orderId,a.tid) > 0 *->** no instr available* > locate(a.tid,b.orderId) > 0 *->** no locate available* > ...... *-> no > ...... * > > > > 2015-09-12 13:49 GMT+08:00 Richard Eggert <richard.egg...@gmail.com>: > >> concat and locate are available as of version 1.5.0, according to the >> Scaladocs. For earlier versions of Spark, and for the operations that are >> still not supported, it's pretty straightforward to define your own >> UserDefinedFunctions in either Scala or Java (I don't know about other >> languages). >> On Sep 11, 2015 10:26 PM, "liam" <liaml...@gmail.com> wrote: >> >>> Hi, >>> >>> Imaging this: the value of one column is the substring of another >>> column, when using Oracle,I got many ways to do the query like the >>> following statement,but how to do in SparkSQL since this no "concat(), >>> instr(), locate()..." >>> >>> >>> select * from table t where t.a like '%'||t.b||'%'; >>> >>> >>> Thanks. >>> >>> >