wild cards in spark sql

2015-09-02 Thread Hafiz Mujadid
Hi

does spark sql support wild cards to filter data in sql queries just like we
can filter data in sql queries in RDBMS with different wild cards like % and
? etc. In other words how can i write following query in spar sql

select * from employee where ename like 'a%d'

thanks



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/wild-cards-in-spark-sql-tp24563.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: wild cards in spark sql

2015-09-02 Thread Michael Armbrust
That query should work.

On Wed, Sep 2, 2015 at 1:50 PM, Hafiz Mujadid <hafizmujadi...@gmail.com>
wrote:

> Hi
>
> does spark sql support wild cards to filter data in sql queries just like
> we
> can filter data in sql queries in RDBMS with different wild cards like %
> and
> ? etc. In other words how can i write following query in spar sql
>
> select * from employee where ename like 'a%d'
>
> thanks
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/wild-cards-in-spark-sql-tp24563.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> -
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>


Re: wild cards in spark sql

2015-09-02 Thread Anas Sherwani
Yes, SparkSQL does support wildcards. The query you have written should work
as is, if the type of ename is string. You can find all the keywords and a
few supported functions at 
http://docs.datastax.com/en/datastax_enterprise/4.6/datastax_enterprise/spark/sparkSqlSupportedSyntax.html
<http://docs.datastax.com/en/datastax_enterprise/4.6/datastax_enterprise/spark/sparkSqlSupportedSyntax.html>
  



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/wild-cards-in-spark-sql-tp24563p24565.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org