Hi,

Can you anyone point me where to find the sql dialect for Spark SQL? Unlike
HQL, there are lot of tasks involved in creating and querying tables which
is very cumbersome one. If we have to fire multiple queries on 10's and
100's of tables then it is very difficult at this point. Given Spark SQL
will replace shark if we have nice way to handle DDL and DML operations,
that would be awesome. We shall use it for ETL and BI querying in a
seamless fashion.

Can anyone please help me with this?


Thanks

Sathish

Reply via email to