Would it be possible to use views to address some of your requirements?
Alternatively it might be better to parse it yourself. There are open source
libraries for it, if you need really a complete sql parser. Do you want to do
it on sub queries?
> On 05 Nov 2015, at 23:34, Yana Kadiyska
You can hack around this by constructing logical plans yourself and then
creating a DataFrame in order to execute them. Note that this is all
depending on internals of the framework and can break when Spark upgrades.
On Thu, Nov 5, 2015 at 4:18 PM, Yana Kadiyska
wrote:
Hi folks, not sure if this belongs to dev or user list..sending to dev as
it seems a bit convoluted.
I have a UI in which we allow users to write ad-hoc queries against a (very
large, partitioned) table. I would like to analyze the queries prior to
execution for two purposes:
1. Reject
I don't think a view would help -- in the case of under-constraining, I
want to make sure that the user is constraining a column (e.g. I want to
restrict them to querying a single partition at a time but I don't care
which one)...a view per partition value is not practical due to the fairly
high