As far as from my tests, language integrated query in spark isn't type safe, ie.

query.where('cost == "foo")

Would compile and return nothing.

If you want type safety, perhaps you want to map the SchemaRDD to a RDD of 
Product (your type, not scala.Product)

--- Original Message ---

From: "jay vyas" <jayunit100.apa...@gmail.com>
Sent: February 11, 2015 7:29 PM
To: user@spark.apache.org
Subject: Re: Strongly Typed SQL in Spark

Ah, nevermind, I just saw
http://spark.apache.org/docs/1.2.0/sql-programming-guide.html (language
integrated queries) which looks quite similar to what i was thinking
about.  I'll give that a whirl...

On Wed, Feb 11, 2015 at 7:40 PM, jay vyas <jayunit100.apa...@gmail.com>
wrote:

> Hi spark.  is there anything in the works for a  typesafe HQL like API for
> building spark queries from case classes ? i.e. where, given a domain
> object "Product" with a "cost" associated with it  , we can do something
> like:
>
> query.select(Product).filter({ _.cost > 50.00f
> }).join(ProductMetaData).by(product,meta=>product.id=meta.id).
> toSchemaRDD ?
>
> I know the above snippet is totally wacky but, you get the idea :)
>
>
> --
> jay vyas
>



--
jay vyas

Reply via email to