Creating a front-end for output from Spark/PySpark

2014-11-23 Thread Alaa Ali
to use and I'll dig up the rest. Regards, Alaa Ali

Re: Spark SQL with Apache Phoenix lower and upper Bound

2014-11-22 Thread Alaa Ali
Thanks Alex! I'm actually working with views from HBase because I will never edit the HBase table from Phoenix and I'd hate to accidentally drop it. I'll have to work out how to create the view with the additional ID column. Regards, Alaa Ali On Fri, Nov 21, 2014 at 5:26 PM, Alex Kamil alex.ka

Spark SQL with Apache Phoenix lower and upper Bound

2014-11-21 Thread Alaa Ali
)) But this doesn't work because the sql expression that the JdbcRDD expects has to have two ?s to represent the lower and upper bound. How can I run my query through the JdbcRDD? Regards, Alaa Ali

Re: Spark SQL with Apache Phoenix lower and upper Bound

2014-11-21 Thread Alaa Ali
question, I still haven't tried this out, but I'll actually be using this with PySpark, so I'm guessing the PhoenixPigConfiguration and newHadoopRDD can be defined in PySpark as well? Regards, Alaa Ali On Fri, Nov 21, 2014 at 4:34 PM, Josh Mahonin jmaho...@interset.com wrote: Hi Alaa Ali