Currently we have implemented External Data Source API and are able to
push filters and projections.
Could you provide some info on how perhaps the joins could be pushed to
the original Data Source if both the data sources are from same database
*.*
First a disclaimer: This is an
One other caveat: While writing up this example I realized that we make
SparkPlan private and we are already packaging 1.3-RC3... So you'll need a
custom build of Spark for this to run. We'll fix this in the next release.
On Thu, Mar 5, 2015 at 5:26 PM, Michael Armbrust mich...@databricks.com
Wouldn't it be possible with .saveAsNewHadoopAPIFile? How are you pushing
the filters and projections currently?
Thanks
Best Regards
On Tue, Mar 3, 2015 at 1:11 AM, Addanki, Santosh Kumar
santosh.kumar.adda...@sap.com wrote:
Hi Colleagues,
Currently we have implemented External Data
Hi Colleagues,
Currently we have implemented External Data Source API and are able to push
filters and projections.
Could you provide some info on how perhaps the joins could be pushed to the
original Data Source if both the data sources are from same database
Briefly looked at
Hi,
We implemented an External Data Source by extending the TableScan . We added
the classes to the classpath
The data source works fine when run in Spark Shell .
But currently we are unable to use this same data source in Python Environment.
So when we execute the following below in an