bq. Whether sContext(SQlCOntext) will help to query in both the dataframes
and will it decide on which dataframe to query for .

Can you clarify what you were asking ?

The queries would be carried out on respective DataFrame's as shown in your
snippet.

On Thu, Feb 11, 2016 at 8:47 AM, Gaurav Agarwal <gaurav130...@gmail.com>
wrote:

> Thanks
>
> That's what i tried to do , but for these two dataframes sqlContext is
> only one .
>
> DataFrame tableA = sqlContext.read().jdbc(url,"tableA",prop);
>
> DataFrame tableB = sqlContext.read().jdbc(url,"tableB",prop);
>
>
> When i will say like this
>
>
> SqlContext sContext = new SQlContext(sc)
>
> DataFrame df = sContext.load("jdbc","select * from employee"); // These
> queries will be the Map<String,String> with driver.
>
> DataFrame df = sContext.load("jdbc","select * from Dept");
>
>
> DataFrame filteredCriteria = sContext.sql("Select * from employee where
> empId="1" ");
>
> DataFrame filteredCriteria2 = sContext.sql("select * from Dept where
> deptid="2" ");
>
>
> List<Row> listEmployee = filteredCriteria.collectASList();
>
> List<Row> listDept = filteredCriteria2 .collectASList();
>
>
> will this work in this scenario Whether sContext(SQlCOntext) will help to
> query in both the dataframes and will it decide on which dataframe to query
> for .
>
>
> if any more question then let me know.
>
>
> Thanks
>
>
>
>
> On Thu, Feb 11, 2016 at 7:41 PM, Prashant Verma <
> prashant.ve...@ericsson.com> wrote:
>
>> Hi Gaurav,
>>
>>                 You can try something like this.
>>
>>
>>
>> SparkConf conf = new SparkConf();
>>
>> JavaSparkContext sc = new JavaSparkContext(conf);
>>
>> SQLContext sqlContext = new org.apache.spark.sql.SQLContext(sc);
>>
>> Class.forName("com.mysql.jdbc.Driver");
>>
>> String url="url";
>>
>> Properties prop = new java.util.Properties();
>>
>> prop.setProperty("user","user");
>>
>> prop.setProperty("password","password");
>>
>> DataFrame tableA = sqlContext.read().jdbc(url,"tableA",prop);
>>
>> DataFrame tableB = sqlContext.read().jdbc(url,"tableB",prop);
>>
>>
>>
>>                 Hope this helps.
>>
>>
>>
>> Thanks,
>>
>> Prashant
>>
>>
>>
>>
>>
>>
>>
>> *From:* Gaurav Agarwal [mailto:gaurav130...@gmail.com]
>> *Sent:* Thursday, February 11, 2016 7:35 PM
>> *To:* user@spark.apache.org
>> *Subject:* Dataframes
>>
>>
>>
>> Hi
>>
>> Can we load 5 data frame for 5 tables in one spark context.
>> I am asking why because we have to give
>> Map<string string> options= new hashmap();
>>
>> Options.put(driver,"");
>> Options.put(URL,"");
>> Options.put(dbtable,"");
>>
>> I can give only table query at time in dbtable options .
>> How will I register multiple queries and dataframes
>>
>> Thankw
>> with all table.
>>
>> Thanks
>> +
>>
>
>

Reply via email to