It seems to be failing when I do something like following in both
sqlContext and hiveContext


sqlContext.sql("SELECT ssd.savedDate from saveSessionDatesRecs ssd
where ssd.partitioner in (SELECT sr1.partitioner  from
sparkSessionRecords1 sr1))")


On Tue, Feb 23, 2016 at 5:57 PM, swetha kasireddy <swethakasire...@gmail.com
> wrote:

> These tables are stored in hdfs as parquet. Can sqlContext be applied for
> the subQueries?
>
> On Tue, Feb 23, 2016 at 5:31 PM, Mich Talebzadeh <
> mich.talebza...@cloudtechnologypartners.co.uk> wrote:
>
>> Assuming these are all in Hive, you can either use spark-sql or
>> spark-shell.
>>
>> HiveContext has richer settings compared to SparkContext
>>
>> Have a look at this example of joins among three Hive tables:
>>
>> // sc is an existing SparkContext.
>> val sqlContext  = new org.apache.spark.sql.hive.HiveContext(sc)
>> var sqltext : String = ""
>> sqltext = "use oraclehadoop"
>> sqlContext.sql(sqltext)
>> sqlContext.sql("SELECT FROM_unixtime(unix_timestamp(), 'dd/MM/yyyy
>> HH:mm:ss.ss') AS StartTime").show()
>> println("\n Running the query \n")
>>
>> sqltext = """
>>
>> SELECT c.country_name AS nation,
>>        s.year,
>>        s.month,
>>        sum (amount_sold) AS salesamount
>> FROM  countries c
>> INNER JOIN customers cs
>> ON c.country_id = cs.country_id
>> INNER JOIN sales s
>> ON cs.cust_id = s.cust_id
>> GROUP BY country_name, s.year, s.month
>> ORDER BY country_name, s.year, s.month
>> """
>>
>> sqlContext.sql(sqltext).collect.foreach(println)
>>
>> sqlContext.sql("SELECT FROM_unixtime(unix_timestamp(), 'dd/MM/yyyy
>> HH:mm:ss.ss') AS EndTime").show()
>> sys.exit()
>>
>>
>>
>> HTH
>>
>> --
>>
>> Dr Mich Talebzadeh
>>
>> LinkedIn  
>> https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
>> http://talebzadehmich.wordpress.com
>>
>> NOTE: The information in this email is proprietary and confidential. This 
>> message is for the designated recipient only, if you are not the intended 
>> recipient, you should destroy it immediately. Any information in this 
>> message shall not be understood as given or endorsed by Cloud Technology 
>> Partners Ltd, its subsidiaries or their employees, unless expressly so 
>> stated. It is the responsibility of the recipient to ensure that this email 
>> is virus free, therefore neither Cloud Technology partners Ltd, its 
>> subsidiaries nor their employees accept any responsibility.
>>
>>
>>
>> On 24/02/2016 01:01, SRK wrote:
>>
>> Hi,
>>
>> How do I join multiple tables and use subqueries in Spark SQL using
>> sqlContext? Can I do this using sqlContext or do I have to use HiveContext
>> for the same?
>>
>> Thanks!
>>
>>
>>
>>
>>
>> --
>> View this message in context: 
>> http://apache-spark-user-list.1001560.n3.nabble.com/How-to-join-multiple-tables-and-use-subqueries-in-Spark-SQL-using-sqlContext-tp26315.html
>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>> For additional commands, e-mail: user-h...@spark.apache.org
>>
>>
>>
>>
>>
>>
>
>

Reply via email to