Re: Subqueries

2017-12-30 Thread Lalwani, Jayesh
: Nicholas Hakobian Date: Friday, December 29, 2017 at 8:10 PM To: "Lalwani, Jayesh" Cc: "user@spark.apache.org" Subject: Re: Subqueries This sounds like a perfect example of using windowing functions. Have you tried something like the following: select ACCT_ID, CR_RVKD_S

Re: Subqueries

2017-12-29 Thread Nicholas Hakobian
com> wrote: > I have a table, and I want to find the latest records in the table. The > table has a column called instnc_id that is incremented everyday. So, I > want to find the records that have the max instnc_id. > > > > I am trying to do this using subqueries, but it gives

Subqueries

2017-12-29 Thread Lalwani, Jayesh
I have a table, and I want to find the latest records in the table. The table has a column called instnc_id that is incremented everyday. So, I want to find the records that have the max instnc_id. I am trying to do this using subqueries, but it gives me an error. For example, when I try this

Re: Spark 2.0 - SQL Subqueries.

2016-05-21 Thread Reynold Xin
wrote: > Hi, > > From the Spark 2.0 Release webinar what I understood is, the newer version > have significantly expanded the SQL capabilities of Spark, with the > introduction of a new ANSI SQL parser and support for Subqueries. It also > says, Spark 2.0 can run all the 99 TPC

Spark 2.0 - SQL Subqueries.

2016-05-21 Thread Kamalesh Nair
Hi, >From the Spark 2.0 Release webinar what I understood is, the newer version have significantly expanded the SQL capabilities of Spark, with the introduction of a new ANSI SQL parser and support for Subqueries. It also says, Spark 2.0 can run all the 99 TPC-DS queries, which require many

Re: How to join multiple tables and use subqueries in Spark SQL using sqlContext?

2016-02-23 Thread swetha kasireddy
kasireddy wrote: > These tables are stored in hdfs as parquet. Can sqlContext be applied for > the subQueries? > > On Tue, Feb 23, 2016 at 5:31 PM, Mich Talebzadeh < > mich.talebza...@cloudtechnologypartners.co.uk> wrote: > >> Assuming these are all in Hive, you ca

Re: How to join multiple tables and use subqueries in Spark SQL using sqlContext?

2016-02-23 Thread swetha kasireddy
These tables are stored in hdfs as parquet. Can sqlContext be applied for the subQueries? On Tue, Feb 23, 2016 at 5:31 PM, Mich Talebzadeh < mich.talebza...@cloudtechnologypartners.co.uk> wrote: > Assuming these are all in Hive, you can either use spark-sql or > spark-shell. > &g

How to join multiple tables and use subqueries in Spark SQL using sqlContext?

2016-02-23 Thread SRK
Hi, How do I join multiple tables and use subqueries in Spark SQL using sqlContext? Can I do this using sqlContext or do I have to use HiveContext for the same? Thanks! -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/How-to-join-multiple-tables-and-use

Re: SparkSQL - No support for subqueries in 1.2-snapshot?

2014-11-04 Thread Terry Siu
e.org<mailto:user@spark.apache.org>" mailto:user@spark.apache.org>> Subject: Re: SparkSQL - No support for subqueries in 1.2-snapshot? This is not supported yet. It would be great if you could open a JIRA (though I think apache JIRA is down ATM). On Tue, Nov 4, 2014 a

Re: SparkSQL - No support for subqueries in 1.2-snapshot?

2014-11-04 Thread Michael Armbrust
> > customerid > > 2 > > 3 > > TOK_TABLE_OR_COL > > customerid > > " + > > > > org.apache.spark.sql.hive.HiveQl$.nodeToExpr(HiveQl.scala:1098) > > > > at scala.sys.package$.error(package.scala

SparkSQL - No support for subqueries in 1.2-snapshot?

2014-11-04 Thread Terry Siu
at org.apache.spark.sql.hive.ExtendedHiveQlParser$$anonfun$hiveQl$1.apply(ExtendedHiveQlParser.scala:49) at scala.util.parsing.combinator.Parsers$Success.map(Parsers.scala:136) Are subqueries in predicates just not supported in 1.2? I think I’m seeing the same issue as: http://apache-spark-user-list.1001560.n3.nabbl