Hi Nicolas,

thanks a ton for your kind response, I will surely try this out.

Regards,
Gourav Sengupta

On Sun, Aug 29, 2021 at 11:01 PM Nicolas Paris <nicolas.pa...@riseup.net>
wrote:

> as a workaround turn off pruning :
>
> spark.sql.hive.metastorePartitionPruning false
> spark.sql.hive.convertMetastoreParquet false
>
> see
> https://github.com/awslabs/aws-glue-data-catalog-client-for-apache-hive-metastore/issues/45
>
> On Tue Aug 24, 2021 at 9:18 AM CEST, Gourav Sengupta wrote:
> > Hi,
> >
> > I received a response from AWS, this is an issue with EMR, and they are
> > working on resolving the issue I believe.
> >
> > Thanks and Regards,
> > Gourav Sengupta
> >
> > On Mon, Aug 23, 2021 at 1:35 PM Gourav Sengupta <
> > gourav.sengupta.develo...@gmail.com> wrote:
> >
> > > Hi,
> > >
> > > the query still gives the same error if we write "SELECT * FROM
> table_name
> > > WHERE data_partition > CURRENT_DATE() - INTERVAL 10 DAYS".
> > >
> > > Also the queries work fine in SPARK 3.0.x, or in EMR 6.2.0.
> > >
> > >
> > > Thanks and Regards,
> > > Gourav Sengupta
> > >
> > > On Mon, Aug 23, 2021 at 1:16 PM Sean Owen <sro...@gmail.com> wrote:
> > >
> > >> Date handling was tightened up in Spark 3. I think you need to
> compare to
> > >> a date literal, not a string literal.
> > >>
> > >> On Mon, Aug 23, 2021 at 5:12 AM Gourav Sengupta <
> > >> gourav.sengupta.develo...@gmail.com> wrote:
> > >>
> > >>> Hi,
> > >>>
> > >>> while I am running in EMR 6.3.0 (SPARK 3.1.1) a simple query as
> "SELECT
> > >>> * FROM <table_name> WHERE <date parition field> > '2021-03-01'" the
> query
> > >>> is failing with error:
> > >>>
> > >>>
> ---------------------------------------------------------------------------
> > >>> pyspark.sql.utils.AnalysisException:
> > >>> org.apache.hadoop.hive.metastore.api.InvalidObjectException:
> Unsupported
> > >>> expression '2021 - 03 - 01' (Service: AWSGlue; Status Code: 400;
> Error
> > >>> Code: InvalidInputException; Request ID:
> > >>> dd3549c2-2eeb-4616-8dc5-5887ba43dd22; Proxy: null)
> > >>>
> > >>>
> ---------------------------------------------------------------------------
> > >>>
> > >>> The above query works fine in all previous versions of SPARK.
> > >>>
> > >>> Is this the expected behaviour in SPARK 3.1.1? If so can someone
> please
> > >>> let me know how to write this query.
> > >>>
> > >>> Also if this is the expected behaviour I think that a lot of users
> will
> > >>> have to make these changes in their existing code making transition
> to
> > >>> SPARK 3.1.1 expensive I think.
> > >>>
> > >>> Regards,
> > >>> Gourav Sengupta
> > >>>
> > >>
>
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: user-unsubscr...@spark.apache.org
>
>

Reply via email to