Hi Pradyumn,

I think it’s because of a HMS client backward compatibility issue described 
here, https://issues.apache.org/jira/browse/HIVE-24608

Thanks,

DB Tsai | ACI Spark Core |  Apple, Inc

> On Jan 9, 2021, at 9:53 AM, Pradyumn Agrawal <coderbond...@gmail.com> wrote:
> 
> Hi Michael, 
> Thanks for references, although I had a hard time translating the 3rd one as 
> Google Translate 
> <https://translate.google.com/translate?sl=auto&tl=en&u=https://blog.csdn.net/Young2018/article/details/108871542>
>  of the csdn blog, it didn't work correctly and already went through the 1st 
> and 2nd earlier. 
> But I can see the CDH distribution is different in my case, it is CDH-6.3.1.
> 
> <image.png>
> 
> As you can see here in the screenshot, it is saying that Invalid Methods 
> Name: get_table_req
> I am guessing that CDH Distribution has some changes on Hive Metastore Client 
> which is conflicting with Shim implementations of Spark. Although, I couldn't 
> debug a lot, it's totally a guesswork.
> Would certainly like to know your and other views on this?
> 
> Thanks and Regards
> Pradyumn Agrawal
> Media.net (India)
> 
> On Sat, Jan 9, 2021 at 8:01 PM michael.yang <yangrong.jx...@gmail.com 
> <mailto:yangrong.jx...@gmail.com>> wrote:
> Hi Pradyumn,
> 
> We integrated Spark 3.0.1 with hive 2.1.1-cdh6.1.0 and it works fine to use
> spark-sql to query hive tables.
> 
> Make sure you config spark-defaults.conf and spark-env.sh well and copy
> hive/hadoop related config files to spark conf folder.
> 
> You can refer to below refrences for detail.
> 
> https://spark.apache.org/docs/latest/building-spark.html 
> <https://spark.apache.org/docs/latest/building-spark.html>
> https://spark.apache.org/docs/latest/sql-data-sources-hive-tables.html 
> <https://spark.apache.org/docs/latest/sql-data-sources-hive-tables.html>
> https://blog.csdn.net/Young2018/article/details/108871542 
> <https://blog.csdn.net/Young2018/article/details/108871542>
> 
> Thanks
> Michael Yang
> 
> 
> 
> --
> Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/ 
> <http://apache-spark-user-list.1001560.n3.nabble.com/>
> 
> ---------------------------------------------------------------------
> To unsubscribe e-mail: user-unsubscr...@spark.apache.org 
> <mailto:user-unsubscr...@spark.apache.org>
> 

Reply via email to