Using Custom Version of Hive with Spark

2019-07-18 Thread Valeriy Trofimov
Hi All, I've created test tables in HiveCLI (druid1, druid2) and test tables in Beeline (beeline1, beeline2). I want to be able to access Hive tables in Beeline and Beeline tables in Hive. Is it possible to do? I've set up hive-site.xml for both Hive and Spark to use the same warehouse thinking

Creating external Druid table

2019-07-15 Thread Valeriy Trofimov
Hi All, How do you create an external Druid table via Spark? I know that you can do it like this: https://docs.hortonworks.com/HDPDocuments/HDP3/HDP-3.1.0/using-druid/content/druid_anatomy_of_hive_to_druid.html But the issue is that Spark was built on Hive 1.2.1:

Re: Error When Creating External Hive Table Via Thrift Server

2019-07-03 Thread Valeriy Trofimov
requirements are to use Thrift Server, which for some reason uses Spark SQL instead of HiveQL, which should be a default behavior, because Thrift Server uses Hive. Thanks, Val On Tue, Jul 2, 2019 at 4:34 PM Valeriy Trofimov wrote: > Hi All, > > I'm trying to create and external table using Thri

Error When Creating External Hive Table Via Thrift Server

2019-07-02 Thread Valeriy Trofimov
Hi All, I'm trying to create and external table using Thrift Server to which I'm connected via Beeline. In order to do this I run the following Hive SQL query, as described here: https://cwiki.apache.org/confluence/display/Hive/Druid+Integration CREATE EXTERNAL TABLE druid_table_1 STORED BY

Accessing Multiple DBs via Spark Thrift Server

2019-06-27 Thread Valeriy Trofimov
Hi All, I want to use Spark SQL to access multiple DBs including doing inter-dataset joins. I use Thrift Server to access Hive DB, because this is what it's designed to do. How can I use Thrift Server to access other DBs? Thanks, Val

Java version for building Spark

2019-06-24 Thread Valeriy Trofimov
Hi All, What Java version should I use to build Spark on Ubuntu? What are the instructions on installing it on Ubuntu? Official doc on this is missing this info: https://spark.apache.org/docs/latest/building-spark.html If I use default JDK, I get a build error Googling which shows that I need