Re: Weird experience Hive with Spark Transformations

2017-01-17 Thread Chetan Khatri
But Hive 1.2.1 do not have hive-site.xml, I tried to add my own which
causes me other several issues. On the other side it works well for me with
 Hive 2.0.1 where hive-site.xml content were as below and copied to
spark/conf too. it worked.

*5. hive-site.xml configuration setup*


Add below at conf/hive-site.xml , if not there then create it.




javax.jdo.option.ConnectionURL

jdbc:mysql://localhost/metastore?createDatabaseIfNotExist=true

metadata is stored in a MySQL server





javax.jdo.option.ConnectionDriverName

com.mysql.jdbc.Driver

MySQL JDBC driver class





javax.jdo.option.ConnectionUserName

hiveuser

user name for connecting to mysql server





javax.jdo.option.ConnectionPassword

hivepassword

password for connecting to mysql server




Replace below 3 properties tag with whatever already exist by default.
otherwise it will throw an error


"java.net.URISyntaxException: Relative path in absolute URI:
${system:java.io.tmpdir%7D/$%7Bsystem:user.name%7D"




hive.querylog.location

$HIVE_HOME/iotmp

Location of Hive run time structured log file






hive.exec.local.scratchdir

$HIVE_HOME/iotmp

Local scratch space for Hive jobs






hive.downloaded.resources.dir

$HIVE_HOME/iotmp

Temporary local directory for added resources in the remote
file system.





On Tue, Jan 17, 2017 at 10:01 PM, Dongjoon Hyun  wrote:

> Hi, Chetan.
>
> Did you copy your `hive-site.xml` into Spark conf directory? For example,
>
> cp /usr/local/hive/conf/hive-site.xml /usr/local/spark/conf
>
> If you want to use the existing Hive metastore, you need to provide that
> information to Spark.
>
> Bests,
> Dongjoon.
>
> On 2017-01-16 21:36 (-0800), Chetan Khatri 
> wrote:
> > Hello,
> >
> > I have following services are configured and installed successfully:
> >
> > Hadoop 2.7.x
> > Spark 2.0.x
> > HBase 1.2.4
> > Hive 1.2.1
> >
> > *Installation Directories:*
> >
> > /usr/local/hadoop
> > /usr/local/spark
> > /usr/local/hbase
> >
> > *Hive Environment variables:*
> >
> > #HIVE VARIABLES START
> > export HIVE_HOME=/usr/local/hive
> > export PATH=$PATH:$HIVE_HOME/bin
> > #HIVE VARIABLES END
> >
> > So, I can access Hive from anywhere as environment variables are
> > configured. Now if if i start my spark-shell & hive from location
> > /usr/local/hive then both work good for hive-metastore other wise from
> > where i start spark-shell where spark creates own meta-store.
> >
> > i.e I am reading from HBase and Writing to Hive using Spark. I dont know
> > why this is weird issue is.
> >
> >
> >
> >
> > Thanks.
> >
>
> -
> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>
>


Re: Weird experience Hive with Spark Transformations

2017-01-17 Thread Dongjoon Hyun
Hi, Chetan.

Did you copy your `hive-site.xml` into Spark conf directory? For example,

cp /usr/local/hive/conf/hive-site.xml /usr/local/spark/conf

If you want to use the existing Hive metastore, you need to provide that 
information to Spark.

Bests,
Dongjoon.

On 2017-01-16 21:36 (-0800), Chetan Khatri  wrote: 
> Hello,
> 
> I have following services are configured and installed successfully:
> 
> Hadoop 2.7.x
> Spark 2.0.x
> HBase 1.2.4
> Hive 1.2.1
> 
> *Installation Directories:*
> 
> /usr/local/hadoop
> /usr/local/spark
> /usr/local/hbase
> 
> *Hive Environment variables:*
> 
> #HIVE VARIABLES START
> export HIVE_HOME=/usr/local/hive
> export PATH=$PATH:$HIVE_HOME/bin
> #HIVE VARIABLES END
> 
> So, I can access Hive from anywhere as environment variables are
> configured. Now if if i start my spark-shell & hive from location
> /usr/local/hive then both work good for hive-metastore other wise from
> where i start spark-shell where spark creates own meta-store.
> 
> i.e I am reading from HBase and Writing to Hive using Spark. I dont know
> why this is weird issue is.
> 
> 
> 
> 
> Thanks.
> 

-
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org



Weird experience Hive with Spark Transformations

2017-01-16 Thread Chetan Khatri
Hello,

I have following services are configured and installed successfully:

Hadoop 2.7.x
Spark 2.0.x
HBase 1.2.4
Hive 1.2.1

*Installation Directories:*

/usr/local/hadoop
/usr/local/spark
/usr/local/hbase

*Hive Environment variables:*

#HIVE VARIABLES START
export HIVE_HOME=/usr/local/hive
export PATH=$PATH:$HIVE_HOME/bin
#HIVE VARIABLES END

So, I can access Hive from anywhere as environment variables are
configured. Now if if i start my spark-shell & hive from location
/usr/local/hive then both work good for hive-metastore other wise from
where i start spark-shell where spark creates own meta-store.

i.e I am reading from HBase and Writing to Hive using Spark. I dont know
why this is weird issue is.




Thanks.