I would say your hadoop configuration file(s) should have been your
class-path (core-site.xml in this case) . You are not supposed to put
hadoop parameters into hive conf files.

-Shrijeet


On Fri, Nov 19, 2010 at 4:57 PM, Stuart Smith <stu24m...@yahoo.com> wrote:
>
> Hello,
>
>  Just wanted to let people know I tracked this one down:
>
> It looks like it was not picking up the *hadoop* core-site.xml configuration 
> file.
>
> - So the variable fs.default.name was never set
>
> - So the warehouse dir became file://[hive.metastore.warehouse.dir] instead 
> of [hdfs location]/[hive.metastore.warehouse.dir]
>
> - So it couldn't find any of the warehouse files.
>
> - So the metastore queries would start to work, but the metastore couldn't 
> find any of the backing files on hdfs.
>
> It was picking up the hive configuration, so I just plopped the 
> fs.default.name property from hdfs-site.xml into the hive configuration.
>
> Should the jdbc wiki:
>
> http://wiki.apache.org/hadoop/Hive/HiveClient#head-fd2d8ae9e17fdc3d9b7048d088b2c23a53a6857d
>
> Be updated to include this information?
>
> It could be useful to anyone trying to use an embedded server (vs the example 
> given). I would actually think this would apply to the standalone case as 
> well, but I haven't tried it yet.
>
> My particular use case is using the jdbc connector in a java servlet 
> (specifically, a GWT server-side RPC implementation).
>
> As an aside: is the hive jdbc connector thread-safe?
> Assuming I instantiate within the callback method?
> (I would think having a class Connection member would not be thread safe?).
>
> I'd be happy to help update the wiki & come up with an example, if that would 
> help..
>
> Take care,
>  -stu
>
>
> --- On Thu, 11/18/10, Stuart Smith <stu24m...@yahoo.com> wrote:
>
>> From: Stuart Smith <stu24m...@yahoo.com>
>> Subject: Using jdbc in embedded mode - Can't find warehouse directory
>> To: user@hive.apache.org
>> Date: Thursday, November 18, 2010, 7:46 PM
>>
>> Hello,
>>
>>   I'm trying to connect to hive using the JDBC driver
>> in embedded mode. I can load the driver successfully &
>> connect to it via:
>>
>> hiveConnection = DriverManager.getConnection(
>> "jdbc:hive://", "", "" )
>>
>> But when I query a table that I know exists - I can query
>> it via a hive command line running on the same machine - I
>> get a "table does not exist" error. When I go ahead and
>> create the table in my java program, and then query it, I
>> get:
>>
>> ERROR: hive.log java.io.FileNotFoundException: File
>> file:/user/hive/warehouse/[table_name]
>>         at
>> org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLocalFileSystem.java:361)
>> ...
>>
>> So it looks like it's trying to use the local filesystem
>> for the warehouse dir. I tried setting the warehouse dir
>> variable in the hive-default.xml file to:
>>
>> hdfs://user/hive/warehouse/
>>
>> But I get the same errors.
>>
>> Any idea what's happening?
>>
>> Am I confused on what an embedded hive server can do - I
>> was under the impression that the cli used an embedded hive
>> server, and could connect to my hdfs store, but... it would
>> seem my java program can't this.
>>
>> I guess my next stop is going through the hive cli source
>> code ?
>>
>> Take care,
>>   -stu
>>
>>
>>
>>
>
>
>
>

Reply via email to