Re: hi users

2012-07-04 Thread shaik ahamed
Hi Nitin,

 How can i check the dfs health? could u plz guide me the steps...

On Thu, Jul 5, 2012 at 12:23 PM, Nitin Pawar wrote:

> can you check dfs health?
>
> I think few of your nodes are down
>
>
> On Thu, Jul 5, 2012 at 12:17 PM, shaik ahamed  wrote:
>
>> Hi All,
>>
>>
>>Im not able to fetch the data from the hive table ,getting
>> the below error
>>
>>FAILED: Error in semantic analysis:
>>
>> hive> select * from vender;
>> OK
>> Failed with exception java.io.IOException:java.io.IOException: Could not
>> obtain block: blk_-3328791500929854839_1178
>> file=/user/hive/warehouse/vender/bigtest.txt
>> Time taken: 9.129 seconds
>> Please help me in this
>>
>> Regards
>> shaik.
>>
>
>
>
> --
> Nitin Pawar
>
>


Re: hi users

2012-07-04 Thread shaik ahamed
Thanks for the reply guys,

Yesterday night im able to fecth the data .And my second node is down in
the sence im not able to connect to the 2 machine as i have 3 machiens 1
master and 2 slave .As the 2 second one im not able to connect .Is this the
prob for not retreiving the data, or other than this.



On Thu, Jul 5, 2012 at 12:23 PM, Nitin Pawar wrote:

> can you check dfs health?
>
> I think few of your nodes are down
>
>
> On Thu, Jul 5, 2012 at 12:17 PM, shaik ahamed  wrote:
>
>> Hi All,
>>
>>
>>Im not able to fetch the data from the hive table ,getting
>> the below error
>>
>>FAILED: Error in semantic analysis:
>>
>> hive> select * from vender;
>> OK
>> Failed with exception java.io.IOException:java.io.IOException: Could not
>> obtain block: blk_-3328791500929854839_1178
>> file=/user/hive/warehouse/vender/bigtest.txt
>> Time taken: 9.129 seconds
>> Please help me in this
>>
>> Regards
>> shaik.
>>
>
>
>
> --
> Nitin Pawar
>
>


Re: hi users

2012-07-04 Thread Mohammad Tariq
Hello shaik,

   Were you able to fetch the data earlier. I mean is it
happening for the first time or you were not able to fetch the data
even once??

Regards,
Mohammad Tariq


On Thu, Jul 5, 2012 at 12:17 PM, shaik ahamed  wrote:
> Hi All,
>
>
>Im not able to fetch the data from the hive table ,getting
> the below error
>
>FAILED: Error in semantic analysis:
>
> hive> select * from vender;
> OK
> Failed with exception java.io.IOException:java.io.IOException: Could not
> obtain block: blk_-3328791500929854839_1178
> file=/user/hive/warehouse/vender/bigtest.txt
> Time taken: 9.129 seconds
> Please help me in this
>
> Regards
> shaik.


Re: hi users

2012-07-04 Thread Nitin Pawar
can you check dfs health?

I think few of your nodes are down

On Thu, Jul 5, 2012 at 12:17 PM, shaik ahamed  wrote:

> Hi All,
>
>
>Im not able to fetch the data from the hive table ,getting
> the below error
>
>FAILED: Error in semantic analysis:
>
> hive> select * from vender;
> OK
> Failed with exception java.io.IOException:java.io.IOException: Could not
> obtain block: blk_-3328791500929854839_1178
> file=/user/hive/warehouse/vender/bigtest.txt
> Time taken: 9.129 seconds
>  Please help me in this
>
> Regards
> shaik.
>



-- 
Nitin Pawar


hi users

2012-07-04 Thread shaik ahamed
Hi All,


   Im not able to fetch the data from the hive table ,getting
the below error

   FAILED: Error in semantic analysis:

hive> select * from vender;
OK
Failed with exception java.io.IOException:java.io.IOException: Could not
obtain block: blk_-3328791500929854839_1178
file=/user/hive/warehouse/vender/bigtest.txt
Time taken: 9.129 seconds
Please help me in this

Regards
shaik.


Re: Hive upload

2012-07-04 Thread Ruslan Al-Fakikh
Hi,

Regarding the sqoop import: I noticed you wrote -table instead of
--table (1 dash instead of 2)

Ruslan

On Wed, Jul 4, 2012 at 12:41 PM, Bejoy Ks  wrote:
> Hi Yogesh
>
> To add on, looks like the table definition doesn't match with data as well.
>
> Your table defn has 4 columns defined, with 4th column as int
>
> describe formatted letstry;
> OK
> # col_namedata_type   comment
>
> rollno  int None
> namestring  None
> numbr   int None
> sno int None
>
>
> But the data has 5 columns with the 4th column as String
>
> 1,John,123,abc,2
>
>
> Also When you create the table, make sure to specify the right field
> separator
>
> 
> ROW FORMAT DELIMITED FIELDS TERMINATED BY ','
>  STORED AS TEXTFILE
>
>
> Regards
> Bejoy KS
>
> 
> From: Bejoy Ks 
> To: "user@hive.apache.org" 
> Sent: Wednesday, July 4, 2012 1:59 PM
> Subject: Re: Hive upload
>
> Hi Yogesh
>
> Looks like Sqoop import from rdbms to hdfs is getting successful but is
> failing at hive create table. You are seeing data in hive ware house because
> you have specified that as your target dir in sqoop import (--target-dir
> /user/hive/warehouse/new). It is recommended to use a different target dir
> while doing sqoop import other than the hive warehouse dir.
>
> Can you post in the full console log of sqoop with --verbose logging
> enabled. It can give some clues.
>
>
> With the second issue, You already have your data in
> '/user/hive/warehouse/letstry/' which is the location for the hive table
> 'letstry'. Why you still want to do a LOAD DATA again in there?
>
> If you are doing a SQOOP import of that, Again it is recommended to use a
> different target dir other than hive ware house dir. It'll help you avoid
> some confusions as well.
>
>
> 
> From: yogesh dhari 
> To: hive request 
> Sent: Wednesday, July 4, 2012 1:40 PM
> Subject: RE: Hive upload
>
>
> Hi Bejoy,
>
> Thank you very much for your response,
>
> 1)
>
> A) When I run command  show tables it doesn't show  newhive table.
> B) Yes the the newhive directory is present into /user/hive/warehouse and
> also containing the values imported from RDBMS
>
> Please suggest and give me an example for the sqoop import command according
> to you for this case.
>
>
> 2)
>
> A) Here is the command
>
> describe formatted letstry;
> OK
> # col_namedata_type   comment
>
> rollno  int None
> namestring  None
> numbr   int None
> sno int None
>
> # Detailed Table Information
> Database:   default
> Owner:  mediaadmin
> CreateTime: Tue Jul 03 17:06:27 GMT+05:30 2012
> LastAccessTime: UNKNOWN
> Protect Mode:   None
> Retention:  0
> Location:   hdfs://localhost:9000/user/hive/warehouse/letstry
> Table Type: MANAGED_TABLE
> Table Parameters:
> transient_lastDdlTime1341315550
>
> # Storage Information
> SerDe Library:  org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe
> InputFormat:org.apache.hadoop.mapred.TextInputFormat
> OutputFormat:
> org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat
> Compressed: No
> Num Buckets:-1
> Bucket Columns: []
> Sort Columns:   []
> Storage Desc Params:
> serialization.format1
> Time taken: 0.101 seconds
>
>
> B) hadoop dfs -ls /user/hive/warehouse/letstry/
> Found 1 items
> -rw-r--r--   1 mediaadmin supergroup 17 2012-07-02 12:05
> /user/hive/warehouse/letstry/part-m-0
>
> hadoop dfs -cat /user/hive/warehouse/letstry/part-m-0
> 1,John,123,abc,2
>
>
>
>
> Here data is present but when I upload it to Hive it gets deleted from HDFS
> and in Hive value appers NULL instead of  ( 1,John,123,abc,2). and I didn't
> understad your point regarding correct data format? ( this data was imported
> from Mysql table)
> And what kind of confugration neede in sqoop
>
> Please suggest and help
>
>
> Greetings
> Yogesh Kumar
>
>
>
>
>
> 
> Subject: Re: Hive upload
> To: user@hive.apache.org
> From: bejoy...@yahoo.com
> Date: Wed, 4 Jul 2012 05:58:41 +
>
> Hi Yogesh
>
> The first issue (sqoop one).
> 1) Is the table newhive coming when you list tables using 'show table'?
> 2) Are you seeing a directory 'newhive' in your hive warte house dir(usually
> /usr/hive/warehouse)?
>
> If not sqoop is failing to create hive tables /load data into them. Only
> sqoop import to hdfs is getting successful the hive part is failing.
>
> If hive in stand alone mode works as desired you need to check the sqoop
> configurations.
>
> Regarding the second issue, can you check th

Re: Hi

2012-07-04 Thread Bejoy KS
Hi Shaik

Updates are not supported in hive. Still you can accomplish updates by over 
writing either a whole table or a partition. 

In short updates are not directly supported in hive, indirectly it is really 
expensive as well.

Regards
Bejoy KS

Sent from handheld, please excuse typos.

-Original Message-
From: shaik ahamed 
Date: Wed, 4 Jul 2012 16:24:05 
To: 
Reply-To: user@hive.apache.org
Subject: Hi

Hi All,

 We can update the records in the Hive table,if so please tell me
the syntax in hive.



Regards,
shaik.



Hi

2012-07-04 Thread shaik ahamed
Hi All,

 We can update the records in the Hive table,if so please tell me
the syntax in hive.



Regards,
shaik.


Re: Hive upload

2012-07-04 Thread Bejoy Ks
Hi Yogesh

To add on, looks like the table definition doesn't match with data as well.

Your table defn has 4 columns defined, with 4th column as int

describe formatted letstry;
OK
# col_name        data_type       comment 
   
   
rollno      int     None
name        string      None
numbr       int    
 None
sno     int     None    



But the data has 5 columns with the 4th column as String

1,John,123,abc,2


Also When you create the table, make sure to specify the right field separator


ROW FORMAT DELIMITED FIELDS TERMINATED BY ','
 STORED AS TEXTFILE

Regards
Bejoy KS




 From: Bejoy Ks 
To: "user@hive.apache.org"  
Sent: Wednesday, July 4, 2012 1:59 PM
Subject: Re: Hive upload
 

Hi Yogesh

Looks like Sqoop import from rdbms to hdfs is getting successful but is failing 
at hive create table. You are seeing data in hive ware house because you have 
specified that as your target dir in sqoop import (--target-dir 
/user/hive/warehouse/new). It is recommended to use a different target dir 
while doing sqoop import other than the hive warehouse dir.

Can you post in the full console log of sqoop with --verbose logging enabled. 
It can give some clues.


With the second issue, You already have your data in 
'/user/hive/warehouse/letstry/' which is the location for the hive table 
'letstry'. Why you still want to do a LOAD DATA again in there? 


If you are doing a SQOOP import of that, Again it is recommended to use a 
different target dir other than hive ware house dir. It'll help you avoid some 
confusions as well. 





 From: yogesh dhari 
To: hive request  
Sent: Wednesday, July 4, 2012 1:40 PM
Subject: RE: Hive upload
 

 

Hi Bejoy,

Thank you very much for your response,

1)

A) When I run command  show tables it doesn't show  newhive table.
B) Yes the the newhive directory is present into /user/hive/warehouse and also 
containing the values imported from RDBMS

Please suggest and give me an example for the sqoop import command according to 
you for this case.


2)

A) Here is the command  

describe formatted letstry;
OK
# col_name        data_type       comment 
   
   
rollno      int     None
name        string      None
numbr       int    
 None
sno     int     None
      
# Detailed Table Information      
Database:       default      
Owner:      mediaadmin     
  
CreateTime:     Tue Jul 03 17:06:27 GMT+05:30 2012     
LastAccessTime:     UNKNOWN      
Protect Mode:       None         
Retention:      0        
Location:       hdfs://localhost:9000/user/hive/warehouse/letstry     
Table Type:
     MANAGED_TABLE        
Table Parameters:      
    transient_lastDdlTime    1341315550  
      
# Storage Information      
SerDe Library:      org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe     
InputFormat:        org.apache.hadoop.mapred.TextInputFormat     
OutputFormat:       
org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat     
Compressed:    
 No       
Num Buckets:        -1       
Bucket Columns:     []       
Sort Columns:       []       
Storage Desc Params:      
    serialization.format   
 1   
Time taken: 0.101 seconds


B) hadoop dfs -ls /user/hive/warehouse/letstry/
Found 1 items
-rw-r--r--   1 mediaadmin supergroup 17 2012-07-02 12:05 
/user/hive/warehouse/letstry/part-m-0

hadoop dfs -cat /user/hive/warehouse/letstry/part-m-0
1,John,123,abc,2




Here data is present but when I upload it to Hive it gets deleted from HDFS and 
in Hive value appers NULL instead of  ( 1,John,123,abc,2). and I didn't 
understad your point regarding correct data format? ( this data was imported 
from Mysql table)
And what kind of confugration neede in sqoop 

Please suggest and help


Greetings
Yogesh Kumar








Subject: Re: Hive upload
To: user@hive.apache.org
From: bejoy...@yahoo.com
Date: Wed, 4 Jul 2012 05:58:41 +

 Hi Yogesh

The first issue (sqoop one).
1) Is the table newhive coming when you list tables using 'show table'?
2) Are you seeing a directory 'newhive' in your hive warte house dir(usually 
/usr/hive/warehouse)?

If not sqoop is failing to create hive tables /load data into them. Only sqoop 
import to hdfs is getting successful the hive part is failing. 

If hive in stand alone m

Re: Hive upload

2012-07-04 Thread Bejoy Ks
Hi Yogesh

Looks like Sqoop import from rdbms to hdfs is getting successful but is failing 
at hive create table. You are seeing data in hive ware house because you have 
specified that as your target dir in sqoop import (--target-dir 
/user/hive/warehouse/new). It is recommended to use a different target dir 
while doing sqoop import other than the hive warehouse dir.

Can you post in the full console log of sqoop with --verbose logging enabled. 
It can give some clues.


With the second issue, You already have your data in 
'/user/hive/warehouse/letstry/' which is the location for the hive table 
'letstry'. Why you still want to do a LOAD DATA again in there?


If you are doing a SQOOP import of that, Again it is recommended to use a 
different target dir other than hive ware house dir. It'll help you avoid some 
confusions as well. 






 From: yogesh dhari 
To: hive request  
Sent: Wednesday, July 4, 2012 1:40 PM
Subject: RE: Hive upload
 

 

Hi Bejoy,

Thank you very much for your response,

1)

A) When I run command  show tables it doesn't show  newhive table.
B) Yes the the newhive directory is present into /user/hive/warehouse and also 
containing the values imported from RDBMS

Please suggest and give me an example for the sqoop import command according to 
you for this case.


2)

A) Here is the command  

describe formatted letstry;
OK
# col_name        data_type       comment 
      
rollno      int     None
name        string      None
numbr       int     None
sno     int     None
      
# Detailed Table Information      
Database:       default      
Owner:      mediaadmin       
CreateTime:     Tue Jul 03 17:06:27 GMT+05:30 2012     
LastAccessTime:     UNKNOWN      
Protect Mode:       None         
Retention:      0        
Location:       hdfs://localhost:9000/user/hive/warehouse/letstry     
Table Type:     MANAGED_TABLE        
Table Parameters:      
    transient_lastDdlTime    1341315550  
      
# Storage Information      
SerDe Library:      org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe     
InputFormat:        org.apache.hadoop.mapred.TextInputFormat     
OutputFormat:       
org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat     
Compressed:     No       
Num Buckets:        -1       
Bucket Columns:     []       
Sort Columns:       []       
Storage Desc Params:      
    serialization.format    1   
Time taken: 0.101 seconds


B) hadoop dfs -ls /user/hive/warehouse/letstry/
Found 1 items
-rw-r--r--   1 mediaadmin supergroup 17 2012-07-02 12:05 
/user/hive/warehouse/letstry/part-m-0

hadoop dfs -cat /user/hive/warehouse/letstry/part-m-0
1,John,123,abc,2




Here data is present but when I upload it to Hive it gets deleted from HDFS and 
in Hive value appers NULL instead of  ( 1,John,123,abc,2). and I didn't 
understad your point regarding correct data format? ( this data was imported 
from Mysql table)
And what kind of confugration neede in sqoop 

Please suggest and help


Greetings
Yogesh Kumar








Subject: Re: Hive upload
To: user@hive.apache.org
From: bejoy...@yahoo.com
Date: Wed, 4 Jul 2012 05:58:41 +

 Hi Yogesh

The first issue (sqoop one).
1) Is the table newhive coming when you list tables using 'show table'?
2) Are you seeing a directory 'newhive' in your hive warte house dir(usually 
/usr/hive/warehouse)?

If not sqoop is failing to create hive tables /load data into them. Only sqoop 
import to hdfs is getting successful the hive part is failing. 

If hive in stand alone mode works as desired you need to check the sqoop 
configurations.

Regarding the second issue, can you check the storage location of NewTable and 
check whether there are files within. If so then do a 'cat' of those files and 
see whether it has the correct data format.

You can get the location of your table from the following command
describe formatted NewTable;

Regards
Bejoy KS

Sent from handheld, please excuse typos.


From:  yogesh dhari  
Date: Wed, 4 Jul 2012 11:09:02 +0530
To: hive request
ReplyTo:  user@hive.apache.org 
Subject: Hive upload

Hi all,

I am trying to upload the tables from RDBMS to hive through sqoop, hive imports 
successfully. but i didn't find any table in hive that imported table gets 
uploaded into hdfs idr /user/hive/warehouse
I want it to be present into hive, I used this command

sqoop import --connect jdbc:mysql://localhost:3306/Demo --username sqoop1 
--password SQ

RE: Hive upload

2012-07-04 Thread yogesh dhari





Hi Bejoy,
Thank you very much for your response,
1)
A) When I run command  show tables it doesn't show  newhive table.B) Yes the 
the newhive directory is present into /user/hive/warehouse and also containing 
the values imported from RDBMS
Please suggest and give me an example for the sqoop import command according to 
you for this case.

2)
A) Here is the command  

describe formatted letstry;
OK
# col_namedata_type   comment 
  
rollno  int None
namestring  None
numbr   int None
sno int None
  
# Detailed Table Information  
Database:   default  
Owner:  mediaadmin   
CreateTime: Tue Jul 03 17:06:27 GMT+05:30 2012 
LastAccessTime: UNKNOWN  
Protect Mode:   None 
Retention:  0
Location:   hdfs://localhost:9000/user/hive/warehouse/letstry 
Table Type: MANAGED_TABLE
Table Parameters:  
transient_lastDdlTime1341315550  
  
# Storage Information  
SerDe Library:  org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe 
InputFormat:org.apache.hadoop.mapred.TextInputFormat 
OutputFormat:   
org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat 
Compressed: No   
Num Buckets:-1   
Bucket Columns: []   
Sort Columns:   []   
Storage Desc Params:  
serialization.format1   
Time taken: 0.101 seconds


B) hadoop dfs -ls /user/hive/warehouse/letstry/
Found 1 items
-rw-r--r--   1 mediaadmin supergroup 17 2012-07-02 12:05 
/user/hive/warehouse/letstry/part-m-0

hadoop dfs -cat /user/hive/warehouse/letstry/part-m-0
1,John,123,abc,2



Here data is present but when I upload it to Hive it gets deleted from HDFS and 
in Hive value appers NULL instead of  ( 1,John,123,abc,2). and I didn't 
understad your point regarding correct data format? ( this data was imported 
from Mysql table)And what kind of confugration neede in sqoop 
Please suggest and help

GreetingsYogesh Kumar





Subject: Re: Hive upload
To: user@hive.apache.org
From: bejoy...@yahoo.com
Date: Wed, 4 Jul 2012 05:58:41 +




Hi Yogesh

The first issue (sqoop one).
1) Is the table newhive coming when you list tables using 'show table'?
2) Are you seeing a directory 'newhive' in your hive warte house dir(usually 
/usr/hive/warehouse)?

If not sqoop is failing to create hive tables /load data into them. Only sqoop 
import to hdfs is getting successful the hive part is failing. 

If hive in stand alone mode works as desired you need to check the sqoop 
configurations.

Regarding the second issue, can you check the storage location of NewTable and 
check whether there are files within. If so then do a 'cat' of those files and 
see whether it has the correct data format.

You can get the location of your table from the following command
describe formatted NewTable;
Regards
Bejoy KS

Sent from handheld, please excuse typos.From:  yogesh dhari 

Date: Wed, 4 Jul 2012 11:09:02 +0530To: hive 
requestReplyTo:  user@hive.apache.org
Subject: Hive upload

Hi all,
I am trying to upload the tables from RDBMS to hive through sqoop, hive imports 
successfully. but i didn't find any table in hive that imported table gets 
uploaded into hdfs idr /user/hive/warehouseI want it to be present into hive, I 
used this command
sqoop import --connect jdbc:mysql://localhost:3306/Demo --username sqoop1 
--password SQOOP1 -table newone --hive-table newhive --create-hive-table 
--hive-import --target-dir /user/hive/warehouse/new

And another thing is,If I upload any file or table from HDFS or from Local then 
its uploads but data doesn't show in Hive table,
If I run command Select * from NewTable;it reflects
Null Null NullNull

although the real data is
Yogesh4Bangalore   1234

Please Suggest and help
RegardsYogesh Kumar   
  

Hive 0.8.0 - Add partitions programmtically to non-default db table

2012-07-04 Thread Priya Cheryl Sebastian
I am not using the default database in hive. And I need to add partitions 
programmatically. So I've tried the following via the JDBC thrift client, but 
nothing seems to work. It seems to only look at the default database. Any help 
appreciated.


1)  ALTER TABLE test.webstat add if not exists partition(dt='2012_04_19')  
LOCATION '/user/hadoop/logfiles/test/webstat/2012_04_19'

Error: FAILED: Parse Error: line 1:12 cannot recognize input near 'test' '.' 
'webstat' in alter table statement



2)  ALTER TABLE webstat add if not exists partition(dt='2012_04_19')  LOCATION 
'/user/hadoop/logfiles/test/webstat/2012_04_19'

Error:  org.springframework.jdbc.BadSqlGrammarException: StatementCallback; bad 
SQL grammar [ALTER TABLE webstat add if not exists partition(dt='2012_04_19')  
LOCATION '/user/hadoop/logfiles/test/webstat/2012_04_19' ]; nested exception is 
java.sql.SQLException: Query returned non-zero code: 10, cause: FAILED: Error 
in semantic analysis: Table not found webstat

2012-07-04 13:08:18,421 ERROR metadata.Hive (Hive.java:getTable(904)) - 
NoSuchObjectException(message:default.webstat table not found)
at 
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler$17.run(HiveMetaStore.java:1218)
at 
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler$17.run(HiveMetaStore.java:1213)
at 
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.executeWithRetry(HiveMetaStore.java:356)
at 
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_table(HiveMetaStore.java:1213)
at 
org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getTable(HiveMetaStoreClient.java:713)
at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:901)
at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:843)
at 
org.apache.hadoop.hive.ql.parse.DDLSemanticAnalyzer.addTablePartsOutputs(DDLSemanticAnalyzer.java:2101)
at 
org.apache.hadoop.hive.ql.parse.DDLSemanticAnalyzer.addTablePartsOutputs(DDLSemanticAnalyzer.java:2079)
at 
org.apache.hadoop.hive.ql.parse.DDLSemanticAnalyzer.analyzeAlterTableAddParts(DDLSemanticAnalyzer.java:1806)
at 
org.apache.hadoop.hive.ql.parse.DDLSemanticAnalyzer.analyzeInternal(DDLSemanticAnalyzer.java:297)
at 
org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:243)
at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:430)
at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:337)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:889)
at 
org.apache.hadoop.hive.service.HiveServer$HiveServerHandler.execute(HiveServer.java:191)
at 
org.apache.hadoop.hive.service.ThriftHive$Processor$execute.getResult(ThriftHive.java:629)
at 
org.apache.hadoop.hive.service.ThriftHive$Processor$execute.getResult(ThriftHive.java:617)
at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:32)
at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:34)
at 
org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:176)
at 
java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
at java.lang.Thread.run(Thread.java:662)



3)  use test; ALTER TABLE webstat add if not exists partition(dt='2012_04_19')  
LOCATION '/user/hadoop/logfiles/test/webstat/2012_04_19'

Error: org.springframework.jdbc.BadSqlGrammarException: StatementCallback; bad 
SQL grammar [use test; ALTER TABLE webstat add if not exists 
partition(dt='2012_04_19')  LOCATION 
'/user/hadoop/logfiles/test/webstat/2012_04_19' ]; nested exception is 
java.sql.SQLException: Query returned non-zero code: 11, cause: FAILED: Parse 
Error: line 1:12 mismatched input ';' expecting EOF near 'test'

2012-07-04 13:02:31,118 ERROR ql.Driver (SessionState.java:printError(380)) - 
FAILED: Parse Error: line 1:12 mismatched input ';' expecting EOF near 'test'

org.apache.hadoop.hive.ql.parse.ParseException: line 1:12 mismatched input ';' 
expecting EOF near 'test'

at 
org.apache.hadoop.hive.ql.parse.ParseDriver.parse(ParseDriver.java:439)
at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:417)
at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:337)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:889)
at 
org.apache.hadoop.hive.service.HiveServer$HiveServerHandler.execute(HiveServer.java:191)
at 
org.apache.hadoop.hive.service.ThriftHive$Processor$execute.getResult(ThriftHive.java:629)
at 
org.apache.hadoop.hive.service.ThriftHive$Processor$execute.getResult(ThriftHive.java:617)
at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:32)
at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:34)
at 
org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(