Re: Hive master branch broken?

2018-03-28 Thread Mahesh Kumar Behera
Try rebuilding your metastore db using the schema tool ..

From: Chaoran Yu 
Reply-To: "user@hive.apache.org" 
Date: Wednesday, March 28, 2018 at 10:53 AM
To: "user@hive.apache.org" 
Subject: Hive master branch broken?

Hi,

   I made some changes to a local Hive repo based on the latest master branch. 
But somehow I got the following error when running a "CREATE TABLE" statement 
in Hive shell:

org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Insert 
of object "org.apache.hadoop.hive.metastore.model.MSerDeInfo@ace2408" using 
statement "INSERT INTO SERDES 
(SERDE_ID,DESCRIPTION,DESERIALIZER_CLASS,"NAME",SERDE_TYPE,SLIB,SERIALIZER_CLASS)
 VALUES (?,?,?,?,?,?,?)" failed : 'DESCRIPTION' is not a column in table or VTI 
'APP.SERDES'.)
FAILED: Execution Error, return code 1 from 
org.apache.hadoop.hive.ql.exec.DDLTask. MetaException(message:Insert of object 
"org.apache.hadoop.hive.metastore.model.MSerDeInfo@ace2408" using statement 
"INSERT INTO SERDES 
(SERDE_ID,DESCRIPTION,DESERIALIZER_CLASS,"NAME",SERDE_TYPE,SLIB,SERIALIZER_CLASS)
 VALUES (?,?,?,?,?,?,?)" failed : 'DESCRIPTION' is not a column in table or VTI 
'APP.SERDES'.)

No matter what table name or schema I create, I got the same error. I remember 
testing my changes about three weeks ago using the master branch back then, 
everything was working. It only started failing today after I merged the latest 
commits in master.

Is anyone aware of recent commits that might break this write code path?
And do we know when the next release of Hive is coming out?

Thank you,
Chaoran Yu


Re: UPDATE in Hive -0.14.0

2014-11-24 Thread Mahesh Kumar
Hi unmesha sreevani,

*Create metastore in mysql and create the tables as per
the below link.*
https://github.com/apache/hive/blob/trunk/metastore/scripts/upCreate
metastore in mysql and create the tables as per the below
link.grade/mysql/hive-schema-0.14.0.mysql.sql
https://github.com/apache/hive/blob/trunk/metastore/scripts/upgrade/mysql/hive-schema-0.14.0.mysql.sql
.

*And add these properties in hive-site.xml.*

property
 namehive.support.concurrency/name
 valuetrue/value
/property

property
 namehive.enforce.bucketing/name
 valuetrue/value
/property

property
 namehive.exec.dynamic.partition.mode/name
 valuenonstrict/value
/property

property
 namehive.txn.manager/name
 valueorg.apache.hadoop.hive.ql.lockmgr.DbTxnManager/value
/property

property
 namehive.compactor.initiator.on/name
 valuetrue/value
/property

property
 namehive.compactor.worker.threads/name
 value1/value
/property

*Make sure your table creation supports ACID ouput format.Create like
following*.

create table test(id int, name varchar(128)) clustered by (id) into 2
buckets stored as orc TBLPROPERTIES ('transactional'='true')


Regards,

Mahesh.S


Re: UPDATE in Hive -0.14.0

2014-11-24 Thread Mahesh Kumar
hi unmesha sreeveni,

  As i told earlier create the table with ACID ouput format
support.



On Mon, Nov 24, 2014 at 3:09 PM, unmesha sreeveni unmeshab...@gmail.com
wrote:

 Created a Table in Hive

 create external table HiveTest (EmployeeID Int,FirstName
 String,Designation String,Salary Int,Department String) row format
 delimited fields terminated by , location '/user/aibladmin/Hive';

 And set all the properties in hive-site.xml
 hive.support.concurrency – true
 hive.enforce.bucketing – true
 hive.exec.dynamic.partition.mode – nonstrict
 hive.txn.manager –org.apache.hadoop.hive.ql.lockmgr.DbTxnManager
 hive.compactor.initiator.on – true
 hive.compactor.worker.threads – 1

 The when I tried
 hive
 
  UPDATE HiveTest SET salary = 5 WHERE employeeid = 19;
 FAILED: SemanticException [Error 10297]: Attempt to do update or delete on
 table default.HiveTest that does not use an AcidOutputFormat or is not
 bucketed

  Is it beacause of hive.enforce.bucketing – true set in hive-site.xml
 Is bucketing like partition?





 On Mon, Nov 24, 2014 at 2:55 PM, Mahesh Kumar sankarmahes...@gmail.com
 wrote:

 Hi unmesha sreevani,

 *Create metastore in mysql and create the tables as
 per the below link.*
 https://github.com/apache/hive/blob/trunk/metastore/scripts/upCreate
 metastore in mysql and create the tables as per the below
 link.grade/mysql/hive-schema-0.14.0.mysql.sql
 https://github.com/apache/hive/blob/trunk/metastore/scripts/upgrade/mysql/hive-schema-0.14.0.mysql.sql
 .

 *And add these properties in hive-site.xml.*

 property
  namehive.support.concurrency/name
  valuetrue/value
 /property

 property
  namehive.enforce.bucketing/name
  valuetrue/value
 /property

 property
  namehive.exec.dynamic.partition.mode/name
  valuenonstrict/value
 /property

 property
  namehive.txn.manager/name
  valueorg.apache.hadoop.hive.ql.lockmgr.DbTxnManager/value
 /property

 property
  namehive.compactor.initiator.on/name
  valuetrue/value
 /property

 property
  namehive.compactor.worker.threads/name
  value1/value
 /property

 *Make sure your table creation supports ACID ouput format.Create like
 following*.

 create table test(id int, name varchar(128)) clustered by (id) into 2
 buckets stored as orc TBLPROPERTIES ('transactional'='true')


 Regards,

 Mahesh.S




 --
 *Thanks  Regards *


 *Unmesha Sreeveni U.B*
 *Hadoop, Bigdata Developer*
 *Centre for Cyber Security | Amrita Vishwa Vidyapeetham*
 http://www.unmeshasreeveni.blogspot.in/





Re: Hive 0.14 configuration

2014-11-04 Thread mahesh kumar
Hi Nitin,
   I created table with ORC format when i update it shows the
following error.
CREATE TABLE students (name VARCHAR(64), age INT, gpa DECIMAL(3, 2))
CLUSTERED BY (age) INTO 2 BUCKETS STORED AS ORC;

INSERT INTO TABLE students VALUES ('fred flintstone', 35, 1.28), ('barney
rubble', 32, 2.32);

hive update students set age='12' where name='barney rubble';

FAILED: SemanticException [Error 10122]: Bucketized tables do not support
INSERT INTO: Table: default.students

Thanks,
Mahesh.S


On Tue, Nov 4, 2014 at 1:28 PM, Nitin Pawar nitinpawar...@gmail.com wrote:

 currently only ORCFileformat is supports ACIDOutputformat

 So you may want to create a table with orcfile format and see if you are
 able to do acid opertaions.



 On Tue, Nov 4, 2014 at 1:14 PM, mahesh kumar sankarmahes...@gmail.com
 wrote:

 Hi Nitin,

  how to create table with AcidOuptut Format.?Can you send me
 examples.

 Thanks
 Mahesh

 On Tue, Nov 4, 2014 at 12:21 PM, Nitin Pawar nitinpawar...@gmail.com
 wrote:

 As the error says, your table file format has to be AcidOutPutFormat or
 table needs to be bucketed to perform update operation.

 You may want to create a new table from your existing table with
 AcidOutPutFormat and insert data from current table to that table and then
 try update op on new table

 On Tue, Nov 4, 2014 at 12:11 PM, mahesh kumar sankarmahes...@gmail.com
 wrote:

 Hi ,
Is anyone tried hive 0.14 configuration.I built it using maven
 from github.
 Insert is working fine but when i use update/delete i got the
  error.First i created table and inserted rows.

 CREATE  TABLE new(id int ,name string)ROW FORMAT DELIMITED FIELDS
 TERMINATED BY ',';
  insert into table new values ('1','Mahesh');

 update new set name='Raj' where id=1;

 FAILED: SemanticException [Error 10297]: Attempt to do update or delete
 on table default.new that does not use an AcidOutputFormat or is not
 bucketed.

 When i update the table i got the above error.

 Can you help me guys.

 Thanks

 Mahesh.S






 --
 Nitin Pawar





 --
 Nitin Pawar



Re: Hive 0.14 configuration

2014-11-04 Thread mahesh kumar
Hi,
 Finally i updated and deleted from hive table.we need to create table
with transactional = true like following.

create table test(id int, name varchar(128)) clustered by (id) into 2
buckets stored as orc TBLPROPERTIES ('transactional'='true')
insert into table test values(1,'Mahesh');
update test set name='Raj' where id=1;
delete from test where name='Raj';

Cheers,

Mahesh.S


On Tue, Nov 4, 2014 at 1:52 PM, mahesh kumar sankarmahes...@gmail.com
wrote:

 Hi Nitin,
I created table with ORC format when i update it shows the
 following error.
 CREATE TABLE students (name VARCHAR(64), age INT, gpa DECIMAL(3, 2))
 CLUSTERED BY (age) INTO 2 BUCKETS STORED AS ORC;

 INSERT INTO TABLE students VALUES ('fred flintstone', 35, 1.28), ('barney
 rubble', 32, 2.32);

 hive update students set age='12' where name='barney rubble';

 FAILED: SemanticException [Error 10122]: Bucketized tables do not support
 INSERT INTO: Table: default.students

 Thanks,
 Mahesh.S


 On Tue, Nov 4, 2014 at 1:28 PM, Nitin Pawar nitinpawar...@gmail.com
 wrote:

 currently only ORCFileformat is supports ACIDOutputformat

 So you may want to create a table with orcfile format and see if you are
 able to do acid opertaions.



 On Tue, Nov 4, 2014 at 1:14 PM, mahesh kumar sankarmahes...@gmail.com
 wrote:

 Hi Nitin,

  how to create table with AcidOuptut Format.?Can you send me
 examples.

 Thanks
 Mahesh

 On Tue, Nov 4, 2014 at 12:21 PM, Nitin Pawar nitinpawar...@gmail.com
 wrote:

 As the error says, your table file format has to be AcidOutPutFormat or
 table needs to be bucketed to perform update operation.

 You may want to create a new table from your existing table with
 AcidOutPutFormat and insert data from current table to that table and then
 try update op on new table

 On Tue, Nov 4, 2014 at 12:11 PM, mahesh kumar sankarmahes...@gmail.com
  wrote:

 Hi ,
Is anyone tried hive 0.14 configuration.I built it using maven
 from github.
 Insert is working fine but when i use update/delete i got the
  error.First i created table and inserted rows.

 CREATE  TABLE new(id int ,name string)ROW FORMAT DELIMITED FIELDS
 TERMINATED BY ',';
  insert into table new values ('1','Mahesh');

 update new set name='Raj' where id=1;

 FAILED: SemanticException [Error 10297]: Attempt to do update or
 delete on table default.new that does not use an AcidOutputFormat or is 
 not
 bucketed.

 When i update the table i got the above error.

 Can you help me guys.

 Thanks

 Mahesh.S






 --
 Nitin Pawar





 --
 Nitin Pawar





Hive 0.14 configuration

2014-11-03 Thread mahesh kumar
Hi ,
   Is anyone tried hive 0.14 configuration.I built it using maven from
github.
Insert is working fine but when i use update/delete i got the  error.First
i created table and inserted rows.

CREATE  TABLE new(id int ,name string)ROW FORMAT DELIMITED FIELDS
TERMINATED BY ',';
 insert into table new values ('1','Mahesh');

update new set name='Raj' where id=1;

FAILED: SemanticException [Error 10297]: Attempt to do update or delete on
table default.new that does not use an AcidOutputFormat or is not bucketed.

When i update the table i got the above error.

Can you help me guys.

Thanks

Mahesh.S


Re: Hive 0.14 configuration

2014-11-03 Thread mahesh kumar
Hi Nitin,

 how to create table with AcidOuptut Format.?Can you send me
examples.

Thanks
Mahesh

On Tue, Nov 4, 2014 at 12:21 PM, Nitin Pawar nitinpawar...@gmail.com
wrote:

 As the error says, your table file format has to be AcidOutPutFormat or
 table needs to be bucketed to perform update operation.

 You may want to create a new table from your existing table with
 AcidOutPutFormat and insert data from current table to that table and then
 try update op on new table

 On Tue, Nov 4, 2014 at 12:11 PM, mahesh kumar sankarmahes...@gmail.com
 wrote:

 Hi ,
Is anyone tried hive 0.14 configuration.I built it using maven
 from github.
 Insert is working fine but when i use update/delete i got the
  error.First i created table and inserted rows.

 CREATE  TABLE new(id int ,name string)ROW FORMAT DELIMITED FIELDS
 TERMINATED BY ',';
  insert into table new values ('1','Mahesh');

 update new set name='Raj' where id=1;

 FAILED: SemanticException [Error 10297]: Attempt to do update or delete
 on table default.new that does not use an AcidOutputFormat or is not
 bucketed.

 When i update the table i got the above error.

 Can you help me guys.

 Thanks

 Mahesh.S






 --
 Nitin Pawar



Hive error with mongodb connector

2014-09-20 Thread mahesh kumar
Hi,
   I got the following error when i trying to connect mongodb with hive
using monog-hadoop connector.

2014-09-16 17:32:24,279 INFO [main]
org.apache.hadoop.mapreduce.v2.app.MRAppMaster: Created MRAppMaster for
application appattempt_1410858694842_0013_01
*2014-09-16 17:32:24,742 FATAL [main] org.apache.hadoop.conf.Configuration:
error parsing conf job.xml*
org.xml.sax.SAXParseException; systemId:
*file:///tmp/hadoop-hadoop2/nm-local-dir/usercache/hadoop2/appcache/application_1410858694842_0013/container_1410858694842_0013_01_01/job.xml;
lineNumber: 586; columnNumber: 51; Character reference #*
at
com.sun.org.apache.xerces.internal.parsers.DOMParser.parse(DOMParser.java:257)
at
com.sun.org.apache.xerces.internal.jaxp.DocumentBuilderImpl.parse(DocumentBuilderImpl.java:347)
at javax.xml.parsers.DocumentBuilder.parse(DocumentBuilder.java:150)
at org.apache.hadoop.conf.Configuration.parse(Configuration.java:2173)
at
org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2242)
at
org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2195)
at org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2102)
at org.apache.hadoop.conf.Configuration.get(Configuration.java:1068)
at
org.apache.hadoop.mapreduce.v2.util.MRWebAppUtil.initialize(MRWebAppUtil.java:50)
at
org.apache.hadoop.mapreduce.v2.app.MRAppMaster.main(MRAppMaster.java:1377)
2014-09-16 17:32:24,748 FATAL [main]
org.apache.hadoop.mapreduce.v2.app.MRAppMaster: Error starting MRAppMaster
java.lang.RuntimeException: org.xml.sax.SAXParseException; systemId:
file:///tmp/hadoop-hadoop2/nm-local-dir/usercache/hadoop2/appcache/application_1410858694842_0013/container_1410858694842_0013_01_01/job.xml;
lineNumber: 586; columnNumber: 51; Character reference #
at
org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2338)
at
org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2195)
at org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2102)
at org.apache.hadoop.conf.Configuration.get(Configuration.java:1068)
at
org.apache.hadoop.mapreduce.v2.util.MRWebAppUtil.initialize(MRWebAppUtil.java:50)
at
org.apache.hadoop.mapreduce.v2.app.MRAppMaster.main(MRAppMaster.java:1377)
Caused by: org.xml.sax.SAXParseException; systemId:
file:///tmp/hadoop-hadoop2/nm-local-dir/usercache/hadoop2/appcache/application_1410858694842_0013/container_1410858694842_0013_01_01/job.xml;
lineNumber: 586; columnNumber: 51; Character reference #
at
com.sun.org.apache.xerces.internal.parsers.DOMParser.parse(DOMParser.java:257)
at
com.sun.org.apache.xerces.internal.jaxp.DocumentBuilderImpl.parse(DocumentBuilderImpl.java:347)
at javax.xml.parsers.DocumentBuilder.parse(DocumentBuilder.java:150)
at org.apache.hadoop.conf.Configuration.parse(Configuration.java:2173)
at
org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2242)
... 5 more
2014-09-16 17:32:24,750 INFO [main] org.apache.hadoop.util.ExitUtil:
Exiting with status 1

my working environment is,

hadoop 2.4.1
hive   0.13.1
mongodb 2.6.3
mongodb connector 1.4.0

reference:
https://groups.google.com/forum/#!msg/mongodb-user/lKbha0SzMP8/jvE8ZrJom4AJ

Please help me.

Thanks

Mahesh S