Error in semantic analysis: Lock manager could not be initialized

2013-07-09 Thread ch huang
here is my hive config file,i do not know why,anyone can help?


  javax.jdo.option.ConnectionURL
  jdbc:mysql://192.168.10.118/metastore
  the URL of the MySQL database


  javax.jdo.option.ConnectionDriverName
  com.mysql.jdbc.Driver


  javax.jdo.option.ConnectionUserName
  hive


  javax.jdo.option.ConnectionPassword
  myhive


  datanucleus.autoCreateSchema
  false


  datanucleus.fixedDatastore
  true


  datanucleus.autoStartMechanism
  SchemaTable


  hive.metastore.uris
  thrift://192.168.10.22:9083


  hive.metastore.warehouse.dir
  /user/hive/warehouse


  hive.aux.jars.path

file:///usr/lib/hive/lib/hive-hbase-handler-0.10.0-cdh4.3.0.jar,file:///usr/lib/hive/lib/hbase.jar,file:///usr/lib/hive/lib/zookeeper.jar


  hive.zookeeper.quorum
  192.168.10.22:2281


  hive.hwi.listen.host
  0.0.0.0
  This is the host address the Hive Web Interface will listen
on


  hive.hwi.listen.port
  
  This is the port the Hive Web Interface will listen
on


  hive.hwi.war.file
  lib/hive-hwi-0.7.1-cdh3u4.war
  This is the WAR file with the jsp content for Hive Web
Interface


  hive.support.concurrency
  true


  hive.zookeeper.client.port
  2281


  hive.lock.manager

org.apache.hadoop.hive.ql.lockmgr.zookeeper.ZooKeeperHiveLockManager



hive> show tables;
FAILED: Error in semantic analysis: Lock manager could not be initialized,
check hive.lock.manager Check hive.zookeeper.quorum and
hive.zookeeper.client.port
OK
demo_hive


how to configure hive use yarn ?

2013-07-09 Thread ch huang
i install CDH4.3,i already try mapreduce V1,it's work fine,but when i stop
mapredv1,and start yarn ,hive can not use it,why?

  set mapred.reduce.tasks=
java.net.ConnectException: Call From CH22/192.168.10.22 to CH22:9001 failed
on connection exception: java.net.ConnectException: Connection refused; For
more details see:  http:/
/wiki.apache.org/hadoop/ConnectionRefused
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
Method)
at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:782)
at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:729)
at org.apache.hadoop.ipc.Client.call(Client.java:1229)
at
org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:225)
at org.apache.hadoop.mapred.$Proxy17.getStagingAreaDir(Unknown
Source)
at
org.apache.hadoop.mapred.JobClient.getStagingAreaDir(JobClient.java:1324)
at
org.apache.hadoop.mapreduce.JobSubmissionFiles.getStagingDir(JobSubmissionFiles.java:102)
at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:951)
at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:945)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:396)
at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408)
at
org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:945)
at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:919)
at
org.apache.hadoop.hive.ql.exec.ExecDriver.execute(ExecDriver.java:448)
at
org.apache.hadoop.hive.ql.exec.MapRedTask.execute(MapRedTask.java:138)
at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:138)
at
org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:57)
at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1374)
at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1160)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:973)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:893)
at
org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:259)
at
org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:216)
at
org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:412)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:759)
at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:613)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.hadoop.util.RunJar.main(RunJar.java:208)
Caused by: java.net.ConnectException: Connection refused
at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
at
sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:599)
at
org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:207)


export csv, use ',' as split

2013-07-09 Thread kentkong_work
hi here,
   I create a table like this and put a lot data into it.
   then I export query result into csv file like this:
hive -e myquery >> result.csv

   but the csv uses tab as split.
   how to make hive use ','? thanks!

CREATE TABLE if not exists upload_users(
  username string, 
  mobile string, 
  id_type string, 
  id_no string, 
  email string, 
  address string, 
  validate_time string
) partitioned by (fileid string) 
row format delimited fields terminated by "\,";

   

Re: Strange error in hive

2013-07-09 Thread Navis류승우
Attached patch for this in https://issues.apache.org/jira/browse/HIVE-4837

2013/7/10 Navis류승우 :
> Could you try to remove "NULL as FLG_DEM_INC_PRX_CS_VAL"s in the query?
>
> It seemed not related to HIVE-4650 but still a bug (I'll book this)
>
> 2013/7/9 Jérôme Verdier :
>> Hi,
>>
>> Thanks for your help.
>>
>> You can see logs below :
>>
>> java.lang.RuntimeException: Error in configuring object
>> at
>> org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:93)
>> at
>> org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:64)
>> at
>> org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:117)
>> at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:432)
>> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:372)
>> at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
>> at java.security.AccessController.doPrivileged(Native Method)
>> at javax.security.auth.Subject.doAs(Subject.java:396)
>> at
>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1149)
>> at org.apache.hadoop.mapred.Child.main(Child.java:249)
>> Caused by: java.lang.reflect.InvocationTargetException
>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> at
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>> at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>> at java.lang.reflect.Method.invoke(Method.java:597)
>> at
>> org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:88)
>> ... 9 more
>> Caused by: java.lang.RuntimeException: Error in configuring object
>> at
>> org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:93)
>> at
>> org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:64)
>> at
>> org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:117)
>> at org.apache.hadoop.mapred.MapRunner.configure(MapRunner.java:34)
>> ... 14 more
>> Caused by: java.lang.reflect.InvocationTargetException
>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> at
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>> at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>> at java.lang.reflect.Method.invoke(Method.java:597)
>> at
>> org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:88)
>> ... 17 more
>> Caused by: java.lang.RuntimeException: Map operator initialization failed
>> at
>> org.apache.hadoop.hive.ql.exec.ExecMapper.configure(ExecMapper.java:121)
>> ... 22 more
>> Caused by: java.lang.NullPointerException
>> at
>> org.apache.hadoop.hive.serde2.objectinspector.StructObjectInspector.toString(StructObjectInspector.java:64)
>> at java.lang.String.valueOf(String.java:2826)
>> at java.lang.StringBuilder.append(StringBuilder.java:115)
>> at
>> org.apache.hadoop.hive.ql.exec.UnionOperator.initializeOp(UnionOperator.java:110)
>> at org.apache.hadoop.hive.ql.exec.Operator.initialize(Operator.java:375)
>> at org.apache.hadoop.hive.ql.exec.Operator.initialize(Operator.java:451)
>> at
>> org.apache.hadoop.hive.ql.exec.Operator.initializeChildren(Operator.java:407)
>> at
>> org.apache.hadoop.hive.ql.exec.TableScanOperator.initializeOp(TableScanOperator.java:186)
>> at org.apache.hadoop.hive.ql.exec.Operator.initialize(Operator.java:375)
>> at
>> org.apache.hadoop.hive.ql.exec.MapOperator.initializeOp(MapOperator.java:563)
>> at org.apache.hadoop.hive.ql.exec.Operator.initialize(Operator.java:375)
>> at
>> org.apache.hadoop.hive.ql.exec.ExecMapper.configure(ExecMapper.java:100)
>> ... 22 more
>>
>>
>>
>> 2013/7/8 
>>
>>> Hii Jerome
>>>
>>>
>>> Can you send the error log of the MapReduce task that failed? That should
>>> have some pointers which can help you troubleshoot the issue.
>>> Regards
>>> Bejoy KS
>>>
>>> Sent from remote device, Please excuse typos
>>> 
>>> From: Jérôme Verdier 
>>> Date: Mon, 8 Jul 2013 11:25:34 +0200
>>> To: 
>>> ReplyTo: user@hive.apache.org
>>> Subject: Strange error in hive
>>>
>>> Hi everybody,
>>>
>>> I faced a strange error in hive this morning.
>>>
>>> The error message is this one :
>>>
>>> FAILED: Execution Error, return code 2 from
>>> org.apache.hadoop.hive.ql.exec.MapRedTask
>>>
>>> after a quick search on Google, it appears that this is a Hive bug :
>>>
>>> https://issues.apache.org/jira/browse/HIVE-4650
>>>
>>> Is there a way to pass through this error ?
>>>
>>> Thanks.
>>>
>>> NB : my hive script is in the attachment.
>>>
>>>
>>> --
>>> Jérôme VERDIER
>>> 06.72.19.17.31
>>> verdier.jerom...@gmail.com
>>>
>>
>>
>>
>> --
>> Jérôme VERDIER
>> 06.72.19.17.31
>> verdier.jerom...@gmail.com
>>


Re: Strange error in hive

2013-07-09 Thread Navis류승우
Could you try to remove "NULL as FLG_DEM_INC_PRX_CS_VAL"s in the query?

It seemed not related to HIVE-4650 but still a bug (I'll book this)

2013/7/9 Jérôme Verdier :
> Hi,
>
> Thanks for your help.
>
> You can see logs below :
>
> java.lang.RuntimeException: Error in configuring object
> at
> org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:93)
> at
> org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:64)
> at
> org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:117)
> at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:432)
> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:372)
> at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
> at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:396)
> at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1149)
> at org.apache.hadoop.mapred.Child.main(Child.java:249)
> Caused by: java.lang.reflect.InvocationTargetException
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> at java.lang.reflect.Method.invoke(Method.java:597)
> at
> org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:88)
> ... 9 more
> Caused by: java.lang.RuntimeException: Error in configuring object
> at
> org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:93)
> at
> org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:64)
> at
> org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:117)
> at org.apache.hadoop.mapred.MapRunner.configure(MapRunner.java:34)
> ... 14 more
> Caused by: java.lang.reflect.InvocationTargetException
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> at java.lang.reflect.Method.invoke(Method.java:597)
> at
> org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:88)
> ... 17 more
> Caused by: java.lang.RuntimeException: Map operator initialization failed
> at
> org.apache.hadoop.hive.ql.exec.ExecMapper.configure(ExecMapper.java:121)
> ... 22 more
> Caused by: java.lang.NullPointerException
> at
> org.apache.hadoop.hive.serde2.objectinspector.StructObjectInspector.toString(StructObjectInspector.java:64)
> at java.lang.String.valueOf(String.java:2826)
> at java.lang.StringBuilder.append(StringBuilder.java:115)
> at
> org.apache.hadoop.hive.ql.exec.UnionOperator.initializeOp(UnionOperator.java:110)
> at org.apache.hadoop.hive.ql.exec.Operator.initialize(Operator.java:375)
> at org.apache.hadoop.hive.ql.exec.Operator.initialize(Operator.java:451)
> at
> org.apache.hadoop.hive.ql.exec.Operator.initializeChildren(Operator.java:407)
> at
> org.apache.hadoop.hive.ql.exec.TableScanOperator.initializeOp(TableScanOperator.java:186)
> at org.apache.hadoop.hive.ql.exec.Operator.initialize(Operator.java:375)
> at
> org.apache.hadoop.hive.ql.exec.MapOperator.initializeOp(MapOperator.java:563)
> at org.apache.hadoop.hive.ql.exec.Operator.initialize(Operator.java:375)
> at
> org.apache.hadoop.hive.ql.exec.ExecMapper.configure(ExecMapper.java:100)
> ... 22 more
>
>
>
> 2013/7/8 
>
>> Hii Jerome
>>
>>
>> Can you send the error log of the MapReduce task that failed? That should
>> have some pointers which can help you troubleshoot the issue.
>> Regards
>> Bejoy KS
>>
>> Sent from remote device, Please excuse typos
>> 
>> From: Jérôme Verdier 
>> Date: Mon, 8 Jul 2013 11:25:34 +0200
>> To: 
>> ReplyTo: user@hive.apache.org
>> Subject: Strange error in hive
>>
>> Hi everybody,
>>
>> I faced a strange error in hive this morning.
>>
>> The error message is this one :
>>
>> FAILED: Execution Error, return code 2 from
>> org.apache.hadoop.hive.ql.exec.MapRedTask
>>
>> after a quick search on Google, it appears that this is a Hive bug :
>>
>> https://issues.apache.org/jira/browse/HIVE-4650
>>
>> Is there a way to pass through this error ?
>>
>> Thanks.
>>
>> NB : my hive script is in the attachment.
>>
>>
>> --
>> Jérôme VERDIER
>> 06.72.19.17.31
>> verdier.jerom...@gmail.com
>>
>
>
>
> --
> Jérôme VERDIER
> 06.72.19.17.31
> verdier.jerom...@gmail.com
>


Re: integration issure about hive and hbase

2013-07-09 Thread Sanjay Subramanian
I am attaching portions from a document  I had written last year while 
investigating Hbase and Hive. You may have already crossed that 
bridge….nevertheless…

Please forgive me :-) if some steps seamy hacky and not very well explained….I 
was on a solo mission to build a Hive Data platform from scratch and QDBW  
(Quick and Dirty But Works) was my philosophy to go ahead !!!

Good luck

Sanjay


=

Hive and Hbase integration on local Fedora desktop 
guide

Pre-requisites

  *   Hadoop needs to be installed and HDFS needs to be be running  (Hadoop 
HDFS setup on local Fedora desktop 
guide)
  *   Hive needs to be installed (Hive setup on local Fedora desktop 
guide)
  *   HBase needs to be installed and running.(Hbase setup on local Fedora 
desktop 
guide)
 *   Make sure ZooKeeper is running on port 2181. If not stop Hbase , 
change $HBASE_HOME/conf/hbase-site.xml and restart HBase



Copying JARS to HADOOP_CLASSPATH

Before you query tables , copy these jars from $HIVE_HOME/lib > 
$HADOOP_HOME/lib

  1.  Make sure zookeeper-3.4.3.jar is not there
 *   ls -latr   $HADOOP_HOME/lib/zookeeper-3.4.3.jar
  2.  Copy zookeeper-3.4.3.jar
 *   sudo cp -av $HIVE_HOME/zookeeper-3.4.3.jar $HADOOP_HOME/lib
  3.  Make sure hive-common-0.9.0.jar is not there
 *   ls -latr   $HADOOP_HOME/lib/hive-common-0.9.0.jar
  4.  Copy hive-common-0.9.0.jar
 *   sudo cp -av $HIVE_HOME/hive-common-0.9.0.jar $HADOOP_HOME/lib
  5.  Make sure hive-hbase-handler-0.9.0.jar is not there
 *   ls -latr   $HADOOP_HOME/lib/hive-hbase-handler-0.9.0.jar
  6.  Copy hive-hbase-handler-0.9.0.jar
 *   sudo cp -av $HIVE_HOME/hive-hbase-handler-0.9.0.jar $HADOOP_HOME/lib
  7.  Exit from Hive Shell (type exit;)

  8.  Exit from HBase shell

  9.  Stop Hbase
 *   $HBASE_HOME/bin/stop-hbase.sh
  10. Stop Hadoop/HDFS
 *   $HADOOP_HOME/bin/stop-all.sh
  11. Check if NO java processes related to Hadoop/HDFS/Hbase/Hive exist
 *ps auxw | grep java
  12. Start Hadoop/HDFS
 *   $HADOOP_HOME/bin/start-all.sh
  13. Start Hbase
 *   $HBASE_HOME/bin/start-hbase.sh
  14. Check ALL  java processes related to Hadoop/HDFS/Hbase/Hive exist
 *   ps auxw | grep java



Create tables in HBase

  *   Refer Hbase setup on local Fedora desktop 
guide
 and create the tables mentioned there
 *   hbase_2_hive_food
 *   hbase_2_hive_names

Create tables in HIVE

To run Hive type

$HIVE_HOME/bin/hive

This will take you to Hive shell. In the shell, create these two tables

  *   CREATE EXTERNAL TABLE hbase_hive_names(hbid INT, id INT,  fn STRING, ln 
STRING, age INT) STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' 
WITH SERDEPROPERTIES ("hbase.columns.mapping" = 
":key,id:id,name:fn,name:ln,age:age") TBLPROPERTIES("hbase.table.name" = 
"hbase_2_hive_names");
 *   This HIVE table will map to Hbase table hbase_2_hive_names
  *   CREATE EXTERNAL TABLE hbase_hive_food(hbid INT, id INT,  name STRING) 
STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH 
SERDEPROPERTIES ("hbase.columns.mapping" = ":key,id:id,name:name") 
TBLPROPERTIES("hbase.table.name" =id:id,name:name") 
TBLPROPERTIES("hbase.table.name" = "hbase_2_hive_food");
 *   This HIVE table will map to Hbase table hbase_2_hive_food


Creating & Loading tables in HBase through Hive


  *   Make sure there is no table in Hbase called 'hive2hbase_names_table'
  *   In Hive shell
 *   CREATE TABLE hive2hbase_names_table (hb_id int, fn string, ln string, 
age_dnq INT) STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH 
SERDEPROPERTIES ("hbase.columns.mapping" = 
":key,student:fn,student:ln,student:age") TBLPROPERTIES ("hbase.table.name" = 
"hive2hbase_names_table") ;
  *   Go to HBase shell
 *   check that table hive2hbase_names_table is created.
  *   In Hive Shell
 *   create a Hive table and populate with data which we will use to 
populate the HiveHBase table
 *   CREATE TABLE names_tab (hb_id int, fn string, ln string, age_dnq INT) 
PARTITIONED BY (age INT) ROW FORMAT DELIMITED FIELDS TERMINATED BY '\t';
 *   LOAD DATA LOCAL INPATH 
'/data/mycode/impressions/inputfiles/names1.tsv.4fields' OVERWRITE INTO TABLE 
names_tab PARTITION (age=60);
 

Re: Hive CLI

2013-07-09 Thread Sanjay Subramanian
Hi Rahul

Is there a reason why u use Hive CLI  ?

I have aliases defined that I use, so I never had to use Hive CLI again

alias hivescript='hive -e '
alias hivescriptd='hive -hiveconf hive.root.logger=INFO,console -e '

So when I want to run hive commands from Linux  I just type

hivescript "select colA, colB, colC from TableN where partitionA='somval1' 
limit 100"

Good luck

sanjay




From: rahul kavale mailto:kavale.ra...@gmail.com>>
Reply-To: "user@hive.apache.org" 
mailto:user@hive.apache.org>>
Date: Monday, July 8, 2013 11:26 PM
To: "user@hive.apache.org" 
mailto:user@hive.apache.org>>
Subject: Re: Hive CLI

Sorry My bad, it was my bad as Ctrl-a was screen binding which is why I was not 
able to use it for moving cursor. This had nothing to do with Hive or its 
configuration. My bad

Thanks & Regards,
Rahul


On 9 July 2013 11:00, rahul kavale 
mailto:kavale.ra...@gmail.com>> wrote:
Hey there,
I have been using HIVE(0.7) for a while now using CLI and bash scripts.
But its a pain to move cursor in the CLI i.e. once you enter a very long query 
then you cant go to start of the query (like you do using Ctrl+A/Ctrl+E in 
terminal). Does anyone know how to do it?

Thanks & Regards,
Rahul


CONFIDENTIALITY NOTICE
==
This email message and any attachments are for the exclusive use of the 
intended recipient(s) and may contain confidential and privileged information. 
Any unauthorized review, use, disclosure or distribution is prohibited. If you 
are not the intended recipient, please contact the sender by reply email and 
destroy all copies of the original message along with any attachments, from 
your computer system. If you are the intended recipient, please be advised that 
the content of this message is subject to access, review and disclosure by the 
sender's Email System Administrator.


Re: start HWI error

2013-07-09 Thread SF Hadoop
Change this:

 /usr/lib/hive/lib/hive-hwi-0.7.1-cdh3u4.war

To this:

 lib/hive-hwi-0.7.1-cdh3u4.war

sf


On Monday, July 8, 2013, ch huang wrote:

> i am testing hive web interface, anyone can help? thanks
> content in hive-site.xml
>
> 
>   hive.hwi.war.file
>   /usr/lib/hive/lib/hive-hwi-0.7.1-cdh3u4.war
>   This is the WAR file with the jsp content for Hive Web
> Interface
> 
>
> # export ANT_LIB=/usr/share/ant/lib
>
> # hive --service hwi
> 13/07/09 13:19:40 INFO hwi.HWIServer: HWI is starting up
> 13/07/09 13:19:41 FATAL hwi.HWIServer: HWI WAR file not found at
> /usr/lib/hive/lib/hive-hwi-0.7.1-cdh3u4.war
>
> # ls /usr/lib/hive/lib/hive-hwi-0.7.1-cdh3u4.war
> /usr/lib/hive/lib/hive-hwi-0.7.1-cdh3u4.war
>


Problem with Upgrading hive

2013-07-09 Thread pradeep T
Hi ,

we are currently using hive Version 0.7.

our current Hadoop Version is .20.2 (CDH 3). We have successfully upgraded
the hadoop version from .20.2 to hadoop 1.1.2.

But after upgrading hadoop, our hive 0.7 had some compatibility problem
with the latest hadoop 1.1.2.

So thought of upgrading hive. After a few research in the net I could only
find this document to Upgrade from .7 to .8.

http://docs.aws.amazon.com/ElasticMapReduce/latest/DeveloperGuide/emr-hive-upgrade-metastore-mysql.html

The procedure for upgrade is to run few SQL scripts on the mysql metastore
DB. We are using mysql to store our metadata.

I have ran the script to upgrade from version .7 to .8.

Then from .8 to .9

Then from .9 to .10

After that took the upgraded sql dump and loaded to a new db created and
pointed that as the metastore location.

And restarted hive server and client. But after restarting when I give
"show tables;" command, I could see none of my old tables.

But when I create a new table it is getting created in the mentioned HDFS
path. But I was not able to see that new table in the Mysql metastore's
db's "TBLS" table too.

Just by logging to the Mysql, after selecting the required DB and If I give
mysql> select * from TBLS;. I am seeing all my old tables but not the new
one which I created after upgrade.

My questions are :

1) What should I do to make all my old tables to get reflected in hive ?

2) Where is my new table getting created and which metastore it is taking
to store the new tables ?


Thanks,
Pradeep


Re: Hive - UDF

2013-07-09 Thread Dean Wampler
The problem might be Java's limitation on having a single top-level class
in each file (as opposed to classes nested within a top-level class). You
would have to nest your UDFs in such a top-level class. That would work
fine, but when you define a TEMPORARY FUNCTION in Hive, I don't know if the
syntax supports references nested UDFs within a class. Perhaps someone else
can comment, or you could do the experiment yourself and see ;) If it
doesn't work, you'll just refactor the code into separate files.

Actually, since you should build a jar with your UDFs anyway, it's probably
doesn't matter a lot how many files you have, other than the slight
inconvenience of managing more than one.

dean

On Tue, Jul 9, 2013 at 7:49 AM, Manickam P  wrote:

> Hi,
>
> Can we write more than one function like to_upper and to_lower in same UDF
> ? Or do we need write separate UDF for each?
> Please let me know.
>
>
>
> Thanks,
> Manickam P
>



-- 
Dean Wampler, Ph.D.
@deanwampler
http://polyglotprogramming.com


Re: Hive - UDF

2013-07-09 Thread Nitin Pawar
in same UDF or in same jar?




On Tue, Jul 9, 2013 at 6:19 PM, Manickam P  wrote:

> Hi,
>
> Can we write more than one function like to_upper and to_lower in same UDF
> ? Or do we need write separate UDF for each?
> Please let me know.
>
>
>
> Thanks,
> Manickam P
>



-- 
Nitin Pawar


Hive - UDF

2013-07-09 Thread Manickam P
Hi,
Can we write more than one function like to_upper and to_lower in same UDF ? Or 
do we need write separate UDF for each? Please let me know. 



Thanks,
Manickam P

Re: Strange error in hive

2013-07-09 Thread Jérôme Verdier
Hi,

Thanks for your help.

You can see logs below :

java.lang.RuntimeException: Error in configuring object
at
org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:93)
at
org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:64)
at
org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:117)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:432)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:372)
at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:396)
at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1149)
at org.apache.hadoop.mapred.Child.main(Child.java:249)
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at
org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:88)
... 9 more
Caused by: java.lang.RuntimeException: Error in configuring object
at
org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:93)
at
org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:64)
at
org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:117)
at org.apache.hadoop.mapred.MapRunner.configure(MapRunner.java:34)
... 14 more
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at
org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:88)
... 17 more
Caused by: java.lang.RuntimeException: Map operator initialization failed
at
org.apache.hadoop.hive.ql.exec.ExecMapper.configure(ExecMapper.java:121)
... 22 more
Caused by: java.lang.NullPointerException
at
org.apache.hadoop.hive.serde2.objectinspector.StructObjectInspector.toString(StructObjectInspector.java:64)
at java.lang.String.valueOf(String.java:2826)
at java.lang.StringBuilder.append(StringBuilder.java:115)
at
org.apache.hadoop.hive.ql.exec.UnionOperator.initializeOp(UnionOperator.java:110)
at org.apache.hadoop.hive.ql.exec.Operator.initialize(Operator.java:375)
at org.apache.hadoop.hive.ql.exec.Operator.initialize(Operator.java:451)
at
org.apache.hadoop.hive.ql.exec.Operator.initializeChildren(Operator.java:407)
at
org.apache.hadoop.hive.ql.exec.TableScanOperator.initializeOp(TableScanOperator.java:186)
at org.apache.hadoop.hive.ql.exec.Operator.initialize(Operator.java:375)
at
org.apache.hadoop.hive.ql.exec.MapOperator.initializeOp(MapOperator.java:563)
at org.apache.hadoop.hive.ql.exec.Operator.initialize(Operator.java:375)
at
org.apache.hadoop.hive.ql.exec.ExecMapper.configure(ExecMapper.java:100)
... 22 more



2013/7/8 

> **
> Hii Jerome
>
>
> Can you send the error log of the MapReduce task that failed? That should
> have some pointers which can help you troubleshoot the issue.
> Regards
> Bejoy KS
>
> Sent from remote device, Please excuse typos
> --
> *From: * Jérôme Verdier 
> *Date: *Mon, 8 Jul 2013 11:25:34 +0200
> *To: *
> *ReplyTo: * user@hive.apache.org
> *Subject: *Strange error in hive
>
> Hi everybody,
>
> I faced a strange error in hive this morning.
>
> The error message is this one :
>
> FAILED: Execution Error, return code 2 from
> org.apache.hadoop.hive.ql.exec.MapRedTask
>
> after a quick search on Google, it appears that this is a Hive bug :
>
> https://issues.apache.org/jira/browse/HIVE-4650
>
> Is there a way to pass through this error ?
>
> Thanks.
>
> NB : my hive script is in the attachment.
>
>
> --
> *Jérôme VERDIER*
> 06.72.19.17.31
> verdier.jerom...@gmail.com
>
>


-- 
*Jérôme VERDIER*
06.72.19.17.31
verdier.jerom...@gmail.com