Re:create a hive table: always a tab space before each line

2013-01-09 Thread Richard
more information:


if I set the format as textfile, there is no tab space. 
if I set the format as sequencefile and view the content via hadoop fs -text, I 
saw a tab space in the head of each line.


At 2013-01-09 15:44:00,Richard codemon...@163.com wrote:

hi there


I have a problem with creating a hive table.
no matter what field delimiter I used, I always got a tab space in the head of 
each line (a line is a record).
something like this:
\t f1 \001 f2 \001 f3 ...
where f1 , f2 , f3 denotes the field value and \001 is the field separator.


here is the clause I used 
35 create external table if not exists ${HIVETBL_my_table}
 36 (
 37 nid string, 
 38 userid string, 
 39 spv bigint, 
 40 sipv bigint, 
 41 pay bigint, 
 42 spay bigint, 
 43 ipv bigint, 
 44 sellerid string, 
 45 cate string
 46 )
 47 partitioned by(ds string)
 48 row format delimited fields terminated by '\001' lines terminated by '\n'
 49 stored as sequencefile
 50 location '${HADOOP_PATH_4_MY_HIVE}/${HIVETBL_my_table}';


thanks for help.


Richard




Re: create a hive table: always a tab space before each line

2013-01-09 Thread Nitin Pawar
you may want to look at the sequencefile format
http://my.safaribooksonline.com/book/databases/hadoop/9780596521974/file-based-data-structures/id3555432

that tab is to separate key from values in the record (I may be wrong but
this is how I interpreted it)


On Wed, Jan 9, 2013 at 12:49 AM, Richard codemon...@163.com wrote:

 more information:

 if I set the format as textfile, there is no tab space.
 if I set the format as sequencefile and view the content via hadoop fs
 -text, I saw a tab space in the head of each line.


 At 2013-01-09 15:44:00,Richard codemon...@163.com wrote:

 hi there


 I have a problem with creating a hive table.

 no matter what field delimiter I used, I always got a tab space in the head 
 of each line (a line is a record).

 something like this:

 \t f1 \001 f2 \001 f3 ...

 where f1 , f2 , f3 denotes the field value and \001 is the field separator.


 **

 here is the clause I used

 35 create external table if not exists ${HIVETBL_my_table}
  36 (
  37 nid string,
  38 userid string,
  39 spv bigint,
  40 sipv bigint,
  41 pay bigint,
  42 spay bigint,
  43 ipv bigint,
  44 sellerid string,
  45 cate string
  46 )
  47 partitioned by(ds string)
  48 row format delimited fields terminated by '\001' lines terminated by '\n'
  49 stored as sequencefile
  50 location '${HADOOP_PATH_4_MY_HIVE}/${HIVETBL_my_table}';


 thanks for help.


 Richard








-- 
Nitin Pawar


Re: create a hive table: always a tab space before each line

2013-01-09 Thread Anurag Tangri
Hi Richard,
You should set the format in create external table command based on the format 
of your data on HDFS.

Is your data text file or seq file on HDFS ?

Thanks,
Anurag Tangri

Sent from my iPhone

On Jan 9, 2013, at 12:49 AM, Richard  codemon...@163.com wrote:

 more information:
 
 if I set the format as textfile, there is no tab space. 
 if I set the format as sequencefile and view the content via hadoop fs -text, 
 I saw a tab space in the head of each line.
 
 At 2013-01-09 15:44:00,Richard codemon...@163.com wrote:
 hi there
 
 I have a problem with creating a hive table.
 no matter what field delimiter I used, I always got a tab space in the head 
 of each line (a line is a record).
 something like this:
 \t f1 \001 f2 \001 f3 ...
 where f1 , f2 , f3 denotes the field value and \001 is the field separator.
 
 here is the clause I used 
 35 create external table if not exists ${HIVETBL_my_table}
  36 (
  37 nid string, 
  38 userid string, 
  39 spv bigint, 
  40 sipv bigint, 
  41 pay bigint, 
  42 spay bigint, 
  43 ipv bigint, 
  44 sellerid string, 
  45 cate string
  46 )
  47 partitioned by(ds string)
  48 row format delimited fields terminated by '\001' lines terminated by '\n'
  49 stored as sequencefile
  50 location '${HADOOP_PATH_4_MY_HIVE}/${HIVETBL_my_table}';
 
 thanks for help.
 
 Richard
 
 
 
 


Re:Re: create a hive table: always a tab space before each line

2013-01-09 Thread Richard
I am trying to create a table and insert overwrite it, so the data is supposed 
to be generated.






At 2013-01-09 17:17:06,Anurag Tangri tangri.anu...@gmail.com wrote:

Hi Richard,
You should set the format in create external table command based on the format 
of your data on HDFS.


Is your data text file or seq file on HDFS ?


Thanks,
Anurag Tangri

Sent from my iPhone

On Jan 9, 2013, at 12:49 AM, Richard  codemon...@163.com wrote:


more information:


if I set the format as textfile, there is no tab space. 
if I set the format as sequencefile and view the content via hadoop fs -text, I 
saw a tab space in the head of each line.


At 2013-01-09 15:44:00,Richard codemon...@163.com wrote:

hi there


I have a problem with creating a hive table.
no matter what field delimiter I used, I always got a tab space in the head of 
each line (a line is a record).
something like this:
\t f1 \001 f2 \001 f3 ...
where f1 , f2 , f3 denotes the field value and \001 is the field separator.


here is the clause I used 
35 create external table if not exists ${HIVETBL_my_table}
 36 (
 37 nid string, 
 38 userid string, 
 39 spv bigint, 
 40 sipv bigint, 
 41 pay bigint, 
 42 spay bigint, 
 43 ipv bigint, 
 44 sellerid string, 
 45 cate string
 46 )
 47 partitioned by(ds string)
 48 row format delimited fields terminated by '\001' lines terminated by '\n'
 49 stored as sequencefile
 50 location '${HADOOP_PATH_4_MY_HIVE}/${HIVETBL_my_table}';


thanks for help.


Richard







Find out what's causing an InvalidOperationException

2013-01-09 Thread Krishna Rao
Hi all,

On running a statement of the form INSERT INTO TABLE tbl1 PARTITION(p1)
SELECT x1 FROM tbl2, I get the following error:

Failed with exception java.lang.ClassCastException:
org.apache.hadoop.hive.metastore.api.InvalidOperationException cannot be
cast to java.lang.RuntimeException

How can I find out what is causing this error? I.e. which logs should I
look at?

Cheers,

Krishna


Re: Find out what's causing an InvalidOperationException

2013-01-09 Thread Nitin Pawar
can you give table definition of both the tables?

are both the columns of same type ?


On Wed, Jan 9, 2013 at 5:15 AM, Krishna Rao krishnanj...@gmail.com wrote:

 Hi all,

 On running a statement of the form INSERT INTO TABLE tbl1 PARTITION(p1)
 SELECT x1 FROM tbl2, I get the following error:

 Failed with exception java.lang.ClassCastException:
 org.apache.hadoop.hive.metastore.api.InvalidOperationException cannot be
 cast to java.lang.RuntimeException

 How can I find out what is causing this error? I.e. which logs should I
 look at?

 Cheers,

 Krishna





-- 
Nitin Pawar


Re: Find out what's causing an InvalidOperationException

2013-01-09 Thread Krishna Rao
The data types are the same. In fact, the statement works the first time,
but not the second (I change a WHERE constraint to give different data).

I presume it is some invalid data, but is there any way to find a clue in a
log file?


On 9 January 2013 13:21, Nitin Pawar nitinpawar...@gmail.com wrote:

 can you give table definition of both the tables?

 are both the columns of same type ?


 On Wed, Jan 9, 2013 at 5:15 AM, Krishna Rao krishnanj...@gmail.comwrote:

 Hi all,

 On running a statement of the form INSERT INTO TABLE tbl1 PARTITION(p1)
 SELECT x1 FROM tbl2, I get the following error:

 Failed with exception java.lang.ClassCastException:
 org.apache.hadoop.hive.metastore.api.InvalidOperationException cannot be
 cast to java.lang.RuntimeException

 How can I find out what is causing this error? I.e. which logs should I
 look at?

 Cheers,

 Krishna





 --
 Nitin Pawar



Re: FAILED: Error in metadata: MetaException(message:org.apache.hadoop.hbase.MasterNotRunningException: in HBase+Hive intergration

2013-01-09 Thread Ted Reynolds
Hi Sagar,

If you do a jps after the error shows, do you still see HMaster in the
list?  It is possible that the Master came up for a bit and then died.

Ted.



On Tue, Jan 8, 2013 at 5:24 PM, sagar nikam sagarnikam...@gmail.com wrote:

 MasterNotRunningException


Hive performance exception

2013-01-09 Thread 窦晓峰
 

 

Hello, every one:

   I am a newbie to hive. And I have a bash shell script which will run
multi hive scripts(more than 16*31) simultaneously to export data to text
files. When I run this shell, the following exception occurred, and if I run
the hive scripts one by one, all are OK.

   In the hive scripts, I must create a temp table for some reason.

 

FAILED: Error in metadata: javax.jdo.JDODataStoreException: Exception thrown
obtaining schema column information from data store

NestedThrowables:

com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: Table
'hivedb1.DELETEME1356611646373' doesn't exist

FAILED: Execution Error, return code 1 from
org.apache.hadoop.hive.ql.exec.DDLTask

 

 

ENV:

   HIVE:0.90

   MetaDataStore: Mysql5.5.30

 

I have googled the internet and found the following setting may fix the
exception, but I don’t know whether I can create and drop tables in hive
scripts.

property
  namedatanucleus.autoCreateSchema/name
  valuefalse/value
/property 
property
  namedatanucleus.fixedDatastore/name
  valuetrue/value
/property

 

 

 

The attachments are hive configuration and the shell script.

 

 

Thanks a lot.



unicode character as delimiter

2013-01-09 Thread Ho Kenneth - kennho
Hi all,

I have an input file that has a unicode character as a delimiter, which is þ  
(thorn)

For example:

col1þcol2þcol3

  Þ has a value of UTF-8(hex) 0xC3 0xBE (c3be)

And I have tried the following but no luck:
create table test(col1 string, col2 string, col3 string) row format delimited 
fields terminated by '\c3be';

I'd appreciate your help! Thanks in advance.

--ken

***
The information contained in this communication is confidential, is
intended only for the use of the recipient named above, and may be legally
privileged.

If the reader of this message is not the intended recipient, you are
hereby notified that any dissemination, distribution or copying of this
communication is strictly prohibited.

If you have received this communication in error, please resend this
communication to the sender and delete the original message or any copy
of it from your computer system.

Thank You.