Hive queries returning all NULL values.

2014-08-17 Thread Tor Ivry
Hi I have a hive (0.11) table with the following create syntax: CREATE EXTERNAL TABLE events( … ) PARTITIONED BY(dt string) ROW FORMAT SERDE 'parquet.hive.serde.ParquetHiveSerDe' STORED AS INPUTFORMAT parquet.hive.DeprecatedParquetInputFormat OUTPUTFORMAT

Hive queries returning all NULL values.

2014-08-17 Thread Tor Ivry
Hi I have a hive (0.11) table with the following create syntax: CREATE EXTERNAL TABLE events( … ) PARTITIONED BY(dt string) ROW FORMAT SERDE 'parquet.hive.serde.ParquetHiveSerDe' STORED AS INPUTFORMAT parquet.hive.DeprecatedParquetInputFormat OUTPUTFORMAT

Re: Hive queries returning all NULL values.

2014-08-17 Thread hadoop hive
Hi, You check the data type you have provided while creating external table, it should match with data in files. Thanks Vikas Srivastava On Aug 17, 2014 7:07 PM, Tor Ivry tork...@gmail.com wrote: Hi I have a hive (0.11) table with the following create syntax: CREATE EXTERNAL TABLE

Re: Hive queries returning all NULL values.

2014-08-17 Thread Tor Ivry
Is there any way to debug this? We are talking about many fields here. How can I see which field has the mismatch? On Sun, Aug 17, 2014 at 4:30 PM, hadoop hive hadooph...@gmail.com wrote: Hi, You check the data type you have provided while creating external table, it should match with

Re: Hive queries returning all NULL values.

2014-08-17 Thread hadoop hive
Take a small set of data like 2-5 line and insert it... After that you can try insert first 10 column and then next 10 till you fund your problematic column On Aug 17, 2014 8:37 PM, Tor Ivry tork...@gmail.com wrote: Is there any way to debug this? We are talking about many fields here. How

Re: Hive queries returning all NULL values.

2014-08-17 Thread Raymond Lau
Do your field names in your parquet files contain upper case letters by any chance ex. userName? Hive will not read the data of external tables if they are not completely lower case field names, it doesn't convert them properly in the case of external tables. On Aug 17, 2014 8:00 AM, hadoop hive

Re: SerDe errors

2014-08-17 Thread Charles Robertson
Hi Roberto, This got solved with the help from another user - the e-mails don't seem to have made it to the user list. There was a problem with the json serde which means it didn't seem to like deserialising an object nested inside the main object. Changing to the Amazon serde fixed it. Thanks,

New lines causing new rows

2014-08-17 Thread Charles Robertson
HI all, I am loading some data into a Hive table, and one of the fields contains text which I believe contains new line characters. I have a view which reads data from this table, and the new line characters appear to be starting new rows Doing 'select * from [mytable] limit 10;' in the hive

Re: New lines causing new rows

2014-08-17 Thread Andre Araujo
Hi, Charles, What's the storage format for the raw data source? What's the definition of your view? On 18 August 2014 04:20, Charles Robertson charles.robert...@gmail.com wrote: HI all, I am loading some data into a Hive table, and one of the fields contains text which I believe contains

why webchat_server listen to port 8080

2014-08-17 Thread no...@sina.cn
hi everyone: I install Hive 0.13 and I find configuration in HIVE_HOME/hcatalog/etc/webhcat/webhcat-default.xml:property    nametempleton.port/name    value50111/value/property but when I start webchat_server, it listened to port 8080.  no...@sina.cn  

why webchat_server listen to port 8080

2014-08-17 Thread no...@sina.cn
hi everyone: I install Hive 0.13 and I find configuration in HIVE_HOME/hcatalog/etc/webhcat/webhcat-default.xml:property    nametempleton.port/name    value50111/value/property but when I start webchat_server, it listened to port 8080.  no...@sina.cn  

CDH4.5 HiveServer2 InterruptedException

2014-08-17 Thread Ji ZHANG
Hi, I'm using CDH4.5 and its built-in HiveServer2. Sometimes it throws the following exception, and the job cannot be submitted: 2014-08-18 09:16:33,346 INFO org.apache.hadoop.hive.ql.exec.ExecDriver: Making Temp Directory: