Hi,
I am able to parse the input JSON file and load it into hive. I do not see
any errors with create table, so I am assuming that. But when I try to read
the data, I get null
hive select * from jobs;
OK
null
I have validated the JSON with JSONLint and Notepad++ JSON plugin and it is
a valid
* from table is as good as hdfs -cat
are you sure there is any data in the table?
On Tue, Jun 18, 2013 at 11:54 PM, Sunita Arvind
sunitarv...@gmail.comjavascript:_e({}, 'cvml', 'sunitarv...@gmail.com');
wrote:
Hi,
I am able to parse the input JSON file and load it into hive. I do not
see
:58 PM, Sunita Arvind sunitarv...@gmail.comwrote:
I ran some complex queries. Something to the extent of
select jobs from jobs;
which triggers map reduce jobs but does not show errors and produces the
same output null. If I try referencing the struct elements, I get error
CPU Time Spent: 880 msec
OK
null
Time taken: 9.591 seconds
regards
Sunita
On Tue, Jun 18, 2013 at 9:35 PM, Sunita Arvind sunitarv...@gmail.comwrote:
Ok.
The data files are quite small. Around 35 KB and 1 KB each.
[sunita@node01 tables]$ hadoop fs -ls /user/sunita/tables/jobs
Found 1 items
!!
Richa
On Wed, Jun 19, 2013 at 7:28 AM, Sunita Arvind sunitarv...@gmail.comwrote:
Having the a column name same as the table name, is a problem due to
which I was not able to reference jobs.values.id from jobs. Changing the
table name to jobs1 resolved the semantic error.
However, the query
as a delimited serialization of what you want to
store and that's it. That would be another way to get your scalars, arrays
and structs to work.
Don't give up yet though on the JsonSerde! :) Its probably something very
easy that we just can't see.
On Wed, Jun 19, 2013 at 10:00 AM, Sunita Arvind
?
regards
Sunita
On Wed, Jun 19, 2013 at 3:29 PM, Sunita Arvind sunitarv...@gmail.comwrote:
Thanks Stephen,
Let me explore options. I will let you all know once I am successful.
regards
Sunita
On Wed, Jun 19, 2013 at 3:08 PM, Stephen Sprague sprag...@gmail.comwrote:
try_parsed_json
Your issue seems familiar. Try logging out of hive session and re-login.
Sunita
On Wed, Jun 19, 2013 at 8:53 PM, Mohammad Tariq donta...@gmail.com wrote:
Hello list,
I have a hive(0.9.0) setup on my Ubuntu box running hadoop-1.0.4.
Everything was going smooth till now. But today
Hi Praveen / All,
I also have a requirement similar to the one explained (by Praveen) below:
distinct rows on a single column with corresponding data from other columns.
Hi,
I have written a script which generates JSON files, loads it into a
dictionary, adds a few attributes and uploads the modified files to HDFS.
After the files are generated, if I perform a select * from..; on the table
which points to this location, I get null, null as the result. I also
in a world of unstructured and un-clean data.
** **
-Marcin
** **
*From:* Sunita Arvind [mailto:sunitarv...@gmail.com]
*Sent:* Tuesday, July 30, 2013 11:00 AM
*To:* user@hive.apache.org
*Subject:* Hive Join with distinct rows
** **
Hi Praveen / All,
I also have
Have you tried a re-start of the cluster. Hive is a JVM process. A hang in
the JVM may cause issues like this. I have experienced a similar issue,
however, I did not get any output at all. After typing the query and
hitting enter, the prompt never returned and neither did I see any counter
Hello,
I am using sqoop to import data from oracle into hive. Below is my SQL:
nohup sqoop import --connect jdbc:oracle:thin:@(DESCRIPTION = (ADDRESS =
(PROTOCOL = TCP)(HOST = xxx)(PORT = )) (CONNECT_DATA = (SERVER =
DEDICATED) (SERVICE_NAME = CDWQ.tms.toyota.com) (FAILOVER_MODE=
everything in the table (not feasible in most cases)
However, I still need to know how to get the exact stack trace.
regards
Sunita
On Mon, Nov 11, 2013 at 1:48 PM, Sunita Arvind sunitarv...@gmail.comwrote:
Hello,
I am using sqoop to import data from oracle into hive. Below is my SQL:
nohup sqoop
as that log
usually contain entire exception including all the chained exceptions.
Jarcec
Links:
1: http://sqoop.apache.org/mail-lists.html
On Mon, Nov 11, 2013 at 03:01:22PM -0800, Sunita Arvind wrote:
Just in case this acts as a workaround for someone:
The issue is resolved if I eliminate
Thanks David,
Very valuable input. Will update the group with my findings.
Regards
Sunita
On Monday, November 11, 2013, David Morel wrote:
On 12 Nov 2013, at 0:01, Sunita Arvind wrote:
Just in case this acts as a workaround for someone:
The issue is resolved if I eliminate the where
Hi All,
I am trying to load database listener logs into hive tables. I am using
Regex Serde from
https://repository.cloudera.com/artifactory/public/org/apache/hive/hive-contrib/0.10.0-cdh4.2.0-SNAPSHOT/hive-contrib-0.10.0-cdh4.2.0.jar
Below is my create table:
CREATE external TABLE
Hello Experts,
I am trying to write a UDF to parse a logline and provide the output in the
form of an array. Basically I want to be able to use LATERAL VIEW explode
subsequently to make it into columns.
This is how a typical log entry looks:
24-JUN-2012 05:00:42 *
Can someone please suggest if this is doable or not? Is generic udf the
only option? How would using generic vs simple udf make any difference
since I would be returning the same object either ways.
Thank you
Sunita
-- Forwarded message --
From: *Sunita Arvind* sunitarv
) and returns an array of structs which is usually used in a
lateral view.
A good article on how to write a generic UDF is this one:
http://www.baynote.com/2012/11/a-word-from-the-engineers/
On Thu, Jan 30, 2014 at 7:06 AM, Sunita Arvind sunitarv...@gmail.comwrote:
Can someone please suggest
20 matches
Mail list logo