Thanks Roberto for you answer.
It turns out that the class names for the two serde were the sames... So I just
had to change the name and it works.
By the way thank you for your tutorial on your blog, it helped me a lot to
write my own serde.
Germain.
Le 12 oct. 2012 à 16:39, Roberto Congiu
Without more specific info about the two serdes I can't be sure, but most
likely there may be a dependency conflict between the two serdes.
Sometimes, for instance, they may use a different version of log4j, and if
packaged with all dependencies they can indeed interfere with each other.
Have a loo
Hi everybody,
I am using Hive-0.9.0 and Hadoop-0.20.2. I have two customizes SerDe :
custoSerde1.jar and custoSerde2.jar adding with ~/.hiverc .
In HDFS I have : /input/data_for_custoSerde1/
/input/data_for_custoSerde2/
When I run a query on my tableSerde1 it's w
Not sure about this bug and don't have a test table right now on which I
can try this or rather don't want to try this on a table :-)
Did you lose the entire table ? If your HDFS trash policy is the default 24
hours, you should be able to recover and run a quick load command to
restore the metadat
For both simplicity and efficiency, I'd recommend making the
mausummary table partitioned on date and generate the MAU data each
day. There is no reason to generate MAU data for a given day more than
once (unless you find some problems with the source data or
something).
On Fri, Oct 12, 2012 at 1:
We are using Hive 8 as part of CDH4.0.1. We noticed a bug in Partition.
Say if my hive table is partitioned by mydate and to drop a partition I mistype
the name of the partition, like,
Alter table mytable drop partition (yourdate='2012-09-10') , this will drop all
the mytable paritions (mydate
You just need to put the join condition in the WHERE clause. That way Hive
will do a cartesian product followed by a filter.
On Fri, Oct 12, 2012 at 1:02 PM, Tom Hubina wrote:
> I think I see what you're saying about the temp table with start/end dates
> (30x expansion makes sense) and it sounds
I think I see what you're saying about the temp table with start/end dates
(30x expansion makes sense) and it sounds like it should work. I just need
to figure out a good way to generate the table. Thanks!
Tom
On Wed, Oct 10, 2012 at 11:05 PM, Igor Tatarinov wrote:
> If you have a lot of data,
The problem is that "day" is the value in the for loop.
I've tried doing a join with a table that contains the set of days, but the
problem is that you can't do a join on a range ... Hive only support
equality in the join. For example:
INSERT OVERWRITE TABLE mausummary SELECT day, COUNT(DISTINCT(
Sadu,
I am using JSON as the input format, with the JSON SerDe from
https://github.com/rcongiu/Hive-JSON-Serde.
A sample JSON record is: (in actual use each JSON record must be on one line
only).
{
"field1":"hello",
"field2":123456,
"field3":1234.5678,
"field4":true,
"field5":{"field5a":"embe
Thanks All :-)
Its working now :-)
Regards
Yogesh Kumar
> CC: user@hive.apache.org
> From: abhishek.dod...@gmail.com
> Subject: Re: Need Help in Hive storage format
> Date: Fri, 12 Oct 2012 08:47:41 -0400
> To: user@hive.apache.org
>
> Hi Yogesh,
>
> Try this PigStorage(' \u0001');
>
> Than
Hi Yogesh,
Try this PigStorage(' \u0001');
Thanks
Abhi
Sent from my iPhone
On Oct 12, 2012, at 5:50 AM, MiaoMiao wrote:
> Hi Yogesh:
>
> I think this may help.
>
> https://pig.apache.org/docs/r0.10.0/api/org/apache/pig/builtin/PigStorage.html
>
> Miao
>
> On Fri, Oct 12, 2012 at 5:27 PM
Hi there,
We're suffering consistent problems with creating extract out of Tableau
from Hive. We keep getting errors that state "Index column out of bounds".
I'm pretty sure it has nothing to do with Tableau, as the problem is
resolved by killing the hive server and starting it again. Then it wor
Hi Praveen
If Between is not supported in your hive version, you can replace Between using
< and > . Like
SELECT *FROM account a, timezone.g WHERE a.create_date >= g.start_date AND
a.create_date <= g.end_date ;
Regards
Bejoy KS
Sent from handheld, please excuse typos.
-Original Message
Hi Yogesh:
I think this may help.
https://pig.apache.org/docs/r0.10.0/api/org/apache/pig/builtin/PigStorage.html
Miao
On Fri, Oct 12, 2012 at 5:27 PM, yogesh dhari wrote:
> Thanks Bejoy,
>
> Now I want to store theses rows in Pig.
>
> like
>
> A = load '/Pig/00_0' using PigStorage()
> as
>
Thanks Bejoy,
Now I want to store theses rows in Pig.
like
A = load '/Pig/00_0' using PigStorage()
as
(id:INT, name:chararray, ddate, prim, ignore, ignorecase, activat);
What should be in the delimiter into PigStorage( )?
I have tried PigStorage('/001') but its showing errors.
What de
Hi Praveen,
It should work the version which I'm using is 0.9.0
Thanks,
Suneel
On Friday, 12 October 2012, Praveenkumar Ch wrote:
> Hi I am new to hive and we have a requirement for converting terra-data
> queries to hive queries. So i was successful converting them.. so far.
> but now i have
You familiar with SQL?
I'm sure this manual will help. https://cwiki.apache.org/Hive/
On Fri, Oct 12, 2012 at 2:40 PM, Praveenkumar Ch
wrote:
> Hi I am new to hive and we have a requirement for converting terra-data
> queries to hive queries. So i was successful converting them.. so far.
> b
18 matches
Mail list logo