jane.wayne2...@gmail.com
wrote:
quick question:
i have a key that already implements WritableComparable. this will be the
intermediary key passed from the map to the reducer.
is it necessary to extend RawComparator and set it on
Job.setSortComparatorClass(Class? extends RawComparator cls) ?
All,
Here is what's happening. I have implemented my own WritableComparable
keys
and values.
Inside a reducer I am seeing 'reduce' being invoked with the same key
_twice_.
I have checked that context.getKeyComparator() and
context.getSortComparator() are both WritableComparator which
The Hadoop framework reuses Writable objects for key and value arguments,
so if your code stores a pointer to that object instead of copying it you
can find yourself with mysterious duplicate objects. This has tripped me
up a number of times. Details on what exactly I encountered and how I fixed
and partitioning the keys.
I have properly implemented a hashCode(), equals() and the
WritableComparable methods.
Also not surprisingly when I use 1 reduce task, the output is correct.
On Tue, Jan 10, 2012 at 5:58 PM, W.P. McNeill bill...@gmail.com wrote:
The Hadoop framework reuses Writable objects
and String values.
The job map output is consistent, but the reduce input groups and values
for the key vary from one job to the next on the same input. It's like it
isn't properly comparing and partitioning the keys.
I have properly implemented a hashCode(), equals() and the
WritableComparable
Thanks for all the help on this issue. It turned out to be a very simple
problem with my 'compareTo' implementation.
The ordering was symmetric but _not_ transitive.
stan
On Tue, Aug 16, 2011 at 4:47 PM, Chris White chriswhite...@gmail.comwrote:
Can you copy the contents of your parent
Are you using a hash partioner? If so make sure the hash value of the
writable is not calculated using the hashCode value of the enum - use the
ordinal value instead. The hashcode value of an enum is different for each
jvm.
On Tue, Aug 16, 2011 at 6:14 AM, Chris White chriswhite...@gmail.comwrote:
Are you using a hash partioner? If so make sure the hash value of the
writable is not calculated using the hashCode value of the enum - use the
ordinal value instead. The hashcode value of an enum is different for each
Can you copy the contents of your parent Writable readField and write
methods (not the ones youve already posted)
Another thing you could try is if you know you have two identical keys, can
you write a unit test to examine the result of compareTo for two instances
to confirm the correct behavior
Hi Folks,
After much poking around I am still unable to determine why I am seeing
'reduce' being called twice with the same key.
Recall from my previous email that sameness is determined by 'compareTo'
of my custom key type.
AFAIK, the default WritableComparator invokes 'compareTo' for any two
Does your compareTo() method test object pointer equality? If so, you could
be getting burned by Hadoop reusing Writable objects.
-Joey
On Aug 14, 2011 9:20 PM, Stan Rosenberg srosenb...@proclivitysystems.com
wrote:
Hi Folks,
After much poking around I am still unable to determine why I am
On Sun, Aug 14, 2011 at 9:33 PM, Joey Echeverria j...@cloudera.com wrote:
Does your compareTo() method test object pointer equality? If so, you could
be getting burned by Hadoop reusing Writable objects.
Yes, but only the equality between enum values. Interestingly, when
'reduce' is called
On Sun, Aug 14, 2011 at 10:25 PM, Joey Echeverria j...@cloudera.com wrote:
What are the types of key1 and key2? What does the readFields() method
look like?
The type of key1 is essentially a wrapper for java.util.UUID.
Here is its readFields:
public void readFields(DataInput in) throws
Hi All,
Here is what's happening. I have implemented my own WritableComparable keys
and values.
Inside a reducer I am seeing 'reduce' being invoked with the same key
_twice_.
I have checked that context.getKeyComparator() and
context.getSortComparator() are both WritableComparator which
On Thu, Nov 11, 2010 at 4:29 PM, Aaron Baff aaron.b...@telescope.tv wrote:
I'm having a problem with a custom WritableComparable that I created to use
as a Key object. I basically have a number of identifier's with a timestamp,
and I'm wanting to group the Identifier's together in the reducer
On Thu, Nov 11, 2010 at 4:29 PM, Aaron Baff aaron.b...@telescope.tv wrote:
I'm having a problem with a custom WritableComparable that I created
to use as a Key object. I basically have a number of identifier's with
a timestamp, and I'm wanting to group the Identifier's together in the
reducer
I'm having a problem with a custom WritableComparable that I created to use as
a Key object. I basically have a number of identifier's with a timestamp, and
I'm wanting to group the Identifier's together in the reducer, and order the
records by the timestamp (oldest to newest). When I used
writablecomparable
The code for Cloud9 is available on GitHub.
On Thu, Jul 1, 2010 at 12:27 AM, Oded Rotem oded.rotem...@gmail.com wrote:
Thanks. I've seen this site, but there's only discussion of it there, not
an
implementation...
Look here (and navigate around to org.json for the specifics):
http
Interesting:
http://www.umiacs.umd.edu/~jimmylin/Cloud9/docs/content/data-types.html
You need to define your own custom comparator.
On Tue, Jun 29, 2010 at 10:41 PM, Oded Rotem oded.rotem...@gmail.comwrote:
Hi,
Is there a json writablecomparable implementation anywhere?
Thanks,
Oded
Thanks. I've seen this site, but there's only discussion of it there, not an
implementation...
-Original Message-
From: Ted Yu [mailto:yuzhih...@gmail.com]
Sent: Wednesday, June 30, 2010 7:03 PM
To: common-user@hadoop.apache.org
Subject: Re: json writablecomparable
Interesting:
http
of it there, not
an
implementation...
-Original Message-
From: Ted Yu [mailto:yuzhih...@gmail.com]
Sent: Wednesday, June 30, 2010 7:03 PM
To: common-user@hadoop.apache.org
Subject: Re: json writablecomparable
Interesting:
http://www.umiacs.umd.edu/~jimmylin/Cloud9/docs/content/data-types.htmlhttp
/lintool/Cloud9/blob/master/src/dist/edu/umd/cloud9/io/JSONObjectWritable.java
-Original Message-
From: Ted Yu [mailto:yuzhih...@gmail.com]
Sent: Wednesday, June 30, 2010 7:03 PM
To: common-user@hadoop.apache.org
Subject: Re: json writablecomparable
Interesting:
http
Hi,
Is there a json writablecomparable implementation anywhere?
Thanks,
Oded
...@gmail.com]
Sent: Thursday, September 17, 2009 11:17 AM
To: common-user@hadoop.apache.org
Subject: reg : my OutputFormat class cannot recognize my
WritableComparable class
Hi,
I'm writing my own OutputFormat class and I faced the folloeing errors.
-- public class KeyDoc implements
24 matches
Mail list logo