Dennis Kubes wrote:
Yes. This seems to have fixed the problem. All, do we want to create
a JIRA and commit this for the 0.9 release?
FYI, this looks like NUTCH-333:
http://issues.apache.org/jira/browse/NUTCH-333.
St.Ack
Dennis
Andrzej Bialecki wrote:
Doğacan Güney wrote:
Hi,
On
Hey Shay.
Some friendly advice. Cross-posting a question will make you unpopular
fast. Its best to start on the most appropriate seeming list and only
move on from there if you are getting no satisfaction. The below
question looks best at home over on the archive-access list. Let me
Stefan Groschupf wrote:
Hi,
I'm wondering why the plugins are in the job file, since it looks like
the plugins are never loaded from the job file but from the outside
(plugin folder).
Should they?
If running your job jar on a pure hadoop platform, there are no plugins
on local disk. The
Andrzej Bialecki wrote:
(redirected to nutch-dev)
Pardon me. I intended to send nutch-dev, not hadoop-dev.
...
How weird, This Should Never Happen(tm) ... ;) Lost map output should
show up in logs, or perhaps even should've killed the job, isn't that so?
Yes. I'd have thought.
I'll
I'm getting same as Anton below trying to launch a new job with latest
from TRUNK.
Logic in ObjectWriteable#readObject seems a little off. On the way in
we test for a null instance. If null, we set to NullWriteable.
Next we test declaredClass to see if its an array. We then try to do an