+1 for a bug (tested two days agon - was not sure if i simply missed
something)
2007-01-17 12:03:07,691 WARN util.NativeCodeLoader - Unable to load
native-hadoop library for your platform... using builtin-java classes
where applicable
2007-01-17 12:03:07,722 WARN mapred.LocalJobRunner - job_6cexok
java.io.EOFException
at java.io.DataInputStream.readFully(DataInputStream.java:178)
at org.apache.hadoop.io.DataOutputBuffer$Buffer.write
(DataOutputBuffer.java:57)
at org.apache.hadoop.io.DataOutputBuffer.write
(DataOutputBuffer.java:91)
at org.apache.hadoop.io.UTF8.readChars(UTF8.java:212)
at org.apache.hadoop.io.UTF8.readString(UTF8.java:204)
at org.apache.hadoop.io.ObjectWritable.readObject
(ObjectWritable.java:173)
at org.apache.hadoop.io.ObjectWritable.readFields
(ObjectWritable.java:61)
at org.apache.hadoop.mapred.MapTask$MapOutputBuffer.spill
(MapTask.java:427)
at org.apache.hadoop.mapred.MapTask
$MapOutputBuffer.sortAndSpillToDisk(MapTask.java:385)
at org.apache.hadoop.mapred.MapTask$MapOutputBuffer.access
$200(MapTask.java:239)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:188)
at org.apache.hadoop.mapred.LocalJobRunner$Job.run
(LocalJobRunner.java:109)
Am 18.01.2007 um 23:09 schrieb Brian Whitman:
>
> On Jan 18, 2007, at 4:44 PM, Andrzej Bialecki wrote:
>
>>
>>> java.io.EOFException
>>> at java.io.DataInputStream.readFully(DataInputStream.java:
>>> 178)
>>> at org.apache.hadoop.io.DataOutputBuffer$Buffer.write
>>> (DataOutputBuffer.java:57)
>>> at org.apache.hadoop.io.DataOutputBuffer.write
>>> (DataOutputBuffer.java:91)
>>> at org.apache.hadoop.io.UTF8.readChars(UTF8.java:212)
>>> at org.apache.hadoop.io.UTF8.readString(UTF8.java:204)
>>> at org.apache.hadoop.io.ObjectWritable.readObject
>>> (ObjectWritable.java:173)
>>
>> UTF8? How weird - recent versions of Nutch tools, such as Crawl,
>> Generate et al (and SegmentMerger) do NOT use UTF8, they use Text.
>> It seems this data was created with older versions. Please check
>> that you don't have older versions of Hadoop or nutch classes on
>> you classpath.
>
> I printed my CLASSPATH in the bin/nutch script before it calls
> anything, and all the jars and jobs are local to the nightly
> directory which I downloaded today except for /usr/local/java/lib/
> tools.jar. All are dated 2007-01-17 19:42.
>
> hadoop-0.10.1-core is in there.
>
> And the data is brand new (I delete the crawl dir before doing my
> test run.)
>
> -Brian
>
>
-------------------------------------------------------------------------
Take Surveys. Earn Cash. Influence the Future of IT
Join SourceForge.net's Techsay panel and you'll get the chance to share your
opinions on IT & business topics through brief surveys - and earn cash
http://www.techsay.com/default.php?page=join.php&p=sourceforge&CID=DEVDEV
_______________________________________________
Nutch-developers mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/nutch-developers