I've found the problem. The separator should be comma. When I use space
with separator , I got the those errors.
Thanks everyone for helping me.
I will pay attention on separator next time.


[Successful Log]

===============================================================

[hadoop@localhost test]$ cat rec5

1,101

1,102
1,103
2,101
2,102
2,103
2,104
3,101
3,104
3,105
3,107
4,101
4,103
4,104
4,106
5,101
5,102
5,103
5,104
5,105
5,106

[hadoop@localhost test]$ hadoop fs -cat rec5/rec5

1,101
1,102
1,103
2,101
2,102
2,103
2,104
3,101
3,104
3,105
3,107
4,101
4,103
4,104
4,106
5,101
5,102
5,103
5,104
5,105
5,106

[hadoop@localhost test]$ hadoop jar
/usr/lib/mahout/mahout-core-0.7-cdh4.2.1-job.jar
org.apache.mahout.cf.taste.hadoop.item.RecommenderJob -i rec5 -o
rec_result5 -s SIMILARITY_LOGLIKELIHOOD

[hadoop@localhost test]$ hadoop fs -ls rec_result5
Found 2 items
-rw-r--r--   1 hadoop supergroup          0 2013-05-12 14:01
rec_result5/_SUCCESS
-rw-r--r--   1 hadoop supergroup        108 2013-05-12 14:01
rec_result5/part-r-00000
[hadoop@localhost test]$ hadoop fs -cat rec_result5/part-r-00000

1       [106:1.0,105:1.0,104:1.0]
2       [106:1.0,105:1.0]
3       [106:1.0,103:1.0,102:1.0]
4       [105:1.0,102:1.0]
5       [107:1.0]



===============================================================




2013/5/10 滝口倫理 <rinri1...@gmail.com>

> I would like to get recommended items by using RecommenderJob.
> The input data I made is as below. There aren't preference value on
> purpose.
>
> When I run RecommenderJob, I got some errors.
>
> Does it mean I have to prepare the preference value for input file?
> I want to do RecommenderJob without preference value.
>
> Regards
> Takiguchi
>
>
> [mahout command]
> ===========================================================
>
> [hadoop@localhost test]$ cat rere2
> 1 101
> 1 102
> 1 103
> 2 101
> 2 102
> 2 103
> 2 104
> 3 101
> 3 104
> 3 105
> 3 107
> 4 101
> 4 103
> 4 104
> 4 106
> 5 101
> 5 102
> 5 103
> 5 104
> 5 105
> 5 106
>
> [hadoop@localhost test]$ hadoop fs -mkdir recommend2_in
>
> [hadoop@localhost test]$ hadoop fs -put rere2 recommend2_in
>
> [hadoop@localhost test]$ hadoop jar
> /usr/lib/mahout/mahout-core-0.7-cdh4.2.1-job.jar \
> org.apache.mahout.cf.taste.hadoop.item.RecommenderJob \
> -i recommend2_in -o rec_out -s SIMILARITY_LOGLIKELIHOOD \
> -b true
> ===========================================================
>
>
> [Error]
>
> =============================================================================
>
>
> 13/05/10 20:15:54 INFO mapreduce.Job: Task Id :
> attempt_1368183830239_0002_m_000000_0, Status : FAILED
> Error: java.lang.ArrayIndexOutOfBoundsException: 1
>         at
> org.apache.mahout.cf.taste.hadoop.item.ItemIDIndexMapper.map(ItemIDIndexMapper.java:47)
>         at
> org.apache.mahout.cf.taste.hadoop.item.ItemIDIndexMapper.map(ItemIDIndexMapper.java:31)
>
> 13/05/10 20:16:02 INFO mapreduce.Job: Task Id :
> attempt_1368183830239_0002_m_000000_1, Status : FAILED
> Error: java.lang.ArrayIndexOutOfBoundsException: 1
>
> 13/05/10 20:16:10 INFO mapreduce.Job: Task Id :
> attempt_1368183830239_0002_m_000000_2, Status : FAILED
> Error: java.lang.ArrayIndexOutOfBoundsException: 1
>
> 13/05/10 20:16:18 INFO mapreduce.Job: Counters: 6
>         Job Counters
>                 Failed map tasks=4
>                 Launched map tasks=4
>                 Other local map tasks=3
>                 Rack-local map tasks=1
>                 Total time spent by all maps in occupied slots (ms)=26266
>                 Total time spent by all reduces in occupied slots (ms)=0
> Exception in thread "main" java.io.FileNotFoundException: File does not
> exist: /user/hadoop/temp/preparePreferenceMatrix/numUsers.bin
>         at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLocationsUpdateTimes(FSNamesystem.java:1312)
>         at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLocationsInt(FSNamesystem.java:1258)
>         at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLocations(FSNamesystem.java:1231)
>
>
>
>
> =============================================================================
>

Reply via email to