Sebastian,

R we still using SplitInputJob, seems like its been replaced by a much newer 
SplitInput. 
Do u think this needs to be purged from the codebase for 0.9, its been marked 
as deprecated anyways?





On Wednesday, December 11, 2013 2:08 PM, Suneel Marthi 
<suneel_mar...@yahoo.com> wrote:
 
A quick search thru the codebase has the following using old mapred :-

DistributedRowMatrix
SplitInputJob
MatrixMultiplicationJob
BtJob
TransposeJob
TimesSquaredJob
ABtJob
ABtDenseOutJob
BtJob
QJob
QRFirstStep








On Wednesday, December 11, 2013 2:01 PM, Sebastian Schelter 
<ssc.o...@googlemail.com> wrote:
 
I think there are still parts of the code (e.g. in DistributedRowMatrix)
that use the old API.

--sebastian


On 11.12.2013 19:56, Suneel Marthi wrote:
> Mahout is using the newer mapreduce API and not the older mapred API. 
> Was that what u were looking for?
> 
> 
> 
> 
> 
> On Wednesday, December 11, 2013 1:53 PM, Zoltan Prekopcsak 
> <preko1...@gmail.com> wrote:
>  
> Hi Gokhan,
> 
> Thank you for the clarification.
> Does it mean that Mahout is using the mapred API everywhere and there is 
> no mapreduce API left? As far as I know, the mapreduce API needs to be 
> recompiled and I remember needing to recompile Mahout for CDH4 when it 
> first came out.
> 
> Thanks, Zoltan
> 
> 
> 
> On 12/10/13 10:02 PM, Gokhan Capan wrote:
>> I meant that you shouldn't need to modify mahout's dependencies, just mvn 
>> package and it should work against hadoop 2.2.0 (Yeah, 2.2.0 is not alpha)
>>
>> Quoting from 
>> http://hadoop.apache.org/docs/current/hadoop-mapreduce-client/hadoop-mapreduce-client-core/MapReduce_Compatibility_Hadoop1_Hadoop2.html
>> "First, we ensure binary compatibility to the applications that use old 
>> mapred APIs. This means that applications which were built against MRv1 
>> mapred APIs can run directly on YARN without recompilation, merely by 
>> pointing them to an Apache Hadoop 2.x cluster via configuration."
>>
>> If you encounter with any problems, just let the list know.
>>
>> Best
>>
>>
>>> On Dec 9, 2013, at 9:40 PM, Hi There <srudamas...@yahoo.com> wrote:
>>>
>>> Hi Gokhan,
>>>
>>> My project currently fetches every dependency through Maven--is there any 
>>> way I can grab the version you mentioned that
 way?
>>>
>>> In that vein, I am using the following version of hadoop:
>>> <dependency>
>>>          <groupId>org.apache.hadoop</groupId>
>>>          <artifactId>hadoop-client</artifactId>
>>>          <version>2.2.0</version>
>>> </dependency>
>>>
>>>
>>> That's not alpha, right?
>>>
>>> Thanks!
>>>
>>>
>>>
>>>
>>>
>>> On Monday, December 9, 2013 10:05 AM, Gokhan Capan <gkhn...@gmail.com> 
>>> wrote:
>>>
>>> Mahout actually should work with hadoop-2 stable without recompiling,
>>> not with hadoop-2 alpha though.
>>>
>>> We're, by the way, currently in the process of adding support to build
>>> mahout with hadoop-2.
>>>
>>> Please see mahout-1354 for the relevant issue
>>>
>>> Sent from my iPhone
>>>
>>>
>>>> On Dec 9, 2013, at 19:54, Hi There <srudamas...@yahoo.com> wrote:
>>>>
>>>> Is Dec 2013 still the intended release date of the next mahout release 
>>>> that will be compatible with Hadoop 2.2.0?
>>>>
>>>>
>>>>
>>>>
>>>> On Thursday, November 21, 2013 12:36 PM, Suneel Marthi 
>>>> <suneel_mar...@yahoo.com> wrote:
>>>>
>>>> Targeted for Dec 2013.
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>> On Thursday, November 21, 2013 3:26 PM, Hi There <srudamas...@yahoo.com> 
>>>> wrote:
>>>>
>>>> Thanks for the reply! Is there a timeline for then the next release will 
>>>> be?
>>>>
>>>>
>>>> Thanks,
>>>> Victor
>>>>
>>>>
>>>>
>>>>
>>>> On Tuesday, November 19, 2013 7:30 PM, Suneel Marthi 
>>>> <suneel_mar...@yahoo.com> wrote:
>>>>
>>>> Hi Victor,
>>>>
>>>> Future releases of Mahout will support Hadoop 2.x, the present codebase 
>>>> still only supports Hadoop
 1.x.
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>> On Tuesday, November 19, 2013 1:42 PM, Hi There <srudamas...@yahoo.com> 
>>>> wrote:
>>>>
>>>>
>>>>
>>>> Hello,
>>>>
>>>> I recently upgraded to hadoop's
>>>> newest release, and it seems one of their interfaces has changed, and
>>>> when I try to create sparse vectors from sequence files, I get the
>>>> following exception:
>>>>
>>>> java.lang.IncompatibleClassChangeError: Found interface 
>>>> org.apache.hadoop.mapreduce.Counter, but class was expected
>>>>
>>>> I can include more of the stack trace if necessary.
>>>>
>>>> Are there any plans in the immediate future to upgrade mahout to be 
>>>> compatible with the newest hadoop release?
>>>>
>>>> Thanks,
>>>> Victor

Reply via email to