either way i don't believe there's something specific to 1.0.1, 1.0.2 or
1.1.0 that is causing/not causing classpath errors. it's just jars are
picked by explicitly hardcoded artifact "opt-in" policy, not the other way
around.

It is not enough just to modify pom in order for something to appear in
task classpath.

On Mon, Oct 20, 2014 at 9:35 AM, Dmitriy Lyubimov <dlie...@gmail.com> wrote:

> Note that classpaths for "cluster" environment is tested trivially by
> starting 1-2 workers and standalone spark manager processes locally. No
> need to build anything "real". Workers would not know anything about mahout
> so unless proper jars are exposed in context, they would have no way of
> "faking" the access to classes.
>
> On Mon, Oct 20, 2014 at 9:28 AM, Pat Ferrel <p...@occamsmachete.com> wrote:
>
>> Yes, asap.
>>
>> To test this right it has to run on a cluster so I’m upgrading. When
>> ready it will just be a “mvn clean install" if you already have Spark 1.1.0
>> running.
>>
>> I would have only expected errors on the CLI drivers so if anyone else
>> sees runtime errors please let us know. Some errors are very hard to unit
>> test since the environment is different for local(unit tests) and cluster
>> execution.
>>
>>
>> On Oct 20, 2014, at 9:14 AM, Mahesh Balija <balijamahesh....@gmail.com>
>> wrote:
>>
>> Hi Pat,
>>
>> Can you please give detailed steps to build Mahout against Spark 1.1.0.
>> I build against 1.1.0 but still had class not found errors, thats why I
>> reverted back to Spark 1.0.2 even though first few steps are successful
>> but still facing some issues in running Mahout spark-shell sample commands
>> (drmData) throws some errors even on 1.0.2.
>>
>> Best,
>> Mahesh.B.
>>
>> On Mon, Oct 20, 2014 at 1:46 AM, peng <pc...@uowmail.edu.au> wrote:
>>
>> > From my experience 1.1.0 is quite stable, plus some performance
>> > improvements that totally worth the effort.
>> >
>> >
>> > On 10/19/2014 06:30 PM, Ted Dunning wrote:
>> >
>> >> On Sun, Oct 19, 2014 at 1:49 PM, Pat Ferrel <p...@occamsmachete.com>
>> >> wrote:
>> >>
>> >> Getting off the dubious Spark 1.0.1 version is turning out to be a bit
>> of
>> >>> work. Does anyone object to upgrading our Spark dependency? I’m not
>> sure
>> >>> if
>> >>> Mahout built for Spark 1.1.0 will run on 1.0.1 so it may mean
>> upgrading
>> >>> your Spark cluster.
>> >>>
>> >>
>> >> It is going to have to happen sooner or later.
>> >>
>> >> Sooner may actually be less total pain.
>> >>
>> >>
>> >
>>
>>
>

Reply via email to