You can always try. But Hadoop 3 is not yet supported by Spark.

On Fri, Apr 5, 2019 at 11:13 AM Anton Kirillov
<[email protected]> wrote:
>
> Marcelo, Sean, thanks for the clarification. So in order to support Hadoop 3+ 
> the preferred way would be to use Hadoop-free builds and provide Hadoop 
> dependencies in the classpath, is that correct?
>
> On Fri, Apr 5, 2019 at 10:57 AM Marcelo Vanzin <[email protected]> wrote:
>>
>> The hadoop-3 profile doesn't really work yet, not even on master.
>> That's being worked on still.
>>
>> On Fri, Apr 5, 2019 at 10:53 AM akirillov <[email protected]> 
>> wrote:
>> >
>> > Hi there! I'm trying to run Spark unit tests with the following profiles:
>> >
>> > And 'core' module fails with the following test failing with
>> > NoClassDefFoundError:
>> >
>> > In the meantime building a distribution works fine when running:
>> >
>> > Also, there are no problems with running tests using Hadoop 2.7 profile.
>> > Does this issue look familiar? Any help appreciated!
>> >
>> >
>> >
>> > --
>> > Sent from: http://apache-spark-developers-list.1001551.n3.nabble.com/
>> >
>> > ---------------------------------------------------------------------
>> > To unsubscribe e-mail: [email protected]
>> >
>>
>>
>> --
>> Marcelo



-- 
Marcelo

---------------------------------------------------------------------
To unsubscribe e-mail: [email protected]

Reply via email to