In yarn/.classpath , I see:
  <classpathentry kind="src" path="/spark-core_2.11"/>

Here is the command I used:

build/mvn clean -Phive -Phive-thriftserver -Pyarn -Phadoop-2.6
-Dhadoop.version=2.7.0 package -DskipTests eclipse:eclipse

FYI

On Thu, Feb 25, 2016 at 6:13 AM, Łukasz Gieroń <lgie...@gmail.com> wrote:

> I've just checked, and "mvn eclipse:eclipse" generates incorrect projects
> as well.
>
>
> On Thu, Feb 25, 2016 at 3:04 PM, Allen Zhang <allenzhang...@126.com>
> wrote:
>
>> why not use maven
>>
>>
>>
>>
>>
>>
>> At 2016-02-25 21:55:49, "lgieron" <lgie...@gmail.com> wrote:
>> >The Spark projects generated by sbt eclipse plugin have incorrect dependent
>> >projects (as visible on Properties -> Java Build Path -> Projects tab). All
>> >dependent project are missing the "_2.11" suffix (for example, it's
>> >"spark-core" instead of correct "spark-core_2.11"). This of course causes
>> >the build to fail.
>> >
>> >I am using sbteclipse-plugin version 4.0.0.
>> >
>> >Has anyone encountered this problem and found a fix?
>> >
>> >Thanks,
>> >Lukasz
>> >
>> >
>> >
>> >
>> >
>> >--
>> >View this message in context: 
>> >http://apache-spark-developers-list.1001551.n3.nabble.com/Eclipse-Wrong-project-dependencies-in-generated-by-sbt-eclipse-tp16436.html
>> >Sent from the Apache Spark Developers List mailing list archive at 
>> >Nabble.com.
>> >
>> >---------------------------------------------------------------------
>> >To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
>> >For additional commands, e-mail: dev-h...@spark.apache.org
>> >
>>
>>
>>
>>
>>
>
>

Reply via email to