We're going to be upgrading from spark 1.0.2 and using hadoop-1.2.1 so need to build by hand. (Yes, I know. Use hadoop-2.x but standard resource constraints apply.) I want to build against scala-2.11 and publish to our artifact repository but finding build/spark-2.10.4 and tracing down what build/mvn was doing had me concerned that I was missing something. I'll hold the course and build it as instructed.
Thanks for the info, all. PS - Since asked -- PATH=./build/apache-maven-3.2.5/bin:$PATH; build/mvn -Phadoop-1 -Dhadoop.version=1.2.1 -Dscala-2.11 -DskipTests package On Mon, Aug 24, 2015 at 2:49 PM, Jonathan Coveney <jcove...@gmail.com> wrote: > I've used the instructions and it worked fine. > > Can you post exactly what you're doing, and what it fails with? Or are you > just trying to understand how it works? > > 2015-08-24 15:48 GMT-04:00 Lanny Ripple <la...@spotright.com>: > >> Hello, >> >> The instructions for building spark against scala-2.11 indicate using >> -Dspark-2.11. When I look in the pom.xml I find a profile named >> 'spark-2.11' but nothing that would indicate I should set a property. The >> sbt build seems to need the -Dscala-2.11 property set. Finally build/mvn >> does a simple grep of scala.version (which doesn't change after running dev/ >> change-version-to-2.11.sh) so the build seems to be grabbing the 2.10.4 >> scala library. >> >> Anyone know (from having done it and used it in production) if the build >> instructions for spark-1.4.1 against Scala-2.11 are correct? >> >> Thanks. >> -Lanny >> > >