Have you tried Maven instead of SBT? This looks like a Java dependency problem, e.g. a wrong version of Avro is picked.
On Tue, Nov 6, 2018 at 8:30 AM shane knapp <skn...@berkeley.edu> wrote: > i'm really close (for real: really close!) on the ubuntu port... but one > build has been a thorn in my side and i was wondering if i could get some > extra eyes on this as i grind through the remaining few pieces of my own > personal system dependency hell. :) > > the job in question is: > > https://amplab.cs.berkeley.edu/jenkins/job/spark-master-test-sbt-hadoop-2.7-ubuntu-testing > > it's identical to the regular spark-master-test-sbt-hadoop-2.7 job, except > i'm building against a newer version of java (1.8.0_171 vs 1.8.0_60). > > the centos job always passes on every worker. > > the ubuntu job fails on every ubuntu worker during the scala unidoc > generation w/the following error: > > """ > [error] > /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7-ubuntu-testing/core/src/main/scala/org/apache/spark/serializer/GenericAvroSerializer.scala:123: > value createDatumWriter is not a member of > org.apache.avro.generic.GenericData > [error] writerCache.getOrElseUpdate(schema, > GenericData.get.createDatumWriter(schema)) > [error] ^ > [info] No documentation generated with unsuccessful compiler run > [error] one error found > > """ > an example job w/this failure is here: > > https://amplab.cs.berkeley.edu/jenkins/job/spark-master-test-sbt-hadoop-2.7-ubuntu-testing/30/consoleFull > > thoughts? am i missing something obvious? i've checked and there are no > avro system packages installed on any of the workers (centos or ubuntu). > > thanks in advance, > > shane > -- > Shane Knapp > UC Berkeley EECS Research / RISELab Staff Technical Lead > https://rise.cs.berkeley.edu >