Right, well I don’t think the issue is with how you’re compiling the scala. I 
think it’s a conflict between different versions of several libs.
I had similar issues with my spark modules. You need to make sure you’re not 
loading a different version of the same lib that is clobbering another 
dependency. It’s very frustrating, but with patience you can weed them out. 
You’ll want to find the offending libs and put them into an <exclusions> block 
under the associated dependency. I am still working with spark 1.5, scala 2.10, 
and for me the presence of scalap was the problem, and this resolved it:
<dependency>
 <groupId>org.apache.spark</groupId>
 <artifactId>spark-core_2.10</artifactId>
 <version>1.5.1</version>
 <exclusions>
  <exclusion>
   <groupId>org.json4s</groupId>
   <artifactId>json4s-core_2.10</artifactId>
  </exclusion>
 </exclusions>
</dependency>
<dependency>
 <groupId>org.json4s</groupId>
 <artifactId>json4s-core_2.10</artifactId>
 <version>3.2.10</version>
 <exclusions>
  <exclusion>
   <groupId>org.scala-lang</groupId>
   <artifactId>scalap</artifactId>
  </exclusion>
 </exclusions>
</dependency>

Unfortunately scalap is a dependency of json4s, which I want to keep. So what I 
do is exclude json4s from spark-core, then add it back in, but with its 
troublesome scalap dependency removed.


> On Mar 11, 2016, at 6:34 PM, Vasu Parameswaran <vas...@gmail.com> wrote:
> 
> Added these to the pom and still the same error :-(. I will look into sbt as 
> well.
> 
> 
> 
> On Fri, Mar 11, 2016 at 2:31 PM, Tristan Nixon <st...@memeticlabs.org 
> <mailto:st...@memeticlabs.org>> wrote:
> You must be relying on IntelliJ to compile your scala, because you haven’t 
> set up any scala plugin to compile it from maven.
> You should have something like this in your plugins:
> 
> <plugins>
>  <plugin>
>   <groupId>net.alchim31.maven</groupId>
>   <artifactId>scala-maven-plugin</artifactId>
>   <executions>
>    <execution>
>     <id>scala-compile-first</id>
>     <phase>process-resources</phase>
>     <goals>
>      <goal>compile</goal>
>     </goals>
>    </execution>
>    <execution>
>     <id>scala-test-compile</id>
>     <phase>process-test-resources</phase>
>     <goals>
>      <goal>testCompile</goal>
>     </goals>
>    </execution>
>   </executions>
>  </plugin>
> </plugins>
> 
> PS - I use maven to compile all my scala and haven’t had a problem with it. I 
> know that sbt has some wonderful things, but I’m just set in my ways ;)
> 
>> On Mar 11, 2016, at 2:02 PM, Jacek Laskowski <ja...@japila.pl 
>> <mailto:ja...@japila.pl>> wrote:
>> 
>> Hi,
>> 
>> Doh! My eyes are bleeding to go through XMLs... 😁
>> 
>> Where did you specify Scala version? Dunno how it's in maven.
>> 
>> p.s. I *strongly* recommend sbt.
>> 
>> Jacek
>> 
>> 11.03.2016 8:04 PM "Vasu Parameswaran" <vas...@gmail.com 
>> <mailto:vas...@gmail.com>> napisał(a):
>> Thanks Jacek.  Pom is below (Currenlty set to 1.6.1 spark but I started out 
>> with 1.6.0 with the same problem).
>> 
>> 
>> <?xml version="1.0" encoding="UTF-8"?>
>> <project xmlns="http://maven.apache.org/POM/4.0.0 
>> <http://maven.apache.org/POM/4.0.0>"
>>          xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance 
>> <http://www.w3.org/2001/XMLSchema-instance>"
>>          xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 
>> <http://maven.apache.org/POM/4.0.0> 
>> http://maven.apache.org/xsd/maven-4.0.0.xsd 
>> <http://maven.apache.org/xsd/maven-4.0.0.xsd>">
>>     <parent>
>>         <artifactId>spark</artifactId>
>>         <groupId>com.test</groupId>
>>         <version>1.0-SNAPSHOT</version>
>>     </parent>
>>     <modelVersion>4.0.0</modelVersion>
>> 
>>     <artifactId>sparktest</artifactId>
>> 
>>     <properties>
>>         <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
>>     </properties>
>> 
>>     <dependencies>
>>         <dependency>
>>             <groupId>junit</groupId>
>>             <artifactId>junit</artifactId>
>>         </dependency>
>> 
>>         <dependency>
>>             <groupId>commons-cli</groupId>
>>             <artifactId>commons-cli</artifactId>
>>         </dependency>
>>         <dependency>
>>             <groupId>com.google.code.gson</groupId>
>>             <artifactId>gson</artifactId>
>>             <version>2.3.1</version>
>>             <scope>compile</scope>
>>         </dependency>
>>         <dependency>
>>             <groupId>org.apache.spark</groupId>
>>             <artifactId>spark-core_2.11</artifactId>
>>             <version>1.6.1</version>
>>         </dependency>
>>     </dependencies>
>> 
>>     <build>
>>         <plugins>
>>             <plugin>
>>                 <groupId>org.apache.maven.plugins</groupId>
>>                 <artifactId>maven-shade-plugin</artifactId>
>>                 <version>2.4.2</version>
>>                 <executions>
>>                     <execution>
>>                         <phase>package</phase>
>>                         <goals>
>>                             <goal>shade</goal>
>>                         </goals>
>>                     </execution>
>>                 </executions>
>>                 <configuration>
>>                     
>> <finalName>${project.artifactId}-${project.version}-with-dependencies</finalName>
>>                 </configuration>
>>             </plugin>
>>         </plugins>
>>     </build>
>> 
>> </project>
>> 
>> 
>> 
>> On Fri, Mar 11, 2016 at 10:46 AM, Jacek Laskowski <ja...@japila.pl 
>> <mailto:ja...@japila.pl>> wrote:
>> Hi,
>> 
>> Why do you use maven not sbt for Scala?
>> 
>> Can you show the entire pom.xml and the command to execute the app?
>> 
>> Jacek
>> 
>> 11.03.2016 7:33 PM "vasu20" <vas...@gmail.com <mailto:vas...@gmail.com>> 
>> napisał(a):
>> Hi
>> 
>> Any help appreciated on this.  I am trying to write a Spark program using
>> IntelliJ.  I get a run time error as soon as new SparkConf() is called from
>> main.  Top few lines of the exception are pasted below.
>> 
>> These are the following versions:
>> 
>> Spark jar:  spark-assembly-1.6.0-hadoop2.6.0.jar
>> pom:  <artifactId>spark-core_2.11</artifactId>
>>          <version>1.6.0</version>
>> 
>> I have installed the Scala plugin in IntelliJ and added a dependency.
>> 
>> I have also added a library dependency in the project structure.
>> 
>> Thanks for any help!
>> 
>> Vasu
>> 
>> 
>> Exception in thread "main" java.lang.NoSuchMethodError:
>> scala.Predef$.augmentString(Ljava/lang/String;)Ljava/lang/String;
>>         at org.apache.spark.util.Utils$.<init>(Utils.scala:1682)
>>         at org.apache.spark.util.Utils$.<clinit>(Utils.scala)
>>         at org.apache.spark.SparkConf.<init>(SparkConf.scala:59)
>> 
>> 
>> 
>> 
>> 
>> 
>> --
>> View this message in context: 
>> http://apache-spark-user-list.1001560.n3.nabble.com/Newbie-question-Help-with-runtime-error-on-augmentString-tp26462.html
>>  
>> <http://apache-spark-user-list.1001560.n3.nabble.com/Newbie-question-Help-with-runtime-error-on-augmentString-tp26462.html>
>> Sent from the Apache Spark User List mailing list archive at Nabble.com 
>> <http://nabble.com/>.
>> 
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org 
>> <mailto:user-unsubscr...@spark.apache.org>
>> For additional commands, e-mail: user-h...@spark.apache.org 
>> <mailto:user-h...@spark.apache.org>
>> 
>> 
> 
> 

Reply via email to