You must be relying on IntelliJ to compile your scala, because you haven’t set 
up any scala plugin to compile it from maven.
You should have something like this in your plugins:

<plugins>
 <plugin>
  <groupId>net.alchim31.maven</groupId>
  <artifactId>scala-maven-plugin</artifactId>
  <executions>
   <execution>
    <id>scala-compile-first</id>
    <phase>process-resources</phase>
    <goals>
     <goal>compile</goal>
    </goals>
   </execution>
   <execution>
    <id>scala-test-compile</id>
    <phase>process-test-resources</phase>
    <goals>
     <goal>testCompile</goal>
    </goals>
   </execution>
  </executions>
 </plugin>
</plugins>

PS - I use maven to compile all my scala and haven’t had a problem with it. I 
know that sbt has some wonderful things, but I’m just set in my ways ;)

> On Mar 11, 2016, at 2:02 PM, Jacek Laskowski <ja...@japila.pl> wrote:
> 
> Hi,
> 
> Doh! My eyes are bleeding to go through XMLs... 😁
> 
> Where did you specify Scala version? Dunno how it's in maven.
> 
> p.s. I *strongly* recommend sbt.
> 
> Jacek
> 
> 11.03.2016 8:04 PM "Vasu Parameswaran" <vas...@gmail.com 
> <mailto:vas...@gmail.com>> napisał(a):
> Thanks Jacek.  Pom is below (Currenlty set to 1.6.1 spark but I started out 
> with 1.6.0 with the same problem).
> 
> 
> <?xml version="1.0" encoding="UTF-8"?>
> <project xmlns="http://maven.apache.org/POM/4.0.0 
> <http://maven.apache.org/POM/4.0.0>"
>          xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance 
> <http://www.w3.org/2001/XMLSchema-instance>"
>          xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 
> <http://maven.apache.org/POM/4.0.0> 
> http://maven.apache.org/xsd/maven-4.0.0.xsd 
> <http://maven.apache.org/xsd/maven-4.0.0.xsd>">
>     <parent>
>         <artifactId>spark</artifactId>
>         <groupId>com.test</groupId>
>         <version>1.0-SNAPSHOT</version>
>     </parent>
>     <modelVersion>4.0.0</modelVersion>
> 
>     <artifactId>sparktest</artifactId>
> 
>     <properties>
>         <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
>     </properties>
> 
>     <dependencies>
>         <dependency>
>             <groupId>junit</groupId>
>             <artifactId>junit</artifactId>
>         </dependency>
> 
>         <dependency>
>             <groupId>commons-cli</groupId>
>             <artifactId>commons-cli</artifactId>
>         </dependency>
>         <dependency>
>             <groupId>com.google.code.gson</groupId>
>             <artifactId>gson</artifactId>
>             <version>2.3.1</version>
>             <scope>compile</scope>
>         </dependency>
>         <dependency>
>             <groupId>org.apache.spark</groupId>
>             <artifactId>spark-core_2.11</artifactId>
>             <version>1.6.1</version>
>         </dependency>
>     </dependencies>
> 
>     <build>
>         <plugins>
>             <plugin>
>                 <groupId>org.apache.maven.plugins</groupId>
>                 <artifactId>maven-shade-plugin</artifactId>
>                 <version>2.4.2</version>
>                 <executions>
>                     <execution>
>                         <phase>package</phase>
>                         <goals>
>                             <goal>shade</goal>
>                         </goals>
>                     </execution>
>                 </executions>
>                 <configuration>
>                     
> <finalName>${project.artifactId}-${project.version}-with-dependencies</finalName>
>                 </configuration>
>             </plugin>
>         </plugins>
>     </build>
> 
> </project>
> 
> 
> 
> On Fri, Mar 11, 2016 at 10:46 AM, Jacek Laskowski <ja...@japila.pl 
> <mailto:ja...@japila.pl>> wrote:
> Hi,
> 
> Why do you use maven not sbt for Scala?
> 
> Can you show the entire pom.xml and the command to execute the app?
> 
> Jacek
> 
> 11.03.2016 7:33 PM "vasu20" <vas...@gmail.com <mailto:vas...@gmail.com>> 
> napisał(a):
> Hi
> 
> Any help appreciated on this.  I am trying to write a Spark program using
> IntelliJ.  I get a run time error as soon as new SparkConf() is called from
> main.  Top few lines of the exception are pasted below.
> 
> These are the following versions:
> 
> Spark jar:  spark-assembly-1.6.0-hadoop2.6.0.jar
> pom:  <artifactId>spark-core_2.11</artifactId>
>          <version>1.6.0</version>
> 
> I have installed the Scala plugin in IntelliJ and added a dependency.
> 
> I have also added a library dependency in the project structure.
> 
> Thanks for any help!
> 
> Vasu
> 
> 
> Exception in thread "main" java.lang.NoSuchMethodError:
> scala.Predef$.augmentString(Ljava/lang/String;)Ljava/lang/String;
>         at org.apache.spark.util.Utils$.<init>(Utils.scala:1682)
>         at org.apache.spark.util.Utils$.<clinit>(Utils.scala)
>         at org.apache.spark.SparkConf.<init>(SparkConf.scala:59)
> 
> 
> 
> 
> 
> 
> --
> View this message in context: 
> http://apache-spark-user-list.1001560.n3.nabble.com/Newbie-question-Help-with-runtime-error-on-augmentString-tp26462.html
>  
> <http://apache-spark-user-list.1001560.n3.nabble.com/Newbie-question-Help-with-runtime-error-on-augmentString-tp26462.html>
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
> 
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org 
> <mailto:user-unsubscr...@spark.apache.org>
> For additional commands, e-mail: user-h...@spark.apache.org 
> <mailto:user-h...@spark.apache.org>
> 
> 

Reply via email to