HI Ashok
  this is not really a spark-related question so i would not use this
mailing list.....
Anyway, my 2 cents here
as outlined by earlier replies, if the class you are referencing is in a
different jar, at compile time you will need to add that dependency to your
build.sbt,
I'd personally leave alone $CLASSPATH...

AT RUN TIME, you have two options:
1 -  as suggested by Ted, when yo u launch your app via spark-submit you
can use '--jars utilities-assembly-0.1-SNAPSHOT.jar' to pass the jar.
2 - Use sbt assembly plugin to package your classes and jars into a 'fat
jar', and then at runtime all you  need to do is to do

 spark-submit   --class <your spark app class name>  <path to your fat
jar>

I'd personally go for 1   as it is the easiest option. (FYI for 2  you
might encounter situations where you have dependencies referring to same
classes, adn that will require you to define an assemblyMergeStrategy....)

hth




On Mon, Jun 6, 2016 at 8:52 AM, Ashok Kumar <ashok34...@yahoo.com.invalid>
wrote:

> Anyone can help me with this please
>
>
> On Sunday, 5 June 2016, 11:06, Ashok Kumar <ashok34...@yahoo.com> wrote:
>
>
> Hi all,
>
> Appreciate any advice on this. It is about scala
>
> I have created a very basic Utilities.scala that contains a test class and
> method. I intend to add my own classes and methods as I expand and make
> references to these classes and methods in my other apps
>
> class getCheckpointDirectory {
>   def CheckpointDirectory (ProgramName: String) : String  = {
>      var hdfsDir = "hdfs://host:9000/user/user/checkpoint/"+ProgramName
>      return hdfsDir
>   }
> }
> I have used sbt to create a jar file for it. It is created as a jar file
>
> utilities-assembly-0.1-SNAPSHOT.jar
>
> Now I want to make a call to that method CheckpointDirectory in my app
> code myapp.dcala to return the value for hdfsDir
>
>    val ProgramName = this.getClass.getSimpleName.trim
>    val getCheckpointDirectory =  new getCheckpointDirectory
>    val hdfsDir = getCheckpointDirectory.CheckpointDirectory(ProgramName)
>
> However, I am getting a compilation error as expected
>
> not found: type getCheckpointDirectory
> [error]     val getCheckpointDirectory =  new getCheckpointDirectory
> [error]                                       ^
> [error] one error found
> [error] (compile:compileIncremental) Compilation failed
>
> So a basic question, in order for compilation to work do I need to create
> a package for my jar file or add dependency like the following I do in sbt
>
> libraryDependencies += "org.apache.spark" %% "spark-core" % "1.5.1"
> libraryDependencies += "org.apache.spark" %% "spark-sql" % "1.5.1"
> libraryDependencies += "org.apache.spark" %% "spark-hive" % "1.5.1"
>
>
> Or add the jar file to $CLASSPATH?
>
> Any advise will be appreciated.
>
> Thanks
>
>
>
>
>
>
>

Reply via email to