At compilation time, you need to declare the dependence
on getCheckpointDirectory.

At runtime, you can use '--jars utilities-assembly-0.1-SNAPSHOT.jar' to
pass the jar.

Cheers

On Sun, Jun 5, 2016 at 3:06 AM, Ashok Kumar <ashok34...@yahoo.com.invalid>
wrote:

> Hi all,
>
> Appreciate any advice on this. It is about scala
>
> I have created a very basic Utilities.scala that contains a test class and
> method. I intend to add my own classes and methods as I expand and make
> references to these classes and methods in my other apps
>
> class getCheckpointDirectory {
>   def CheckpointDirectory (ProgramName: String) : String  = {
>      var hdfsDir = "hdfs://host:9000/user/user/checkpoint/"+ProgramName
>      return hdfsDir
>   }
> }
> I have used sbt to create a jar file for it. It is created as a jar file
>
> utilities-assembly-0.1-SNAPSHOT.jar
>
> Now I want to make a call to that method CheckpointDirectory in my app
> code myapp.dcala to return the value for hdfsDir
>
>    val ProgramName = this.getClass.getSimpleName.trim
>    val getCheckpointDirectory =  new getCheckpointDirectory
>    val hdfsDir = getCheckpointDirectory.CheckpointDirectory(ProgramName)
>
> However, I am getting a compilation error as expected
>
> not found: type getCheckpointDirectory
> [error]     val getCheckpointDirectory =  new getCheckpointDirectory
> [error]                                       ^
> [error] one error found
> [error] (compile:compileIncremental) Compilation failed
>
> So a basic question, in order for compilation to work do I need to create
> a package for my jar file or add dependency like the following I do in sbt
>
> libraryDependencies += "org.apache.spark" %% "spark-core" % "1.5.1"
> libraryDependencies += "org.apache.spark" %% "spark-sql" % "1.5.1"
> libraryDependencies += "org.apache.spark" %% "spark-hive" % "1.5.1"
>
>
> Any advise will be appreciated.
>
> Thanks
>
>
>
>
>

Reply via email to