I think you can place the jar in lib/ in SPARK_HOME, and then compile without 
any change to your class path. This could be a temporary way to include your 
jar. You can also put them in your pom.xml.

Thanks,
Daoyuan

-----Original Message-----
From: flyson [mailto:m_...@msn.com] 
Sent: Wednesday, December 03, 2014 11:23 AM
To: d...@spark.incubator.apache.org
Subject: object xxx is not a member of package com

Hello everyone,

Could anybody tell me how to import and call the 3rd party java classes from 
inside spark?
Here's my case:
I have a jar file (the directory layout is com.xxx.yyy.zzz) which contains some 
java classes, and I need to call some of them in spark code.
I used the statement "import com.xxx.yyy.zzz._" on top of the impacted spark 
file and set the location of the jar file in the CLASSPATH environment, and use 
".sbt/sbt assembly" to build the project. As a result, I got an error saying 
"object xxx is not a member of package com".

I thought that could be related to the library dependencies, but couldn't 
figure it out. Any suggestion/solution from you would be appreciated!

By the way in the scala console, if the :cp is used to point to the jar file, I 
can import the classes from the jar file.

Thanks! 



--
View this message in context: 
http://apache-spark-developers-list.1001551.n3.nabble.com/object-xxx-is-not-a-member-of-package-com-tp9619.html
Sent from the Apache Spark Developers List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org For additional 
commands, e-mail: dev-h...@spark.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org

Reply via email to