Re: Local spark jars not being detected

2015-06-20 Thread Ritesh Kumar Singh
Yes, finally solved. It was there in front of my eyes all time.

Thanks a lot Pete.


Re: Local spark jars not being detected

2015-06-20 Thread Pete Zybrick
It looks like you are using parens instead of curly braces on scala.version



> On Jun 20, 2015, at 8:38 AM, Ritesh Kumar Singh 
>  wrote:
> 
> Hi,
> 
> I'm using IntelliJ ide for my spark project. 
> I've compiled spark 1.3.0 for scala 2.11.4 and here's the one of the compiled 
> jar installed in my m2 folder :
> 
> ~/.m2/repository/org/apache/spark/spark-core_2.11/1.3.0/spark-core_2.11-1.3.0.jar
> 
> But when I add this dependency in my pom file for the project :
> 
> 
> org.apache.spark
> spark-core_$(scala.version)
> ${spark.version}
> provided
> 
> 
> I'm getting Dependency "org.apache.spark:spark-core_$(scala.version):1.3.0" 
> not found.
> Why is this happening and what's the workaround ?

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: Local spark jars not being detected

2015-06-20 Thread Akhil Das
Not sure, but try removing the provided or create a lib directory in the
project home and bring that jar over there.
On 20 Jun 2015 18:08, "Ritesh Kumar Singh" 
wrote:

> Hi,
>
> I'm using IntelliJ ide for my spark project.
> I've compiled spark 1.3.0 for scala 2.11.4 and here's the one of the
> compiled jar installed in my m2 folder :
>
>
> ~/.m2/repository/org/apache/spark/spark-core_2.11/1.3.0/spark-core_2.11-1.3.0.jar
>
> But when I add this dependency in my pom file for the project :
>
> 
> org.apache.spark
> spark-core_$(scala.version)
> ${spark.version}
> provided
> 
>
> I'm getting Dependency
> "org.apache.spark:spark-core_$(scala.version):1.3.0" not found.
> Why is this happening and what's the workaround ?
>