where is the org.apache.spark.util package?

2014-11-07 Thread ll
i'm trying to compile some of the spark code directly from the source
(https://github.com/apache/spark).  it complains about the missing package
org.apache.spark.util.  it doesn't look like this package is part of the
source code on github. 

where can i find this package?



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/where-is-the-org-apache-spark-util-package-tp18360.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: where is the org.apache.spark.util package?

2014-11-07 Thread ll
i found util package under spark core package, but i now got this error
Sysmbol Utils is inaccessible from this place.  

what does this error mean?

the org.apache.spark.util and org.apache.spark.spark.Utils are there now.

thanks.



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/where-is-the-org-apache-spark-util-package-tp18360p18361.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org