Unable to import SharedSparkContext

2015-11-18 Thread njoshi
Hi,

Doesn't *SharedSparkContext* come with spark-core? Do I need to include any
special package in the library dependancies for using SharedSparkContext? 

I am trying to write a testSuite similar to the *LogisticRegressionSuite*
test in the Spak-ml. Unfortunately, I am unable to import any of the
following packages:

import org.apache.spark.SparkFunSuite
import org.apache.spark.ml.param.ParamsSuite
import org.apache.spark.ml.util.{DefaultReadWriteTest, MLTestingUtils}
import org.apache.spark.mllib.util.MLlibTestSparkContext
import org.apache.spark.mllib.util.TestingUtils._

Thanks in advance,
Nikhil



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Unable-to-import-SharedSparkContext-tp25419.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: Unable to import SharedSparkContext

2015-11-18 Thread Sourigna Phetsarath
Nikhil,

Please take a look at: https://github.com/holdenk/spark-testing-base

On Wed, Nov 18, 2015 at 2:12 PM, Marcelo Vanzin  wrote:

> On Wed, Nov 18, 2015 at 11:08 AM, njoshi  wrote:
> > Doesn't *SharedSparkContext* come with spark-core? Do I need to include
> any
> > special package in the library dependancies for using SharedSparkContext?
>
> That's a test class. It's not part of the Spark API.
>
> --
> Marcelo
>
> -
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>


-- 


*Gna Phetsarath*System Architect // AOL Platforms // Data Services //
Applied Research Chapter
770 Broadway, 5th Floor, New York, NY 10003
o: 212.402.4871 // m: 917.373.7363
vvmr: 8890237 aim: sphetsarath20 t: @sourigna

* *


Re: Unable to import SharedSparkContext

2015-11-18 Thread Sourigna Phetsarath
Plus this article:
http://blog.cloudera.com/blog/2015/09/making-apache-spark-testing-easy-with-spark-testing-base/

On Wed, Nov 18, 2015 at 2:25 PM, Sourigna Phetsarath <
gna.phetsar...@teamaol.com> wrote:

> Nikhil,
>
> Please take a look at: https://github.com/holdenk/spark-testing-base
>
> On Wed, Nov 18, 2015 at 2:12 PM, Marcelo Vanzin 
> wrote:
>
>> On Wed, Nov 18, 2015 at 11:08 AM, njoshi 
>> wrote:
>> > Doesn't *SharedSparkContext* come with spark-core? Do I need to include
>> any
>> > special package in the library dependancies for using
>> SharedSparkContext?
>>
>> That's a test class. It's not part of the Spark API.
>>
>> --
>> Marcelo
>>
>> -
>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>> For additional commands, e-mail: user-h...@spark.apache.org
>>
>>
>
>
> --
>
>
> *Gna Phetsarath*System Architect // AOL Platforms // Data Services //
> Applied Research Chapter
> 770 Broadway, 5th Floor, New York, NY 10003
> o: 212.402.4871 // m: 917.373.7363
> vvmr: 8890237 aim: sphetsarath20 t: @sourigna
>
> * *
>



-- 


*Gna Phetsarath*System Architect // AOL Platforms // Data Services //
Applied Research Chapter
770 Broadway, 5th Floor, New York, NY 10003
o: 212.402.4871 // m: 917.373.7363
vvmr: 8890237 aim: sphetsarath20 t: @sourigna

* *


Re: Unable to import SharedSparkContext

2015-11-18 Thread Nikhil Joshi
Thanks Marcelo and Sourigna. I see the spark-testing-base being part of
Spark, but has been included under test package of Spark-core. That caused
the trouble :(.

On Wed, Nov 18, 2015 at 11:26 AM, Sourigna Phetsarath <
gna.phetsar...@teamaol.com> wrote:

> Plus this article:
> http://blog.cloudera.com/blog/2015/09/making-apache-spark-testing-easy-with-spark-testing-base/
>
> On Wed, Nov 18, 2015 at 2:25 PM, Sourigna Phetsarath <
> gna.phetsar...@teamaol.com> wrote:
>
>> Nikhil,
>>
>> Please take a look at: https://github.com/holdenk/spark-testing-base
>>
>> On Wed, Nov 18, 2015 at 2:12 PM, Marcelo Vanzin 
>> wrote:
>>
>>> On Wed, Nov 18, 2015 at 11:08 AM, njoshi 
>>> wrote:
>>> > Doesn't *SharedSparkContext* come with spark-core? Do I need to
>>> include any
>>> > special package in the library dependancies for using
>>> SharedSparkContext?
>>>
>>> That's a test class. It's not part of the Spark API.
>>>
>>> --
>>> Marcelo
>>>
>>> -
>>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>>> For additional commands, e-mail: user-h...@spark.apache.org
>>>
>>>
>>
>>
>> --
>>
>>
>> *Gna Phetsarath*System Architect // AOL Platforms // Data Services //
>> Applied Research Chapter
>> 770 Broadway, 5th Floor, New York, NY 10003
>> o: 212.402.4871 // m: 917.373.7363
>> vvmr: 8890237 aim: sphetsarath20 t: @sourigna
>>
>> * *
>>
>
>
>
> --
>
>
> *Gna Phetsarath*System Architect // AOL Platforms // Data Services //
> Applied Research Chapter
> 770 Broadway, 5th Floor, New York, NY 10003
> o: 212.402.4871 // m: 917.373.7363
> vvmr: 8890237 aim: sphetsarath20 t: @sourigna
>
> * *
>



-- 

*Nikhil Joshi*Princ Data Scientist
*Aol*PLATFORMS.
*395 Page Mill Rd, *Palo Alto
, CA
 94306-2024
vvmr: 8894737