Re: Dependency on TestingUtils in a Spark package

2016-01-12 Thread Robert Dodier
On Tue, Jan 12, 2016 at 12:55 PM, Reynold Xin  wrote:

> If you need it, just copy it over to your own package. That's probably the
> safest option.

OK, not a big deal, I was just hoping to avoid that, in part because the stuff
I'm working on is also proposed as a pull request, and it seems like it would
be a good idea to have a uniform testing environment between the PR
and the Spark package.

best,

Robert Dodier

-
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org



Re: Dependency on TestingUtils in a Spark package

2016-01-12 Thread Reynold Xin
If you need it, just copy it over to your own package. That's probably the
safest option.



On Tue, Jan 12, 2016 at 12:50 PM, Ted Yu  wrote:

> There is no annotation in TestingUtils class indicating whether it is
> suitable for consumption by external projects.
>
> You should assume the class is not public since its methods may change in
> future Spark releases.
>
> Cheers
>
> On Tue, Jan 12, 2016 at 12:36 PM, Robert Dodier 
> wrote:
>
>> Hi,
>>
>> I'm putting together a Spark package (in the spark-packages.org sense)
>> and I'd like to make use of the class
>> org.apache.spark.mllib.util.TestingUtils which appears in
>> mllib/src/test. Can I declare a dependency in my build.sbt to pull in
>> a suitable jar? I have searched around but I have not been able to
>> identify a jar which contains TestingUtils. I suppose I could cut 'n'
>> paste the relevant bits from the source code but I'd really rather
>> just declare a dependency. I looked at a few other packages at
>> spark-packages.org but I couldn't find an example of a project which
>> was doing something similar.
>>
>> Thanks in advance for any light you can shed on this problem.
>>
>> Robert Dodier
>>
>> -
>> To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
>> For additional commands, e-mail: dev-h...@spark.apache.org
>>
>>
>


Re: Dependency on TestingUtils in a Spark package

2016-01-12 Thread Ted Yu
There is no annotation in TestingUtils class indicating whether it is
suitable for consumption by external projects.

You should assume the class is not public since its methods may change in
future Spark releases.

Cheers

On Tue, Jan 12, 2016 at 12:36 PM, Robert Dodier 
wrote:

> Hi,
>
> I'm putting together a Spark package (in the spark-packages.org sense)
> and I'd like to make use of the class
> org.apache.spark.mllib.util.TestingUtils which appears in
> mllib/src/test. Can I declare a dependency in my build.sbt to pull in
> a suitable jar? I have searched around but I have not been able to
> identify a jar which contains TestingUtils. I suppose I could cut 'n'
> paste the relevant bits from the source code but I'd really rather
> just declare a dependency. I looked at a few other packages at
> spark-packages.org but I couldn't find an example of a project which
> was doing something similar.
>
> Thanks in advance for any light you can shed on this problem.
>
> Robert Dodier
>
> -
> To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
> For additional commands, e-mail: dev-h...@spark.apache.org
>
>