It is actually in JavaPairRDD class:
http://spark.incubator.apache.org/docs/latest/api/core/index.html#org.apache.spark.api.java.JavaPairRDD



On Mon, Dec 9, 2013 at 10:42 PM, Matt Cheah <mch...@palantir.com> wrote:

>  Was this introduced recently? JavaRDD's function signatures don't seem
> to take that parameter:
>
>
> http://spark.incubator.apache.org/docs/latest/api/core/index.html#org.apache.spark.api.java.JavaRDD
>  ------------------------------
> *From:* Ashish Rangole [arang...@gmail.com]
> *Sent:* Monday, December 09, 2013 7:41 PM
> *To:* user@spark.incubator.apache.org
> *Subject:* Re: JavaRDD, Specify number of tasks
>
>   AFAIK yes.  IIRC, there is a 2nd parameter numPartitions that one can
> provide to these operations.
> On Dec 9, 2013 8:19 PM, "Matt Cheah" <mch...@palantir.com> wrote:
>
>>  Hi
>>
>>  When I use a JavaPairRDD's groupByKey(), reduceByKey(), or sortByKey(),
>> is there a way for me to specify the number of reduce tasks, as there is in
>> a scala RDD? Or do I have to set them all to use spark.default.parallelism?
>>
>>  Thanks,
>>
>>  -Matt Cheah
>>
>>  (feels like I've been asking a lot of questions as of lateā€¦)
>>
>

Reply via email to