AFAIK yes.  IIRC, there is a 2nd parameter numPartitions that one can
provide to these operations.
On Dec 9, 2013 8:19 PM, "Matt Cheah" <mch...@palantir.com> wrote:

>  Hi
>
>  When I use a JavaPairRDD's groupByKey(), reduceByKey(), or sortByKey(),
> is there a way for me to specify the number of reduce tasks, as there is in
> a scala RDD? Or do I have to set them all to use spark.default.parallelism?
>
>  Thanks,
>
>  -Matt Cheah
>
>  (feels like I've been asking a lot of questions as of lateā€¦)
>

Reply via email to