Hi

I think range partitioner is not available in pyspark, so if we want create
one. how should we create that. my question is that.

On Tue, Sep 1, 2015 at 3:57 PM, Jem Tucker <jem.tuc...@gmail.com> wrote:

> Ah sorry I miss read your question. In pyspark it looks like you just need
> to instantiate the Partitioner class with numPartitions and partitionFunc.
>
> On Tue, Sep 1, 2015 at 11:13 AM shahid ashraf <sha...@trialx.com> wrote:
>
>> Hi
>>
>> I did not get this, e.g if i need to create a custom partitioner like
>> range partitioner.
>>
>> On Tue, Sep 1, 2015 at 3:22 PM, Jem Tucker <jem.tuc...@gmail.com> wrote:
>>
>>> Hi,
>>>
>>> You just need to extend Partitioner and override the numPartitions and
>>> getPartition methods, see below
>>>
>>> class MyPartitioner extends partitioner {
>>>   def numPartitions: Int = // Return the number of partitions
>>>   def getPartition(key Any): Int = // Return the partition for a given
>>> key
>>> }
>>>
>>> On Tue, Sep 1, 2015 at 10:15 AM shahid qadri <shahidashr...@icloud.com>
>>> wrote:
>>>
>>>> Hi Sparkians
>>>>
>>>> How can we create a customer partition in pyspark
>>>>
>>>> ---------------------------------------------------------------------
>>>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>>>> For additional commands, e-mail: user-h...@spark.apache.org
>>>>
>>>>
>>
>>
>> --
>> with Regards
>> Shahid Ashraf
>>
>


-- 
with Regards
Shahid Ashraf

Reply via email to