gt;>
>>>>>> I have a RDD with 30 record ( Key/value pair ) and running 30
>>>>>> executor . i
>>>>>> want to reparation this RDD in to 30 partition so every partition
>>>>>> get one
>>>>>> record and assigned
;>> want to reparation this RDD in to 30 partition so every partition get
>>>>> one
>>>>> record and assigned to one executor .
>>>>>
>>>>> when i used rdd.repartition(30) its repartition my rdd in 30 partition
>>>>>
>>>> but
>>>> some partition get 2 record , some get 1 record and some not getting any
>>>> record .
>>>>
>>>> is there any way in spark so i can evenly distribute my record in all
>>>> partition .
>>>>
>>>
;>> some partition get 2 record , some get 1 record and some not getting any
>>> record .
>>>
>>> is there any way in spark so i can evenly distribute my record in all
>>> partition .
>>>
>>> Regards
>>> Prateek
&
.
>>
>> is there any way in spark so i can evenly distribute my record in all
>> partition .
>>
>> Regards
>> Prateek
>>
>>
>>
>> --
>> View this message in context:
>> http://apache-spark-user-list.
in all
> partition .
>
> Regards
> Prateek
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/how-can-evenly-distribute-my-records-in-all-partition-tp25394.html
, some get 1 record and some not getting any
record .
is there any way in spark so i can evenly distribute my record in all
partition .
Regards
Prateek
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/how-can-evenly-distribute-my-records-in-all-partition