need to tune Spark data partition size for input data
>>> that is stored in Tachyon (default is 512MB), but above method can't work
>>> for Tachyon data.
>>>
>>> Do you have any suggestions? Thanks very much!
>>>
>>> Best Regards,
>>> Jia
>&g
n't work
>> for Tachyon data.
>>
>> Do you have any suggestions? Thanks very much!
>>
>> Best Regards,
>> Jia
>>
>>
>> -- Forwarded message --
>> From: Jia Zou <jacqueline...@gmail.com>
>> Date: Thu, Jan 21, 2016 at 10:05 PM
>>
-
> From: Jia Zou <jacqueline...@gmail.com>
> Date: Thu, Jan 21, 2016 at 10:05 PM
> Subject: Spark partition size tuning
> To: "user @spark" <user@spark.apache.org>
>
>
> Dear all!
>
> When using Spark to read from local file system, the default
any suggestions? Thanks very much!
>
> Best Regards,
> Jia
>
>
> -- Forwarded message --
> From: Jia Zou <jacqueline...@gmail.com>
> Date: Thu, Jan 21, 2016 at 10:05 PM
> Subject: Spark partition size tuning
> To: "user @spark" <user@spark.a
thod can't work for
Tachyon data.
Do you have any suggestions? Thanks very much!
Best Regards,
Jia
-- Forwarded message --
From: Jia Zou <jacqueline...@gmail.com>
Date: Thu, Jan 21, 2016 at 10:05 PM
Subject: Spark partition size tuning
To: "user @spark" <user
Dear all!
When using Spark to read from local file system, the default partition size
is 32MB, how can I increase the partition size to 128MB, to reduce the
number of tasks?
Thank you very much!
Best Regards,
Jia