Thanks all for your answers. Much appreciated.

On Thu, Jun 23, 2022 at 6:07 AM Yong Walt <yongw...@gmail.com> wrote:

> We have many cases like this. it won't cause OOM.
>
> Thanks
>
> On Wed, Jun 22, 2022 at 8:28 PM Sid <flinkbyhe...@gmail.com> wrote:
>
>> I have a 150TB CSV file.
>>
>> I have a total of 100 TB RAM and 100TB disk. So If I do something like
>> this
>>
>> spark.read.option("header","true").csv(filepath).show(false)
>>
>> Will it lead to an OOM error since it doesn't have enough memory? or it
>> will spill data onto the disk and process it?
>>
>> Thanks,
>> Sid
>>
>

Reply via email to