I'm trying to load a 3.5G file with 60 million rows using CsvBulkLoadTool. It hangs while loading HFiles. This runs successfully if I split this into 2 files, but I'd like to avoid doing that. This is on Amazon EMR, is this an issue due to disk space or memory. I have a single master and 2 region server configuration with 16 GB memory on each node.
- Phoenix CSV Bulk Load fails to load a large file Sriram Nookala
- Re: Phoenix CSV Bulk Load fails to load a large file Sergey Soldatov
- Re: Phoenix CSV Bulk Load fails to load a large f... Sriram Nookala
- Re: Phoenix CSV Bulk Load fails to load a lar... Sriram Nookala
- Re: Phoenix CSV Bulk Load fails to load a... Ted Yu
- Re: Phoenix CSV Bulk Load fails to l... Ankit Singhal
- Re: Phoenix CSV Bulk Load fails ... Sriram Nookala