Hi:
If we are using carbondata + spark to load data, we can set
carbon.number.of.cores.while.loading to the number of executor cores.
When set the number of executor cores to 6, it shows that there are at
least 6 cores per node for loading data, so we can set
carbon.number.of.cores.while.load
Hi, dev:
I am using Spark 2.1 + CarbonData 1.2, and find that if
enable.unsafe.sort=true, the length of bytes of column exceed 32768, it will
load data unsuccessfully.
My test code:
*val longStr = sb.toString()
println(longStr.length())
println(longStr.getBytes("UTF-8").length)