Hi,
What is your cluster setup? How mich memory do you have? How much space
does one row only consisting of the 3 columns consume? Do you run other
stuff in the background?
Best regards
Am 04.12.2014 23:57 schrieb bonnahu bonn...@gmail.com:
I am trying to load a large Hbase table into SPARK
Hi,
Here is the configuration of the cluster:
Workers: 2
For each worker,
Cores: 24 Total, 0 Used
Memory: 69.6 GB Total, 0.0 B Used
For the spark.executor.memory, I didn't set it, so it should be the default
value 512M.
How much space does one row only consisting of the 3 columns consume?
the
Hi Ted,
Here is the information about the Regions:
Region Server Region Count
http://regionserver1:60030/ 44
http://regionserver2:60030/ 39
http://regionserver3:60030/ 55
--
View this message in context: