What's the canonical way to find out the number of physical machines in a 
cluster at runtime in Spark? I believe SparkContext.defaultParallelism will 
give me the number of cores, but I'm interested in the number of NICs.

I'm writing a Spark streaming application to ingest from Kafka with the 
Receiver API and want to create one DStream per physical machine for read 
parallelism's sake. How can I figure out at run time how many machines there 
are so I know how many DStreams to create?

Reply via email to