Forcing spark to send exactly one element to each worker node

2014-05-12 Thread NevinLi158
: is there any way to force Spark to send only one? I've tried coalescing and repartitioning the RDD to be equal to the number of elements in the RDD, but that hasn't worked. Thanks! -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Forcing-spark-to-send-exactly-one

Re: Forcing spark to send exactly one element to each worker node

2014-05-12 Thread NevinLi158
.nabble.com/Forcing-spark-to-send-exactly-one-element-to-each-worker-node-tp5605p5607.html Sent from the Apache Spark User List mailing list archive at Nabble.com.

Re: Forcing spark to send exactly one element to each worker node

2014-05-12 Thread NevinLi158
the impression that pipe() would just run the C++ application on the remote node: is the application supposed to run slower if you use pipe() to execute it? -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Forcing-spark-to-send-exactly-one-element-to-each-worker