Hi Ted,
Perhaps this might help? Thanks for your response. I am trying to
access/read binary files stored over a series of servers.
Line used to build RDD:
val BIN_pairRDD: RDD[(BIN_Key, BIN_Value)] =
spark.newAPIHadoopFile("not.used", classOf[BIN_InputFormat],
classOf[BIN_Key],
ker IP.
>
> What else I can check, or change, to force the driver to send these tasks
> to
> the right workers?
>
> Thanks!
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Spark-driver-assigning-splits-to-
can check, or change, to force the driver to send these tasks to
the right workers?
Thanks!
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-driver-assigning-splits-to-incorrect-workers-tp27261.html
Sent from the Apache Spark User List mailing list