Hello,I am reading data from HDFS in a Spark application and as far as I read each HDFS block is 1 partition for Spark by default. Is there any way to select only 1 block from HDFS to read in my Spark application?
Thank you, Thodoris --------------------------------------------------------------------- To unsubscribe e-mail: user-unsubscr...@spark.apache.org