Hi Maria, Having files that end with 1.0.0 means you're Spark 1.0, not Hadoop 1.0. You can check your hadoop version by running "$HADOOP_HOME/bin/hadoop version", where HADOOP_HOME is set to your installation of hadoop. On the clusters started by the Spark ec2 scripts, this should be "/root/ephemeral-hdfs".
Andrew 2014-07-22 7:07 GMT-07:00 mrm <ma...@skimlinks.com>: > Hi, > > Where can I find the version of Hadoop my cluster is using? I launched my > ec2 cluster using the spark-ec2 script with the "--hadoop-major-version=2" > option. However, the folder "hadoop-native/lib" in the master node only > contains files that end in 1.0.0. Does that mean that I have Hadoop version > 1? > > Thanks! > > > > -- > View this message in context: > http://apache-spark-user-list.1001560.n3.nabble.com/hadoop-version-tp10405.html > Sent from the Apache Spark User List mailing list archive at Nabble.com. >