During spark-submit when running hive on spark I get: Exception in thread "main" java.util.ServiceConfigurationError: org.apache.hadoop.fs.FileSystem: Provider org.apache.hadoop.hdfs.HftpFileSystem could not be instantiated
Caused by: java.lang.IllegalAccessError: tried to access method org.apache.hadoop.fs.DelegationTokenRenewer.<init>(Ljava/lang/Class;)V from class org.apache.hadoop.hdfs.HftpFileSystem I managed to make hive on spark work on a staging cluster I have and now I'm trying to do the same on a production cluster and this happened. Both are cdh5.4.3. I read that this is due to something not being compiled against the correct hadoop version. my main question what is the binary/jar/file that can cause this? I tried replacing the binaries and jars to the ones used by the staging cluster (that hive on spark worked on) and it didn't help. Thank you for anyone reading this, and thank you for any direction on where to look. Ophir