[ 
https://issues.apache.org/jira/browse/HDT-24?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13805000#comment-13805000
 ] 

Rahul Sharma commented on HDT-24:
---------------------------------

I haven't thought on these lines, for now I think connecting to multiple 
versions simultaneously looks like a nice to have !  I think it is quite 
important that we first sort out the connection to one particular version of 
cluster, IMO that will put a good capability in the project. Also I am not sure 
how we could facilitate to connect different versions of Hadoop ? somehow embed 
different versions of Hadoop (awesome !) or as different connectors for each 
version ?

+1 to the idea of having a few sessions and connect to faces !

> No connection to testcluster
> ----------------------------
>
>                 Key: HDT-24
>                 URL: https://issues.apache.org/jira/browse/HDT-24
>             Project: Hadoop Development Tools
>          Issue Type: Bug
>          Components: HDFS
>         Environment: Cluster with CDH4.0 and CDH4.2 installation
>            Reporter: Mirko Kaempf
>            Priority: Blocker
>
> It is not possible to connect to the testcluster.
> !MESSAGE An internal error occurred during: "Connecting to DFS 
> MyResearchCluster".
> !STACK 0
> java.lang.NoClassDefFoundError: org/apache/commons/configuration/Configuration
>       at 
> org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.<init>(DefaultMetricsSystem.java:37)
>       at 
> org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.<clinit>(DefaultMetricsSystem.java:34)
>       at 
> org.apache.hadoop.security.UgiInstrumentation.create(UgiInstrumentation.java:51)
>       at 
> org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:216)
>       at 
> org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:184)
>       at 
> org.apache.hadoop.security.UserGroupInformation.isSecurityEnabled(UserGroupInformation.java:236)
>       at 
> org.apache.hadoop.security.KerberosName.<clinit>(KerberosName.java:79)
>       at 
> org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:209)
>       at 
> org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:184)
>       at 
> org.apache.hadoop.security.UserGroupInformation.isSecurityEnabled(UserGroupInformation.java:236)
>       at 
> org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:466)
>       at 
> org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:452)
>       at 
> org.apache.hadoop.fs.FileSystem$Cache$Key.<init>(FileSystem.java:1494)
>       at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1395)
>       at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:254)
>       at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:123)
>       at 
> org.apache.hdt.core.cluster.HadoopCluster.getDFS(HadoopCluster.java:474)
>       at org.apache.hdt.dfs.core.DFSPath.getDFS(DFSPath.java:146)
>       at 
> org.apache.hdt.dfs.core.DFSFolder.loadDFSFolderChildren(DFSFolder.java:61)
>       at org.apache.hdt.dfs.core.DFSFolder$1.run(DFSFolder.java:178)
>       at org.eclipse.core.internal.jobs.Worker.run(Worker.java:53)
> Caused by: java.lang.ClassNotFoundException: 
> org.apache.commons.configuration.Configuration
>       at 
> org.eclipse.osgi.internal.loader.BundleLoader.findClassInternal(BundleLoader.java:501)
>       at 
> org.eclipse.osgi.internal.loader.BundleLoader.findClass(BundleLoader.java:421)
>       at 
> org.eclipse.osgi.internal.loader.BundleLoader.findClass(BundleLoader.java:412)
>       at 
> org.eclipse.osgi.internal.baseadaptor.DefaultClassLoader.loadClass(DefaultClassLoader.java:107)
>       at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
>       ... 21 more



--
This message was sent by Atlassian JIRA
(v6.1#6144)

Reply via email to