[
https://issues.apache.org/jira/browse/HADOOP-1437?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel#action_12520034
]
Doug Cutting commented on HADOOP-1437:
--------------------------------------
When this patch is committed, it would be best to also commit the fully built
plugin too, so that when releases are built (using ant, perhaps without Eclipse
installed) the plugin is included. This way committers who commit patches to
the plugin must have eclipse installed, but committers who build releases need
not.
If ECLIPSE_HOME is set correctly, can ant build this without Eclipse? That
would be very convenient. Then we could add a target in the top-level
build.xml to build the plugin.
I am not an Eclipse user. I installed Eclipse and tried to build this without
success. I have Eclipse 3.2 installed on Ubuntu 7.04. I apply the patch and
add jsch-0.1.33.jar to MapReduceTools/lib. (This file should be attached to
this patch, with a copy of its license.) I then start Eclipse and import the
MapReduceTools directory as an existing project. This causes the project to be
built, which results in 85 errors, such as:
Override cannot be resolved to a type
MapReduceTools/src/org/apache/hadoop/eclipse Activator.java line 50
1187196301697 183
The method append(String) is undefined for the type BufferedWriter
MapReduceTools/src/org/apache/hadoop/eclipse/servers ServerRegistry.java
line 158 1187196298949 8
What am I doing wrong?
> Eclipse plugin for developing and executing MapReduce programs on Hadoop
> ------------------------------------------------------------------------
>
> Key: HADOOP-1437
> URL: https://issues.apache.org/jira/browse/HADOOP-1437
> Project: Hadoop
> Issue Type: New Feature
> Affects Versions: 0.13.1
> Environment: Eclipse 3.2.0+, Java 1.5.0+
> Reporter: Eugene Hung
> Attachments: eclipse-plugin-20070813d.patch,
> eclipse-plugin-20070813e.patch, eclipse-plugin-20070815a.patch,
> eclipse-plugin.patch, mrt-eclipse-1.0.4.zip
>
>
> An Eclipse plugin for developing and executing MapReduce programs on remote
> Hadoop servers. Automatically provides templates for creating Map/Reduce
> classes, transparently bundles the classes into JAR files and sends them to a
> remote server for execution. Allows the user to easily view status of Hadoop
> jobs and browse/upload/delete files from the Hadoop DFS within the Eclipse
> IDE.
--
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.