Hi guys,
Can I just install the HDFS project and debug it? (assuming I am running a
put command through the command line). If so, which project should I
download (hadoop project that has hdfs)?
--
Best Regards,
Karim Ahmed Awara
On Sun, Oct 6, 2013 at 5:40 PM, Ted Yu yuzhih...@gmail.com wrote:
Instead of Ted's approach, it's also useful to use surefire plugin
when you debug tests.
mvn test -Dmaven.surefire.debug -Dtest=TestClassName
This commands accept debugger's attach on 5005 port by default, so you
can attach via eclipse's debugger. Then the test runs and you can use
debugger. I
The thing is.. when I downloaded the source code and compiled it with
maven. There exist no configuration files to configure. So I assume maven
has its own way of test the unit tests... or am I missing something?
--
Best Regards,
Karim Ahmed Awara
On Mon, Nov 4, 2013 at 7:53 AM, Tsuyoshi OZAWA
Also my problem when importing the compiled hadoop to eclipse is that it
does not identify certain types such as AvroRecord that is used in
hadoop-common project and is declared at this path
/hadoop-common-project/hadoop-common/src/test/avro/avroRecord.avsc.. so
somehow Eclipse does not have an
Karim,
I am not an experienced Hadoop programmer, but what I found was that building
and debugging Hadoop under Eclipse was very difficult, and I was never to make
it work correctly. I suggest using the well documented command-line Maven
build, installing Hadoop from that build, and running
Karim:
If you want to debug unit tests, using Eclipse is a viable approach.
Here is what I did the past week debugging certain part of hadoop
(JobSubmitter in particular) through an HBase unit test.
Run 'mvn install -DskipTests' to install hadoop locally
Open the class you want to debug and place