Oh, I see. Right, injection framework has been introduced in early 0.21 and
never been backported to 0.20.
On 11/23/09 19:38 , Thanh Do wrote:
The reason I changed the /build.xml/ is that /build.xml/ in the
hadoop-common trunk release (0.20.1) does not contain /injectfaults/
target ( I wanna us
The reason I changed the *build.xml* is that *build.xml* in the
hadoop-common trunk release (0.20.1) does not contain *injectfaults* target
( I wanna use AspectJ in the hadoop release that contains both hdfs and
mapred). I just add following two targets.
Generally the idea was to provide everything needed for injection by what
current build.xml is having in Common and Hdfs. Would you mind to share what
extra changes you've needed and why?
Cos
On 11/20/09 12:32 , Thanh Do wrote:
Thank you folks!
Finally, I am able (really) to run FI with HADO
Thank you folks!
Finally, I am able (really) to run FI with HADOOP. I added some aspects into
the source code, changed the build.xml, and that's it.
AspectJ is awesome!
Have a nice weekend!
On Fri, Nov 20, 2009 at 1:08 PM, Konstantin Boudnik wrote:
> Hi Thanh.
>
> hmm, it sounds like you have
Hi Thanh.
hmm, it sounds like you have some issue with compilation of your code.
addDeprication() has been added to Configuration in 0.21, I believe. And it is
there no matter how do you compile your code (with FI or without).
Cos
On 11/19/09 10:12 , Thanh Do wrote:
Sorry to dig this thread
Sorry to dig this thread again!
I am expecting the release of 0.21 so that I don't have to manually play
around with AspectJ FI any more.
I still have problem with running HDFS with instrumented code (with aspect).
Here is what I did:
In the root directory of HDFS:
*$ ant injectfaults
$ ant ja
Thanks for your very useful advice. I am able to play with fault injection
now.
On Thu, Oct 8, 2009 at 11:41 AM, Konstantin Boudnik wrote:
> Thanks for looking into fault injection - it's very interesting and useful
> technique based on AspectJ.
>
> Currently, it is fully integrated into HDFS onl
Thanks for looking into fault injection - it's very interesting and useful
technique based on AspectJ.
Currently, it is fully integrated into HDFS only. There's a JIRA (HADOOP-6204)
which tracks the same effort for Common and then all Hadoop's components will
have injection (as well as fault i
Thank you so much, Jakob.
Could you please explain the fault injection running procedure in details?
My goal is running HDFS in a cluster (with a namenode and several datanode),
and see how fault injection techniques affect HDFS behavior's. Also, I would
like to define some new aspects/fault to t
Thanh-
If you would like the run execute the tests that have been instrumented
to use the fault injection framework the ant target is
run-test-hdfs-fault-inject. These were used extensively in the recent
append work and there are quite a few append-related tests. Was there
something more spe
Hi everyone,
Could any body so me how to run the fault injection framework mentioned in
the following links?:
http://issues.apache.org/jira/browse/HDFS-435
and
https://issues.apache.org/jira/browse/HDFS-436
Thanks,
Thanh
11 matches
Mail list logo