RE: [DISCUSS] KIP-25 System test improvements

2015-06-11 Thread Aditya Auradkar
Hi Geoffrey,

Thanks for the writeup. Couple of questions:
- Is it possible to configure suites using ducktape? For example: assume all 
the tests in system_tests have been migrated to ducktape. Can I run a subset of 
all tests grouped by functional areas i.e. replication, broker failure etc.

- Ducktape allows us to run tests on a vagrant cluster or on a static cluster 
configured via JSON. Once ported to ducktape, can we very easily run the 
existing system tests in both flavors?

Thanks,
Aditya


From: Geoffrey Anderson [ge...@confluent.io]
Sent: Monday, June 08, 2015 10:56 PM
To: dev@kafka.apache.org
Subject: Re: [DISCUSS] KIP-25 System test improvements

Hi KIP-25 thread,

I consolidated some of the questions from this thread and elsewhere.

Q: Can we see a map of what system-test currently tests, which ones we want
to replace and JIRAs for replacing?
A: Initial draft here:
https://cwiki.apache.org/confluence/display/KAFKA/Roadmap+-+port+existing+system+tests

Q: Will ducktape be maintained separately as a github repo?
A: Yes https://github.com/confluentinc/ducktape

Q: How easy is viewing the test results and logs, how will test output be
structured?
A: Hierarchical structure as outlined here:
https://github.com/confluentinc/ducktape/wiki/Design-overview#output

Q: Does it support code coverage? If not, how easy/ difficult would it be
to support?
A: It does not, and we have no immediate plans to support this. Difficulty
unclear.

Q: It would be nice if each Kafka version that we release will also
have a separate tests artifact that users can download, untar and easily
run against a Kafka cluster of the same version.
A: This seems reasonable and not too much extra work. Definitely open to
discussion on this.

Q: Why not share running services across multiple tests?
A: Prefer to optimize for simplicity and correctness over what might be a
questionable improvement in run-time.

Q: Are regressions - in the road map?
A: yes

Q: Are Jepsen style tests involving network failures in the road map?
A: yes

Thanks much,
Geoff



On Mon, Jun 8, 2015 at 4:55 PM, Geoffrey Anderson ge...@confluent.io
wrote:

 Hi Gwen,

 I don't see any problem with this as long as we're convinced there's a
 good use case, which seems to be true.

 Cheers,
 Geoff

 On Thu, Jun 4, 2015 at 5:20 PM, Gwen Shapira gshap...@cloudera.com
 wrote:

 Not completely random places :)
 People may use Cloudera / HWX distributions which include Kafka, but want
 to verify that these bits match a specific upstream release.

 I think having the tests separately will be useful for this. In this case,
 finding the tests are not a big issue - we'll add a download link :)

 On Thu, Jun 4, 2015 at 5:00 PM, Jiangjie Qin j...@linkedin.com.invalid
 wrote:

  Hey Gwen,
 
  Currently the test and code are downloaded at the same time. Supposedly
  the tests in the same repository should cover match the code.
  Are you saying people downloaded a release from some random place and
 want
  to verify it? If that is the case, does that mean people still need to
  find the correct place to download the right test artifact?
 
  Thanks,
 
  Jiangjie (Becket) Qin
 
 
 
  On 6/4/15, 4:29 PM, Gwen Shapira gshap...@cloudera.com wrote:
 
  Hi,
  
  Reviving the discussion a bit :)
  
  I think it will be nice if each Kafka version that we release will also
  have a separate tests artifact that users can download, untar and
 easily
  run against a Kafka cluster of the same version.
  
  The idea is that if someone downloads packages that claim to contain
  something of a specific Kafka version (i.e. Kafka 0.8.2.0 + patches),
  users
  can easily download the tests and verify that it indeed passes the
 tests
  for this version and therefore behaves the way this version is
 expected to
  behave.
  
  Does it make sense?
  
  Gwen
  
  On Thu, May 21, 2015 at 3:26 PM, Geoffrey Anderson ge...@confluent.io
 
  wrote:
  
   Hi Ashish,
  
   Looks like Ewen already hit the main points, but a few additions:
  
   1. ducktape repo is here: https://github.com/confluentinc/ducktape
   ducktape itself will be pip installable in the near future, and Kafka
   system tests will be able to depend on a particular version of
 ducktape.
  
   2.  The reporting is nothing fancy. We're definitely open to
 feedback,
  but
   it consists of:
   - top level summary of the test run (simple PASS/FAIL for each test)
   - top level info and debug logs
   - per-test info and debug logs
   - per-test service logs gathered from each service used in the
 test.
  For
   example, if your test pulls up a Kafka cluster with 5 brokers, the
 end
   result will have the Kafka logs from each of those 5 machines.
  
   Cheers,
   Geoff
  
   On Thu, May 21, 2015 at 3:15 PM, Ewen Cheslack-Postava
  e...@confluent.io
   wrote:
  
Ashish,
   
1. That was the plan. We put some effort into cleanly separating
 the
framework so it would be reusable across many projects

Re: [DISCUSS] KIP-25 System test improvements

2015-06-11 Thread Geoffrey Anderson
Hi Aditya,

(1) There are a currently a few different ways to target a specific test or
subset of tests.

If for example tests were organized like the current system tests, where
suites are grouped by directory, you could run

cd system_test_dir
ducktape replication_testsuite/

You can also target tests in a particular file (ducktape path_to_file),
all tests in a test class (ducktape path_to_file::test_class), or a
particular test method in a class (ducktape
path_to_file::test_class.test_method)

(2) ducktape runs on a vagrant cluster by parsing the information returned
by the vagrant ssh-config command into JSON configuration used by the
JsonCluster class, so in that sense we are already using the JSON flavor.

I see a few potential challenges, but nothing too bad.
- There may be some work involved in getting ssh configs right
- A couple assumptions about where files are deployed on remote machines
are baked into some of the Service classes, so some minor refactoring may
be needed to make this a little more general. This would be a good thing.

In any case, we're happy to help out/take pull requests on ducktape etc.

Best,
Geoff








On Thu, Jun 11, 2015 at 10:13 AM, Aditya Auradkar 
aaurad...@linkedin.com.invalid wrote:

 Hi Geoffrey,

 Thanks for the writeup. Couple of questions:
 - Is it possible to configure suites using ducktape? For example: assume
 all the tests in system_tests have been migrated to ducktape. Can I run a
 subset of all tests grouped by functional areas i.e. replication, broker
 failure etc.

 - Ducktape allows us to run tests on a vagrant cluster or on a static
 cluster configured via JSON. Once ported to ducktape, can we very easily
 run the existing system tests in both flavors?

 Thanks,
 Aditya

 
 From: Geoffrey Anderson [ge...@confluent.io]
 Sent: Monday, June 08, 2015 10:56 PM
 To: dev@kafka.apache.org
 Subject: Re: [DISCUSS] KIP-25 System test improvements

 Hi KIP-25 thread,

 I consolidated some of the questions from this thread and elsewhere.

 Q: Can we see a map of what system-test currently tests, which ones we want
 to replace and JIRAs for replacing?
 A: Initial draft here:

 https://cwiki.apache.org/confluence/display/KAFKA/Roadmap+-+port+existing+system+tests

 Q: Will ducktape be maintained separately as a github repo?
 A: Yes https://github.com/confluentinc/ducktape

 Q: How easy is viewing the test results and logs, how will test output be
 structured?
 A: Hierarchical structure as outlined here:
 https://github.com/confluentinc/ducktape/wiki/Design-overview#output

 Q: Does it support code coverage? If not, how easy/ difficult would it be
 to support?
 A: It does not, and we have no immediate plans to support this. Difficulty
 unclear.

 Q: It would be nice if each Kafka version that we release will also
 have a separate tests artifact that users can download, untar and easily
 run against a Kafka cluster of the same version.
 A: This seems reasonable and not too much extra work. Definitely open to
 discussion on this.

 Q: Why not share running services across multiple tests?
 A: Prefer to optimize for simplicity and correctness over what might be a
 questionable improvement in run-time.

 Q: Are regressions - in the road map?
 A: yes

 Q: Are Jepsen style tests involving network failures in the road map?
 A: yes

 Thanks much,
 Geoff



 On Mon, Jun 8, 2015 at 4:55 PM, Geoffrey Anderson ge...@confluent.io
 wrote:

  Hi Gwen,
 
  I don't see any problem with this as long as we're convinced there's a
  good use case, which seems to be true.
 
  Cheers,
  Geoff
 
  On Thu, Jun 4, 2015 at 5:20 PM, Gwen Shapira gshap...@cloudera.com
  wrote:
 
  Not completely random places :)
  People may use Cloudera / HWX distributions which include Kafka, but
 want
  to verify that these bits match a specific upstream release.
 
  I think having the tests separately will be useful for this. In this
 case,
  finding the tests are not a big issue - we'll add a download link :)
 
  On Thu, Jun 4, 2015 at 5:00 PM, Jiangjie Qin j...@linkedin.com.invalid
 
  wrote:
 
   Hey Gwen,
  
   Currently the test and code are downloaded at the same time.
 Supposedly
   the tests in the same repository should cover match the code.
   Are you saying people downloaded a release from some random place and
  want
   to verify it? If that is the case, does that mean people still need to
   find the correct place to download the right test artifact?
  
   Thanks,
  
   Jiangjie (Becket) Qin
  
  
  
   On 6/4/15, 4:29 PM, Gwen Shapira gshap...@cloudera.com wrote:
  
   Hi,
   
   Reviving the discussion a bit :)
   
   I think it will be nice if each Kafka version that we release will
 also
   have a separate tests artifact that users can download, untar and
  easily
   run against a Kafka cluster of the same version.
   
   The idea is that if someone downloads packages that claim to contain
   something of a specific Kafka version (i.e. Kafka 0.8.2.0

Re: [DISCUSS] KIP-25 System test improvements

2015-06-08 Thread Geoffrey Anderson
Hi Gwen,

I don't see any problem with this as long as we're convinced there's a good
use case, which seems to be true.

Cheers,
Geoff

On Thu, Jun 4, 2015 at 5:20 PM, Gwen Shapira gshap...@cloudera.com wrote:

 Not completely random places :)
 People may use Cloudera / HWX distributions which include Kafka, but want
 to verify that these bits match a specific upstream release.

 I think having the tests separately will be useful for this. In this case,
 finding the tests are not a big issue - we'll add a download link :)

 On Thu, Jun 4, 2015 at 5:00 PM, Jiangjie Qin j...@linkedin.com.invalid
 wrote:

  Hey Gwen,
 
  Currently the test and code are downloaded at the same time. Supposedly
  the tests in the same repository should cover match the code.
  Are you saying people downloaded a release from some random place and
 want
  to verify it? If that is the case, does that mean people still need to
  find the correct place to download the right test artifact?
 
  Thanks,
 
  Jiangjie (Becket) Qin
 
 
 
  On 6/4/15, 4:29 PM, Gwen Shapira gshap...@cloudera.com wrote:
 
  Hi,
  
  Reviving the discussion a bit :)
  
  I think it will be nice if each Kafka version that we release will also
  have a separate tests artifact that users can download, untar and
 easily
  run against a Kafka cluster of the same version.
  
  The idea is that if someone downloads packages that claim to contain
  something of a specific Kafka version (i.e. Kafka 0.8.2.0 + patches),
  users
  can easily download the tests and verify that it indeed passes the tests
  for this version and therefore behaves the way this version is expected
 to
  behave.
  
  Does it make sense?
  
  Gwen
  
  On Thu, May 21, 2015 at 3:26 PM, Geoffrey Anderson ge...@confluent.io
  wrote:
  
   Hi Ashish,
  
   Looks like Ewen already hit the main points, but a few additions:
  
   1. ducktape repo is here: https://github.com/confluentinc/ducktape
   ducktape itself will be pip installable in the near future, and Kafka
   system tests will be able to depend on a particular version of
 ducktape.
  
   2.  The reporting is nothing fancy. We're definitely open to feedback,
  but
   it consists of:
   - top level summary of the test run (simple PASS/FAIL for each test)
   - top level info and debug logs
   - per-test info and debug logs
   - per-test service logs gathered from each service used in the test.
  For
   example, if your test pulls up a Kafka cluster with 5 brokers, the end
   result will have the Kafka logs from each of those 5 machines.
  
   Cheers,
   Geoff
  
   On Thu, May 21, 2015 at 3:15 PM, Ewen Cheslack-Postava
  e...@confluent.io
   wrote:
  
Ashish,
   
1. That was the plan. We put some effort into cleanly separating the
framework so it would be reusable across many projects.
2. I think you're seeing a test in progress where the final report
  hasn't
been created yet. If you visit one of the older ones you'll see it
  has a
landing page with links:
http://testing.confluent.io/confluent_platform/2015-05-20--001/
   Apparently
we need to adjust when we update the 'latest' symlink. The logs that
  are
collected for tests are configurable, and service implementations
  include
sane defaults (so, e.g., you will always get the normal log file for
   Kafka,
but only get the data files if the test asks for them).
3. No code coverage support. Haven't looked into it, so I couldn't
   comment
on how hard it would be to add.
   
-Ewen
   
On Thu, May 21, 2015 at 2:38 PM, Ashish Singh asi...@cloudera.com
   wrote:
   
 Geoffrey,

 This looks great!

 A few questions.
 1. Will ducktape be maintained separately as a github repo?
 2. How easy is viewing the test results and logs. The link in KIP,
 http://testing.confluent.io/confluent_platform/latest/, lists a
  bunch
   of
 files and dirs. Could you add to KIP how the result and logs for
 the
tests
 will be organized.
 3. Does it support code coverage? If not, how easy/ difficult
 would
  it
be?

 On Thu, May 21, 2015 at 2:03 PM, Geoffrey Anderson
  ge...@confluent.io
   
 wrote:

  Great, I'll work on putting together a more detailed map of this
  replacement process.
 
  On Thu, May 21, 2015 at 11:13 AM, Gwen Shapira 
   gshap...@cloudera.com
  wrote:
 
   Love this idea :)
  
   I took a look at Ducktape API and it looks like a good fit -
  clean
API,
   extensible, easy to use and powerful enough for our use-case.
  
   Something I'd like to see as part of the KIP is a map of what
 system-test
   currently tests, which ones we want to replace and a JIRA for
replacing
   (possibly one for each group of tests).
   Basically, I know we all want to use the new system for new
 test
cases
   (upgrades, etc), but I really want to make sure we don't get
  stuck
with
   both systems forever.
  

Re: [DISCUSS] KIP-25 System test improvements

2015-06-08 Thread Geoffrey Anderson
Hi KIP-25 thread,

I consolidated some of the questions from this thread and elsewhere.

Q: Can we see a map of what system-test currently tests, which ones we want
to replace and JIRAs for replacing?
A: Initial draft here:
https://cwiki.apache.org/confluence/display/KAFKA/Roadmap+-+port+existing+system+tests

Q: Will ducktape be maintained separately as a github repo?
A: Yes https://github.com/confluentinc/ducktape

Q: How easy is viewing the test results and logs, how will test output be
structured?
A: Hierarchical structure as outlined here:
https://github.com/confluentinc/ducktape/wiki/Design-overview#output

Q: Does it support code coverage? If not, how easy/ difficult would it be
to support?
A: It does not, and we have no immediate plans to support this. Difficulty
unclear.

Q: It would be nice if each Kafka version that we release will also
have a separate tests artifact that users can download, untar and easily
run against a Kafka cluster of the same version.
A: This seems reasonable and not too much extra work. Definitely open to
discussion on this.

Q: Why not share running services across multiple tests?
A: Prefer to optimize for simplicity and correctness over what might be a
questionable improvement in run-time.

Q: Are regressions - in the road map?
A: yes

Q: Are Jepsen style tests involving network failures in the road map?
A: yes

Thanks much,
Geoff



On Mon, Jun 8, 2015 at 4:55 PM, Geoffrey Anderson ge...@confluent.io
wrote:

 Hi Gwen,

 I don't see any problem with this as long as we're convinced there's a
 good use case, which seems to be true.

 Cheers,
 Geoff

 On Thu, Jun 4, 2015 at 5:20 PM, Gwen Shapira gshap...@cloudera.com
 wrote:

 Not completely random places :)
 People may use Cloudera / HWX distributions which include Kafka, but want
 to verify that these bits match a specific upstream release.

 I think having the tests separately will be useful for this. In this case,
 finding the tests are not a big issue - we'll add a download link :)

 On Thu, Jun 4, 2015 at 5:00 PM, Jiangjie Qin j...@linkedin.com.invalid
 wrote:

  Hey Gwen,
 
  Currently the test and code are downloaded at the same time. Supposedly
  the tests in the same repository should cover match the code.
  Are you saying people downloaded a release from some random place and
 want
  to verify it? If that is the case, does that mean people still need to
  find the correct place to download the right test artifact?
 
  Thanks,
 
  Jiangjie (Becket) Qin
 
 
 
  On 6/4/15, 4:29 PM, Gwen Shapira gshap...@cloudera.com wrote:
 
  Hi,
  
  Reviving the discussion a bit :)
  
  I think it will be nice if each Kafka version that we release will also
  have a separate tests artifact that users can download, untar and
 easily
  run against a Kafka cluster of the same version.
  
  The idea is that if someone downloads packages that claim to contain
  something of a specific Kafka version (i.e. Kafka 0.8.2.0 + patches),
  users
  can easily download the tests and verify that it indeed passes the
 tests
  for this version and therefore behaves the way this version is
 expected to
  behave.
  
  Does it make sense?
  
  Gwen
  
  On Thu, May 21, 2015 at 3:26 PM, Geoffrey Anderson ge...@confluent.io
 
  wrote:
  
   Hi Ashish,
  
   Looks like Ewen already hit the main points, but a few additions:
  
   1. ducktape repo is here: https://github.com/confluentinc/ducktape
   ducktape itself will be pip installable in the near future, and Kafka
   system tests will be able to depend on a particular version of
 ducktape.
  
   2.  The reporting is nothing fancy. We're definitely open to
 feedback,
  but
   it consists of:
   - top level summary of the test run (simple PASS/FAIL for each test)
   - top level info and debug logs
   - per-test info and debug logs
   - per-test service logs gathered from each service used in the
 test.
  For
   example, if your test pulls up a Kafka cluster with 5 brokers, the
 end
   result will have the Kafka logs from each of those 5 machines.
  
   Cheers,
   Geoff
  
   On Thu, May 21, 2015 at 3:15 PM, Ewen Cheslack-Postava
  e...@confluent.io
   wrote:
  
Ashish,
   
1. That was the plan. We put some effort into cleanly separating
 the
framework so it would be reusable across many projects.
2. I think you're seeing a test in progress where the final report
  hasn't
been created yet. If you visit one of the older ones you'll see it
  has a
landing page with links:
http://testing.confluent.io/confluent_platform/2015-05-20--001/
   Apparently
we need to adjust when we update the 'latest' symlink. The logs
 that
  are
collected for tests are configurable, and service implementations
  include
sane defaults (so, e.g., you will always get the normal log file
 for
   Kafka,
but only get the data files if the test asks for them).
3. No code coverage support. Haven't looked into it, so I couldn't
   comment
on how hard it would be to add.
   
-Ewen
  

Re: [DISCUSS] KIP-25 System test improvements

2015-06-04 Thread Gwen Shapira
Not completely random places :)
People may use Cloudera / HWX distributions which include Kafka, but want
to verify that these bits match a specific upstream release.

I think having the tests separately will be useful for this. In this case,
finding the tests are not a big issue - we'll add a download link :)

On Thu, Jun 4, 2015 at 5:00 PM, Jiangjie Qin j...@linkedin.com.invalid
wrote:

 Hey Gwen,

 Currently the test and code are downloaded at the same time. Supposedly
 the tests in the same repository should cover match the code.
 Are you saying people downloaded a release from some random place and want
 to verify it? If that is the case, does that mean people still need to
 find the correct place to download the right test artifact?

 Thanks,

 Jiangjie (Becket) Qin



 On 6/4/15, 4:29 PM, Gwen Shapira gshap...@cloudera.com wrote:

 Hi,
 
 Reviving the discussion a bit :)
 
 I think it will be nice if each Kafka version that we release will also
 have a separate tests artifact that users can download, untar and easily
 run against a Kafka cluster of the same version.
 
 The idea is that if someone downloads packages that claim to contain
 something of a specific Kafka version (i.e. Kafka 0.8.2.0 + patches),
 users
 can easily download the tests and verify that it indeed passes the tests
 for this version and therefore behaves the way this version is expected to
 behave.
 
 Does it make sense?
 
 Gwen
 
 On Thu, May 21, 2015 at 3:26 PM, Geoffrey Anderson ge...@confluent.io
 wrote:
 
  Hi Ashish,
 
  Looks like Ewen already hit the main points, but a few additions:
 
  1. ducktape repo is here: https://github.com/confluentinc/ducktape
  ducktape itself will be pip installable in the near future, and Kafka
  system tests will be able to depend on a particular version of ducktape.
 
  2.  The reporting is nothing fancy. We're definitely open to feedback,
 but
  it consists of:
  - top level summary of the test run (simple PASS/FAIL for each test)
  - top level info and debug logs
  - per-test info and debug logs
  - per-test service logs gathered from each service used in the test.
 For
  example, if your test pulls up a Kafka cluster with 5 brokers, the end
  result will have the Kafka logs from each of those 5 machines.
 
  Cheers,
  Geoff
 
  On Thu, May 21, 2015 at 3:15 PM, Ewen Cheslack-Postava
 e...@confluent.io
  wrote:
 
   Ashish,
  
   1. That was the plan. We put some effort into cleanly separating the
   framework so it would be reusable across many projects.
   2. I think you're seeing a test in progress where the final report
 hasn't
   been created yet. If you visit one of the older ones you'll see it
 has a
   landing page with links:
   http://testing.confluent.io/confluent_platform/2015-05-20--001/
  Apparently
   we need to adjust when we update the 'latest' symlink. The logs that
 are
   collected for tests are configurable, and service implementations
 include
   sane defaults (so, e.g., you will always get the normal log file for
  Kafka,
   but only get the data files if the test asks for them).
   3. No code coverage support. Haven't looked into it, so I couldn't
  comment
   on how hard it would be to add.
  
   -Ewen
  
   On Thu, May 21, 2015 at 2:38 PM, Ashish Singh asi...@cloudera.com
  wrote:
  
Geoffrey,
   
This looks great!
   
A few questions.
1. Will ducktape be maintained separately as a github repo?
2. How easy is viewing the test results and logs. The link in KIP,
http://testing.confluent.io/confluent_platform/latest/, lists a
 bunch
  of
files and dirs. Could you add to KIP how the result and logs for the
   tests
will be organized.
3. Does it support code coverage? If not, how easy/ difficult would
 it
   be?
   
On Thu, May 21, 2015 at 2:03 PM, Geoffrey Anderson
 ge...@confluent.io
  
wrote:
   
 Great, I'll work on putting together a more detailed map of this
 replacement process.

 On Thu, May 21, 2015 at 11:13 AM, Gwen Shapira 
  gshap...@cloudera.com
 wrote:

  Love this idea :)
 
  I took a look at Ducktape API and it looks like a good fit -
 clean
   API,
  extensible, easy to use and powerful enough for our use-case.
 
  Something I'd like to see as part of the KIP is a map of what
system-test
  currently tests, which ones we want to replace and a JIRA for
   replacing
  (possibly one for each group of tests).
  Basically, I know we all want to use the new system for new test
   cases
  (upgrades, etc), but I really want to make sure we don't get
 stuck
   with
  both systems forever.
 
  Gwen
 
  On Thu, May 21, 2015 at 9:01 PM, Geoffrey Anderson 
   ge...@confluent.io

  wrote:
 
   Hi,
  
   Just kicking off the discussion thread on KIP-25
  
  
  
 

   
  
 
 
 https://cwiki.apache.org/confluence/display/KAFKA/KIP+-+25+System+test+im
 provements
  
   Thanks,
   

Re: [DISCUSS] KIP-25 System test improvements

2015-06-04 Thread Jiangjie Qin
Hey Gwen,

Currently the test and code are downloaded at the same time. Supposedly
the tests in the same repository should cover match the code.
Are you saying people downloaded a release from some random place and want
to verify it? If that is the case, does that mean people still need to
find the correct place to download the right test artifact?

Thanks,

Jiangjie (Becket) Qin



On 6/4/15, 4:29 PM, Gwen Shapira gshap...@cloudera.com wrote:

Hi,

Reviving the discussion a bit :)

I think it will be nice if each Kafka version that we release will also
have a separate tests artifact that users can download, untar and easily
run against a Kafka cluster of the same version.

The idea is that if someone downloads packages that claim to contain
something of a specific Kafka version (i.e. Kafka 0.8.2.0 + patches),
users
can easily download the tests and verify that it indeed passes the tests
for this version and therefore behaves the way this version is expected to
behave.

Does it make sense?

Gwen

On Thu, May 21, 2015 at 3:26 PM, Geoffrey Anderson ge...@confluent.io
wrote:

 Hi Ashish,

 Looks like Ewen already hit the main points, but a few additions:

 1. ducktape repo is here: https://github.com/confluentinc/ducktape
 ducktape itself will be pip installable in the near future, and Kafka
 system tests will be able to depend on a particular version of ducktape.

 2.  The reporting is nothing fancy. We're definitely open to feedback,
but
 it consists of:
 - top level summary of the test run (simple PASS/FAIL for each test)
 - top level info and debug logs
 - per-test info and debug logs
 - per-test service logs gathered from each service used in the test.
For
 example, if your test pulls up a Kafka cluster with 5 brokers, the end
 result will have the Kafka logs from each of those 5 machines.

 Cheers,
 Geoff

 On Thu, May 21, 2015 at 3:15 PM, Ewen Cheslack-Postava
e...@confluent.io
 wrote:

  Ashish,
 
  1. That was the plan. We put some effort into cleanly separating the
  framework so it would be reusable across many projects.
  2. I think you're seeing a test in progress where the final report
hasn't
  been created yet. If you visit one of the older ones you'll see it
has a
  landing page with links:
  http://testing.confluent.io/confluent_platform/2015-05-20--001/
 Apparently
  we need to adjust when we update the 'latest' symlink. The logs that
are
  collected for tests are configurable, and service implementations
include
  sane defaults (so, e.g., you will always get the normal log file for
 Kafka,
  but only get the data files if the test asks for them).
  3. No code coverage support. Haven't looked into it, so I couldn't
 comment
  on how hard it would be to add.
 
  -Ewen
 
  On Thu, May 21, 2015 at 2:38 PM, Ashish Singh asi...@cloudera.com
 wrote:
 
   Geoffrey,
  
   This looks great!
  
   A few questions.
   1. Will ducktape be maintained separately as a github repo?
   2. How easy is viewing the test results and logs. The link in KIP,
   http://testing.confluent.io/confluent_platform/latest/, lists a
bunch
 of
   files and dirs. Could you add to KIP how the result and logs for the
  tests
   will be organized.
   3. Does it support code coverage? If not, how easy/ difficult would
it
  be?
  
   On Thu, May 21, 2015 at 2:03 PM, Geoffrey Anderson
ge...@confluent.io
 
   wrote:
  
Great, I'll work on putting together a more detailed map of this
replacement process.
   
On Thu, May 21, 2015 at 11:13 AM, Gwen Shapira 
 gshap...@cloudera.com
wrote:
   
 Love this idea :)

 I took a look at Ducktape API and it looks like a good fit -
clean
  API,
 extensible, easy to use and powerful enough for our use-case.

 Something I'd like to see as part of the KIP is a map of what
   system-test
 currently tests, which ones we want to replace and a JIRA for
  replacing
 (possibly one for each group of tests).
 Basically, I know we all want to use the new system for new test
  cases
 (upgrades, etc), but I really want to make sure we don't get
stuck
  with
 both systems forever.

 Gwen

 On Thu, May 21, 2015 at 9:01 PM, Geoffrey Anderson 
  ge...@confluent.io
   
 wrote:

  Hi,
 
  Just kicking off the discussion thread on KIP-25
 
 
 

   
  
 
 
https://cwiki.apache.org/confluence/display/KAFKA/KIP+-+25+System+test+im
provements
 
  Thanks,
  Geoff
 

   
  
  
  
   --
  
   Regards,
   Ashish
  
 
 
 
  --
  Thanks,
  Ewen
 




Re: [DISCUSS] KIP-25 System test improvements

2015-06-04 Thread Gwen Shapira
Hi,

Reviving the discussion a bit :)

I think it will be nice if each Kafka version that we release will also
have a separate tests artifact that users can download, untar and easily
run against a Kafka cluster of the same version.

The idea is that if someone downloads packages that claim to contain
something of a specific Kafka version (i.e. Kafka 0.8.2.0 + patches), users
can easily download the tests and verify that it indeed passes the tests
for this version and therefore behaves the way this version is expected to
behave.

Does it make sense?

Gwen

On Thu, May 21, 2015 at 3:26 PM, Geoffrey Anderson ge...@confluent.io
wrote:

 Hi Ashish,

 Looks like Ewen already hit the main points, but a few additions:

 1. ducktape repo is here: https://github.com/confluentinc/ducktape
 ducktape itself will be pip installable in the near future, and Kafka
 system tests will be able to depend on a particular version of ducktape.

 2.  The reporting is nothing fancy. We're definitely open to feedback, but
 it consists of:
 - top level summary of the test run (simple PASS/FAIL for each test)
 - top level info and debug logs
 - per-test info and debug logs
 - per-test service logs gathered from each service used in the test. For
 example, if your test pulls up a Kafka cluster with 5 brokers, the end
 result will have the Kafka logs from each of those 5 machines.

 Cheers,
 Geoff

 On Thu, May 21, 2015 at 3:15 PM, Ewen Cheslack-Postava e...@confluent.io
 wrote:

  Ashish,
 
  1. That was the plan. We put some effort into cleanly separating the
  framework so it would be reusable across many projects.
  2. I think you're seeing a test in progress where the final report hasn't
  been created yet. If you visit one of the older ones you'll see it has a
  landing page with links:
  http://testing.confluent.io/confluent_platform/2015-05-20--001/
 Apparently
  we need to adjust when we update the 'latest' symlink. The logs that are
  collected for tests are configurable, and service implementations include
  sane defaults (so, e.g., you will always get the normal log file for
 Kafka,
  but only get the data files if the test asks for them).
  3. No code coverage support. Haven't looked into it, so I couldn't
 comment
  on how hard it would be to add.
 
  -Ewen
 
  On Thu, May 21, 2015 at 2:38 PM, Ashish Singh asi...@cloudera.com
 wrote:
 
   Geoffrey,
  
   This looks great!
  
   A few questions.
   1. Will ducktape be maintained separately as a github repo?
   2. How easy is viewing the test results and logs. The link in KIP,
   http://testing.confluent.io/confluent_platform/latest/, lists a bunch
 of
   files and dirs. Could you add to KIP how the result and logs for the
  tests
   will be organized.
   3. Does it support code coverage? If not, how easy/ difficult would it
  be?
  
   On Thu, May 21, 2015 at 2:03 PM, Geoffrey Anderson ge...@confluent.io
 
   wrote:
  
Great, I'll work on putting together a more detailed map of this
replacement process.
   
On Thu, May 21, 2015 at 11:13 AM, Gwen Shapira 
 gshap...@cloudera.com
wrote:
   
 Love this idea :)

 I took a look at Ducktape API and it looks like a good fit - clean
  API,
 extensible, easy to use and powerful enough for our use-case.

 Something I'd like to see as part of the KIP is a map of what
   system-test
 currently tests, which ones we want to replace and a JIRA for
  replacing
 (possibly one for each group of tests).
 Basically, I know we all want to use the new system for new test
  cases
 (upgrades, etc), but I really want to make sure we don't get stuck
  with
 both systems forever.

 Gwen

 On Thu, May 21, 2015 at 9:01 PM, Geoffrey Anderson 
  ge...@confluent.io
   
 wrote:

  Hi,
 
  Just kicking off the discussion thread on KIP-25
 
 
 

   
  
 
 https://cwiki.apache.org/confluence/display/KAFKA/KIP+-+25+System+test+improvements
 
  Thanks,
  Geoff
 

   
  
  
  
   --
  
   Regards,
   Ashish
  
 
 
 
  --
  Thanks,
  Ewen
 



Re: [DISCUSS] KIP-25 System test improvements

2015-05-21 Thread Gwen Shapira
Love this idea :)

I took a look at Ducktape API and it looks like a good fit - clean API,
extensible, easy to use and powerful enough for our use-case.

Something I'd like to see as part of the KIP is a map of what system-test
currently tests, which ones we want to replace and a JIRA for replacing
(possibly one for each group of tests).
Basically, I know we all want to use the new system for new test cases
(upgrades, etc), but I really want to make sure we don't get stuck with
both systems forever.

Gwen

On Thu, May 21, 2015 at 9:01 PM, Geoffrey Anderson ge...@confluent.io
wrote:

 Hi,

 Just kicking off the discussion thread on KIP-25


 https://cwiki.apache.org/confluence/display/KAFKA/KIP+-+25+System+test+improvements

 Thanks,
 Geoff



Re: [DISCUSS] KIP-25 System test improvements

2015-05-21 Thread Geoffrey Anderson
Great, I'll work on putting together a more detailed map of this
replacement process.

On Thu, May 21, 2015 at 11:13 AM, Gwen Shapira gshap...@cloudera.com
wrote:

 Love this idea :)

 I took a look at Ducktape API and it looks like a good fit - clean API,
 extensible, easy to use and powerful enough for our use-case.

 Something I'd like to see as part of the KIP is a map of what system-test
 currently tests, which ones we want to replace and a JIRA for replacing
 (possibly one for each group of tests).
 Basically, I know we all want to use the new system for new test cases
 (upgrades, etc), but I really want to make sure we don't get stuck with
 both systems forever.

 Gwen

 On Thu, May 21, 2015 at 9:01 PM, Geoffrey Anderson ge...@confluent.io
 wrote:

  Hi,
 
  Just kicking off the discussion thread on KIP-25
 
 
 
 https://cwiki.apache.org/confluence/display/KAFKA/KIP+-+25+System+test+improvements
 
  Thanks,
  Geoff
 



Re: [DISCUSS] KIP-25 System test improvements

2015-05-21 Thread Ashish Singh
Geoffrey,

This looks great!

A few questions.
1. Will ducktape be maintained separately as a github repo?
2. How easy is viewing the test results and logs. The link in KIP,
http://testing.confluent.io/confluent_platform/latest/, lists a bunch of
files and dirs. Could you add to KIP how the result and logs for the tests
will be organized.
3. Does it support code coverage? If not, how easy/ difficult would it be?

On Thu, May 21, 2015 at 2:03 PM, Geoffrey Anderson ge...@confluent.io
wrote:

 Great, I'll work on putting together a more detailed map of this
 replacement process.

 On Thu, May 21, 2015 at 11:13 AM, Gwen Shapira gshap...@cloudera.com
 wrote:

  Love this idea :)
 
  I took a look at Ducktape API and it looks like a good fit - clean API,
  extensible, easy to use and powerful enough for our use-case.
 
  Something I'd like to see as part of the KIP is a map of what system-test
  currently tests, which ones we want to replace and a JIRA for replacing
  (possibly one for each group of tests).
  Basically, I know we all want to use the new system for new test cases
  (upgrades, etc), but I really want to make sure we don't get stuck with
  both systems forever.
 
  Gwen
 
  On Thu, May 21, 2015 at 9:01 PM, Geoffrey Anderson ge...@confluent.io
  wrote:
 
   Hi,
  
   Just kicking off the discussion thread on KIP-25
  
  
  
 
 https://cwiki.apache.org/confluence/display/KAFKA/KIP+-+25+System+test+improvements
  
   Thanks,
   Geoff
  
 




-- 

Regards,
Ashish


Re: [DISCUSS] KIP-25 System test improvements

2015-05-21 Thread Ewen Cheslack-Postava
Ashish,

1. That was the plan. We put some effort into cleanly separating the
framework so it would be reusable across many projects.
2. I think you're seeing a test in progress where the final report hasn't
been created yet. If you visit one of the older ones you'll see it has a
landing page with links:
http://testing.confluent.io/confluent_platform/2015-05-20--001/ Apparently
we need to adjust when we update the 'latest' symlink. The logs that are
collected for tests are configurable, and service implementations include
sane defaults (so, e.g., you will always get the normal log file for Kafka,
but only get the data files if the test asks for them).
3. No code coverage support. Haven't looked into it, so I couldn't comment
on how hard it would be to add.

-Ewen

On Thu, May 21, 2015 at 2:38 PM, Ashish Singh asi...@cloudera.com wrote:

 Geoffrey,

 This looks great!

 A few questions.
 1. Will ducktape be maintained separately as a github repo?
 2. How easy is viewing the test results and logs. The link in KIP,
 http://testing.confluent.io/confluent_platform/latest/, lists a bunch of
 files and dirs. Could you add to KIP how the result and logs for the tests
 will be organized.
 3. Does it support code coverage? If not, how easy/ difficult would it be?

 On Thu, May 21, 2015 at 2:03 PM, Geoffrey Anderson ge...@confluent.io
 wrote:

  Great, I'll work on putting together a more detailed map of this
  replacement process.
 
  On Thu, May 21, 2015 at 11:13 AM, Gwen Shapira gshap...@cloudera.com
  wrote:
 
   Love this idea :)
  
   I took a look at Ducktape API and it looks like a good fit - clean API,
   extensible, easy to use and powerful enough for our use-case.
  
   Something I'd like to see as part of the KIP is a map of what
 system-test
   currently tests, which ones we want to replace and a JIRA for replacing
   (possibly one for each group of tests).
   Basically, I know we all want to use the new system for new test cases
   (upgrades, etc), but I really want to make sure we don't get stuck with
   both systems forever.
  
   Gwen
  
   On Thu, May 21, 2015 at 9:01 PM, Geoffrey Anderson ge...@confluent.io
 
   wrote:
  
Hi,
   
Just kicking off the discussion thread on KIP-25
   
   
   
  
 
 https://cwiki.apache.org/confluence/display/KAFKA/KIP+-+25+System+test+improvements
   
Thanks,
Geoff
   
  
 



 --

 Regards,
 Ashish




-- 
Thanks,
Ewen


Re: [DISCUSS] KIP-25 System test improvements

2015-05-21 Thread Geoffrey Anderson
Hi Ashish,

Looks like Ewen already hit the main points, but a few additions:

1. ducktape repo is here: https://github.com/confluentinc/ducktape
ducktape itself will be pip installable in the near future, and Kafka
system tests will be able to depend on a particular version of ducktape.

2.  The reporting is nothing fancy. We're definitely open to feedback, but
it consists of:
- top level summary of the test run (simple PASS/FAIL for each test)
- top level info and debug logs
- per-test info and debug logs
- per-test service logs gathered from each service used in the test. For
example, if your test pulls up a Kafka cluster with 5 brokers, the end
result will have the Kafka logs from each of those 5 machines.

Cheers,
Geoff

On Thu, May 21, 2015 at 3:15 PM, Ewen Cheslack-Postava e...@confluent.io
wrote:

 Ashish,

 1. That was the plan. We put some effort into cleanly separating the
 framework so it would be reusable across many projects.
 2. I think you're seeing a test in progress where the final report hasn't
 been created yet. If you visit one of the older ones you'll see it has a
 landing page with links:
 http://testing.confluent.io/confluent_platform/2015-05-20--001/ Apparently
 we need to adjust when we update the 'latest' symlink. The logs that are
 collected for tests are configurable, and service implementations include
 sane defaults (so, e.g., you will always get the normal log file for Kafka,
 but only get the data files if the test asks for them).
 3. No code coverage support. Haven't looked into it, so I couldn't comment
 on how hard it would be to add.

 -Ewen

 On Thu, May 21, 2015 at 2:38 PM, Ashish Singh asi...@cloudera.com wrote:

  Geoffrey,
 
  This looks great!
 
  A few questions.
  1. Will ducktape be maintained separately as a github repo?
  2. How easy is viewing the test results and logs. The link in KIP,
  http://testing.confluent.io/confluent_platform/latest/, lists a bunch of
  files and dirs. Could you add to KIP how the result and logs for the
 tests
  will be organized.
  3. Does it support code coverage? If not, how easy/ difficult would it
 be?
 
  On Thu, May 21, 2015 at 2:03 PM, Geoffrey Anderson ge...@confluent.io
  wrote:
 
   Great, I'll work on putting together a more detailed map of this
   replacement process.
  
   On Thu, May 21, 2015 at 11:13 AM, Gwen Shapira gshap...@cloudera.com
   wrote:
  
Love this idea :)
   
I took a look at Ducktape API and it looks like a good fit - clean
 API,
extensible, easy to use and powerful enough for our use-case.
   
Something I'd like to see as part of the KIP is a map of what
  system-test
currently tests, which ones we want to replace and a JIRA for
 replacing
(possibly one for each group of tests).
Basically, I know we all want to use the new system for new test
 cases
(upgrades, etc), but I really want to make sure we don't get stuck
 with
both systems forever.
   
Gwen
   
On Thu, May 21, 2015 at 9:01 PM, Geoffrey Anderson 
 ge...@confluent.io
  
wrote:
   
 Hi,

 Just kicking off the discussion thread on KIP-25



   
  
 
 https://cwiki.apache.org/confluence/display/KAFKA/KIP+-+25+System+test+improvements

 Thanks,
 Geoff

   
  
 
 
 
  --
 
  Regards,
  Ashish
 



 --
 Thanks,
 Ewen



[DISCUSS] KIP-25 System test improvements

2015-05-21 Thread Geoffrey Anderson
Hi,

Just kicking off the discussion thread on KIP-25

https://cwiki.apache.org/confluence/display/KAFKA/KIP+-+25+System+test+improvements

Thanks,
Geoff