Mathieu,

follow-up question:  Are you also doing or considering integration testing
by spawning a local Kafka cluster and then reading/writing to that cluster
(often called embedded or in-memory cluster)?  This approach would be in
the middle between ProcessorTopologyTestDriver (that does not spawn a Kafka
cluster) and your system-level testing (which I suppose is running against
a "real" test Kafka cluster).

-Michael





On Mon, Aug 15, 2016 at 3:44 PM, Mathieu Fenniak <
mathieu.fenn...@replicon.com> wrote:

> Hey all,
>
> At my workplace, we have a real focus on software automated testing.  I'd
> love to be able to test the composition of a TopologyBuilder with
> org.apache.kafka.test.ProcessorTopologyTestDriver
> <https://github.com/apache/kafka/blob/14934157df7aaf5e9c37a302ef9fd9
> 317b95efa4/streams/src/test/java/org/apache/kafka/test/
> ProcessorTopologyTestDriver.java>;
> has there ever been any thought given to making this part of the public API
> of Kafka Streams?
>
> For some background, here are some details on the automated testing plan
> that I have in mind for a Kafka Streams application.  Our goal is to enable
> continuous deployment of any new development we do, so, it has to be
> rigorously tested with complete automation.
>
> As part of our pre-commit testing, we'd first have these gateways; no code
> would reach our master branch without passing these tests:
>
>    - At the finest level, unit tests covering individual pieces like a
>    Serde, ValueMapper, ValueJoiner, aggregate adder/subtractor, etc.  These
>    pieces are very isolated, very easy to unit test.
>    - At a higher level, I'd like to have component tests of the composition
>    of the TopologyBuilder; this is where ProcessorTopologyTestDriver would
> be
>    valuable.  There'd be far fewer of these tests than the lower-level
> tests.
>    There are no external dependencies to these tests, so they'd be very
> fast.
>
> Having passed that level of testing, we'd deploy the Kafka Streams
> application to an integration testing area where the rest of our
> application is kept up-to-date, and proceed with these integration tests:
>
>    - Systems-level tests where we synthesize inputs to the Kafka topics,
>    wait for the Streams app to process the data, and then inspect the
> output
>    that it pushes into other Kafka topics.  These tests will be fewer in
>    nature than the above tests, but they serve to ensure that the
> application
>    is well-configured, executing, and handling inputs & outputs as
> expected.
>    - UI-level tests where we verify behaviors that are expected from the
>    system as a whole.  As our application is a web app, we'd be using
> Selenium
>    to drive a web browser and verifying interactions and outputs that are
>    expected from the Streams application matching our real-world use-cases.
>    These tests are even fewer in nature than the above.
>
> This is an adaptation of the automated testing scaffold that we currently
> use for microservices; I'd love any input on the plan as a whole.
>
> Thanks,
>
> Mathieu
>

Reply via email to