Re: How to spin up Kafka using docker and use for Spark Streaming Integration tests

2016-07-10 Thread Lars Albertsson
Let us assume that you want to build an integration test setup where you run all participating components in Docker. You create a docker-compose.yml with four Docker images, something like this: # Start docker-compose.yml version: '2' services: myapp: build: myapp_dir links: -

Re: How to spin up Kafka using docker and use for Spark Streaming Integration tests

2016-07-06 Thread swetha kasireddy
Can this docker image be used to spin up kafka cluster in a CI/CD pipeline like Jenkins to run the integration tests? Or it can be done only in the local machine that has docker installed? I assume that the box where the CI/CD pipeline runs should have docker installed correct? On Mon, Jul 4,

Re: How to spin up Kafka using docker and use for Spark Streaming Integration tests

2016-07-06 Thread swetha kasireddy
The application output is that it inserts data to cassandra at the end of every batch. On Mon, Jul 4, 2016 at 5:20 AM, Lars Albertsson wrote: > I created such a setup for a client a few months ago. It is pretty > straightforward, but it can take some work to get all the wires

Re: How to spin up Kafka using docker and use for Spark Streaming Integration tests

2016-07-04 Thread Lars Albertsson
I created such a setup for a client a few months ago. It is pretty straightforward, but it can take some work to get all the wires connected. I suggest that you start with the spotify/kafka (https://github.com/spotify/docker-kafka) Docker image, since it includes a bundled zookeeper. The

Re: How to spin up Kafka using docker and use for Spark Streaming Integration tests

2016-07-01 Thread Akhil Das
You can use this https://github.com/wurstmeister/kafka-docker to spin up a kafka cluster and then point your sparkstreaming to it to consume from it. On Fri, Jul 1, 2016 at 1:19 AM, SRK wrote: > Hi, > > I need to do integration tests using Spark Streaming. My idea is

How to spin up Kafka using docker and use for Spark Streaming Integration tests

2016-06-30 Thread SRK
Hi, I need to do integration tests using Spark Streaming. My idea is to spin up kafka using docker locally and use it to feed the stream to my Streaming Job. Any suggestions on how to do this would be of great help. Thanks, Swetha -- View this message in context: