Re: Kafka-Spark Integration - build failing with sbt

2017-06-17 Thread karan alang
Hey Jozef, Thanks for the quick response .. yes, you are right .. spark-sql dependency was missing .. added that & it worked fine. regds, Karan Alang On Sat, Jun 17, 2017 at 2:24 PM, Jozef.koval wrote: > Hey Karan, > I believe you are missing spark-sql dependency. > > Jozef > > Sent from Proto

Re: Kafka-Spark Integration - build failing with sbt

2017-06-17 Thread Jozef.koval
Hey Karan, I believe you are missing spark-sql dependency. Jozef Sent from [ProtonMail](https://protonmail.ch), encrypted email based in Switzerland. Original Message Subject: Re: Kafka-Spark Integration - build failing with sbt Local Time: June 17, 2017 10:52 PM UTC Time: Jun

Scala type mismatch after upgrade to 0.10.2.1

2017-06-17 Thread Björn Häuser
Hi! I am maintaining an application which is written in Kafka and uses the kafka-streams library. As said in the topic, after trying to upgrade from 0.10.1.1 to 0.10.2.1, I am getting the following compilation error: [error] found : service.streams.transformers.FilterMainCoverSupplier [erro

Re: Kafka-Spark Integration - build failing with sbt

2017-06-17 Thread karan alang
Thanks, i was able to get this working. here is what i added in build.sbt file -- scalaVersion := "2.11.7" val sparkVers = "2.1.0" // Base Spark-provided dependencies libraryDependencies ++= Seq( "org

Re: [DISCUSS] KIP-163: Lower the Minimum Required ACL Permission of OffsetFetch

2017-06-17 Thread Viktor Somogyi
Got it, thanks Hans! On Sat, Jun 17, 2017 at 11:11 AM, Hans Jespersen wrote: > > Offset commit is something that is done in the act of consuming (or > reading) Kafka messages. > Yes technically it is a write to the Kafka consumer offset topic but it's > much easier for > administers to think of

Re: [DISCUSS] KIP-163: Lower the Minimum Required ACL Permission of OffsetFetch

2017-06-17 Thread Hans Jespersen
Offset commit is something that is done in the act of consuming (or reading) Kafka messages. Yes technically it is a write to the Kafka consumer offset topic but it's much easier for administers to think of ACLs in terms of whether the user is allowed to write (Produce) or read (Consume) mes

Re: [DISCUSS] KIP-163: Lower the Minimum Required ACL Permission of OffsetFetch

2017-06-17 Thread Viktor Somogyi
Hi Vahid, +1 for OffsetFetch from me too. I also wanted to ask the strangeness of the permissions, like why is OffsetCommit a Read operation instead of Write which would intuitively make more sense to me. Perhaps any expert could shed some light on this? :) Viktor On Tue, Jun 13, 2017 at 2:38 P

Re: Single Key Aggregation

2017-06-17 Thread Sameer Kumar
Continued from m last mail... The code snippet that I shared was after joining impression and notification logs. Here I am picking the line item and concatenating it with date. You can also see there is a check for a TARGETED_LINE_ITEM, I am not emitting the data otherwise. -Sameer. On Sat, Jun

Re: Single Key Aggregation

2017-06-17 Thread Sameer Kumar
The example I gave was just for illustration. I have impression logs and notification logs. Notification logs are essentially tied to impressions served. An impression would serve multiple items. I was just trying to aggregate across a single line item, this means I am always generating a single k

Multiple consumers for the same topic/group in different threads of the same JVM

2017-06-17 Thread Cédric Chantepie
Hi, Doing some benchmarks with multiple consumers for the same topic/group in different threads of the same JVM, it seems that the throughput when there is only one consumer per group is divided when they are two in the same group. Not having mem or cpu issue, I wondering whether there could b

Re: Kafka-Spark Integration - build failing with sbt

2017-06-17 Thread Jozef.koval
Hi Karan, spark-streaming-kafka is for old spark (version < 1.6.3) spark-streaming-kafka-0.8 is for current spark (version > 2.0) Jozef n.b. there is also version for kafka 0.10+ see [this](https://spark.apache.org/docs/latest/streaming-kafka-integration.html) Sent from [ProtonMail](https://pr