> From: deanwamp...@gmail.com
> Date: Wed, 31 Aug 2016 10:53:49 -0500
> Subject: Re: handling generics in Kafka Scala
> To: users@kafka.apache.org
> 
> Okay, the type parameters with the variances need to be after the method
> name, like this:
> 
> private def createNewConsumer[K <: java.util.ArrayList[Byte],V <:
> java.util.ArrayList[Byte]](): KafkaConsumer[K,V] = {...}
> 
> Dean Wampler, Ph.D.
> Author: Programming Scala, 2nd Edition
> <http://shop.oreilly.com/product/0636920033073.do> (O'Reilly)
> Lightbend <http://lightbend.com>
> @deanwampler <http://twitter.com/deanwampler>
> http://polyglotprogramming.com
> 
> On Wed, Aug 31, 2016 at 8:08 AM, Martin Gainty <mgaint...@gmail.com> wrote:
> 
> > supposedly gmail wont strip \n so we'll try again with non-html mail
> > composer
> >
> >  noob with Scala so Im looking for an experienced answer
> >
> >  ConsumerGroupCommand.scala
> >
> >  //private def createNewConsumer(): KafkaConsumer[String, String] = {
> >  //private def createNewConsumer(): KafkaConsumer[K extends
> > // java.util.ArrayList[Byte],V extends java.util.ArrayList[Byte]] = {
> >
> >      private def createNewConsumer(): KafkaConsumer[K <:
> >  java.util.ArrayList[Byte],V <: java.util.ArrayList[Byte]] = {
> >
> >        val properties = new java.util.Properties()
> >
> >        val deserializer = (new StringDeserializer).getClass.getName
> >
> >        val brokerUrl = opts.options.valueOf(opts.bootstrapServerOpt)
> >
> >      properties.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, brokerUrl)
> >
> >  properties.put(ConsumerConfig.GROUP_ID_CONFIG,opts.options.
> > valueOf(opts.groupOpt))
> >
> >        properties.put(ConsumerConfig.ENABLE_AUTO_COMMIT_CONFIG, "false")
> >
> >        properties.put(ConsumerConfig.SESSION_TIMEOUT_MS_CONFIG, "30000")
> >
> >        properties.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG,
> >  deserializer)
> >
> >   properties.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG,
> >  deserializer)
> >
> >        if (opts.options.has(opts.commandConfigOpt))
> >  properties.putAll(Utils.loadProps(opts.options.
> > valueOf(opts.commandConfigOpt)))
> >
> >        new KafkaConsumer(properties).asInstanceOf[KafkaConsumer[K,V]]
> >      }
> >
> > scala-compiler displays:
> >  [ERROR] \kafka\kafka-trunk\core\src\main\scala\kafka\admin\
> > ConsumerGroupCommand.scala:309:
> >  error: ']' expected but '<:' found.
> >
> > [ERROR]     private def createNewConsumer(): KafkaConsumer[? <:
> >  java.util.ArrayList[Byte],? <: java.util.ArrayList[Byte]] = {
> >  [ERROR]                                                      ^
> >
> >  [ERROR]
> >  \kafka\kafka-trunk\core\src\main\scala\kafka\admin\
> > ConsumerGroupCommand.scala:309:
> >  error: '=' expected but ',' found.
> >
> >  [ERROR]     private def createNewConsumer(): KafkaConsumer[? <:
> >  java.util.ArrayList[Byte],? <: java.util.ArrayList[Byte]] = {
> >  [ERROR]
> >                    ^
> >
> >  [ERROR]
> >  \kafka\kafka-trunk\core\src\main\scala\kafka\admin\
> > ConsumerGroupCommand.scala:322:
> >  error: illegal start of simple expression
> >
> >  i need 2 datatype parameter types extending java.util.ArrayList<Byte>
> >  in regular java this would be:
> >
> > public KafkaConsumer<K extends java.util.ArrayList<Byte>,V extends
> >  java.util.ArrayList<Byte>>  createNewConsumer() { ....}
> >
> >  this works in java but
> >  how do I setup a function or class declaration K, V whose parameter
> >  datatype extends java.util.ArrayList<Byte> ?
> >
> >  Martin
> >
> > >>
> > >> From: mgai...@hotmail.com
> > >> To: mathieu.fenn...@replicon.com; users@kafka.apache.org
> > >> Subject: RE: handling generics in Kafka Scala
> > >> Date: Tue, 30 Aug 2016 23:00:29 -0400
> > >>
> > >>
> > >>
> > >>
> > >> noob with Scala so Im looking for an experienced answer
> > >> ConsumerGroupCommand.scala
> > >> //private def createNewConsumer(): KafkaConsumer[String, String] =
> > >> {//private def createNewConsumer(): KafkaConsumer[K extends
> > >> java.util.ArrayList[Byte],V extends java.util.ArrayList[Byte]] = {
> > >> private def createNewConsumer(): KafkaConsumer[K <:
> > >> java.util.ArrayList[Byte],V <: java.util.ArrayList[Byte]] = {      val
> > >> properties = new java.util.Properties()      val deserializer = (new
> > >> StringDeserializer).getClass.getName      val brokerUrl =
> > >> opts.options.valueOf(opts.bootstrapServerOpt)
> > >> properties.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, brokerUrl)
> > >> properties.put(ConsumerConfig.GROUP_ID_CONFIG,
> > >> opts.options.valueOf(opts.groupOpt))
> > >> properties.put(ConsumerConfig.ENABLE_AUTO_COMMIT_CONFIG, "false")
> > >> properties.put(ConsumerConfig.SESSION_TIMEOUT_MS_CONFIG, "30000")
> > >> properties.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG,
> > >> deserializer)
> > >>    properties.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG,
> > >> deserializer)      if (opts.options.has(opts.commandConfigOpt))
> > >> properties.putAll(Utils.loadProps(opts.options.
> > valueOf(opts.commandConfigOpt)))
> > >>      new KafkaConsumer(properties).asInstanceOf[KafkaConsumer[K,V]]
> > }
> > >> scala-compiler displays:
> > >> [ERROR]
> > >> \kafka\kafka-trunk\core\src\main\scala\kafka\admin\
> > ConsumerGroupCommand.scala:309:
> > >> error: ']' expected but '<:' found.[ERROR]     private def
> > >> createNewConsumer(): KafkaConsumer[? <: java.util.ArrayList[Byte],? <:
> > >> java.util.ArrayList[Byte]] = {[ERROR]
> > >>               ^[ERROR]
> > >> \kafka\kafka-trunk\core\src\main\scala\kafka\admin\
> > ConsumerGroupCommand.scala:309:
> > >> error: '=' expected but ',' found.[ERROR]     private def
> > >> createNewConsumer(): KafkaConsumer[? <: java.util.ArrayList[Byte],? <:
> > >> java.util.ArrayList[Byte]] = {[ERROR]
> > >>                                           ^[ERROR]
> > >> \kafka\kafka-trunk\core\src\main\scala\kafka\admin\
> > ConsumerGroupCommand.scala:322:
> > >> error: illegal start of simple expression
> > >> i want 2 datatype parameter types extending java.util.ArrayList<Byte> in
> > >> regular java this would be:
> > >> public KafkaConsumer<K extends java.util.ArrayList<Byte>,V extends
> > >> java.util.ArrayList<Byte>>  createNewConsumer() {}
> > >> how do I setup a function or class declaration K, V whose parameter
> > >> datatype
> > >> extends java.util.ArrayList<Byte> ?
> > >> Martin
> > >> ______________________________________________
> > >>
> > >>
> > >>
> > >>> From: mathieu.fenn...@replicon.com
> > >>> Date: Wed, 17 Aug 2016 18:06:38 -0600
> > >>> Subject: Re: DLL Hell
> > >>> To: mgai...@hotmail.com
> > >>>
> > >>> Hi Martin,
> > >>>
> > >>> I'm sorry, this is way outside my Kafka knowledge.  I'm just a new
> > >>> Kafka user who wanted to help with your Windows questions because I
> > >>> had just faced the same hurdle. :-)  Wish I could help, but I wouldn't
> > >>> know where to start with this.
> > >>>
> > >>> Mathieu
> > >>>
> > >>>
> > >>> On Wed, Aug 17, 2016 at 6:00 PM, Martin Gainty <mgai...@hotmail.com>
> > >>> wrote:
> > >>> > Hi Matthieu
> > >>> > Many Thanks for attaching the binary
> > >>> >
> > >>> > running scala->java generator plugin I see:
> > >>> >
> > >>> > [ERROR]
> > >>> > C:\Maven-plugin\kafka\kafka-trunk\core\src\main\scala\
> > kafka\admin\AdminUtils.scala:639:
> > >>> > error: type PartitionMetadata is not a member of object
> > >>> > org.apache.kafka.common.requests.MetadataResponse
> > >>> >
> > >>> > yet when I look at
> > >>> > org.apache.kafka.common.requests.MetadataResponse.java I
> > >>> > see inner class
> > >>> >
> > >>> > public static class PartitionMetadata {
> > >>> >
> > >>> > inner static java classes are not visible to the converter for some
> > >>> > reason
> > >>> > the workaround seems to be birth inner static classes (e.g.
> > >>> > PartitionMetadata)
> > >>> > treating inner class as standalone works
> > >>> >
> > >>> > Advice?
> > >>> > Martin
> > >>> > ______________________________________________
> > >>> >
> > >>> >
> > >>> >
> > >>> >
> > >>> > ________________________________
> > >>> > From: mathieu.fenn...@replicon.com
> > >>> > Date: Tue, 16 Aug 2016 08:04:52 -0600
> > >>> > Subject: Re: DLL Hell
> > >>> > To: mgai...@hotmail.com
> > >>> >
> > >>> >
> > >>> > Hey Martin,
> > >>> >
> > >>> > Attached is the native .dll that I was able to build for rocksdb.  If
> > >>> > you
> > >>> > unzip this, and include the contained .dll into your
> > >>> > rocksdbjni-4.8.0.jar at
> > >>> > the root, it should be possible to use Kafka Streams in Windows.  But
> > >>> > this
> > >>> > is just a minimal debug build; wouldn't be appropriate for production
> > >>> > use.
> > >>> > Might save you some time if you're just trying to get a dev
> > >>> > environment
> > >>> > working though.
> > >>> >
> > >>> > Mathieu
> > >>> >
> > >>> >
> > >>> > On Tue, Aug 16, 2016 at 7:40 AM, Martin Gainty <mgai...@hotmail.com>
> > >>> > wrote:
> > >>> >
> > >>> >
> > >>> >
> > >>> >
> > >>> >> From: mathieu.fenn...@replicon.com
> > >>> >> Date: Tue, 16 Aug 2016 06:57:16 -0600
> > >>> >> Subject: Re: DLL Hell
> > >>> >> To: users@kafka.apache.org
> > >>> >>
> > >>> >> Hey Martin,
> > >>> >>
> > >>> >> I had to modify the -G argument to that command to include the
> > visual
> > >>> >> studio year.  If you run "cmake /?", it will output all the
> > available
> > >>> >> generators.  My cmake looked like:
> > >>> >>
> > >>> >>     cmake -G "Visual Studio 12 2013 Win64" -DJNI=1 ..
> > >>> >>
> > >>> >> I think this is probably a change in cmake since the rocksdb doc was
> > >>> >> written (
> > >>> >>
> > >>> >> https://cmake.org/cmake/help/v3.0/generator/Visual%
> > 20Studio%2012%202013.html
> > >>> >> ).
> > >>> >> MG>same "informative error"
> > >>> >>C:\cygwin64\bin\cmake -G "Visual Studio 12 2013 Win64" -DJNI=1
> > >>> > CMake Error: Could not create named generator Visual Studio 12 2013
> > >>> > Win64
> > >>> > Generators  Unix Makefiles               = Generates standard UNIX
> > >>> > makefiles.  Ninja                        = Generates build.ninja
> > >>> > files.
> > >>> > CodeBlocks - Ninja           = Generates CodeBlocks project files.
> > >>> > CodeBlocks - Unix Makefiles  = Generates CodeBlocks project files.
> > >>> > CodeLite
> > >>> > - Ninja             = Generates CodeLite project files.  CodeLite -
> > >>> > Unix
> > >>> > Makefiles    = Generates CodeLite project files.  Eclipse CDT4 -
> > Ninja
> > >>> > = Generates Eclipse CDT 4.0 project files.  Eclipse CDT4 - Unix
> > >>> > Makefiles=
> > >>> > Generates Eclipse CDT 4.0 project files.  KDevelop3
> > >>> > =
> > >>> > Generates KDevelop 3 project files.  KDevelop3 - Unix Makefiles   =
> > >>> > Generates KDevelop 3 project files.  Kate - Ninja                 =
> > >>> > Generates Kate project files.  Kate - Unix Makefiles        =
> > >>> > Generates
> > >>> > Kate
> > >>> > project files.  Sublime Text 2 - Ninja       = Generates Sublime Text
> > >>> > 2
> > >>> > project files.  Sublime Text 2 - Unix Makefiles
> > >>> > = Generates Sublime Text 2 project files.
> > >>> > MG>I am thinking if I want to automate this native build..I could
> > more
> > >>> > easily create binary thru maven-nar-plugin ?
> > >>> > MG>as I do not have any MS VS or DotNet installed..maybe I need to
> > >>> > install
> > >>> > many gigs of MS specific VS?
> > >>> > MG>Please advise
> > >>> >> Mathieu
> > >>> >>
> > >>> >>
> > >>> >> On Tue, Aug 16, 2016 at 5:03 AM, Martin Gainty <mgai...@hotmail.com
> > >
> > >>> >> wrote:
> > >>> >>
> > >>> >> > havent used cmake in over 10 years so Im a bit lost..
> > >>> >> > cmake -G "Visual Studio 12 Win64" -DGFLAGS=1 -DSNAPPY=1
> > >>> >> > -DJEMALLOC=1
> > >>> >> > -DJNI=1
> > >>> >> > CMake Error: Could not create named generator Visual Studio 12
> > >>> >> > Win64
> > >>> >> > ?Please advise
> > >>> >> > Martin
> > >>> >> > ______________________________________________
> > >>> >> >
> > >>> >> >
> > >>> >> >
> > >>> >> > > From: mathieu.fenn...@replicon.com
> > >>> >> > > Date: Mon, 15 Aug 2016 13:43:47 -0600
> > >>> >> > > Subject: Re: DLL Hell
> > >>> >> > > To: users@kafka.apache.org
> > >>> >> > >
> > >>> >> > > Hi Martin,
> > >>> >> > >
> > >>> >> > > rocksdb does not currently distribute a Windows-compatible build
> > >>> >> > > of
> > >>> >> > > their
> > >>> >> > > rocksdbjni library.  I recently wrote up some instructions on
> > how
> > >>> >> > > to
> > >>> >> > > produce a local build, which you can find here:
> > >>> >> > > http://mail-archives.apache.org/mod_mbox/kafka-users/
> > >>> >> > 201608.mbox/%3CCAHoiPjweo-xSj3TiodcDVf4wNnnJ8u6PcwWDPF7L
> > >>> >> > T5ps%2BxQ3eA%40mail.gmail.com%3E
> > >>> >> > >
> > >>> >> > > I'd also suggest tracking this issue in GitHub, which is likely
> > >>> >> > > to
> > >>> >> > > be
> > >>> >> > > updated if this ever changes: https://github.com/facebook/
> > >>> >> > rocksdb/issues/703
> > >>> >> > >
> > >>> >> > > Mathieu
> > >>> >> > >
> > >>> >> > >
> > >>> >> > > On Mon, Aug 15, 2016 at 1:34 PM, Martin Gainty
> > >>> >> > > <mgai...@hotmail.com>
> > >>> >> > wrote:
> > >>> >> > >
> > >>> >> > > > kafka-trunk\streams>gradle buildCaused by:
> > >>> >> > > > java.lang.RuntimeException:
> > >>> >> > > > librocksdbjni-win64.dll was not found inside JAR.        at
> > >>> >> > org.rocksdb.
> > >>> >> > > > NativeLibraryLoader.loadLibraryFromJarToTemp(
> > >>> >> > NativeLibraryLoader.java:106)
> > >>> >> > > >      at org.rocksdb.NativeLibraryLoader.loadLibraryFromJar(
> > >>> >> > NativeLibraryLoader.java:78)
> > >>> >> > > >     at org.rocksdb.NativeLibraryLoader.loadLibrary(
> > >>> >> > NativeLibraryLoader.java:56)
> > >>> >> > > >    at org.rocksdb.RocksDB.loadLibrary(RocksDB.java:47)     at
> > >>> >> > > > org.rocksdb.RocksDB.<clinit>(RocksDB.java:23)
> > >>> >> > > > any idea where I can locale librocksdbjni-win64.dll ?
> > >>> >> > > > /thanks/
> > >>> >> > > > Martin
> > >>> >> > > > ______________________________________________
> > >>> >> > > >
> > >>> >> > > >
> > >>> >> >
> > >>> >> >
> > >>> >
> > >>> >
> > >>> >
> > >>
> > >
> >
                                          

Reply via email to