Sorry no clue about that - anyone else know?

On Mon, Oct 28, 2013 at 10:41 AM, Roger Hoover <roger.hoo...@gmail.com> wrote:
> Joel,
>
> Thank you!  This is very helpful.
>
> What I notice now is that it works for Test classes that
> extend org.scalatest.junit.JUnit3Suite.  There are other tests in the
> codebase that use a @Test annotation as in the example below.  Any idea how
> to run those?
>
> import org.junit.{Test, After, Before}
>
> class JsonTest {
>
>   @Test
>   def testJsonEncoding() {
> ...
>
> Thanks!
>
> Roger
>
> On Fri, Oct 25, 2013 at 6:18 PM, Joel Koshy <jjkosh...@gmail.com> wrote:
>
>> In the sbt shell:
>>
>> > projects (to see the available projects)
>> > project core
>> > test-only <unit test>
>>
>> Although lately if there is a test failure it isn't very helpful in
>> saying exactly where the test failed; my environment is probably
>> messed up but I know of one or two others who are having similar
>> issues.
>>
>> Joel
>>
>>
>> On Fri, Oct 25, 2013 at 4:24 PM, Roger Hoover <roger.hoo...@gmail.com>
>> wrote:
>> > Hi,
>> >
>> > I'm new to Scala but working on a simple patch for a configuration change
>> > and want to run just my unit tests.  When I run ./sbt test-only, it
>> > executes all sorts of other tests but not the one I want.  Is there an
>> easy
>> > way to run a single test?  Any help is appreciated.
>> >
>> > $ ./sbt test-only kafka.utils.JsonTest
>> > [info] Loading project definition from /Users/rhoover/Work/kafka/project
>> > [warn] Multiple resolvers having different access mechanism configured
>> with
>> > same name 'sbt-plugin-releases'. To avoid conflict, Remove duplicate
>> > project resolvers (`resolvers`) or rename publishing resolver
>> (`publishTo`).
>> > [info] Set current project to Kafka (in build
>> > file:/Users/rhoover/Work/kafka/)
>> > [info] No tests to run for Kafka/test:test-only
>> > [info] No tests to run for contrib/test:test-only
>> > [info] No tests to run for java-examples/test:test-only
>> > [info] No tests to run for perf/test:test-only
>> > [info] No tests to run for hadoop-producer/test:test-only
>> > [info] No tests to run for hadoop-consumer/test:test-only
>> > [info] Test Starting: testFetcher(kafka.integration.FetcherTest)
>> > [info] Test Passed: testFetcher(kafka.integration.FetcherTest)
>> > [2013-10-25 16:19:55,067] ERROR Error in cleaner thread 0:
>> > (kafka.log.LogCleaner:103)
>> > java.lang.IllegalArgumentException: inconsistent range
>> >  at
>> >
>> java.util.concurrent.ConcurrentSkipListMap$SubMap.<init>(ConcurrentSkipListMap.java:2506)
>> > at
>> >
>> java.util.concurrent.ConcurrentSkipListMap.subMap(ConcurrentSkipListMap.java:1984)
>> >  at kafka.log.Log.logSegments(Log.scala:604)
>> > at kafka.log.LogToClean.<init>(LogCleaner.scala:596)
>> >  at kafka.log.LogCleaner$$anonfun$5.apply(LogCleaner.scala:137)
>> > at kafka.log.LogCleaner$$anonfun$5.apply(LogCleaner.scala:137)
>> >  at
>> >
>> scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:206)
>> > at
>> >
>> scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:206)
>> >  at
>> >
>> scala.collection.LinearSeqOptimized$class.foreach(LinearSeqOptimized.scala:61)
>> > at scala.collection.immutable.List.foreach(List.scala:45)
>> >  at scala.collection.TraversableLike$class.map(TraversableLike.scala:206)
>> > at scala.collection.immutable.List.map(List.scala:45)
>> >  at
>> >
>> kafka.log.LogCleaner.kafka$log$LogCleaner$$grabFilthiestLog(LogCleaner.scala:137)
>> > at kafka.log.LogCleaner$CleanerThread.cleanOrSleep(LogCleaner.scala:203)
>> >  at kafka.log.LogCleaner$CleanerThread.run(LogCleaner.scala:189)
>> >
>> >
>> > #THIS DOESN'T WORK EITHER BUT EXECUTES OTHER TESTS
>> >
>> > $ ./sbt test-only unit.kafka.utils.JsonTest
>> > [info] Loading project definition from /Users/rhoover/Work/kafka/project
>> > [warn] Multiple resolvers having different access mechanism configured
>> with
>> > same name 'sbt-plugin-releases'. To avoid conflict, Remove duplicate
>> > project resolvers (`resolvers`) or rename publishing resolver
>> (`publishTo`).
>> > [info] Set current project to Kafka (in build
>> > file:/Users/rhoover/Work/kafka/)
>> > [info] No tests to run for contrib/test:test-only
>> > [info] No tests to run for Kafka/test:test-only
>> > [info] No tests to run for java-examples/test:test-only
>> > [info] No tests to run for perf/test:test-only
>> > [info] No tests to run for hadoop-consumer/test:test-only
>> > [info] No tests to run for hadoop-producer/test:test-only
>> > [info] Test Starting: truncate
>> > [info] Test Passed: truncate
>> > [info] Test Starting: randomLookupTest
>> > [info] Test Passed: randomLookupTest
>> > [info] Test Starting: lookupExtremeCases
>> > [info] Test Passed: lookupExtremeCases
>> > [info] Test Starting: appendTooMany
>> > [info] Test Passed: appendTooMany
>> > [info] Test Starting: appendOutOfOrder
>> > [info] Test Passed: appendOutOfOrder
>> > [info] Test Starting: testReopen
>> > [info] Test Passed: testReopen
>> > [2013-10-25 16:21:44,004] ERROR [KafkaApi-0] Error when processing fetch
>> > request for partition [test2,0] offset -1 from consumer with correlation
>> id
>> > 0 (kafka.server.KafkaApis:103)
>> > kafka.common.OffsetOutOfRangeException: Request for offset -1 but we only
>> > have log segments in the range 0 to 2.
>> > at kafka.log.Log.read(Log.scala:380)
>> >  at
>> >
>> kafka.server.KafkaApis.kafka$server$KafkaApis$$readMessageSet(KafkaApis.scala:401)
>> > at
>> >
>> kafka.server.KafkaApis$$anonfun$kafka$server$KafkaApis$$readMessageSets$1.apply(KafkaApis.scala:347)
>> >  at
>> >
>> kafka.server.KafkaApis$$anonfun$kafka$server$KafkaApis$$readMessageSets$1.apply(KafkaApis.scala:343)
>> > at
>> >
>> scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:206)
>> >  at
>> >
>> scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:206)
>> > at scala.collection.immutable.Map$Map4.foreach(Map.scala:180)
>> >  at scala.collection.TraversableLike$class.map(TraversableLike.scala:206)
>> > at scala.collection.immutable.Map$Map4.map(Map.scala:157)
>> >  at
>> >
>> kafka.server.KafkaApis.kafka$server$KafkaApis$$readMessageSets(KafkaApis.scala:343)
>> > at kafka.server.KafkaApis.handleFetchRequest(KafkaApis.scala:309)
>> >  at kafka.server.KafkaApis.handle(KafkaApis.scala:71)
>> > at kafka.server.KafkaRequestHandler.run(KafkaRequestHandler.scala:42)
>> >  at java.lang.Thread.run(Thread.java:744)
>>

Reply via email to