Build failed in Jenkins: Kafka » kafka-trunk-jdk8 #559

2021-03-10 Thread Apache Jenkins Server
See 


Changes:

[github] KAFKA-12287: Add WARN logging on consumer-groups when reset-offsets by 
timestamp or duration can't find an offset and defaults to latest (#10092)


--
[...truncated 3.67 MB...]
KafkaZkClientTest > testUpdateBrokerInfo() STARTED

KafkaZkClientTest > testUpdateBrokerInfo() PASSED

KafkaZkClientTest > testCreateRecursive() STARTED

KafkaZkClientTest > testCreateRecursive() PASSED

KafkaZkClientTest > testGetConsumerOffsetNoData() STARTED

KafkaZkClientTest > testGetConsumerOffsetNoData() PASSED

KafkaZkClientTest > testDeleteTopicPathMethods() STARTED

KafkaZkClientTest > testDeleteTopicPathMethods() PASSED

KafkaZkClientTest > testSetTopicPartitionStatesRaw() STARTED

KafkaZkClientTest > testSetTopicPartitionStatesRaw() PASSED

KafkaZkClientTest > testAclManagementMethods() STARTED

KafkaZkClientTest > testAclManagementMethods() PASSED

KafkaZkClientTest > testPreferredReplicaElectionMethods() STARTED

KafkaZkClientTest > testPreferredReplicaElectionMethods() PASSED

KafkaZkClientTest > testPropagateLogDir() STARTED

KafkaZkClientTest > testPropagateLogDir() PASSED

KafkaZkClientTest > testGetDataAndStat() STARTED

KafkaZkClientTest > testGetDataAndStat() PASSED

KafkaZkClientTest > testReassignPartitionsInProgress() STARTED

KafkaZkClientTest > testReassignPartitionsInProgress() PASSED

KafkaZkClientTest > testCreateTopLevelPaths() STARTED

KafkaZkClientTest > testCreateTopLevelPaths() PASSED

KafkaZkClientTest > testGetAllTopicsInClusterDoesNotTriggerWatch() STARTED

KafkaZkClientTest > testGetAllTopicsInClusterDoesNotTriggerWatch() PASSED

KafkaZkClientTest > testIsrChangeNotificationGetters() STARTED

KafkaZkClientTest > testIsrChangeNotificationGetters() PASSED

KafkaZkClientTest > testLogDirEventNotificationsDeletion() STARTED

KafkaZkClientTest > testLogDirEventNotificationsDeletion() PASSED

KafkaZkClientTest > testGetLogConfigs() STARTED

KafkaZkClientTest > testGetLogConfigs() PASSED

KafkaZkClientTest > testBrokerSequenceIdMethods() STARTED

KafkaZkClientTest > testBrokerSequenceIdMethods() PASSED

KafkaZkClientTest > testAclMethods() STARTED

KafkaZkClientTest > testAclMethods() PASSED

KafkaZkClientTest > testCreateSequentialPersistentPath() STARTED

KafkaZkClientTest > testCreateSequentialPersistentPath() PASSED

KafkaZkClientTest > testConditionalUpdatePath() STARTED

KafkaZkClientTest > testConditionalUpdatePath() PASSED

KafkaZkClientTest > testGetAllTopicsInClusterTriggersWatch() STARTED

KafkaZkClientTest > testGetAllTopicsInClusterTriggersWatch() PASSED

KafkaZkClientTest > testDeleteTopicZNode() STARTED

KafkaZkClientTest > testDeleteTopicZNode() PASSED

KafkaZkClientTest > testDeletePath() STARTED

KafkaZkClientTest > testDeletePath() PASSED

KafkaZkClientTest > testGetBrokerMethods() STARTED

KafkaZkClientTest > testGetBrokerMethods() PASSED

KafkaZkClientTest > testCreateTokenChangeNotification() STARTED

KafkaZkClientTest > testCreateTokenChangeNotification() PASSED

KafkaZkClientTest > testGetTopicsAndPartitions() STARTED

KafkaZkClientTest > testGetTopicsAndPartitions() PASSED

KafkaZkClientTest > testRegisterBrokerInfo() STARTED

KafkaZkClientTest > testRegisterBrokerInfo() PASSED

KafkaZkClientTest > testRetryRegisterBrokerInfo() STARTED

KafkaZkClientTest > testRetryRegisterBrokerInfo() PASSED

KafkaZkClientTest > testConsumerOffsetPath() STARTED

KafkaZkClientTest > testConsumerOffsetPath() PASSED

KafkaZkClientTest > testDeleteRecursiveWithControllerEpochVersionCheck() STARTED

KafkaZkClientTest > testDeleteRecursiveWithControllerEpochVersionCheck() PASSED

KafkaZkClientTest > testTopicAssignments() STARTED

KafkaZkClientTest > testTopicAssignments() PASSED

KafkaZkClientTest > testControllerManagementMethods() STARTED

KafkaZkClientTest > testControllerManagementMethods() PASSED

KafkaZkClientTest > testTopicAssignmentMethods() STARTED

KafkaZkClientTest > testTopicAssignmentMethods() PASSED

KafkaZkClientTest > testConnectionViaNettyClient() STARTED

KafkaZkClientTest > testConnectionViaNettyClient() PASSED

KafkaZkClientTest > testPropagateIsrChanges() STARTED

KafkaZkClientTest > testPropagateIsrChanges() PASSED

KafkaZkClientTest > testControllerEpochMethods() STARTED

KafkaZkClientTest > testControllerEpochMethods() PASSED

KafkaZkClientTest > testDeleteRecursive() STARTED

KafkaZkClientTest > testDeleteRecursive() PASSED

KafkaZkClientTest > testGetTopicPartitionStates() STARTED

KafkaZkClientTest > testGetTopicPartitionStates() PASSED

KafkaZkClientTest > testCreateConfigChangeNotification() STARTED

KafkaZkClientTest > testCreateConfigChangeNotification() PASSED

KafkaZkClientTest > testDelegationTokenMethods() STARTED

KafkaZkClientTest > testDelegationTokenMethods() PASSED

LiteralAclStoreTest > shouldHaveCorrectPaths() STARTED

LiteralAclStoreTest > shouldHaveCorrectPaths() PASSED

LiteralAclStoreTest > sh

Jenkins build is back to normal : Kafka » kafka-trunk-jdk11 #588

2021-03-10 Thread Apache Jenkins Server
See 




Build failed in Jenkins: Kafka » kafka-trunk-jdk15 #615

2021-03-10 Thread Apache Jenkins Server
See 


Changes:

[github] KAFKA-12287: Add WARN logging on consumer-groups when reset-offsets by 
timestamp or duration can't find an offset and defaults to latest (#10092)


--
[...truncated 3.68 MB...]
AuthorizerIntegrationTest > 
shouldThrowTransactionalIdAuthorizationExceptionWhenNoTransactionAccessOnEndTransaction()
 STARTED

AuthorizerIntegrationTest > 
shouldThrowTransactionalIdAuthorizationExceptionWhenNoTransactionAccessOnEndTransaction()
 PASSED

AuthorizerIntegrationTest > 
shouldThrowTransactionalIdAuthorizationExceptionWhenNoTransactionAccessOnSendOffsetsToTxn()
 STARTED

AuthorizerIntegrationTest > 
shouldThrowTransactionalIdAuthorizationExceptionWhenNoTransactionAccessOnSendOffsetsToTxn()
 PASSED

AuthorizerIntegrationTest > testCommitWithNoGroupAccess() STARTED

AuthorizerIntegrationTest > testCommitWithNoGroupAccess() PASSED

AuthorizerIntegrationTest > 
testTransactionalProducerInitTransactionsNoDescribeTransactionalIdAcl() STARTED

AuthorizerIntegrationTest > 
testTransactionalProducerInitTransactionsNoDescribeTransactionalIdAcl() PASSED

AuthorizerIntegrationTest > testAuthorizeByResourceTypeDenyTakesPrecedence() 
STARTED

AuthorizerIntegrationTest > testAuthorizeByResourceTypeDenyTakesPrecedence() 
PASSED

AuthorizerIntegrationTest > testUnauthorizedDeleteRecordsWithDescribe() STARTED

AuthorizerIntegrationTest > testUnauthorizedDeleteRecordsWithDescribe() PASSED

AuthorizerIntegrationTest > testCreateTopicAuthorizationWithClusterCreate() 
STARTED

AuthorizerIntegrationTest > testCreateTopicAuthorizationWithClusterCreate() 
PASSED

AuthorizerIntegrationTest > testOffsetFetchWithTopicAndGroupRead() STARTED

AuthorizerIntegrationTest > testOffsetFetchWithTopicAndGroupRead() PASSED

AuthorizerIntegrationTest > testCommitWithTopicDescribe() STARTED

AuthorizerIntegrationTest > testCommitWithTopicDescribe() PASSED

AuthorizerIntegrationTest > testAuthorizationWithTopicExisting() STARTED

AuthorizerIntegrationTest > testAuthorizationWithTopicExisting() PASSED

AuthorizerIntegrationTest > testUnauthorizedDeleteRecordsWithoutDescribe() 
STARTED

AuthorizerIntegrationTest > testUnauthorizedDeleteRecordsWithoutDescribe() 
PASSED

AuthorizerIntegrationTest > testMetadataWithTopicDescribe() STARTED

AuthorizerIntegrationTest > testMetadataWithTopicDescribe() PASSED

AuthorizerIntegrationTest > testProduceWithTopicDescribe() STARTED

AuthorizerIntegrationTest > testProduceWithTopicDescribe() PASSED

AuthorizerIntegrationTest > testDescribeGroupApiWithNoGroupAcl() STARTED

AuthorizerIntegrationTest > testDescribeGroupApiWithNoGroupAcl() PASSED

AuthorizerIntegrationTest > testPatternSubscriptionMatchingInternalTopic() 
STARTED

AuthorizerIntegrationTest > testPatternSubscriptionMatchingInternalTopic() 
PASSED

AuthorizerIntegrationTest > testSendOffsetsWithNoConsumerGroupDescribeAccess() 
STARTED

AuthorizerIntegrationTest > testSendOffsetsWithNoConsumerGroupDescribeAccess() 
PASSED

AuthorizerIntegrationTest > testListTransactionsAuthorization() STARTED

AuthorizerIntegrationTest > testListTransactionsAuthorization() PASSED

AuthorizerIntegrationTest > testOffsetFetchTopicDescribe() STARTED

AuthorizerIntegrationTest > testOffsetFetchTopicDescribe() PASSED

AuthorizerIntegrationTest > testCommitWithTopicAndGroupRead() STARTED

AuthorizerIntegrationTest > testCommitWithTopicAndGroupRead() PASSED

AuthorizerIntegrationTest > 
testIdempotentProducerNoIdempotentWriteAclInInitProducerId() STARTED

AuthorizerIntegrationTest > 
testIdempotentProducerNoIdempotentWriteAclInInitProducerId() PASSED

AuthorizerIntegrationTest > testSimpleConsumeWithExplicitSeekAndNoGroupAccess() 
STARTED

AuthorizerIntegrationTest > testSimpleConsumeWithExplicitSeekAndNoGroupAccess() 
PASSED

SslProducerSendTest > testSendNonCompressedMessageWithCreateTime() STARTED

SslProducerSendTest > testSendNonCompressedMessageWithCreateTime() PASSED

SslProducerSendTest > testClose() STARTED

SslProducerSendTest > testClose() PASSED

SslProducerSendTest > testFlush() STARTED

SslProducerSendTest > testFlush() PASSED

SslProducerSendTest > testSendToPartition() STARTED

SslProducerSendTest > testSendToPartition() PASSED

SslProducerSendTest > testSendOffset() STARTED

SslProducerSendTest > testSendOffset() PASSED

SslProducerSendTest > testSendCompressedMessageWithCreateTime() STARTED

SslProducerSendTest > testSendCompressedMessageWithCreateTime() PASSED

SslProducerSendTest > testCloseWithZeroTimeoutFromCallerThread() STARTED

SslProducerSendTest > testCloseWithZeroTimeoutFromCallerThread() PASSED

SslProducerSendTest > testCloseWithZeroTimeoutFromSenderThread() STARTED

SslProducerSendTest > testCloseWithZeroTimeoutFromSenderThread() PASSED

SslProducerSendTest > testSendBeforeAndAfterPartitionExpansion() STARTED

SslProducerSendTest > testSendBeforeAndAfterPartitionExpansion() PASSED

ProducerCompres

Build failed in Jenkins: Kafka » kafka-trunk-jdk8 #558

2021-03-10 Thread Apache Jenkins Server
See 


Changes:

[github] KAFKA-12441: remove deprecated method StreamsBuilder#addGlobalStore 
(#10284)


--
[...truncated 7.32 MB...]
GssapiAuthenticationTest > testReLogin() STARTED

GssapiAuthenticationTest > testReLogin() PASSED

PermissiveControllerMutationQuotaTest > testControllerMutationQuotaViolation() 
STARTED

PermissiveControllerMutationQuotaTest > testControllerMutationQuotaViolation() 
PASSED

AssignmentStateTest > [1] isr=List(101, 102, 103), replicas=List(101, 102, 
103), adding=List(), removing=List(), original=List(), isUnderReplicated=false 
STARTED

AssignmentStateTest > [1] isr=List(101, 102, 103), replicas=List(101, 102, 
103), adding=List(), removing=List(), original=List(), isUnderReplicated=false 
PASSED

AssignmentStateTest > [2] isr=List(101, 102), replicas=List(101, 102, 103), 
adding=List(), removing=List(), original=List(), isUnderReplicated=true STARTED

AssignmentStateTest > [2] isr=List(101, 102), replicas=List(101, 102, 103), 
adding=List(), removing=List(), original=List(), isUnderReplicated=true PASSED

AssignmentStateTest > [3] isr=List(101, 102, 103), replicas=List(101, 102, 
103), adding=List(104, 105), removing=List(102), original=List(101, 102, 103), 
isUnderReplicated=false STARTED

AssignmentStateTest > [3] isr=List(101, 102, 103), replicas=List(101, 102, 
103), adding=List(104, 105), removing=List(102), original=List(101, 102, 103), 
isUnderReplicated=false PASSED

AssignmentStateTest > [4] isr=List(101, 102, 103), replicas=List(101, 102, 
103), adding=List(104, 105), removing=List(), original=List(101, 102, 103), 
isUnderReplicated=false STARTED

AssignmentStateTest > [4] isr=List(101, 102, 103), replicas=List(101, 102, 
103), adding=List(104, 105), removing=List(), original=List(101, 102, 103), 
isUnderReplicated=false PASSED

AssignmentStateTest > [5] isr=List(101, 102, 103), replicas=List(101, 102, 
103), adding=List(), removing=List(102), original=List(101, 102, 103), 
isUnderReplicated=false STARTED

AssignmentStateTest > [5] isr=List(101, 102, 103), replicas=List(101, 102, 
103), adding=List(), removing=List(102), original=List(101, 102, 103), 
isUnderReplicated=false PASSED

AssignmentStateTest > [6] isr=List(102, 103), replicas=List(102, 103), 
adding=List(101), removing=List(), original=List(102, 103), 
isUnderReplicated=false STARTED

AssignmentStateTest > [6] isr=List(102, 103), replicas=List(102, 103), 
adding=List(101), removing=List(), original=List(102, 103), 
isUnderReplicated=false PASSED

AssignmentStateTest > [7] isr=List(103, 104, 105), replicas=List(101, 102, 
103), adding=List(104, 105, 106), removing=List(), original=List(101, 102, 
103), isUnderReplicated=false STARTED

AssignmentStateTest > [7] isr=List(103, 104, 105), replicas=List(101, 102, 
103), adding=List(104, 105, 106), removing=List(), original=List(101, 102, 
103), isUnderReplicated=false PASSED

AssignmentStateTest > [8] isr=List(103, 104, 105), replicas=List(101, 102, 
103), adding=List(104, 105, 106), removing=List(), original=List(101, 102, 
103), isUnderReplicated=false STARTED

AssignmentStateTest > [8] isr=List(103, 104, 105), replicas=List(101, 102, 
103), adding=List(104, 105, 106), removing=List(), original=List(101, 102, 
103), isUnderReplicated=false PASSED

AssignmentStateTest > [9] isr=List(103, 104), replicas=List(101, 102, 103), 
adding=List(104, 105, 106), removing=List(), original=List(101, 102, 103), 
isUnderReplicated=true STARTED

AssignmentStateTest > [9] isr=List(103, 104), replicas=List(101, 102, 103), 
adding=List(104, 105, 106), removing=List(), original=List(101, 102, 103), 
isUnderReplicated=true PASSED

BrokerEndPointTest > testFromJsonV4WithNoFeatures() STARTED

BrokerEndPointTest > testFromJsonV4WithNoFeatures() PASSED

BrokerEndPointTest > testEndpointFromUri() STARTED

BrokerEndPointTest > testEndpointFromUri() PASSED

BrokerEndPointTest > testHashAndEquals() STARTED

BrokerEndPointTest > testHashAndEquals() PASSED

BrokerEndPointTest > testFromJsonV4WithNoRack() STARTED

BrokerEndPointTest > testFromJsonV4WithNoRack() PASSED

BrokerEndPointTest > testFromJsonFutureVersion() STARTED

BrokerEndPointTest > testFromJsonFutureVersion() PASSED

BrokerEndPointTest > testFromJsonV4WithNullRack() STARTED

BrokerEndPointTest > testFromJsonV4WithNullRack() PASSED

BrokerEndPointTest > testBrokerEndpointFromUri() STARTED

BrokerEndPointTest > testBrokerEndpointFromUri() PASSED

BrokerEndPointTest > testFromJsonV1() STARTED

BrokerEndPointTest > testFromJsonV1() PASSED

BrokerEndPointTest > testFromJsonV2() STARTED

BrokerEndPointTest > testFromJsonV2() PASSED

BrokerEndPointTest > testFromJsonV3() STARTED

BrokerEndPointTest > testFromJsonV3() PASSED

BrokerEndPointTest > testFromJsonV5() STARTED

BrokerEndPointTest > testFromJsonV5() PASSED

ZooKeeperClientTest > testZNodeChangeHandlerForDataChange() STARTED

ZooKeeperClien

[jira] [Created] (KAFKA-12453) Guidance on whether a topology is eligible for optimisation

2021-03-10 Thread Patrick O'Keeffe (Jira)
Patrick O'Keeffe created KAFKA-12453:


 Summary: Guidance on whether a topology is eligible for 
optimisation
 Key: KAFKA-12453
 URL: https://issues.apache.org/jira/browse/KAFKA-12453
 Project: Kafka
  Issue Type: Improvement
  Components: streams
Reporter: Patrick O'Keeffe


Since the introduction of KStream.toTable() in Kafka 2.6.x, the decision about 
whether a topology is eligible for optimisation is no longer a simple one, and 
is related to whether toTable() operations are preceded by key changing 
operators.

This decision requires expert level knowledge, and there are serious 
implications associated with getting it wrong in terms of fault tolerance

Some idea spring to mind:
 # Topology.describe() could indicate whether this topology is eligible for 
optimisation
 # Topologies could be automatically optimised - note this may have an impact 
at deployment time, in that an application reset may be required. The developer 
would need to made aware of this and adjust the deployment instructions 
accordingly

 

 



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


Build failed in Jenkins: Kafka » kafka-trunk-jdk15 #614

2021-03-10 Thread Apache Jenkins Server
See 


Changes:

[github] KAFKA-12441: remove deprecated method StreamsBuilder#addGlobalStore 
(#10284)


--
[...truncated 3.68 MB...]

AuthorizerIntegrationTest > 
testTransactionalProducerInitTransactionsNoDescribeTransactionalIdAcl() PASSED

AuthorizerIntegrationTest > testAuthorizeByResourceTypeDenyTakesPrecedence() 
STARTED

AuthorizerIntegrationTest > testAuthorizeByResourceTypeDenyTakesPrecedence() 
PASSED

AuthorizerIntegrationTest > testUnauthorizedDeleteRecordsWithDescribe() STARTED

AuthorizerIntegrationTest > testUnauthorizedDeleteRecordsWithDescribe() PASSED

AuthorizerIntegrationTest > testCreateTopicAuthorizationWithClusterCreate() 
STARTED

AuthorizerIntegrationTest > testCreateTopicAuthorizationWithClusterCreate() 
PASSED

AuthorizerIntegrationTest > testOffsetFetchWithTopicAndGroupRead() STARTED

AuthorizerIntegrationTest > testOffsetFetchWithTopicAndGroupRead() PASSED

AuthorizerIntegrationTest > testCommitWithTopicDescribe() STARTED

AuthorizerIntegrationTest > testCommitWithTopicDescribe() PASSED

AuthorizerIntegrationTest > testAuthorizationWithTopicExisting() STARTED

AuthorizerIntegrationTest > testAuthorizationWithTopicExisting() PASSED

AuthorizerIntegrationTest > testUnauthorizedDeleteRecordsWithoutDescribe() 
STARTED

AuthorizerIntegrationTest > testUnauthorizedDeleteRecordsWithoutDescribe() 
PASSED

AuthorizerIntegrationTest > testMetadataWithTopicDescribe() STARTED

AuthorizerIntegrationTest > testMetadataWithTopicDescribe() PASSED

AuthorizerIntegrationTest > testProduceWithTopicDescribe() STARTED

AuthorizerIntegrationTest > testProduceWithTopicDescribe() PASSED

AuthorizerIntegrationTest > testDescribeGroupApiWithNoGroupAcl() STARTED

AuthorizerIntegrationTest > testDescribeGroupApiWithNoGroupAcl() PASSED

AuthorizerIntegrationTest > testPatternSubscriptionMatchingInternalTopic() 
STARTED

AuthorizerIntegrationTest > testPatternSubscriptionMatchingInternalTopic() 
PASSED

AuthorizerIntegrationTest > testSendOffsetsWithNoConsumerGroupDescribeAccess() 
STARTED

AuthorizerIntegrationTest > testSendOffsetsWithNoConsumerGroupDescribeAccess() 
PASSED

AuthorizerIntegrationTest > testListTransactionsAuthorization() STARTED

AuthorizerIntegrationTest > testListTransactionsAuthorization() PASSED

AuthorizerIntegrationTest > testOffsetFetchTopicDescribe() STARTED

AuthorizerIntegrationTest > testOffsetFetchTopicDescribe() PASSED

AuthorizerIntegrationTest > testCommitWithTopicAndGroupRead() STARTED

AuthorizerIntegrationTest > testCommitWithTopicAndGroupRead() PASSED

AuthorizerIntegrationTest > 
testIdempotentProducerNoIdempotentWriteAclInInitProducerId() STARTED

AuthorizerIntegrationTest > 
testIdempotentProducerNoIdempotentWriteAclInInitProducerId() PASSED

AuthorizerIntegrationTest > testSimpleConsumeWithExplicitSeekAndNoGroupAccess() 
STARTED

AuthorizerIntegrationTest > testSimpleConsumeWithExplicitSeekAndNoGroupAccess() 
PASSED

SslProducerSendTest > testSendNonCompressedMessageWithCreateTime() STARTED

SslProducerSendTest > testSendNonCompressedMessageWithCreateTime() PASSED

SslProducerSendTest > testClose() STARTED

SslProducerSendTest > testClose() PASSED

SslProducerSendTest > testFlush() STARTED

SslProducerSendTest > testFlush() PASSED

SslProducerSendTest > testSendToPartition() STARTED

SslProducerSendTest > testSendToPartition() PASSED

SslProducerSendTest > testSendOffset() STARTED

SslProducerSendTest > testSendOffset() PASSED

SslProducerSendTest > testSendCompressedMessageWithCreateTime() STARTED

SslProducerSendTest > testSendCompressedMessageWithCreateTime() PASSED

SslProducerSendTest > testCloseWithZeroTimeoutFromCallerThread() STARTED

SslProducerSendTest > testCloseWithZeroTimeoutFromCallerThread() PASSED

SslProducerSendTest > testCloseWithZeroTimeoutFromSenderThread() STARTED

SslProducerSendTest > testCloseWithZeroTimeoutFromSenderThread() PASSED

SslProducerSendTest > testSendBeforeAndAfterPartitionExpansion() STARTED

SslProducerSendTest > testSendBeforeAndAfterPartitionExpansion() PASSED

ProducerCompressionTest > [1] compression=none STARTED

ProducerCompressionTest > [1] compression=none PASSED

ProducerCompressionTest > [2] compression=gzip STARTED

ProducerCompressionTest > [2] compression=gzip PASSED

ProducerCompressionTest > [3] compression=snappy STARTED

ProducerCompressionTest > [3] compression=snappy PASSED

ProducerCompressionTest > [4] compression=lz4 STARTED

ProducerCompressionTest > [4] compression=lz4 PASSED

ProducerCompressionTest > [5] compression=zstd STARTED

ProducerCompressionTest > [5] compression=zstd PASSED

MetricsTest > testMetrics() STARTED

MetricsTest > testMetrics() PASSED

ProducerFailureHandlingTest > testCannotSendToInternalTopic() STARTED

ProducerFailureHandlingTest > testCannotSendToInternalTopic() PASSED

ProducerFailureHandlingTest > testTooLargeRecordWithAckOne() STARTED

Pro

Build failed in Jenkins: Kafka » kafka-trunk-jdk11 #587

2021-03-10 Thread Apache Jenkins Server
See 


Changes:

[github] KAFKA-12441: remove deprecated method StreamsBuilder#addGlobalStore 
(#10284)


--
[...truncated 3.66 MB...]
BrokerCompressionTest > [11] messageCompression=none, brokerCompression=lz4 
PASSED

BrokerCompressionTest > [12] messageCompression=gzip, brokerCompression=lz4 
STARTED

BrokerCompressionTest > [12] messageCompression=gzip, brokerCompression=lz4 
PASSED

BrokerCompressionTest > [13] messageCompression=snappy, brokerCompression=lz4 
STARTED

BrokerCompressionTest > [13] messageCompression=snappy, brokerCompression=lz4 
PASSED

BrokerCompressionTest > [14] messageCompression=lz4, brokerCompression=lz4 
STARTED

BrokerCompressionTest > [14] messageCompression=lz4, brokerCompression=lz4 
PASSED

BrokerCompressionTest > [15] messageCompression=zstd, brokerCompression=lz4 
STARTED

BrokerCompressionTest > [15] messageCompression=zstd, brokerCompression=lz4 
PASSED

BrokerCompressionTest > [16] messageCompression=none, brokerCompression=snappy 
STARTED

BrokerCompressionTest > [16] messageCompression=none, brokerCompression=snappy 
PASSED

BrokerCompressionTest > [17] messageCompression=gzip, brokerCompression=snappy 
STARTED

BrokerCompressionTest > [17] messageCompression=gzip, brokerCompression=snappy 
PASSED

BrokerCompressionTest > [18] messageCompression=snappy, 
brokerCompression=snappy STARTED

BrokerCompressionTest > [18] messageCompression=snappy, 
brokerCompression=snappy PASSED

BrokerCompressionTest > [19] messageCompression=lz4, brokerCompression=snappy 
STARTED

BrokerCompressionTest > [19] messageCompression=lz4, brokerCompression=snappy 
PASSED

BrokerCompressionTest > [20] messageCompression=zstd, brokerCompression=snappy 
STARTED

BrokerCompressionTest > [20] messageCompression=zstd, brokerCompression=snappy 
PASSED

BrokerCompressionTest > [21] messageCompression=none, brokerCompression=gzip 
STARTED

BrokerCompressionTest > [21] messageCompression=none, brokerCompression=gzip 
PASSED

BrokerCompressionTest > [22] messageCompression=gzip, brokerCompression=gzip 
STARTED

BrokerCompressionTest > [22] messageCompression=gzip, brokerCompression=gzip 
PASSED

BrokerCompressionTest > [23] messageCompression=snappy, brokerCompression=gzip 
STARTED

BrokerCompressionTest > [23] messageCompression=snappy, brokerCompression=gzip 
PASSED

BrokerCompressionTest > [24] messageCompression=lz4, brokerCompression=gzip 
STARTED

BrokerCompressionTest > [24] messageCompression=lz4, brokerCompression=gzip 
PASSED

BrokerCompressionTest > [25] messageCompression=zstd, brokerCompression=gzip 
STARTED

BrokerCompressionTest > [25] messageCompression=zstd, brokerCompression=gzip 
PASSED

BrokerCompressionTest > [26] messageCompression=none, 
brokerCompression=producer STARTED

BrokerCompressionTest > [26] messageCompression=none, 
brokerCompression=producer PASSED

BrokerCompressionTest > [27] messageCompression=gzip, 
brokerCompression=producer STARTED

BrokerCompressionTest > [27] messageCompression=gzip, 
brokerCompression=producer PASSED

BrokerCompressionTest > [28] messageCompression=snappy, 
brokerCompression=producer STARTED

BrokerCompressionTest > [28] messageCompression=snappy, 
brokerCompression=producer PASSED

BrokerCompressionTest > [29] messageCompression=lz4, brokerCompression=producer 
STARTED

BrokerCompressionTest > [29] messageCompression=lz4, brokerCompression=producer 
PASSED

BrokerCompressionTest > [30] messageCompression=zstd, 
brokerCompression=producer STARTED

BrokerCompressionTest > [30] messageCompression=zstd, 
brokerCompression=producer PASSED

DefaultMessageFormatterTest > [1] name=print nothing, 
record=ConsumerRecord(topic = someTopic, partition = 9, leaderEpoch = null, 
offset = 9876, CreateTime = 1234, serialized key size = 0, serialized value 
size = 0, headers = RecordHeaders(headers = [RecordHeader(key = h1, value = 
[118, 49]), RecordHeader(key = h2, value = [118, 50])], isReadOnly = false), 
key = [B@164d1e7e, value = [B@b909aed), properties=Map(print.value -> false), 
expected= STARTED

DefaultMessageFormatterTest > [1] name=print nothing, 
record=ConsumerRecord(topic = someTopic, partition = 9, leaderEpoch = null, 
offset = 9876, CreateTime = 1234, serialized key size = 0, serialized value 
size = 0, headers = RecordHeaders(headers = [RecordHeader(key = h1, value = 
[118, 49]), RecordHeader(key = h2, value = [118, 50])], isReadOnly = false), 
key = [B@164d1e7e, value = [B@b909aed), properties=Map(print.value -> false), 
expected= PASSED

DefaultMessageFormatterTest > [2] name=print key, record=ConsumerRecord(topic = 
someTopic, partition = 9, leaderEpoch = null, offset = 9876, CreateTime = 1234, 
serialized key size = 0, serialized value size = 0, headers = 
RecordHeaders(headers = [RecordHeader(key = h1, value = [118, 49]), 
RecordHeader(key = h2, value = [118, 50])], isReadOnly = false), key = 
[B@3863

[jira] [Resolved] (KAFKA-12287) Add WARN logging on consumer-groups when reset-offsets by timestamp or duration can't find an offset and defaults to latest.

2021-03-10 Thread Matthias J. Sax (Jira)


 [ 
https://issues.apache.org/jira/browse/KAFKA-12287?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Matthias J. Sax resolved KAFKA-12287.
-
Fix Version/s: 3.0.0
   Resolution: Fixed

> Add WARN logging on consumer-groups when reset-offsets by timestamp or 
> duration can't find an offset and defaults to latest.
> 
>
> Key: KAFKA-12287
> URL: https://issues.apache.org/jira/browse/KAFKA-12287
> Project: Kafka
>  Issue Type: Improvement
>Reporter: Jorge Esteban Quilcate Otoya
>Assignee: Jorge Esteban Quilcate Otoya
>Priority: Minor
> Fix For: 3.0.0
>
>
> From https://issues.apache.org/jira/browse/KAFKA-9527
>  
> Warn when resetting offsets by timestamp in a topic partition that does not 
> have records and return null, to explictly say that we are resetting to 
> latest offset available (e.g. zero in the case no records have been stored in 
> a TP yet).



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


Build failed in Jenkins: Kafka » kafka-trunk-jdk15 #613

2021-03-10 Thread Apache Jenkins Server
See 


Changes:

[github] KAFKA-10062: Add a methods to retrieve the current timestamps as known 
by the Streams app (#9744)


--
[...truncated 3.68 MB...]
AuthorizerIntegrationTest > 
shouldThrowTransactionalIdAuthorizationExceptionWhenNoTransactionAccessOnEndTransaction()
 STARTED

AuthorizerIntegrationTest > 
shouldThrowTransactionalIdAuthorizationExceptionWhenNoTransactionAccessOnEndTransaction()
 PASSED

AuthorizerIntegrationTest > 
shouldThrowTransactionalIdAuthorizationExceptionWhenNoTransactionAccessOnSendOffsetsToTxn()
 STARTED

AuthorizerIntegrationTest > 
shouldThrowTransactionalIdAuthorizationExceptionWhenNoTransactionAccessOnSendOffsetsToTxn()
 PASSED

AuthorizerIntegrationTest > testCommitWithNoGroupAccess() STARTED

AuthorizerIntegrationTest > testCommitWithNoGroupAccess() PASSED

AuthorizerIntegrationTest > 
testTransactionalProducerInitTransactionsNoDescribeTransactionalIdAcl() STARTED

AuthorizerIntegrationTest > 
testTransactionalProducerInitTransactionsNoDescribeTransactionalIdAcl() PASSED

AuthorizerIntegrationTest > testAuthorizeByResourceTypeDenyTakesPrecedence() 
STARTED

AuthorizerIntegrationTest > testAuthorizeByResourceTypeDenyTakesPrecedence() 
PASSED

AuthorizerIntegrationTest > testUnauthorizedDeleteRecordsWithDescribe() STARTED

AuthorizerIntegrationTest > testUnauthorizedDeleteRecordsWithDescribe() PASSED

AuthorizerIntegrationTest > testCreateTopicAuthorizationWithClusterCreate() 
STARTED

AuthorizerIntegrationTest > testCreateTopicAuthorizationWithClusterCreate() 
PASSED

AuthorizerIntegrationTest > testOffsetFetchWithTopicAndGroupRead() STARTED

AuthorizerIntegrationTest > testOffsetFetchWithTopicAndGroupRead() PASSED

AuthorizerIntegrationTest > testCommitWithTopicDescribe() STARTED

AuthorizerIntegrationTest > testCommitWithTopicDescribe() PASSED

AuthorizerIntegrationTest > testAuthorizationWithTopicExisting() STARTED

AuthorizerIntegrationTest > testAuthorizationWithTopicExisting() PASSED

AuthorizerIntegrationTest > testUnauthorizedDeleteRecordsWithoutDescribe() 
STARTED

AuthorizerIntegrationTest > testUnauthorizedDeleteRecordsWithoutDescribe() 
PASSED

AuthorizerIntegrationTest > testMetadataWithTopicDescribe() STARTED

AuthorizerIntegrationTest > testMetadataWithTopicDescribe() PASSED

AuthorizerIntegrationTest > testProduceWithTopicDescribe() STARTED

AuthorizerIntegrationTest > testProduceWithTopicDescribe() PASSED

AuthorizerIntegrationTest > testDescribeGroupApiWithNoGroupAcl() STARTED

AuthorizerIntegrationTest > testDescribeGroupApiWithNoGroupAcl() PASSED

AuthorizerIntegrationTest > testPatternSubscriptionMatchingInternalTopic() 
STARTED

AuthorizerIntegrationTest > testPatternSubscriptionMatchingInternalTopic() 
PASSED

AuthorizerIntegrationTest > testSendOffsetsWithNoConsumerGroupDescribeAccess() 
STARTED

AuthorizerIntegrationTest > testSendOffsetsWithNoConsumerGroupDescribeAccess() 
PASSED

AuthorizerIntegrationTest > testListTransactionsAuthorization() STARTED

AuthorizerIntegrationTest > testListTransactionsAuthorization() PASSED

AuthorizerIntegrationTest > testOffsetFetchTopicDescribe() STARTED

AuthorizerIntegrationTest > testOffsetFetchTopicDescribe() PASSED

AuthorizerIntegrationTest > testCommitWithTopicAndGroupRead() STARTED

AuthorizerIntegrationTest > testCommitWithTopicAndGroupRead() PASSED

AuthorizerIntegrationTest > 
testIdempotentProducerNoIdempotentWriteAclInInitProducerId() STARTED

AuthorizerIntegrationTest > 
testIdempotentProducerNoIdempotentWriteAclInInitProducerId() PASSED

AuthorizerIntegrationTest > testSimpleConsumeWithExplicitSeekAndNoGroupAccess() 
STARTED

AuthorizerIntegrationTest > testSimpleConsumeWithExplicitSeekAndNoGroupAccess() 
PASSED

SslProducerSendTest > testSendNonCompressedMessageWithCreateTime() STARTED

SslProducerSendTest > testSendNonCompressedMessageWithCreateTime() PASSED

SslProducerSendTest > testClose() STARTED

SslProducerSendTest > testClose() PASSED

SslProducerSendTest > testFlush() STARTED

SslProducerSendTest > testFlush() PASSED

SslProducerSendTest > testSendToPartition() STARTED

SslProducerSendTest > testSendToPartition() PASSED

SslProducerSendTest > testSendOffset() STARTED

SslProducerSendTest > testSendOffset() PASSED

SslProducerSendTest > testSendCompressedMessageWithCreateTime() STARTED

SslProducerSendTest > testSendCompressedMessageWithCreateTime() PASSED

SslProducerSendTest > testCloseWithZeroTimeoutFromCallerThread() STARTED

SslProducerSendTest > testCloseWithZeroTimeoutFromCallerThread() PASSED

SslProducerSendTest > testCloseWithZeroTimeoutFromSenderThread() STARTED

SslProducerSendTest > testCloseWithZeroTimeoutFromSenderThread() PASSED

SslProducerSendTest > testSendBeforeAndAfterPartitionExpansion() STARTED

SslProducerSendTest > testSendBeforeAndAfterPartitionExpansion() PASSED

ProducerCompressionTest > [1] compression=none STARTED

Produce

[jira] [Created] (KAFKA-12452) Remove deprecated overloads for ProcessorContext#forward

2021-03-10 Thread Matthias J. Sax (Jira)
Matthias J. Sax created KAFKA-12452:
---

 Summary: Remove deprecated overloads for ProcessorContext#forward
 Key: KAFKA-12452
 URL: https://issues.apache.org/jira/browse/KAFKA-12452
 Project: Kafka
  Issue Type: Sub-task
  Components: streams
Reporter: Matthias J. Sax
 Fix For: 3.0.0


[https://cwiki.apache.org/confluence/display/KAFKA/KIP-251%3A+Allow+timestamp+manipulation+in+Processor+API]
 



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


Build failed in Jenkins: Kafka » kafka-trunk-jdk11 #586

2021-03-10 Thread Apache Jenkins Server
See 


Changes:

[github] KAFKA-10062: Add a methods to retrieve the current timestamps as known 
by the Streams app (#9744)


--
[...truncated 3.68 MB...]

KafkaZkClientTest > testSetTopicPartitionStatesRaw() PASSED

KafkaZkClientTest > testAclManagementMethods() STARTED

KafkaZkClientTest > testAclManagementMethods() PASSED

KafkaZkClientTest > testPreferredReplicaElectionMethods() STARTED

KafkaZkClientTest > testPreferredReplicaElectionMethods() PASSED

KafkaZkClientTest > testPropagateLogDir() STARTED

KafkaZkClientTest > testPropagateLogDir() PASSED

KafkaZkClientTest > testGetDataAndStat() STARTED

KafkaZkClientTest > testGetDataAndStat() PASSED

KafkaZkClientTest > testReassignPartitionsInProgress() STARTED

KafkaZkClientTest > testReassignPartitionsInProgress() PASSED

KafkaZkClientTest > testCreateTopLevelPaths() STARTED

KafkaZkClientTest > testCreateTopLevelPaths() PASSED

KafkaZkClientTest > testGetAllTopicsInClusterDoesNotTriggerWatch() STARTED

KafkaZkClientTest > testGetAllTopicsInClusterDoesNotTriggerWatch() PASSED

KafkaZkClientTest > testIsrChangeNotificationGetters() STARTED

KafkaZkClientTest > testIsrChangeNotificationGetters() PASSED

KafkaZkClientTest > testLogDirEventNotificationsDeletion() STARTED

KafkaZkClientTest > testLogDirEventNotificationsDeletion() PASSED

KafkaZkClientTest > testGetLogConfigs() STARTED

KafkaZkClientTest > testGetLogConfigs() PASSED

KafkaZkClientTest > testBrokerSequenceIdMethods() STARTED

KafkaZkClientTest > testBrokerSequenceIdMethods() PASSED

KafkaZkClientTest > testAclMethods() STARTED

KafkaZkClientTest > testAclMethods() PASSED

KafkaZkClientTest > testCreateSequentialPersistentPath() STARTED

KafkaZkClientTest > testCreateSequentialPersistentPath() PASSED

KafkaZkClientTest > testConditionalUpdatePath() STARTED

KafkaZkClientTest > testConditionalUpdatePath() PASSED

KafkaZkClientTest > testGetAllTopicsInClusterTriggersWatch() STARTED

KafkaZkClientTest > testGetAllTopicsInClusterTriggersWatch() PASSED

KafkaZkClientTest > testDeleteTopicZNode() STARTED

KafkaZkClientTest > testDeleteTopicZNode() PASSED

KafkaZkClientTest > testDeletePath() STARTED

KafkaZkClientTest > testDeletePath() PASSED

KafkaZkClientTest > testGetBrokerMethods() STARTED

KafkaZkClientTest > testGetBrokerMethods() PASSED

KafkaZkClientTest > testCreateTokenChangeNotification() STARTED

KafkaZkClientTest > testCreateTokenChangeNotification() PASSED

KafkaZkClientTest > testGetTopicsAndPartitions() STARTED

KafkaZkClientTest > testGetTopicsAndPartitions() PASSED

KafkaZkClientTest > testRegisterBrokerInfo() STARTED

KafkaZkClientTest > testRegisterBrokerInfo() PASSED

KafkaZkClientTest > testRetryRegisterBrokerInfo() STARTED

KafkaZkClientTest > testRetryRegisterBrokerInfo() PASSED

KafkaZkClientTest > testConsumerOffsetPath() STARTED

KafkaZkClientTest > testConsumerOffsetPath() PASSED

KafkaZkClientTest > testDeleteRecursiveWithControllerEpochVersionCheck() STARTED

KafkaZkClientTest > testDeleteRecursiveWithControllerEpochVersionCheck() PASSED

KafkaZkClientTest > testTopicAssignments() STARTED

KafkaZkClientTest > testTopicAssignments() PASSED

KafkaZkClientTest > testControllerManagementMethods() STARTED

KafkaZkClientTest > testControllerManagementMethods() PASSED

KafkaZkClientTest > testTopicAssignmentMethods() STARTED

KafkaZkClientTest > testTopicAssignmentMethods() PASSED

KafkaZkClientTest > testConnectionViaNettyClient() STARTED

KafkaZkClientTest > testConnectionViaNettyClient() PASSED

KafkaZkClientTest > testPropagateIsrChanges() STARTED

KafkaZkClientTest > testPropagateIsrChanges() PASSED

KafkaZkClientTest > testControllerEpochMethods() STARTED

KafkaZkClientTest > testControllerEpochMethods() PASSED

KafkaZkClientTest > testDeleteRecursive() STARTED

KafkaZkClientTest > testDeleteRecursive() PASSED

KafkaZkClientTest > testGetTopicPartitionStates() STARTED

KafkaZkClientTest > testGetTopicPartitionStates() PASSED

KafkaZkClientTest > testCreateConfigChangeNotification() STARTED

KafkaZkClientTest > testCreateConfigChangeNotification() PASSED

KafkaZkClientTest > testDelegationTokenMethods() STARTED

KafkaZkClientTest > testDelegationTokenMethods() PASSED

LiteralAclStoreTest > shouldHaveCorrectPaths() STARTED

LiteralAclStoreTest > shouldHaveCorrectPaths() PASSED

LiteralAclStoreTest > shouldRoundTripChangeNode() STARTED

LiteralAclStoreTest > shouldRoundTripChangeNode() PASSED

LiteralAclStoreTest > shouldThrowFromEncodeOnNoneLiteral() STARTED

LiteralAclStoreTest > shouldThrowFromEncodeOnNoneLiteral() PASSED

LiteralAclStoreTest > shouldWriteChangesToTheWritePath() STARTED

LiteralAclStoreTest > shouldWriteChangesToTheWritePath() PASSED

LiteralAclStoreTest > shouldHaveCorrectPatternType() STARTED

LiteralAclStoreTest > shouldHaveCorrectPatternType() PASSED

LiteralAclStoreTest > shouldDecodeResourceUsingTwoPartLogic() 

Build failed in Jenkins: Kafka » kafka-trunk-jdk8 #557

2021-03-10 Thread Apache Jenkins Server
See 


Changes:

[github] KAFKA-10062: Add a methods to retrieve the current timestamps as known 
by the Streams app (#9744)


--
[...truncated 3.66 MB...]

LogValidatorTest > testOffsetAssignmentAfterDownConversionV2ToV1Compressed() 
PASSED

LogValidatorTest > testOffsetAssignmentAfterDownConversionV1ToV0Compressed() 
STARTED

LogValidatorTest > testOffsetAssignmentAfterDownConversionV1ToV0Compressed() 
PASSED

LogValidatorTest > testOffsetAssignmentAfterUpConversionV0ToV2Compressed() 
STARTED

LogValidatorTest > testOffsetAssignmentAfterUpConversionV0ToV2Compressed() 
PASSED

LogValidatorTest > testNonCompressedV1() STARTED

LogValidatorTest > testNonCompressedV1() PASSED

LogValidatorTest > testNonCompressedV2() STARTED

LogValidatorTest > testNonCompressedV2() PASSED

LogValidatorTest > testOffsetAssignmentAfterUpConversionV1ToV2NonCompressed() 
STARTED

LogValidatorTest > testOffsetAssignmentAfterUpConversionV1ToV2NonCompressed() 
PASSED

LogValidatorTest > testInvalidCreateTimeCompressedV1() STARTED

LogValidatorTest > testInvalidCreateTimeCompressedV1() PASSED

LogValidatorTest > testInvalidCreateTimeCompressedV2() STARTED

LogValidatorTest > testInvalidCreateTimeCompressedV2() PASSED

LogValidatorTest > testNonIncreasingOffsetRecordBatchHasMetricsLogged() STARTED

LogValidatorTest > testNonIncreasingOffsetRecordBatchHasMetricsLogged() PASSED

LogValidatorTest > testRecompressionV1() STARTED

LogValidatorTest > testRecompressionV1() PASSED

LogValidatorTest > testRecompressionV2() STARTED

LogValidatorTest > testRecompressionV2() PASSED

ProducerStateManagerTest > testSkipEmptyTransactions() STARTED

ProducerStateManagerTest > testSkipEmptyTransactions() PASSED

ProducerStateManagerTest > testControlRecordBumpsProducerEpoch() STARTED

ProducerStateManagerTest > testControlRecordBumpsProducerEpoch() PASSED

ProducerStateManagerTest > testProducerSequenceWithWrapAroundBatchRecord() 
STARTED

ProducerStateManagerTest > testProducerSequenceWithWrapAroundBatchRecord() 
PASSED

ProducerStateManagerTest > testCoordinatorFencing() STARTED

ProducerStateManagerTest > testCoordinatorFencing() PASSED

ProducerStateManagerTest > testLoadFromTruncatedSnapshotFile() STARTED

ProducerStateManagerTest > testLoadFromTruncatedSnapshotFile() PASSED

ProducerStateManagerTest > testTruncateFullyAndStartAt() STARTED

ProducerStateManagerTest > testTruncateFullyAndStartAt() PASSED

ProducerStateManagerTest > testRemoveExpiredPidsOnReload() STARTED

ProducerStateManagerTest > testRemoveExpiredPidsOnReload() PASSED

ProducerStateManagerTest > testRecoverFromSnapshotFinishedTransaction() STARTED

ProducerStateManagerTest > testRecoverFromSnapshotFinishedTransaction() PASSED

ProducerStateManagerTest > testOutOfSequenceAfterControlRecordEpochBump() 
STARTED

ProducerStateManagerTest > testOutOfSequenceAfterControlRecordEpochBump() PASSED

ProducerStateManagerTest > testFirstUnstableOffsetAfterTruncation() STARTED

ProducerStateManagerTest > testFirstUnstableOffsetAfterTruncation() PASSED

ProducerStateManagerTest > testTakeSnapshot() STARTED

ProducerStateManagerTest > testTakeSnapshot() PASSED

ProducerStateManagerTest > testRecoverFromSnapshotUnfinishedTransaction() 
STARTED

ProducerStateManagerTest > testRecoverFromSnapshotUnfinishedTransaction() PASSED

ProducerStateManagerTest > testDeleteSnapshotsBefore() STARTED

ProducerStateManagerTest > testDeleteSnapshotsBefore() PASSED

ProducerStateManagerTest > testAppendEmptyControlBatch() STARTED

ProducerStateManagerTest > testAppendEmptyControlBatch() PASSED

ProducerStateManagerTest > testNoValidationOnFirstEntryWhenLoadingLog() STARTED

ProducerStateManagerTest > testNoValidationOnFirstEntryWhenLoadingLog() PASSED

ProducerStateManagerTest > testRemoveStraySnapshotsKeepCleanShutdownSnapshot() 
STARTED

ProducerStateManagerTest > testRemoveStraySnapshotsKeepCleanShutdownSnapshot() 
PASSED

ProducerStateManagerTest > testRemoveAllStraySnapshots() STARTED

ProducerStateManagerTest > testRemoveAllStraySnapshots() PASSED

ProducerStateManagerTest > testLoadFromEmptySnapshotFile() STARTED

ProducerStateManagerTest > testLoadFromEmptySnapshotFile() PASSED

ProducerStateManagerTest > testProducersWithOngoingTransactionsDontExpire() 
STARTED

ProducerStateManagerTest > testProducersWithOngoingTransactionsDontExpire() 
PASSED

ProducerStateManagerTest > testBasicIdMapping() STARTED

ProducerStateManagerTest > testBasicIdMapping() PASSED

ProducerStateManagerTest > updateProducerTransactionState() STARTED

ProducerStateManagerTest > updateProducerTransactionState() PASSED

ProducerStateManagerTest > testPrepareUpdateDoesNotMutate() STARTED

ProducerStateManagerTest > testPrepareUpdateDoesNotMutate() PASSED

ProducerStateManagerTest > testSequenceNotValidatedForGroupMetadataTopic() 
STARTED

ProducerStateManagerTest > testSequenceNotValidatedForGroupM

[jira] [Resolved] (KAFKA-12448) STATE_DIR_CONFIG path cannot be found in windows due to usage of setPosixFilePermissions

2021-03-10 Thread Matthias J. Sax (Jira)


 [ 
https://issues.apache.org/jira/browse/KAFKA-12448?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Matthias J. Sax resolved KAFKA-12448.
-
Resolution: Duplicate

This should already be fixed via 
https://issues.apache.org/jira/browse/KAFKA-12190 – the 2.6.2 bug-fix release 
should be out soon.

Closing this as a duplicate. Feel free to reopen if the issue persists in 2.6.2 
release.

> STATE_DIR_CONFIG path cannot be found in windows due to usage of 
> setPosixFilePermissions
> 
>
> Key: KAFKA-12448
> URL: https://issues.apache.org/jira/browse/KAFKA-12448
> Project: Kafka
>  Issue Type: Bug
>  Components: streams
>Affects Versions: 2.6.1
> Environment: Windows 10, Intellij, Spring-Kafka.2.6.6
>Reporter: Ahmet Yortanlı
>Priority: Minor
>
> STATE_DIR_CONFIG path cannot be found in windows due to usage of 
> setPosixFilePermissions.
> I am trying to develop a small application by using the Spring-Kafka library. 
> My application cannot be started on windows because in 
> org.apache.kafka.streams.processor.internals.StateDirectory class file 
> permission operations are performed by assuming underlying os supports 
> PosixFileSystem.  WindowsFileSystemProvider returns null when 
> getFileAttributeView method is called.
> It is not a big problem however, it blocks working in windows machines.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


Re: Gradle error - aggregatedJavadoc depending on "compileJava"

2021-03-10 Thread Ismael Juma
Can you please verify if the issue occurs with `./gradlew`? That's the only
supported mechanism.

Ismael

On Wed, Mar 10, 2021 at 1:31 PM Alexandre Dupriez <
alexandre.dupr...@gmail.com> wrote:

> Hi Ismael,
>
> Thanks for your quick response. I used the system-installed gradle.
> Surprisingly, the same command works successfully as well on my laptop
> but not on the target environment - both use the same version of
> Gradle (6.8.3) and JDK (1.8), and start with an empty Gradle cache.
> Note the error is given before any compilation actually happens, at
> the "configure" stage of the build.
>
> Thanks,
> Alexandre
>
> Le mer. 10 mars 2021 à 20:58, Ismael Juma  a écrit :
> >
> > Hi Alexandre,
> >
> > Did you use `./gradlew releaseTarGz` (you should never use the system
> > installed gradle)? That works for me. Also `./gradlew clean releaseTarGz`
> > works for me.
> >
> > Ismael
> >
> > On Wed, Mar 10, 2021 at 8:42 AM Ismael Juma  wrote:
> >
> > > Interesting, I didn't have the same problem. I'll try to reproduce.
> > >
> > > Ismael
> > >
> > > On Wed, Mar 10, 2021, 6:07 AM Alexandre Dupriez <
> > > alexandre.dupr...@gmail.com> wrote:
> > >
> > >> Hi Community,
> > >>
> > >> I tried to build Kafka from trunk on my environment today (2021, March
> > >> 10th) and it failed with the following Gradle error at the beginning
> > >> of the build, while Gradle configures project from build.gradle:
> > >>
> > >>   "Could not get unknown property 'compileJava' for root project
> > >> '' of type org.gradle.api.Project."
> > >>
> > >> The command used is "gradle releaseTarGz". Removing "dependsOn:
> > >> compileJava" from the task "aggregatedJavadoc" (added on March 9th
> > >> [1]) made the problem disappear - I wonder if anyone else encountered
> > >> the same problem?
> > >>
> > >> [1] https://github.com/apache/kafka/pull/10272
> > >>
> > >> Many thanks,
> > >> Alexandre
> > >>
> > >
>


Build failed in Jenkins: Kafka » kafka-2.8-jdk8 #60

2021-03-10 Thread Apache Jenkins Server
See 


Changes:

[Colin McCabe] MINOR: Disable transactional/idempotent system tests for Raft 
quorums (#10224)


--
[...truncated 3.61 MB...]
SocketServerTest > testConnectionRatePerIp() PASSED

SocketServerTest > processCompletedSendException() STARTED

SocketServerTest > processCompletedSendException() PASSED

SocketServerTest > processDisconnectedException() STARTED

SocketServerTest > processDisconnectedException() PASSED

SocketServerTest > closingChannelWithBufferedReceives() STARTED

SocketServerTest > closingChannelWithBufferedReceives() PASSED

SocketServerTest > sendCancelledKeyException() STARTED

SocketServerTest > sendCancelledKeyException() PASSED

SocketServerTest > processCompletedReceiveException() STARTED

SocketServerTest > processCompletedReceiveException() PASSED

SocketServerTest > testControlPlaneAsPrivilegedListener() STARTED

SocketServerTest > testControlPlaneAsPrivilegedListener() PASSED

SocketServerTest > closingChannelSendFailure() STARTED

SocketServerTest > closingChannelSendFailure() PASSED

SocketServerTest > idleExpiryWithBufferedReceives() STARTED

SocketServerTest > idleExpiryWithBufferedReceives() PASSED

SocketServerTest > testSocketsCloseOnShutdown() STARTED

SocketServerTest > testSocketsCloseOnShutdown() PASSED

SocketServerTest > 
testNoOpActionResponseWithThrottledChannelWhereThrottlingAlreadyDone() STARTED

SocketServerTest > 
testNoOpActionResponseWithThrottledChannelWhereThrottlingAlreadyDone() PASSED

SocketServerTest > pollException() STARTED

SocketServerTest > pollException() PASSED

SocketServerTest > closingChannelWithBufferedReceivesFailedSend() STARTED

SocketServerTest > closingChannelWithBufferedReceivesFailedSend() PASSED

SocketServerTest > remoteCloseWithBufferedReceives() STARTED

SocketServerTest > remoteCloseWithBufferedReceives() PASSED

SocketServerTest > testThrottledSocketsClosedOnShutdown() STARTED

SocketServerTest > testThrottledSocketsClosedOnShutdown() PASSED

SocketServerTest > closingChannelWithCompleteAndIncompleteBufferedReceives() 
STARTED

SocketServerTest > closingChannelWithCompleteAndIncompleteBufferedReceives() 
PASSED

SocketServerTest > testInterBrokerListenerAsPrivilegedListener() STARTED

SocketServerTest > testInterBrokerListenerAsPrivilegedListener() PASSED

SocketServerTest > testSslSocketServer() STARTED

SocketServerTest > testSslSocketServer() PASSED

SocketServerTest > testDisabledRequestIsRejected() STARTED

SocketServerTest > testDisabledRequestIsRejected() PASSED

SocketServerTest > tooBigRequestIsRejected() STARTED

SocketServerTest > tooBigRequestIsRejected() PASSED

SocketServerTest > 
testNoOpActionResponseWithThrottledChannelWhereThrottlingInProgress() STARTED

SocketServerTest > 
testNoOpActionResponseWithThrottledChannelWhereThrottlingInProgress() PASSED

InterBrokerSendThreadTest > shutdownThreadShouldNotCauseException() STARTED

InterBrokerSendThreadTest > shutdownThreadShouldNotCauseException() PASSED

InterBrokerSendThreadTest > shouldCreateClientRequestAndSendWhenNodeIsReady() 
STARTED

InterBrokerSendThreadTest > shouldCreateClientRequestAndSendWhenNodeIsReady() 
PASSED

InterBrokerSendThreadTest > testFailingExpiredRequests() STARTED

InterBrokerSendThreadTest > testFailingExpiredRequests() PASSED

InterBrokerSendThreadTest > 
shouldCallCompletionHandlerWithDisconnectedResponseWhenNodeNotReady() STARTED

InterBrokerSendThreadTest > 
shouldCallCompletionHandlerWithDisconnectedResponseWhenNodeNotReady() PASSED

InterBrokerSendThreadTest > shouldNotSendAnythingWhenNoRequests() STARTED

InterBrokerSendThreadTest > shouldNotSendAnythingWhenNoRequests() PASSED

DefaultMessageFormatterTest > [1] name=print nothing, 
record=ConsumerRecord(topic = someTopic, partition = 9, leaderEpoch = null, 
offset = 9876, CreateTime = 1234, serialized key size = 0, serialized value 
size = 0, headers = RecordHeaders(headers = [RecordHeader(key = h1, value = 
[118, 49]), RecordHeader(key = h2, value = [118, 50])], isReadOnly = false), 
key = [B@3dabb17d, value = [B@e610e3), properties=Map(print.value -> false), 
expected= STARTED

DefaultMessageFormatterTest > [1] name=print nothing, 
record=ConsumerRecord(topic = someTopic, partition = 9, leaderEpoch = null, 
offset = 9876, CreateTime = 1234, serialized key size = 0, serialized value 
size = 0, headers = RecordHeaders(headers = [RecordHeader(key = h1, value = 
[118, 49]), RecordHeader(key = h2, value = [118, 50])], isReadOnly = false), 
key = [B@3dabb17d, value = [B@e610e3), properties=Map(print.value -> false), 
expected= PASSED

DefaultMessageFormatterTest > [2] name=print key, record=ConsumerRecord(topic = 
someTopic, partition = 9, leaderEpoch = null, offset = 9876, CreateTime = 1234, 
serialized key size = 0, serialized value size = 0, headers = 
RecordHeaders(headers = [RecordHeader(key = h1, value = [118, 49]), 
RecordHeader(key = h2

Re: Gradle error - aggregatedJavadoc depending on "compileJava"

2021-03-10 Thread Alexandre Dupriez
Hi Ismael,

Thanks for your quick response. I used the system-installed gradle.
Surprisingly, the same command works successfully as well on my laptop
but not on the target environment - both use the same version of
Gradle (6.8.3) and JDK (1.8), and start with an empty Gradle cache.
Note the error is given before any compilation actually happens, at
the "configure" stage of the build.

Thanks,
Alexandre

Le mer. 10 mars 2021 à 20:58, Ismael Juma  a écrit :
>
> Hi Alexandre,
>
> Did you use `./gradlew releaseTarGz` (you should never use the system
> installed gradle)? That works for me. Also `./gradlew clean releaseTarGz`
> works for me.
>
> Ismael
>
> On Wed, Mar 10, 2021 at 8:42 AM Ismael Juma  wrote:
>
> > Interesting, I didn't have the same problem. I'll try to reproduce.
> >
> > Ismael
> >
> > On Wed, Mar 10, 2021, 6:07 AM Alexandre Dupriez <
> > alexandre.dupr...@gmail.com> wrote:
> >
> >> Hi Community,
> >>
> >> I tried to build Kafka from trunk on my environment today (2021, March
> >> 10th) and it failed with the following Gradle error at the beginning
> >> of the build, while Gradle configures project from build.gradle:
> >>
> >>   "Could not get unknown property 'compileJava' for root project
> >> '' of type org.gradle.api.Project."
> >>
> >> The command used is "gradle releaseTarGz". Removing "dependsOn:
> >> compileJava" from the task "aggregatedJavadoc" (added on March 9th
> >> [1]) made the problem disappear - I wonder if anyone else encountered
> >> the same problem?
> >>
> >> [1] https://github.com/apache/kafka/pull/10272
> >>
> >> Many thanks,
> >> Alexandre
> >>
> >


Re: Gradle error - aggregatedJavadoc depending on "compileJava"

2021-03-10 Thread Ismael Juma
Hi Alexandre,

Did you use `./gradlew releaseTarGz` (you should never use the system
installed gradle)? That works for me. Also `./gradlew clean releaseTarGz`
works for me.

Ismael

On Wed, Mar 10, 2021 at 8:42 AM Ismael Juma  wrote:

> Interesting, I didn't have the same problem. I'll try to reproduce.
>
> Ismael
>
> On Wed, Mar 10, 2021, 6:07 AM Alexandre Dupriez <
> alexandre.dupr...@gmail.com> wrote:
>
>> Hi Community,
>>
>> I tried to build Kafka from trunk on my environment today (2021, March
>> 10th) and it failed with the following Gradle error at the beginning
>> of the build, while Gradle configures project from build.gradle:
>>
>>   "Could not get unknown property 'compileJava' for root project
>> '' of type org.gradle.api.Project."
>>
>> The command used is "gradle releaseTarGz". Removing "dependsOn:
>> compileJava" from the task "aggregatedJavadoc" (added on March 9th
>> [1]) made the problem disappear - I wonder if anyone else encountered
>> the same problem?
>>
>> [1] https://github.com/apache/kafka/pull/10272
>>
>> Many thanks,
>> Alexandre
>>
>


Re: [DISCUSS] Apache Kafka 3.0.0 release

2021-03-10 Thread Konstantine Karantasis
Thank you and hi again.

I just published a release plan at:
https://cwiki.apache.org/confluence/pages/viewpage.action?pageId=177046466

I have included all the currently approved KIPs. I'm expecting this list to
grow as we approach KIP freeze.

The KIP freeze date for Apache Kafka 3.0 is June 9th, 2021.

Please let me know if you have any objections.

Regards,
Konstantine


On Wed, Feb 24, 2021 at 9:45 AM Chia-Ping Tsai  wrote:

> Thanks for taking over this hard job! +1
>
> On 2021/02/23 08:02:09, Konstantine Karantasis 
> wrote:
> > Hi all,
> >
> > Given that we seem to reach an agreement that the feature release after
> the
> > upcoming 2.8.0 will be 3.0.0, I'd like to volunteer to be the release
> > manager for the Apache Kafka 3.0.0 release.
> >
> > It's a major release, so I thought it'd be helpful to start planning a
> bit
> > in advance.
> >
> > If there are no objections, I'll start working on a release plan in the
> > next few days.
> >
> > Best,
> > Konstantine
> >
>


[jira] [Created] (KAFKA-12450) Remove deprecated methods from ReadOnlyWindowStore

2021-03-10 Thread Jorge Esteban Quilcate Otoya (Jira)
Jorge Esteban Quilcate Otoya created KAFKA-12450:


 Summary: Remove deprecated methods from ReadOnlyWindowStore
 Key: KAFKA-12450
 URL: https://issues.apache.org/jira/browse/KAFKA-12450
 Project: Kafka
  Issue Type: Sub-task
Reporter: Jorge Esteban Quilcate Otoya






--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Created] (KAFKA-12451) Remove deprecation annotation on long-based read operations in WindowStore

2021-03-10 Thread Jorge Esteban Quilcate Otoya (Jira)
Jorge Esteban Quilcate Otoya created KAFKA-12451:


 Summary: Remove deprecation annotation on long-based read 
operations in WindowStore 
 Key: KAFKA-12451
 URL: https://issues.apache.org/jira/browse/KAFKA-12451
 Project: Kafka
  Issue Type: Sub-task
Reporter: Jorge Esteban Quilcate Otoya






--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Created] (KAFKA-12449) Remove deprecated WindowStore#put

2021-03-10 Thread Jorge Esteban Quilcate Otoya (Jira)
Jorge Esteban Quilcate Otoya created KAFKA-12449:


 Summary: Remove deprecated WindowStore#put
 Key: KAFKA-12449
 URL: https://issues.apache.org/jira/browse/KAFKA-12449
 Project: Kafka
  Issue Type: Sub-task
Reporter: Jorge Esteban Quilcate Otoya






--
This message was sent by Atlassian Jira
(v8.3.4#803005)


Re: [VOTE] KIP-708: Rack awareness for Kafka Streams

2021-03-10 Thread Bruno Cadonna

Hi Levani,

Thanks for the KIP!

+1 (non-binding)

Best,
Bruno

On 10.03.21 12:13, Levani Kokhreidze wrote:

Hello all,

I’d like to start the voting on KIP-708 [1]

Best,
Levani

[1] - 
https://cwiki.apache.org/confluence/display/KAFKA/KIP-708%3A+Rack+awareness+for+Kafka+Streams



Re: Gradle error - aggregatedJavadoc depending on "compileJava"

2021-03-10 Thread Ismael Juma
Interesting, I didn't have the same problem. I'll try to reproduce.

Ismael

On Wed, Mar 10, 2021, 6:07 AM Alexandre Dupriez 
wrote:

> Hi Community,
>
> I tried to build Kafka from trunk on my environment today (2021, March
> 10th) and it failed with the following Gradle error at the beginning
> of the build, while Gradle configures project from build.gradle:
>
>   "Could not get unknown property 'compileJava' for root project
> '' of type org.gradle.api.Project."
>
> The command used is "gradle releaseTarGz". Removing "dependsOn:
> compileJava" from the task "aggregatedJavadoc" (added on March 9th
> [1]) made the problem disappear - I wonder if anyone else encountered
> the same problem?
>
> [1] https://github.com/apache/kafka/pull/10272
>
> Many thanks,
> Alexandre
>


[jira] [Created] (KAFKA-12448) STATE_DIR_CONFIG path cannot be found in windows due to usage of setPosixFilePermissions

2021-03-10 Thread Jira
Ahmet Yortanlı created KAFKA-12448:
--

 Summary: STATE_DIR_CONFIG path cannot be found in windows due to 
usage of setPosixFilePermissions
 Key: KAFKA-12448
 URL: https://issues.apache.org/jira/browse/KAFKA-12448
 Project: Kafka
  Issue Type: Bug
  Components: streams
Affects Versions: 2.6.1
 Environment: Windows 10, Intellij, Spring-Kafka.2.6.6
Reporter: Ahmet Yortanlı


STATE_DIR_CONFIG path cannot be found in windows due to usage of 
setPosixFilePermissions.

I am trying to develop a small application by using the Spring-Kafka library. 
My application cannot be started on windows because in 
org.apache.kafka.streams.processor.internals.StateDirectory class file 
permission operations are performed by assuming underlying os supports 
PosixFileSystem.  WindowsFileSystemProvider returns null when 
getFileAttributeView method is called.

It is not a big problem however, it blocks working in windows machines.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


Gradle error - aggregatedJavadoc depending on "compileJava"

2021-03-10 Thread Alexandre Dupriez
Hi Community,

I tried to build Kafka from trunk on my environment today (2021, March
10th) and it failed with the following Gradle error at the beginning
of the build, while Gradle configures project from build.gradle:

  "Could not get unknown property 'compileJava' for root project
'' of type org.gradle.api.Project."

The command used is "gradle releaseTarGz". Removing "dependsOn:
compileJava" from the task "aggregatedJavadoc" (added on March 9th
[1]) made the problem disappear - I wonder if anyone else encountered
the same problem?

[1] https://github.com/apache/kafka/pull/10272

Many thanks,
Alexandre


Re: Re: [DISCUSS] KIP-706: Add method "Producer#produce" to return CompletionStage instead of Future

2021-03-10 Thread Chia-Ping Tsai
make noise again. I hope this KIP can join 3.0 party :)

On 2021/01/31 05:39:17, Chia-Ping Tsai  wrote: 
> It seems to me changing the input type might make complicate the migration 
> from deprecated send method to new API.
> 
> Personally, I prefer to introduce a interface called “SendRecord” to replace 
> ProducerRecord. Hence, the new API/classes is shown below.
> 
> 1. CompletionStage send(SendRecord)
> 2. class ProducerRecord implement SendRecord
> 3. Introduce builder pattern for SendRecord
> 
> That includes following benefit.
> 
> 1. Kafka users who don’t use both return type and callback do not need to 
> change code even though we remove deprecated send methods. (of course, they 
> still need to compile code with new Kafka)
> 
> 2. Kafka users who need Future can easily migrate to new API by regex 
> replacement. (cast ProduceRecord to SendCast and add toCompletableFuture)
> 
> 3. It is easy to support topic id in the future. We can add new method to 
> SendRecord builder. For example:
> 
> Builder topicName(String)
> Builder topicId(UUID)
> 
> 4. builder pattern can make code more readable. Especially, Produce record 
> has a lot of fields which can be defined by users.
> —
> Chia-Ping
> 
> On 2021/01/30 22:50:36 Ismael Juma wrote:
> > Another thing to think about: the consumer api currently has
> > `subscribe(String|Pattern)` and a number of methods that accept
> > `TopicPartition`. A similar approach could be used for the Consumer to work
> > with topic ids or topic names. The consumer side also has to support
> > regexes so it probably makes sense to have a separate interface.
> > 
> > Ismael
> > 
> > On Sat, Jan 30, 2021 at 2:40 PM Ismael Juma  wrote:
> > 
> > > I think this is a promising idea. I'd personally avoid the overload and
> > > simply have a `Topic` type that implements `SendTarget`. It's a mix of 
> > > both
> > > proposals: strongly typed, no overloads and general class names that
> > > implement `SendTarget`.
> > >
> > > Ismael
> > >
> > > On Sat, Jan 30, 2021 at 2:22 PM Jason Gustafson 
> > > wrote:
> > >
> > >> Giving this a little more thought, I imagine sending to a topic is the
> > >> most
> > >> common case, so maybe it's an overload worth having. Also, if 
> > >> `SendTarget`
> > >> is just a marker interface, we could let `TopicPartition` implement it
> > >> directly. Then we have:
> > >>
> > >> interface SendTarget;
> > >> class TopicPartition implements SendTarget;
> > >>
> > >> CompletionStage send(String topic, Record record);
> > >> CompletionStage send(SendTarget target, Record record);
> > >>
> > >> The `SendTarget` would give us a lot of flexibility in the future. It
> > >> would
> > >> give us a couple options for topic ids. We could either have an overload
> > >> of
> > >> `send` which accepts `Uuid`, or we could add a `TopicId` type which
> > >> implements `SendTarget`.
> > >>
> > >> -Jason
> > >>
> > >>
> > >> On Sat, Jan 30, 2021 at 1:11 PM Jason Gustafson 
> > >> wrote:
> > >>
> > >> > Yeah, good question. I guess we always tend to regret using lower-level
> > >> > types in these APIs. Perhaps there should be some kind of interface:
> > >> >
> > >> > interface SendTarget
> > >> > class TopicIdTarget implements SendTarget
> > >> > class TopicTarget implements SendTarget
> > >> > class TopicPartitionTarget implements SendTarget
> > >> >
> > >> > Then we just have:
> > >> >
> > >> > CompletionStage send(SendTarget target, Record record);
> > >> >
> > >> > Not sure if we could reuse `Record` in the consumer though. We do have
> > >> > some state in `ConsumerRecord` which is not present in `ProducerRecord`
> > >> > (e.g. offset). Perhaps we could provide a `Record` view from
> > >> > `ConsumerRecord` for convenience. That would be useful for use cases
> > >> which
> > >> > involve reading from one topic and writing to another.
> > >> >
> > >> > -Jason
> > >> >
> > >> > On Sat, Jan 30, 2021 at 12:29 PM Ismael Juma  wrote:
> > >> >
> > >> >> Interesting idea. A couple of things to consider:
> > >> >>
> > >> >> 1. Would we introduce the Message concept to the Consumer too? I think
> > >> >> that's what .NET does.
> > >> >> 2. If we eventually allow a send to a topic id instead of topic name,
> > >> >> would
> > >> >> that result in two additional overloads?
> > >> >>
> > >> >> Ismael
> > >> >>
> > >> >> On Sat, Jan 30, 2021 at 11:38 AM Jason Gustafson 
> > >> >> wrote:
> > >> >>
> > >> >> > For the sake of having another option to shoot down, we could take a
> > >> >> page
> > >> >> > from the .net client and separate the message data from the
> > >> destination
> > >> >> > (i.e. topic or partition). This would get around the need to use a
> > >> new
> > >> >> > verb. For example:
> > >> >> >
> > >> >> > CompletionStage send(String topic, Message message);
> > >> >> > CompletionStage send(TopicPartition topicPartition,
> > >> >> Message
> > >> >> > message);
> > >> >> >
> > >> >> > -Jason
> > >> >> >
> > >> >> >
> > >> >> >
> > >> >> > On Sat, Jan 30, 2021 at 11:

[VOTE] KIP-708: Rack awareness for Kafka Streams

2021-03-10 Thread Levani Kokhreidze
Hello all,

I’d like to start the voting on KIP-708 [1]

Best,
Levani

[1] - 
https://cwiki.apache.org/confluence/display/KAFKA/KIP-708%3A+Rack+awareness+for+Kafka+Streams



[jira] [Resolved] (KAFKA-12447) placeholder

2021-03-10 Thread Ben Ellis (Jira)


 [ 
https://issues.apache.org/jira/browse/KAFKA-12447?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ben Ellis resolved KAFKA-12447.
---
Resolution: Fixed

> placeholder
> ---
>
> Key: KAFKA-12447
> URL: https://issues.apache.org/jira/browse/KAFKA-12447
> Project: Kafka
>  Issue Type: Improvement
>  Components: streams
>Reporter: Ben Ellis
>Priority: Major
>




--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Created] (KAFKA-12447) placeholder

2021-03-10 Thread Ben Ellis (Jira)
Ben Ellis created KAFKA-12447:
-

 Summary: placeholder
 Key: KAFKA-12447
 URL: https://issues.apache.org/jira/browse/KAFKA-12447
 Project: Kafka
  Issue Type: Improvement
  Components: streams
Reporter: Ben Ellis






--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Created] (KAFKA-12446) Make

2021-03-10 Thread Ben Ellis (Jira)
Ben Ellis created KAFKA-12446:
-

 Summary: Make 
 Key: KAFKA-12446
 URL: https://issues.apache.org/jira/browse/KAFKA-12446
 Project: Kafka
  Issue Type: Improvement
  Components: streams
Reporter: Ben Ellis


Currently, when an update is processed by KGroupedTable#aggregate, the 
subtractor is called first, then the adder. But per the docs the order of 
execution is not defined (ie. could change in future releases).

[https://kafka.apache.org/26/documentation/streams/developer-guide/dsl-api.html#streams-developer-guide-dsl-aggregating]
{quote}When subsequent non-null values are received for a key (e.g., UPDATE), 
then (1) the subtractor is called with the old value as stored in the table and 
(2) the adder is called with the new value of the input record that was just 
received. The order of execution for the subtractor and adder is not defined.
{quote}
This ticket proposes making the current order of execution part of the public 
contract.

That would allow Kafka Streams DSL users the freedom to use aggregates such as: 

{{aggregate(}}

{{  HashMap::new,}}

{{  (aggKey, newValue, aggValue) ->}}{{{ // adder}}

{{     aggValue.put(newValue.getKey(), newValue.getValue())}}

{{     return aggValue;   }}}{{,}}

{{  (aggKey, oldValue, aggValue) ->}}{{{ // subtractor}}

{{     aggValue.remove(oldValue.getKey())}}

{{     return aggValue;}}

{{   }}}

{{)}}

and handle updates where key remains the same but value changes.

The Kafka Music Example at

[https://github.com/confluentinc/kafka-streams-examples/blob/6.0.1-post/src/main/java/io/confluent/examples/streams/interactivequeries/kafkamusic/KafkaMusicExample.java#L345]

relies on the subtractor being called first.

 

See discussion at 
[https://github.com/confluentinc/kafka-streams-examples/issues/380]

See also the more general point made at 
[https://stackoverflow.com/questions/65888756/clarify-the-order-of-execution-for-the-subtractor-and-adder-is-not-defined]
 
{quote}If the adder and subtractor are non-commutative operations and the order 
in which they are executed can vary, you can end up with different results 
depending on the order of execution of adder and subtractor. An example of a 
useful non-commutative operation would be something like if we’re aggregating 
records into a Set:
{quote}
 

{{.aggregate[Set[Animal]](Set.empty)(}}

{{  adder = (zooKey, animalValue, setOfAnimals) => setOfAnimals + animalValue,}}

{{  subtractor = (zooKey, animalValue, setOfAnimals) => setOfAnimals - 
animalValue}}

{{)}}
{quote}In this example, for duplicated events, if the adder is called before 
the subtractor you would end up removing the value entirely from the set (which 
would be problematic for most use-cases I imagine).
{quote}
As [~mjsax] notes on 
[https://github.com/confluentinc/kafka-streams-examples/issues/380]

 
{quote}the implementation used the same order since 0.10.0 release and it was 
never changed
{quote}
so making this behavior part of the standard amounts to making official what 
has already been stable for a long time.

Cost:
 *  Limits your options for the future. If you ever needed Kafka Streams to 
change the order of execution (or make that order indeterminate instead of its 
current hard coded order), you would have to make that a breaking change.

Benefit:
 * Encourages wider use of the KGroupedTable#aggregate method (current lack of 
a defined order prevents using aggregate with non-commutative adder/subtractor 
functions)
 * Simplifies reasoning about how to use KGroupedTable#aggregate (knowing that 
a given order can be relied upon makes the method itself easier to understand)

 

 

 



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


Build failed in Jenkins: Kafka » kafka-trunk-jdk15 #612

2021-03-10 Thread Apache Jenkins Server
See 


Changes:

[github] MINOR: Remove unused variables, methods, parameters, unthrown 
exceptions, and fix typos (#9457)

[github] MINOR: Add entityType for metadata record definitions (#10116)


--
[...truncated 3.69 MB...]

AclAuthorizerTest > testChangeListenerTiming() PASSED

AclAuthorizerTest > 
testWritesLiteralWritesLiteralAclChangeEventWhenInterBrokerProtocolLessThanKafkaV2eralAclChangesForOlderProtocolVersions()
 STARTED

AclAuthorizerTest > 
testWritesLiteralWritesLiteralAclChangeEventWhenInterBrokerProtocolLessThanKafkaV2eralAclChangesForOlderProtocolVersions()
 PASSED

AclAuthorizerTest > testAuthorzeByResourceTypeSuperUserHasAccess() STARTED

AclAuthorizerTest > testAuthorzeByResourceTypeSuperUserHasAccess() PASSED

AclAuthorizerTest > testAuthorizeByResourceTypePrefixedResourceDenyDominate() 
STARTED

AclAuthorizerTest > testAuthorizeByResourceTypePrefixedResourceDenyDominate() 
PASSED

AclAuthorizerTest > testAuthorizeByResourceTypeMultipleAddAndRemove() STARTED

AclAuthorizerTest > testAuthorizeByResourceTypeMultipleAddAndRemove() PASSED

AclAuthorizerTest > 
testThrowsOnAddPrefixedAclIfInterBrokerProtocolVersionTooLow() STARTED

AclAuthorizerTest > 
testThrowsOnAddPrefixedAclIfInterBrokerProtocolVersionTooLow() PASSED

AclAuthorizerTest > testAccessAllowedIfAllowAclExistsOnPrefixedResource() 
STARTED

AclAuthorizerTest > testAccessAllowedIfAllowAclExistsOnPrefixedResource() PASSED

AclAuthorizerTest > testAuthorizeByResourceTypeDenyTakesPrecedence() STARTED

AclAuthorizerTest > testAuthorizeByResourceTypeDenyTakesPrecedence() PASSED

AclAuthorizerTest > testHighConcurrencyModificationOfResourceAcls() STARTED

AclAuthorizerTest > testHighConcurrencyModificationOfResourceAcls() PASSED

AclAuthorizerTest > testAuthorizeByResourceTypeWithAllPrincipalAce() STARTED

AclAuthorizerTest > testAuthorizeByResourceTypeWithAllPrincipalAce() PASSED

AclAuthorizerTest > testAuthorizeWithEmptyResourceName() STARTED

AclAuthorizerTest > testAuthorizeWithEmptyResourceName() PASSED

AclAuthorizerTest > testAuthorizeThrowsOnNonLiteralResource() STARTED

AclAuthorizerTest > testAuthorizeThrowsOnNonLiteralResource() PASSED

AclAuthorizerTest > testDeleteAllAclOnPrefixedResource() STARTED

AclAuthorizerTest > testDeleteAllAclOnPrefixedResource() PASSED

AclAuthorizerTest > testAddAclsOnLiteralResource() STARTED

AclAuthorizerTest > testAddAclsOnLiteralResource() PASSED

AclAuthorizerTest > testGetAclsPrincipal() STARTED

AclAuthorizerTest > testGetAclsPrincipal() PASSED

AclAuthorizerTest > 
testWritesExtendedAclChangeEventIfInterBrokerProtocolNotSet() STARTED

AclAuthorizerTest > 
testWritesExtendedAclChangeEventIfInterBrokerProtocolNotSet() PASSED

AclAuthorizerTest > testAccessAllowedIfAllowAclExistsOnWildcardResource() 
STARTED

AclAuthorizerTest > testAccessAllowedIfAllowAclExistsOnWildcardResource() PASSED

AclAuthorizerTest > testLoadCache() STARTED

AclAuthorizerTest > testLoadCache() PASSED

AuthorizerInterfaceDefaultTest > testAuthorizeByResourceTypeWithAllHostAce() 
STARTED

AuthorizerInterfaceDefaultTest > testAuthorizeByResourceTypeWithAllHostAce() 
PASSED

AuthorizerInterfaceDefaultTest > 
testAuthorizeByResourceTypeIsolationUnrelatedDenyWontDominateAllow() STARTED

AuthorizerInterfaceDefaultTest > 
testAuthorizeByResourceTypeIsolationUnrelatedDenyWontDominateAllow() PASSED

AuthorizerInterfaceDefaultTest > 
testAuthorizeByResourceTypeWildcardResourceDenyDominate() STARTED

AuthorizerInterfaceDefaultTest > 
testAuthorizeByResourceTypeWildcardResourceDenyDominate() PASSED

AuthorizerInterfaceDefaultTest > 
testAuthorizeByResourceTypeWithAllOperationAce() STARTED

AuthorizerInterfaceDefaultTest > 
testAuthorizeByResourceTypeWithAllOperationAce() PASSED

AuthorizerInterfaceDefaultTest > testAuthorzeByResourceTypeSuperUserHasAccess() 
STARTED

AuthorizerInterfaceDefaultTest > testAuthorzeByResourceTypeSuperUserHasAccess() 
PASSED

AuthorizerInterfaceDefaultTest > 
testAuthorizeByResourceTypePrefixedResourceDenyDominate() STARTED

AuthorizerInterfaceDefaultTest > 
testAuthorizeByResourceTypePrefixedResourceDenyDominate() PASSED

AuthorizerInterfaceDefaultTest > 
testAuthorizeByResourceTypeMultipleAddAndRemove() STARTED

AuthorizerInterfaceDefaultTest > 
testAuthorizeByResourceTypeMultipleAddAndRemove() PASSED

AuthorizerInterfaceDefaultTest > 
testAuthorizeByResourceTypeDenyTakesPrecedence() STARTED

AuthorizerInterfaceDefaultTest > 
testAuthorizeByResourceTypeDenyTakesPrecedence() PASSED

AuthorizerInterfaceDefaultTest > 
testAuthorizeByResourceTypeWithAllPrincipalAce() STARTED

AuthorizerInterfaceDefaultTest > 
testAuthorizeByResourceTypeWithAllPrincipalAce() PASSED

AclEntryTest > testAclJsonConversion() STARTED

AclEntryTest > testAclJsonConversion() PASSED

AuthorizerWrapperTest > 
testAuthorizeByResourceTypeDisableAllowEveryoneOverride() STARTED

AuthorizerWrapp

[VOTE] KIP-717: Deprecate batch-size config from console producer

2021-03-10 Thread Kamal Chandraprakash
Hi,

I'd like to start a vote on KIP-717 to remove batch-size config from the
console producer.

https://cwiki.apache.org/confluence/x/DB1RCg

Thanks,
Kamal