I am trying to use Confluent Platform 3.3.0 and the S3-Connector and I get
a StackOverflowError error:
java.lang.StackOverflowError
at java.util.HashMap.hash(HashMap.java:338)
at java.util.LinkedHashMap.get(LinkedHashMap.java:440)
at org.apache.avro.JsonProperties.getJsonProp(JsonProperties.java:141)
at org.apache.avro.JsonProperties.getProp(JsonProperties.java:130)
at io.confluent.connect.avro.AvroData.toConnectSchema(AvroData.java:1258)
at io.confluent.connect.avro.AvroData.toConnectSchema(AvroData.java:1239)
at io.confluent.connect.avro.AvroData.toConnectSchema(AvroData.java:1348)
at io.confluent.connect.avro.AvroData.toConnectSchema(AvroData.java:1381)
at io.confluent.connect.avro.AvroData.toConnectSchema(AvroData.java:1359)
at io.confluent.connect.avro.AvroData.toConnectSchema(AvroData.java:1239)
at io.confluent.connect.avro.AvroData.toConnectSchema(AvroData.java:1348)
at io.confluent.connect.avro.AvroData.toConnectSchema(AvroData.java:1381)
at io.confluent.connect.avro.AvroData.toConnectSchema(AvroData.java:1359)
with the following schema structure:
@namespace ("com.test.avro")
protocol TestError {
record TestError {
union { null, string } type = null;
union { null, array<TestError> } errors = null;
}
}
Doing some debugging it looks like the recursive method toConnectSchema is
not breaking the recursion for schemas which referentiates them self.
Is this a known issue and some workaround to bypass it?
Thanks,
Catalin