rdblue commented on a change in pull request #1096:
URL: https://github.com/apache/iceberg/pull/1096#discussion_r438271764



##########
File path: build.gradle
##########
@@ -235,6 +235,38 @@ project(':iceberg-data') {
   }
 }
 
+project(':iceberg-flink') {
+  apply plugin: 'scala'
+
+  dependencies {
+    compile project(':iceberg-api')
+    compile project(':iceberg-common')
+    compile project(':iceberg-core')
+    compile project(':iceberg-data')
+    compile project(':iceberg-orc')
+    compile project(':iceberg-parquet')
+    compile project(':iceberg-arrow')
+    compile "org.apache.flink:flink-streaming-java_2.11::tests"

Review comment:
       I didn't intend to ask you to add another module to the build. Sorry 
about not being clear.
   
   In many cases, it's possible to work with both 2.11 and 2.12 with compiled 
binaries. We have two internal versions of Spark that we do this with, although 
Spark defines the catalog and table APIs in Java so we have less to worry 
about. Ideally, we would have one module that works with both and we would have 
some plan to test that. Otherwise, it's probably a good idea to move this code 
into Flink as soon as possible because it makes more sense to maintain just 
source compatibility there.
   
   For now, I think we should consider options for testing this against both 
2.11 and 2.12. Maybe we can add a build property to switch between 2.11 and 
2.12 and just test both in CI. Until then, I think we should just build for 
2.12. 2.11 hasn't been supported since 2017, so it makes sense for new 
development to target 2.12.
   
   




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to