[ https://issues.apache.org/jira/browse/DATAFU-148?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16642885#comment-16642885 ]
Eyal Allweil commented on DATAFU-148: ------------------------------------- I pushed the suggested changes into the gradle build. However, after looking at a few other examples (spark-testing-base, spark-daria) it seems that Spark libraries are typically made available in a number of different versions - in some cases a cross product between scala and spark version. What versions do we want to make available for our package? I would forego Spark 1.x, for example. I don't actually know how to make a gradle build that compiles multiple artifacts from the same source code in an elegant way, but I'll see if I can figure it out. > Setup Spark sub-project > ----------------------- > > Key: DATAFU-148 > URL: https://issues.apache.org/jira/browse/DATAFU-148 > Project: DataFu > Issue Type: New Feature > Reporter: Eyal Allweil > Assignee: Eyal Allweil > Priority: Major > > Create a skeleton Spark sub project for Spark code to be contributed to DataFu -- This message was sent by Atlassian JIRA (v7.6.3#76005)