[ https://issues.apache.org/jira/browse/SPARK-6320?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14368683#comment-14368683 ]
Santiago M. Mola edited comment on SPARK-6320 at 3/19/15 8:12 AM: ------------------------------------------------------------------ [~marmbrus] We could change strategies so that they take a SparkPlanner in their constructor. This should provide enough flexibility for [~H.Youssef]'s use case and might improve code organization of the core strategies in the future. was (Author: smolav): [~marmbrus] We could change strategies so that they take a SparkPlanner in their constructor. This should provide enough flexibility for [~H.Youssef]]'s use case and might improve code organization of the core strategies in the future. > Adding new query plan strategy to SQLContext > -------------------------------------------- > > Key: SPARK-6320 > URL: https://issues.apache.org/jira/browse/SPARK-6320 > Project: Spark > Issue Type: Bug > Components: SQL > Affects Versions: 1.3.0 > Reporter: Youssef Hatem > Priority: Minor > > Hi, > I would like to add a new strategy to {{SQLContext}}. To do this I created a > new class which extends {{Strategy}}. In my new class I need to call > {{planLater}} function. However this method is defined in {{SparkPlanner}} > (which itself inherits the method from {{QueryPlanner}}). > To my knowledge the only way to make {{planLater}} function visible to my new > strategy is to define my strategy inside another class that extends > {{SparkPlanner}} and inherits {{planLater}} as a result, by doing so I will > have to extend the {{SQLContext}} such that I can override the {{planner}} > field with the new {{Planner}} class I created. > It seems that this is a design problem because adding a new strategy seems to > require extending {{SQLContext}} (unless I am doing it wrong and there is a > better way to do it). > Thanks a lot, > Youssef -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org