[ https://issues.apache.org/jira/browse/SPARK-1209?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14169835#comment-14169835 ]
Sean Owen commented on SPARK-1209: ---------------------------------- Yes, I wonder too, does SparkHadoopMapRedUtil and SparkHadoopMapReduceUtil need to live in {{org.apache.hadoop}} anymore? I assume they may have in the past to access some package-private Hadoop code. But I've tried moving them under {{org.apache.spark}} and compiling versus a few Hadoop versions and it all seems fine. Am I missing something or is this worth changing? it's private to Spark (well, org.apache right now by necessity) so think it's fair game to move. See https://github.com/srowen/spark/tree/SPARK-1209 > SparkHadoopUtil should not use package org.apache.hadoop > -------------------------------------------------------- > > Key: SPARK-1209 > URL: https://issues.apache.org/jira/browse/SPARK-1209 > Project: Spark > Issue Type: Bug > Affects Versions: 0.9.0 > Reporter: Sandy Pérez González > Assignee: Mark Grover > > It's private, so the change won't break compatibility -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org