Thank you for your reply,
? ? I understand your explaination, but i wonder what is the?correct usage of 
the apinew              SparkContext(config: SparkConf, 
preferredNodeLocationData: Map[String, Set[SplitInfo]])how to construct the 
second param?preferredNodeLocationData?hope for your reply!


qinwei
?发件人:?Shao, Saisai发送时间:?2014-09-28?14:42收件人:?qinwei抄送:?user主题:?RE: problem with 
data locality api







Hi
?
First conf is used for Hadoop to determine the locality distribution of HDFS 
file. Second conf is used for Spark, though with the same name, actually
 they are two different classes.
?
Thanks
Jerry
?


From: qinwei [mailto:wei....@dewmobile.net]


Sent: Sunday, September 28, 2014 2:05 PM

To: user

Subject: problem with data locality api


?

Hi, everyone


? ? I come across with a problem about data locality, i found 
these?example?code in
《Spark-on-YARN-A-Deep-Dive-Sandy-Ryza.pdf》


? ??? ??val locData = InputFormatInfo.computePreferredLocations(Seq(new 
InputFormatInfo(conf,
 classOf[TextInputFormat], new Path(“myfile.txt”)))?

? ??? ??val sc = new SparkContext(conf,
 locData)

? ? but i found the two confs above are of different types, conf in the first 
line if of type?org.apache.hadoop.conf.Configuration,
 and conf in the second line is of type SparkConf, ?can anyone
 explain that to me or give me some example code?


? ??






qinwei








Reply via email to