Sorry I mean, I tried this command
    ./sbt/sbt clean
and now it works.

Is it because of cached components no recompiled?

On 8/4/14, 4:44 PM, Larry Xiao wrote:
I guessed
    ./sbt/sbt clean
and it works fine now.

On 8/4/14, 11:48 AM, Larry Xiao wrote:
On the latest pull today (6ba6c3ebfe9a47351a50e45271e241140b09bf10) meet assembly problem.

$ ./sbt/sbt assembly
Using /usr/lib/jvm/java-7-oracle as default JAVA_HOME.
Note, this will be overridden by -java-home if it is set.
[info] Loading project definition from ~/spark/project/project
[info] Loading project definition from ~/.sbt/0.13/staging/ec3aa8f39111944cc5f2/sbt-pom-reader/project [warn] Multiple resolvers having different access mechanism configured with same name 'sbt-plugin-releases'. To avoid conflict, Remove duplicate project resolvers (`resolvers`) or rename publishing resolv
er (`publishTo`).
[info] Loading project definition from ~/spark/project
[info] Set current project to spark-parent (in build file:~/spark/)
[info] Compiling 372 Scala sources and 35 Java sources to ~/spark/core/target/scala-2.10/classes... [error] ~/spark/core/src/main/scala/org/apache/spark/ui/jobs/JobProgressListener.scala:116: type mismatch;
[error]  found   : org.apache.spark.ui.jobs.TaskUIData
[error]  required: org.apache.spark.ui.jobs.UIData.TaskUIData
[error] stageData.taskData.put(taskInfo.taskId, new TaskUIData(taskInfo))
[error]                                               ^
[error] ~/spark/core/src/main/scala/org/apache/spark/ui/jobs/JobProgressListener.scala:134: type mismatch;
[error]  found   : org.apache.spark.ui.jobs.ExecutorSummary
[error]  required: org.apache.spark.ui.jobs.UIData.ExecutorSummary
[error] val execSummary = execSummaryMap.getOrElseUpdate(info.executorId, new ExecutorSummary)
[error] ^
[error] ~/spark/core/src/main/scala/org/apache/spark/ui/jobs/JobProgressListener.scala:163: type mismatch;
[error]  found   : org.apache.spark.ui.jobs.TaskUIData
[error]  required: org.apache.spark.ui.jobs.UIData.TaskUIData
[error] val taskData = stageData.taskData.getOrElseUpdate(info.taskId, new TaskUIData(info))
[error] ^
[error] ~/spark/core/src/main/scala/org/apache/spark/ui/jobs/JobProgressListener.scala:180: type mismatch;
[error]  found   : org.apache.spark.ui.jobs.ExecutorSummary
[error]  required: org.apache.spark.ui.jobs.UIData.ExecutorSummary
[error] val execSummary = stageData.executorSummary.getOrElseUpdate(execId, new ExecutorSummary)
[error] ^
[error] ~/spark/core/src/main/scala/org/apache/spark/ui/jobs/StagePage.scala:109: type mismatch; [error] found : org.apache.spark.ui.jobs.TaskUIData => Seq[scala.xml.Node] [error] required: org.apache.spark.ui.jobs.UIData.TaskUIData => Seq[scala.xml.Node]
[error] Error occurred in an application involving default arguments.
[error] taskHeaders, taskRow(hasInput, hasShuffleRead, hasShuffleWrite, hasBytesSpilled), tasks)
[error]                             ^
[error] ~/spark/core/src/main/scala/org/apache/spark/ui/jobs/StagePage.scala:119: constructor cannot be instantiated to expected type;
[error]  found   : org.apache.spark.ui.jobs.TaskUIData
[error]  required: org.apache.spark.ui.jobs.UIData.TaskUIData
[error] val serializationTimes = validTasks.map { case TaskUIData(_, metrics, _) =>
[error] ^
[error] ~/spark/core/src/main/scala/org/apache/spark/ui/jobs/StagePage.scala:120: not found: value metrics
[error]             metrics.get.resultSerializationTime.toDouble
[error]             ^

I think the code doesn't make correct reference to the updated structure.

"core/src/main/scala/org/apache/spark/ui/jobs/UIData.scala" is introduced in commit 72e9021eaf26f31a82120505f8b764b18fbe8d48

Larry





---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org

Reply via email to