Hi All,
I am trying to build spark 1.3.O on standalone Ubuntu 14.04. I am using the
sbt command i.e. "sbt/sbt assembly" to build it. This command works pretty
good with spark version 1.1 however, it gives following error with spark
1.3.0. Any help or suggestions to resolve this would highly be appreciated.

[info] Done updating.
[info] Updating {file:/home/roott/aamirTest/spark/}network-shuffle...
[info] Resolving org.fusesource.jansi#jansi;1.4 ...
[warn]  ::::::::::::::::::::::::::::::::::::::::::::::
[warn]  ::          UNRESOLVED DEPENDENCIES         ::
[warn]  ::::::::::::::::::::::::::::::::::::::::::::::
[warn]  :: org.apache.spark#spark-network-common_2.10;1.3.0: configuration
not p                                                                           
                                 
ublic in org.apache.spark#spark-network-common_2.10;1.3.0: 'test'. It was
requir                                                                          
                                  
ed from org.apache.spark#spark-network-shuffle_2.10;1.3.0 test
[warn]  ::::::::::::::::::::::::::::::::::::::::::::::
[warn]
[warn]  Note: Unresolved dependencies path:
[warn]          org.apache.spark:spark-network-common_2.10:1.3.0
((com.typesafe.                                                                 
                                           
sbt.pom.MavenHelper) MavenHelper.scala#L76)
[warn]            +- org.apache.spark:spark-network-shuffle_2.10:1.3.0
sbt.ResolveException: unresolved dependency:
org.apache.spark#spark-network-comm                                             
                                                               
on_2.10;1.3.0: configuration not public in
org.apache.spark#spark-network-common                                           
                                                                 
_2.10;1.3.0: 'test'. It was required from
org.apache.spark#spark-network-shuffle                                          
                                                                  
_2.10;1.3.0 test
        at sbt.IvyActions$.sbt$IvyActions$$resolve(IvyActions.scala:278)
        at
sbt.IvyActions$$anonfun$updateEither$1.apply(IvyActions.scala:175)
        at
sbt.IvyActions$$anonfun$updateEither$1.apply(IvyActions.scala:157)
        at sbt.IvySbt$Module$$anonfun$withModule$1.apply(Ivy.scala:151)
        at sbt.IvySbt$Module$$anonfun$withModule$1.apply(Ivy.scala:151)
        at sbt.IvySbt$$anonfun$withIvy$1.apply(Ivy.scala:128)
        at sbt.IvySbt.sbt$IvySbt$$action$1(Ivy.scala:56)
        at sbt.IvySbt$$anon$4.call(Ivy.scala:64)
        at xsbt.boot.Locks$GlobalLock.withChannel$1(Locks.scala:93)
        at
xsbt.boot.Locks$GlobalLock.xsbt$boot$Locks$GlobalLock$$withChannelRet           
                                                                                
                 
ries$1(Locks.scala:78)
        at
xsbt.boot.Locks$GlobalLock$$anonfun$withFileLock$1.apply(Locks.scala:           
                                                                                
                 
97)
        at xsbt.boot.Using$.withResource(Using.scala:10)
        at xsbt.boot.Using$.apply(Using.scala:9)
        at
xsbt.boot.Locks$GlobalLock.ignoringDeadlockAvoided(Locks.scala:58)
        at xsbt.boot.Locks$GlobalLock.withLock(Locks.scala:48)
        at xsbt.boot.Locks$.apply0(Locks.scala:31)
        at xsbt.boot.Locks$.apply(Locks.scala:28)
        at sbt.IvySbt.withDefaultLogger(Ivy.scala:64)
        at sbt.IvySbt.withIvy(Ivy.scala:123)
        at sbt.IvySbt.withIvy(Ivy.scala:120)
        at sbt.IvySbt$Module.withModule(Ivy.scala:151)
        at sbt.IvyActions$.updateEither(IvyActions.scala:157)
        at
sbt.Classpaths$$anonfun$sbt$Classpaths$$work$1$1.apply(Defaults.scala           
                                                                                
                 
:1318)
        at
sbt.Classpaths$$anonfun$sbt$Classpaths$$work$1$1.apply(Defaults.scala           
                                                                                
                 
:1315)
        at
sbt.Classpaths$$anonfun$doWork$1$1$$anonfun$85.apply(Defaults.scala:1           
                                                                                
                 
345)
        at
sbt.Classpaths$$anonfun$doWork$1$1$$anonfun$85.apply(Defaults.scala:1           
                                                                                
                 
343)
        at sbt.Tracked$$anonfun$lastOutput$1.apply(Tracked.scala:35)
        at sbt.Classpaths$$anonfun$doWork$1$1.apply(Defaults.scala:1348)
        at sbt.Classpaths$$anonfun$doWork$1$1.apply(Defaults.scala:1342)
        at sbt.Tracked$$anonfun$inputChanged$1.apply(Tracked.scala:45)
        at sbt.Classpaths$.cachedUpdate(Defaults.scala:1360)
        at sbt.Classpaths$$anonfun$updateTask$1.apply(Defaults.scala:1300)
        at sbt.Classpaths$$anonfun$updateTask$1.apply(Defaults.scala:1275)
        at scala.Function1$$anonfun$compose$1.apply(Function1.scala:47)
        at
sbt.$tilde$greater$$anonfun$$u2219$1.apply(TypeFunctions.scala:40)
        at sbt.std.Transform$$anon$4.work(System.scala:63)
        at
sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:22           
                                                                                
                 
6)
        at
sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:22           
                                                                                
                 
6)
        at sbt.ErrorHandling$.wideConvert(ErrorHandling.scala:17)
        at sbt.Execute.work(Execute.scala:235)
        at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:226)
        at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:226)
        at
sbt.ConcurrentRestrictions$$anon$4$$anonfun$1.apply(ConcurrentRestric           
                                                                                
                 
tions.scala:159)
        at sbt.CompletionService$$anon$2.call(CompletionService.scala:28)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:51           
                                                                                
                 
1)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.           
                                                                                
                 
java:1142)
        at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor           
                                                                                
                 
.java:617)
        at java.lang.Thread.run(Thread.java:745)
[error] (network-shuffle/*:update) sbt.ResolveException: unresolved
dependency:                                                                     
                                        
org.apache.spark#spark-network-common_2.10;1.3.0: configuration not public
in or                                                                           
                                 
g.apache.spark#spark-network-common_2.10;1.3.0: 'test'. It was required from
org                                                                             
                               
.apache.spark#spark-network-shuffle_2.10;1.3.0 test




--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/UNRESOLVED-DEPENDENCIES-while-building-Spark-1-3-0-tp22377.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to