.
It appears that just permuting the order of decencies inclusion solves this
problem:
first spark, than your cdh hadoop distro.
HTH,
Pierre
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/SparkContext-startup-time-out-tp1753p6582.html
Sent from
I was annoyed by this as well.
It appears that just permuting the order of decencies inclusion solves this
problem:
first spark, than your cdh hadoop distro.
HTH,
Pierre
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/SparkContext-startup-time-out
How did you deal with this problem, I have met with it these days.God bless
me.
Best regard,
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/SparkContext-startup-time-out-tp1753p5738.html
Sent from the Apache Spark User List mailing list archive
How did you deal with this problem finally?I also met with it.
Best regards,
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/SparkContext-startup-time-out-tp1753p5739.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
(NativeConstructorAccessorImpl.java:39)
I don't quite understand though, I ran the dependency-graph plugin and
nowhere is netty in my dependency chain.
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/SparkContext-startup-time-out-tp1753p2669.html
Sent from