Hi,

First of all, I would highly recommend to use spark that built with scala
2.11 instead of 2.10, because you use ignite that built with scala 2.11.

Please double check that you moved all ZooKeeper module related libs from
“{apache_build}/libs/optional/ignite-zookeper” into “{apache_build}/libs/“
and start the nodes using ignite.sh using your configuration.

If you start the nodes from IDEA please make sure that there are no any
curator libs that are imported by your idea.

2017-05-03 2:53 GMT+03:00 baozipu <job.lei.zh...@gmail.com>:

> Hello, I got this error but I couldn't solve it.
>
> =======================
> java.lang.NoSuchMethodError:
> org.apache.curator.framework.api.CreateBuilder.creatingParen
> tContainersIfNeeded()Lorg/apache/curator/framework/api/P
> rotectACLCreateModePathAndBytesable;
>         at
> org.apache.curator.x.discovery.details.ServiceDiscoveryImpl.
> internalRegisterService(ServiceDiscoveryImpl.java:222)
>         at
> org.apache.curator.x.discovery.details.ServiceDiscoveryImpl.
> registerService(ServiceDiscoveryImpl.java:188)
>         at
> org.apache.ignite.spi.discovery.tcp.ipfinder.zk.TcpDiscovery
> ZookeeperIpFinder.registerAddresses(TcpDiscoveryZookeeperIpF
> inder.java:237)
>         at
> org.apache.ignite.spi.discovery.tcp.ipfinder.TcpDiscoveryIpF
> inderAdapter.initializeLocalAddresses(TcpDiscoveryIpFinderAdapter.java:61)
>         at
> org.apache.ignite.spi.discovery.tcp.TcpDiscoveryImpl.registe
> rLocalNodeAddress(TcpDiscoveryImpl.java:294)
>         at
> org.apache.ignite.spi.discovery.tcp.ServerImpl.spiStart(
> ServerImpl.java:334)
>         at
> org.apache.ignite.spi.discovery.tcp.TcpDiscoverySpi.spiStart
> (TcpDiscoverySpi.java:1850)
>         at
> org.apache.ignite.internal.managers.GridManagerAdapter.start
> Spi(GridManagerAdapter.java:268)
>         at
> org.apache.ignite.internal.managers.discovery.GridDiscoveryM
> anager.start(GridDiscoveryManager.java:685)
>         at
> org.apache.ignite.internal.IgniteKernal.startManager(IgniteK
> ernal.java:1626)
>         at org.apache.ignite.internal.IgniteKernal.start(IgniteKernal.
> java:924)
>         at
> org.apache.ignite.internal.IgnitionEx$IgniteNamedInstance.
> start0(IgnitionEx.java:1799)
>         at
> org.apache.ignite.internal.IgnitionEx$IgniteNamedInstance.
> start(IgnitionEx.java:1602)
>         at org.apache.ignite.internal.IgnitionEx.start0(IgnitionEx.java
> :1042)
>         at org.apache.ignite.internal.IgnitionEx.start(IgnitionEx.java:
> 569)
>         at org.apache.ignite.internal.IgnitionEx.start(IgnitionEx.java:
> 530)
>         at org.apache.ignite.Ignition.getOrStart(Ignition.java:414)
>         at org.apache.ignite.spark.IgniteContext.ignite(IgniteContext.
> scala:143)
>         at
> org.apache.ignite.spark.IgniteContext$$anonfun$1.apply(
> IgniteContext.scala:54)
>         at
> org.apache.ignite.spark.IgniteContext$$anonfun$1.apply(
> IgniteContext.scala:54)
>         at
> org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1$$anonfu
> n$apply$29.apply(RDD.scala:925)
>         at
> org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1$$anonfu
> n$apply$29.apply(RDD.scala:925)
>         at
> org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkC
> ontext.scala:1944)
>         at
> org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkC
> ontext.scala:1944)
>         at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.sca
> la:87)
>         at org.apache.spark.scheduler.Task.run(Task.scala:99)
>         at org.apache.spark.executor.Executor$TaskRunner.run(Executor.
> scala:282)
>         at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPool
> Executor.java:1145)
>         at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoo
> lExecutor.java:615)
>         at java.lang.Thread.run(Thread.java:745)
>
> ===============================
>
> My SBT file only includes Ignite related
>
> ====== SBT   ============
> libraryDependencies += ("org.apache.ignite" % "ignite-spark" % "1.9.0")
>   .exclude("org.scalatest", "scalatest_2.10")
>   .exclude("com.twitter", "chill_2.10")
>   .exclude("org.apache.spark", "spark-unsafe_2.10")
>   .exclude("org.apache.spark", "spark-tags_2.10")
>
> libraryDependencies += "org.apache.ignite" % "ignite-zookeeper" % "1.9.0"
> =======================
>
>
> I have no problem on launching it on my local through IDEA. However, when I
> submit the jar file on my server cluster, it reports such error.
>
> ===== submit command =============
>
> spark-submit --packages
> org.apache.ignite:ignite-spark:1.9.0,org.apache.ignite:ignit
> e-zookeeper:1.9.0
> --conf 'spark.mesos.coarse=true' ignitemaster_2.11-1.0.jar
>
> ==================
>
> I think this command has included everything, but I don't know where is the
> conflict.
>
> Could you please provide steps that I should check one by one? Thanks,
>
>
>
>
>
>
>
>
>
>
>
> --
> View this message in context: http://apache-ignite-users.705
> 18.x6.nabble.com/NoSuchMethodError-org-apache-curator-
> framework-api-CreateBuilder-creatingParentContainersIfNeeded-tp12363.html
> Sent from the Apache Ignite Users mailing list archive at Nabble.com.
>

Reply via email to