Thank you for your advice , I hope it will work, at least the compilation did
not throw any exception.
But I got another error when I submit the jar. The cluster does not seems to
be wrong, since I've tested it using the official guide:
https://apacheignite-fs.readme.io/docs/ignitecontext-igniterdd#section-running-sql-queries-against-ignite-cache
<https://apacheignite-fs.readme.io/docs/ignitecontext-igniterdd#section-running-sql-queries-against-ignite-cache>
and the simple use case works well. *The exception tell me that I should
write 'Ignition.start()'* before other code,I have not do this in the use
case given by the guide, there must be something wrong. *here is the full
stack trace:*
-------------------------------------------------------------------------------------------------------------------
Exception in thread "main" org.apache.spark.SparkException: Job aborted due
to stage failure: Task 0 in stage 0.0 failed 4 times, most recent failure:
Lost task 0.3 in stage 0.0 (TID 3, nobida145): class
org.apache.ignite.IgniteIllegalStateException: Ignite instance with provided
name doesn't exist. Did you call Ignition.start(..) to start an Ignite
instance? [name=null]
at org.apache.ignite.internal.IgnitionEx.grid(IgnitionEx.java:1235)
at org.apache.ignite.Ignition.ignite(Ignition.java:516)
at org.apache.ignite.spark.IgniteContext.ignite(IgniteContext.scala:150)
at
org.apache.ignite.spark.IgniteContext$$anonfun$1.apply$mcVI$sp(IgniteContext.scala:58)
at
org.apache.ignite.spark.IgniteContext$$anonfun$1.apply(IgniteContext.scala:58)
at
org.apache.ignite.spark.IgniteContext$$anonfun$1.apply(IgniteContext.scala:58)
at scala.collection.Iterator$class.foreach(Iterator.scala:727)
at
org.apache.spark.InterruptibleIterator.foreach(InterruptibleIterator.scala:28)
at
org.apache.spark.rdd.RDD$$anonfun$foreach$1$$anonfun$apply$32.apply(RDD.scala:912)
at
org.apache.spark.rdd.RDD$$anonfun$foreach$1$$anonfun$apply$32.apply(RDD.scala:912)
at
org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1858)
at
org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1858)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)
at org.apache.spark.scheduler.Task.run(Task.scala:89)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
Driver stacktrace:
at
org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1431)
at
org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1419)
at
org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1418)
at
scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
at
org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1418)
at
org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:799)
at
org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:799)
at scala.Option.foreach(Option.scala:236)
at
org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:799)
at
org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:1640)
at
org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1599)
at
org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1588)
at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48)
at
org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:620)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:1832)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:1845)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:1858)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:1929)
at org.apache.spark.rdd.RDD$$anonfun$foreach$1.apply(RDD.scala:912)
at org.apache.spark.rdd.RDD$$anonfun$foreach$1.apply(RDD.scala:910)
at
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:150)
at
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:111)
at org.apache.spark.rdd.RDD.withScope(RDD.scala:316)
at org.apache.spark.rdd.RDD.foreach(RDD.scala:910)
at org.apache.ignite.spark.IgniteContext.<init>(IgniteContext.scala:58)
at main.scala.StreamingJoin$.main(StreamingJoin.scala:180)
at main.scala.StreamingJoin.main(StreamingJoin.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at
org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
at
org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: class org.apache.ignite.IgniteIllegalStateException: Ignite
instance with provided name doesn't exist. Did you call Ignition.start(..)
to start an Ignite instance? [name=null]
at org.apache.ignite.internal.IgnitionEx.grid(IgnitionEx.java:1235)
at org.apache.ignite.Ignition.ignite(Ignition.java:516)
at org.apache.ignite.spark.IgniteContext.ignite(IgniteContext.scala:150)
at
org.apache.ignite.spark.IgniteContext$$anonfun$1.apply$mcVI$sp(IgniteContext.scala:58)
at
org.apache.ignite.spark.IgniteContext$$anonfun$1.apply(IgniteContext.scala:58)
at
org.apache.ignite.spark.IgniteContext$$anonfun$1.apply(IgniteContext.scala:58)
at scala.collection.Iterator$class.foreach(Iterator.scala:727)
at
org.apache.spark.InterruptibleIterator.foreach(InterruptibleIterator.scala:28)
at
org.apache.spark.rdd.RDD$$anonfun$foreach$1$$anonfun$apply$32.apply(RDD.scala:912)
at
org.apache.spark.rdd.RDD$$anonfun$foreach$1$$anonfun$apply$32.apply(RDD.scala:912)
at
org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1858)
at
org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1858)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)
at org.apache.spark.scheduler.Task.run(Task.scala:89)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
./igparkStr.sh: line 16: --repositories: command not found
-----------------------------------------------------------------------------------------------------------------
igsparkStr.sh is a shell script I wrote to submit a spark app, its content
listed below:
-----------------------------------------------------------------------------------------------------------------
/opt/spark-1.6.1/bin/spark-submit --class main.scala.StreamingJoin
--properties-file conf/spark.conf
./myapp-1.0-SNAPSHOT-jar-with-dependencies.jar --packages
org.apache.ignite:ignite-spark:1.5.0.final
16 --repositories
http://www.gridgainsystems.com/nexus/content/repositories/external
-----------------------------------------------------------------------------------------------------------------
--
View this message in context:
http://apache-ignite-users.70518.x6.nabble.com/How-to-solve-the-22-parameters-limit-under-scala-2-10-in-the-case-class-tp3847p3852.html
Sent from the Apache Ignite Users mailing list archive at Nabble.com.