how the figures in apache ignite website be drawed

2016-05-22 Thread F7753
Hi, any one knows that how the figures in apache ignite website be drawed? Using which tool? That is very beautiful! -- View this message in context: http://apache-ignite-users.70518.x6.nabble.com/how-the-figures-in-apache-ignite-website-be-drawed-tp5081.html Sent from the Apache Ignite Users m

Re: Cross cache query on spark: schema not found

2016-04-10 Thread F7753
Any ideas? -- View this message in context: http://apache-ignite-users.70518.x6.nabble.com/Cross-cache-query-on-spark-schema-not-found-tp3991p4040.html Sent from the Apache Ignite Users mailing list archive at Nabble.com.

Re: Cross cache query on spark: schema not found

2016-04-07 Thread F7753
Should the two cacheConfiguration must be in one Ignite instance? And why I can not get the sqlSchema name after I created a cacheConfiguration. Whatever, I can always get the "schema not found " exception. --

Re: How to solve the 22 parameters' limit under scala 2.10 in the case class?

2016-04-07 Thread F7753
Hi Val, Yes, it did very weird and I am very sure that the exception -- " nested exception is org.xml.sax.SAXParseException; systemId: http://www.springframework.org/schema/beans/spring-beans.xs

Re: Cross cache query on spark: schema not found

2016-04-07 Thread F7753
Thanks Alexey I turn to the source code of ignite and found that: if I had not set the sqlSchema, the sqlSchema name will be the same of cacheCfg.cacheName. But when I try to get the sqlSchema name of a cacheCfg before I set a sqlSchema name manually, there will be a null pointer exception.

Re: Cross cache query on spark: schema not found

2016-04-07 Thread F7753
My question is: how to config a schema name in a cross cache query? I found that in scala , the method "className.class.getSimpleName" is differ from the method "className.getClass().getSimpleName()", the former will add a letter "$" at the end of the class name, so I write a hard code like "val pa

Re: How to solve the 22 parameters' limit under scala 2.10 in the case class?

2016-04-07 Thread F7753
Hi Val This issue happens whenever I use the XML file to initialize a IgniteContext while I can not link to the internet, the two way to avoid the issus on my cluster is to use a IgniteConfiguration to instead the XML file or to make sure the node has already linked to the internet. I'm sure that t

Cross cache query on spark: schema not found

2016-04-07 Thread F7753
First, I created two cache: - val smallTableCacheCfg = new CacheConfiguration[Long, TableSchema.table_small](SMALL) smallTableCacheCfg.setIndexedTypes(classOf[Long], classOf[TableSchema.table_small])

Re: How to config the "query-indexing" module in the node?

2016-04-06 Thread F7753
Thank you, Val The "query-indexing" exception disappeared, here is the way I start a app: - /opt/spark-1.6.1/bin/spark-submit --class main.scala.StreamingJoin --properties-file conf/spark.conf --packages c

Re: How to config the "query-indexing" module in the node?

2016-04-06 Thread F7753
I added "export CLASSPATH=.:$IGNITE_HOME/libs" to /etc/profile, then source it , but of no use ... -- View this message in context: http://apache-ignite-users.70518.x6.nabble.com/How-to-config-the-query-indexing-module-in-the-node-tp3962p3973.html Sent from the Apache Ignite Users mailing list

Re: How to config the "query-indexing" module in the node?

2016-04-06 Thread F7753
Yes, I do have this folder and contains all the jars that you have listed above ,in fact it has that folder originally. Here is the folder and its components: -- View this message in context: ht

Re: How to config the "query-indexing" module in the node?

2016-04-06 Thread F7753
I added this config to my spark-env.sh # Optionally set IGNITE_HOME here. # IGNITE_HOME=/path/to/my-ignite-home IGNITE_LIBS="${IGNITE_HOME}/libs/*" for file in ${IGNITE_HOME}/libs/* do if [ -d ${file} ] && [ "${file}" != "${IGNITE_HOM

How to config the "query-indexing" module in the node?

2016-04-06 Thread F7753
Well, I have already set the class path on my nodes, but when I run my app, it still shows the lack of "ignite-indexing". - Exception in thread "main" javax.cache.CacheException: Failed to execute query. Add

Re: How to solve the 22 parameters' limit under scala 2.10 in the case class?

2016-04-06 Thread F7753
Sorry vkulichenko, which one do you want to reproduce? I had made so many mistake here in this topic. -- View this message in context: http://apache-ignite-users.70518.x6.nabble.com/How-to-solve-the-22-parameters-limit-under-scala-2-10-in-the-case-class-tp3847p3958.html Sent from the Apache Ign

Re: How to end up the GC overhead problem in the IgniteRDD?

2016-04-06 Thread F7753
Thanks for Alexy and vkulichenko, I modified the ignite.sh, re-configured the jvm params, this did work. And may I ask another question? " Failed to execute query. Add module 'ignite-indexing' to the classpath of all Ignite nodes" I compiled the code with maven added all dependencies and it runs su

Re: Why the client and server behaves like that?

2016-04-05 Thread F7753
Thanks a lot to let me know that. I think I'd use some time to refer to the ignite doc more carefully. And I created another topic about the GC OOM in my cluster: http://apache-ignite-users.70518.x6.nabble.com/How-to-end-up-the-GC-overhead-problem-in-the-IgniteRDD-tc3945.html -- View this messa

Re: How to end up the GC overhead problem in the IgniteRDD?

2016-04-05 Thread F7753
Is there some configuration I can use to monitor the GC behavior in the ignite? It is curious that a OOM happens on a node with more than 100GB RAM. -- View this message in context: http://apache-ignite-users.70518.x6.nabble.com/How-to-end-up-the-GC-overhead-problem-in-the-IgniteRDD-tp3945p394

Re: Should Ignition.start() method to be called in a spark-igniteRDD app?

2016-04-05 Thread F7753
Hi agura, I refer to the source code of IgniteContext.scala, and notified that the ignite method will do "Ignition.start()" in a try-catch block. And the code I use is listed in this topic: http://apache-ignite-users.70518.x6.nabble.com/How-to-solve-the-22-parameters-limit-under-scala-2-10-in-the-c

Re: How to solve the 22 parameters' limit under scala 2.10 in the case class?

2016-04-05 Thread F7753
There is nothing more different from the code I show in the 6th comments in this topic, only the initialization of igniteContext, I also list the way I create it in the above comnents: -- val sma

How to end up the GC overhead problem in the IgniteRDD?

2016-04-05 Thread F7753
I run a spark streaming app in a ignite cluster overlap with a spark cluster(the server node of the ignite is also the worker node of the spark), the monitor page shows that only one task can success, then after a while an OOM error will be throwed

Re: Why the client and server behaves like that?

2016-04-05 Thread F7753
I found that GC was the main problem in my circumstance, each of my node throws the GC exception: Exception in thread "shmem-worker-#175%null%" java.lang.OutOfMemoryError: GC overhead li

Why the client and server behaves like that?

2016-04-05 Thread F7753
Here I launched 3 ignite node using ${IGNITE_HOME}/bin/ignite.sh, but the output of the control like below: - [18:51:09] Topology snapshot [ver=4, servers=3, clients=1, CPUs=96, heap=53.0GB]

Re: How to solve the 22 parameters' limit under scala 2.10 in the case class?

2016-04-05 Thread F7753
I use the constructor like "val smallTableContext = new IgniteContext[BigInt, zb_test](ssc.sparkContext, () => new IgniteConfiguration())" instead of given a springXML file, since there is no need to put all the parameters in the XML file. -- View this message in context: http://apache-ignite-u

Re: How to solve the 22 parameters' limit under scala 2.10 in the case class?

2016-04-04 Thread F7753
In fact, I do not use the XML file to set the cache configuration. It seems that I had to connect to the Internet each time I launch the application, there is no local XML file cache(I remember the spring seems to fetch local XML file if there is one, the usage of spring in ignite is differ from th

Re: How to solve the 22 parameters' limit under scala 2.10 in the case class?

2016-04-04 Thread F7753
I copied the config file to all other nodes then it throws: - Exception in thread "main" org.apache.spark.SparkException: Job aborted due to stage failure: Task 1 in stage 0.0 failed 4 times, most recent fai

Re: How to solve the 22 parameters' limit under scala 2.10 in the case class?

2016-04-04 Thread F7753
main.scala /** * Created by F7753 on 2016/3/30. */ import kafka.serializer.StringDecoder import org.apache.ignite.cache.CacheMode import org.apache.ignite.cache.query.annotations.QuerySqlField import org.apache.spark.{SparkConf, SparkContext} import org.apache.spark.streaming.kafka._ import

Re: Should Ignition.start() method to be called in a spark-igniteRDD app?

2016-04-01 Thread F7753
What makes me very confused is that when I use ignite to run a spark word count app, it did not ask me to call 'Ignition.start()', while I use igniteRDD to cache table and just run a spark streaming app, the console throw the exception. What's the difference? -- View this message in context: ht

Re: Should Ignition.start() method to be called in a spark-igniteRDD app?

2016-04-01 Thread F7753
I added 'Ignition.start()' in my first line of the 'main' method, but still got the same error, add also notified that it seems the approach I use to start ignite in spark cluster is wrong, the ignite instance is more than I expected, there is 3 worker nodes in my cluster . I use '${IGNITE_HOME}/bi

Re: Should Ignition.start() method to be called in a spark-igniteRDD app?

2016-04-01 Thread F7753
I've no idea about how the Ignition.start() method was called in the ignite kernel. Is there some document to refer to? -- View this message in context: http://apache-ignite-users.70518.x6.nabble.com/Should-Ignition-start-method-to-be-called-in-a-spark-igniteRDD-app-tp3854p3860.html Sent from

Should Ignition.start() method to be called in a spark-igniteRDD app?

2016-04-01 Thread F7753
I got an error when I submit a jar. The cluster does not seems to be wrong, since I've tested it using the official guide: https://apacheignite-fs.readme.io/docs/ignitecontext-igniterdd#section-running-sql-queries-against-ignite-cache

Re: How to solve the 22 parameters' limit under scala 2.10 in the case class?

2016-04-01 Thread F7753
Thank you for your advice , I hope it will work, at least the compilation did not throw any exception. But I got another error when I submit the jar. The cluster does not seems to be wrong, since I've tested it using the official guide: https://apacheignite-fs.readme.io/docs/ignitecontext-igniterd

Re: How to solve the 22 parameters' limit under scala 2.10 in the case class?

2016-04-01 Thread F7753
Here are some details: I created the case class A which parameter is less than 22 and the case class B which has more than 22 parameters, I want to let ignite to create index on the fields , so the case class is like: ---

Re: A question of ignite on spark streaming

2016-04-01 Thread F7753
vkulichenko: Thanks for your carefulness, I put the query string into the 'sql' method, hope it well works well, but when I compile this maven project, I meet another problem and post another question in this forum : *

How to solve the 22 parameters' limit under scala 2.10 in the case class?

2016-04-01 Thread F7753
*Description:* I have two tables to join in spark - ignite hyper environment. one has 20 fields, the other has 40 fields, I *can not create more than 22 params in a case class under scala 2.10*(It has solved in scala 2.11, but for some reason, *I can not use 2.11 in my environment)*. The ign

Re: A question of ignite on spark streaming

2016-03-31 Thread F7753
Here comes another question : If I use two IgniteContext in my Spark Streaming application, since the IgniteContext.fromCache() method returns the IgniteRDD, which does not have a query( SqlFieldQuery) method to apply cross query sql. What shoud I do? Implement query method in IgniteRDD or some

Re: A question of ignite on spark streaming

2016-03-31 Thread F7753
24242A1 for ; Thu, 31 Mar 2016 03:07:13 -0700 (PDT) Date: Thu, 31 Mar 2016 03:05:10 -0700 (PDT) From: F7753 To: user@ignite.apache.org Message-ID: <1459418710613-3815.p...@n6.nabble.com> Subject: How to use ignite on spark streaming to join two tables in ignite cache? MIME-Version: 1.0 Conte