Hi, stan,

Thanks. I am evaluating igniteRDD. it is cool!  I am new to spark, I can get sqlContext by ds.sqlContext.
If I want to query all using select, what's the table name?
see below in spark-shell,  I want to do something like 
> sc.sql("select * from ").  Where is the table name?
================================================
scala>val rdd = ic.fromCache[String,BinaryObject]("testCache")
rdd: org.apache.ignite.spark.IgniteRDD[String,org.apache.ignite.binary.BinaryObject] = IgniteRDD[0] at RDD at IgniteAbstractRDD.scala:32
scala> val ds = rdd.sql("select site,timestamp,product from testCache where site=?",("site1"))
ds: org.apache.spark.sql.DataFrame = [SITE: string, TIMESTAMP: bigint ... 1 more field]
scala> ds.schema
res1: org.apache.spark.sql.types.StructType = StructType(StructField(SITE,StringType,true), StructField(TIMESTAMP,LongType,true), StructField(PRODUCT,StringType,true))
scala> val sc = ds.sqlContext
sc: org.apache.spark.sql.SQLContext = org.apache.spark.sql.SQLContext@78682201

Thanks
Shawn

On 2/26/2018 18:25Stanislav Lukyanov<stanlukya...@gmail.com> wrote:

Hi Shawn,

 

You can use Ignite standalone and you can also use it together with Spark.

Please take a look at these SO question and an article:

https://stackoverflow.com/questions/36036910/apache-spark-vs-apache-ignite

https://insidebigdata.com/2016/06/20/apache-ignite-and-apache-spark-complementary-in-memory-computing-solutions/

 

Stan

 

From: shawn.du
Sent: 24 февраля 2018 г. 9:56
To: user
Subject: compute ignite data with spark

 

Hi,

 

Spark is a compute engine.  Ignite also provide compute feature. Also Ignite can integrate with spark.

We are using ignite compute map-reduce feature now.  It is very fast.

I am just curious how spark compares with ignite on computing.

it is possible using spark API computing ignite cache data? 

 

Thanks

Shawn



 

Reply via email to