Spark ANSI SQL Support

2017-01-17 Thread Rishabh Bhardwaj
Hi All, Does Spark 2.0 Sql support full ANSI SQL query standards? Thanks, Rishabh.

Re: Time-Series Analysis with Spark

2017-01-11 Thread Rishabh Bhardwaj
017 at 10:11 AM Rishabh Bhardwaj <rbnex...@gmail.com> > wrote: > >> Hi All, >> >> I am exploring time-series forecasting with Spark. >> I have some questions regarding this: >> >> 1. Is there any library/package out there in community of *Se

Time-Series Analysis with Spark

2017-01-11 Thread Rishabh Bhardwaj
Hi All, I am exploring time-series forecasting with Spark. I have some questions regarding this: 1. Is there any library/package out there in community of *Seasonal ARIMA* implementation in Spark? 2. Is there any implementation of Dynamic Linear Model (*DLM*) on Spark? 3. What are the

Re: write and call UDF in spark dataframe

2016-07-20 Thread Rishabh Bhardwaj
Hi Divya, There is already "from_unixtime" exists in org.apache.spark.sql.frunctions, Rabin has used that in the sql query,if you want to use it in dataframe DSL you can try like this, val new_df = df.select(from_unixtime($"time").as("newtime")) Thanks, Rishabh. On Wed, Jul 20, 2016 at 4:21

Deploying ML Pipeline Model

2016-07-01 Thread Rishabh Bhardwaj
Hi All, I am looking for ways to deploy a ML Pipeline model in production . Spark has already proved to be a one of the best framework for model training and creation, but once the ml pipeline model is ready how can I deploy it outside spark context ? MLlib model has toPMML method but today

Re: Unable to register UDF with StructType

2015-11-06 Thread Rishabh Bhardwaj
ruct as input, use Row as the type. > > On Thu, Nov 5, 2015 at 9:53 PM, Rishabh Bhardwaj <rbnex...@gmail.com> > wrote: > >> Hi all, >> >> I am unable to register a UDF with return type as StructType: >> >> scala> def test(r:StructType):StructT

Unable to register UDF with StructType

2015-11-05 Thread Rishabh Bhardwaj
Hi all, I am unable to register a UDF with return type as StructType: scala> def test(r:StructType):StructType = { r } > > test: (r: >> org.apache.spark.sql.types.StructType)org.apache.spark.sql.types.StructType > > >> scala> sqlContext.udf.register("test",test _ ) > > scala.MatchError:

DataFrame column structure change

2015-08-07 Thread Rishabh Bhardwaj
Hi all, I want to have some nesting structure from the existing columns of the dataframe. For that,,I am trying to transform a DF in the following way,but couldn't do it. scala df.printSchema root |-- a: string (nullable = true) |-- b: string (nullable = true) |-- c: string (nullable = true)

Re: DataFrame column structure change

2015-08-07 Thread Rishabh Bhardwaj
I am doing it by creating a new data frame out of the fields to be nested and then join with the original DF. Looking for some optimized solution here. On Fri, Aug 7, 2015 at 2:06 PM, Rishabh Bhardwaj rbnex...@gmail.com wrote: Hi all, I want to have some nesting structure from the existing

Cast Error DataFrame/RDD doing group by and case class

2015-07-30 Thread Rishabh Bhardwaj
Hi, I have just started learning DF in sparks and encountered the following error: I am creating the following : *case class A(a1:String,a2:String,a3:String)* *case class B(b1:String,b2:String,b3:String)* *case class C(key:A,value:Seq[B])* Now I have to do a DF with struc (key :{..},value:{..}