Hi All,
Does Spark 2.0 Sql support full ANSI SQL query standards?
Thanks,
Rishabh.
017 at 10:11 AM Rishabh Bhardwaj <rbnex...@gmail.com>
> wrote:
>
>> Hi All,
>>
>> I am exploring time-series forecasting with Spark.
>> I have some questions regarding this:
>>
>> 1. Is there any library/package out there in community of *Se
Hi All,
I am exploring time-series forecasting with Spark.
I have some questions regarding this:
1. Is there any library/package out there in community of *Seasonal ARIMA*
implementation in Spark?
2. Is there any implementation of Dynamic Linear Model (*DLM*) on Spark?
3. What are the
Hi Divya,
There is already "from_unixtime" exists in org.apache.spark.sql.frunctions,
Rabin has used that in the sql query,if you want to use it in dataframe DSL
you can try like this,
val new_df = df.select(from_unixtime($"time").as("newtime"))
Thanks,
Rishabh.
On Wed, Jul 20, 2016 at 4:21
Hi All,
I am looking for ways to deploy a ML Pipeline model in production .
Spark has already proved to be a one of the best framework for model
training and creation, but once the ml pipeline model is ready how can I
deploy it outside spark context ?
MLlib model has toPMML method but today
ruct as input, use Row as the type.
>
> On Thu, Nov 5, 2015 at 9:53 PM, Rishabh Bhardwaj <rbnex...@gmail.com>
> wrote:
>
>> Hi all,
>>
>> I am unable to register a UDF with return type as StructType:
>>
>> scala> def test(r:StructType):StructT
Hi all,
I am unable to register a UDF with return type as StructType:
scala> def test(r:StructType):StructType = { r }
>
> test: (r:
>> org.apache.spark.sql.types.StructType)org.apache.spark.sql.types.StructType
>
>
>> scala> sqlContext.udf.register("test",test _ )
>
> scala.MatchError:
Hi all,
I want to have some nesting structure from the existing columns of
the dataframe.
For that,,I am trying to transform a DF in the following way,but couldn't
do it.
scala df.printSchema
root
|-- a: string (nullable = true)
|-- b: string (nullable = true)
|-- c: string (nullable = true)
I am doing it by creating a new data frame out of the fields to be nested
and then join with the original DF.
Looking for some optimized solution here.
On Fri, Aug 7, 2015 at 2:06 PM, Rishabh Bhardwaj rbnex...@gmail.com wrote:
Hi all,
I want to have some nesting structure from the existing
Hi,
I have just started learning DF in sparks and encountered the following
error:
I am creating the following :
*case class A(a1:String,a2:String,a3:String)*
*case class B(b1:String,b2:String,b3:String)*
*case class C(key:A,value:Seq[B])*
Now I have to do a DF with struc
(key :{..},value:{..}
10 matches
Mail list logo