Re: [Spark 2.0] Error during codegen for Java POJO

2016-08-05 Thread Andy Grove
I tracked this down in the end. It turns out the POJO was not actually defined as 'public' for some reason. It seems like this should be detected as an error prior to generating code though? Thanks, Andy. -- Andy Grove Chief Architect www.agildata.com On Fri, Aug 5, 2016 at 8:28 AM, Andy

Re: Java and SparkSession

2016-08-05 Thread Andy Grove
Ah, you still have to use the JavaSparkContext rather than using the sparkSession.sparkContext ... that makes sense. Thanks for your help. Thanks, Andy. -- Andy Grove Chief Architect www.agildata.com On Fri, Aug 5, 2016 at 12:03 PM, Everett Anderson <ever...@nuna.com> wrote: > Hi

[Spark 2.0] Error during codegen for Java POJO

2016-08-05 Thread Andy Grove
ile a JIRA for. Thanks, Andy. -- Andy Grove Chief Architect www.agildata.com

Java and SparkSession

2016-08-04 Thread Andy Grove
>From some brief experiments using Java with Spark 2.0 it looks like Java developers should stick to SparkContext and SQLContext rather than using the new SparkSession API? It would be great if someone could confirm if that is the intention or not. Thanks, Andy. -- Andy Grove Chief Archit

Re: Regression in Java RDD sortBy() in Spark 2.0

2016-08-04 Thread Andy Grove
is better for sure. Thanks, Andy. -- Andy Grove Chief Architect AgilData - Simple Streaming SQL that Scales www.agildata.com On Thu, Aug 4, 2016 at 10:25 PM, Andy Grove <andy.gr...@agildata.com> wrote: > Hi, > > I have some working Java code with Spark 1.6 that I am upgrading to Sp

Regression in Java RDD sortBy() in Spark 2.0

2016-08-04 Thread Andy Grove
ing wrong or is this a regression? Thanks, Andy. -- Andy Grove Chief Architect www.agildata.com

Re: Inferring schema from GenericRowWithSchema

2016-05-17 Thread Andy Grove
On Tue, May 17, 2016 at 11:48 AM, Andy Grove <andy.gr...@agildata.com> > wrote: > >> >> Hi, >> >> I have a requirement to create types dynamically in Spark and then >> instantiate those types from Spark SQL via a UDF. >> >> I tried doing

Inferring schema from GenericRowWithSchema

2016-05-17 Thread Andy Grove
Hi, I have a requirement to create types dynamically in Spark and then instantiate those types from Spark SQL via a UDF. I tried doing the following: val addressType = StructType(List( new StructField("state", DataTypes.StringType), new StructField("zipcode", DataTypes.IntegerType) ))

Re: Use case for RDD and Data Frame

2016-02-16 Thread Andy Grove
This blog post should be helpful http://www.agildata.com/apache-spark-rdd-vs-dataframe-vs-dataset/ Thanks, Andy. -- Andy Grove Chief Architect AgilData - Simple Streaming SQL that Scales www.agildata.com On Tue, Feb 16, 2016 at 9:05 AM, Ashok Kumar <ashok34...@yahoo.com.invalid>

Re: Scala MatchError in Spark SQL

2016-01-20 Thread Andy Grove
-dataframe-vs-dataset/ Thanks, Andy. -- Andy Grove Chief Architect AgilData - Simple Streaming SQL that Scales www.agildata.com On Wed, Jan 20, 2016 at 7:07 AM, raghukiran <raghuki...@gmail.com> wrote: > Hi, > > I created a custom UserDefinedType in Java as follows: >

Re: Scala MatchError in Spark SQL

2016-01-20 Thread Andy Grove
I'm talking about implementing CustomerRecord as a scala case class, rather than as a Java class. Scala case classes implement the scala.Product trait, which Catalyst is looking for. Thanks, Andy. -- Andy Grove Chief Architect AgilData - Simple Streaming SQL that Scales www.agildata.com

Re: Scala MatchError in Spark SQL

2016-01-20 Thread Andy Grove
ome pointers. > > Thanks, > Raghu > > On Wed, Jan 20, 2016 at 12:25 PM, Andy Grove <andy.gr...@agildata.com> > wrote: > >> I'm talking about implementing CustomerRecord as a scala case class, >> rather than as a Java class. Scala case classes implement th

Re: Scala MatchError in Spark SQL

2016-01-20 Thread Andy Grove
Honestly, moving to Scala and using case classes is the path of least resistance in the long term. Thanks, Andy. -- Andy Grove Chief Architect AgilData - Simple Streaming SQL that Scales www.agildata.com On Wed, Jan 20, 2016 at 10:19 AM, Raghu Ganti <raghuki...@gmail.com> wrote: &g