Here's a fragment of code that intends to convert a Dataset of features
into a Vector of Doubles for use as the features column for SparkML's
DecisionTree algorithm. My current problem is the .map() operation, which
refuses to compile with an eclipse error "The method map(Function1,
Encoder) in
I'm experimenting with Spark 2.0.1 for the first time and hitting a problem
right out of the gate.
My main routine starts with this which I think is the standard idiom.
SparkSession sparkSession = SparkSession
.builder()
Mitch: could you elaborate on: You can practically run most of your unit
testing with Local mode and deploy variety of options including running SQL
queries, reading data from CSV files, writing to HDFS, creating Hive tables
including ORC tables and doing Spark Streaming.
In particular, are you
chema() { return SCHEMA$; }. Does anybody have any idea what's causing this
and how to get around it?
Dr. Brad J. CoxCell: 703-594-1883 Skype: dr.brad.cox
> On Apr 10, 2016, at 12:51 PM, Brad Cox wrote:
>
> I'm getting a StackOverflowError from inside the crea
I'm getting a StackOverflowError from inside the createDataFrame call in this
example. It originates in scala code involving java type inferencing which
calls itself in an infinite loop.
final EventParser parser = new EventParser();
JavaRDD eventRDD = sc.textFile(path)
.map(new Function(