Oh no worries at all. If you want, I'd be glad to make updates and PR for
anything I find, eh?!
On Fri, Feb 20, 2015 at 12:18 Michael Armbrust <mich...@databricks.com>
wrote:

> Yeah, sorry.  The programming guide has not been updated for 1.3.  I'm
> hoping to get to that this weekend / next week.
>
> On Fri, Feb 20, 2015 at 9:55 AM, Denny Lee <denny.g....@gmail.com> wrote:
>
>> Quickly reviewing the latest SQL Programming Guide
>> <https://github.com/apache/spark/blob/master/docs/sql-programming-guide.md>
>> (in github) I had a couple of quick questions:
>>
>> 1) Do we need to instantiate the SparkContext as per
>> // sc is an existing SparkContext.
>> val sqlContext = new org.apache.spark.sql.SQLContext(sc)
>>
>> Within Spark 1.3 the sqlContext is already available so probably do not
>> need to make this call.
>>
>> 2) Importing org.apache.spark.sql._ should bring in both SQL data types,
>> struct types, and row
>> // Import Spark SQL data types and Row.
>> import org.apache.spark.sql._
>>
>> Currently with Spark 1.3 RC1, it appears org.apache.spark.sql._ only
>> brings in row.
>>
>> scala> import org.apache.spark.sql._
>>
>> import org.apache.spark.sql._
>>
>>
>> scala> val schema =
>>
>>      |   StructType(
>>
>>      |     schemaString.split(" ").map(fieldName =>
>> StructField(fieldName, StringType, true)))
>>
>> <console>:25: error: not found: value StructType
>>
>>          StructType(
>>
>> But if I also import in org.apache.spark.sql.types_
>>
>> scala> import org.apache.spark.sql.types._
>>
>> import org.apache.spark.sql.types._
>>
>>
>> scala> val schema =
>>
>>      |   StructType(
>>
>>      |     schemaString.split(" ").map(fieldName =>
>> StructField(fieldName, StringType, true)))
>>
>> schema: org.apache.spark.sql.types.StructType =
>> StructType(StructField(DeviceMake,StringType,true),
>> StructField(Country,StringType,true))
>>
>> Wondering if this is by design or perhaps a quick documentation / package
>> update is warranted.
>>
>>
>>
>>
>>
>

Reply via email to