Thank you Jakob, you were bang on the money. Jorge appologies my snippets
was partial and I hadn't made it equivelent to my failing test.
For reference and for all that pass this way, here is the (a) working
solution with passing tests without inferring a schema, it was the second
test that had be
I think the issue is that the `json.read` function has no idea of the
underlying schema, in fact the documentation
(http://spark.apache.org/docs/latest/api/scala/index.html#org.apache.spark.sql.DataFrameReader)
says:
> Unless the schema is specified using schema function, this function goes
> thr
Hi Anthony,
I try the code on my self. I think it is on the jsonStr:
I do it with : val jsonStr = """{"customer_id":
"3ee066ab571e03dd5f3c443a6c34417a","product_id": 3}”""
or is it the “,” after your 3 oder the “\n”
Regards
> On 22/02/2016, at 15:42, Anthony Brew wrote:
>
> Hi,
>
Hi,
I'm trying to parse JSON data into a case class using the
DataFrame.as[] function, nut I am hitting an unusual error and the interweb
isnt solving my pain so thought I would reach out for help. Ive truncated
my code a little here to make it readable, but the error is full
My case class lo