This sounds like a bug.  What version of spark? and can you provide the
stack trace?

On Sun, Aug 2, 2015 at 11:27 AM, fuellee lee <lifuyu198...@gmail.com> wrote:

> I'm trying to process a bunch of large json log files with spark, but it
> fails every time with `scala.MatchError`, Whether I give it schema or not.
>
> I just want to skip lines that does not match schema, but I can't find how
> in docs of spark.
>
> I know write a json parser and map it to json file RDD can get things
> done, but I want to use
> `sqlContext.read.schema(schema).json(fileNames).selectExpr(...)` because
> it's much easier to maintain.
>
> thanks
>

Reply via email to