I haven’t seen this but it may be a bug in Typesafe Config, since this is
serializing a Config object. We don’t actually use Typesafe Config ourselves.
Do you have any nulls in the data itself by any chance? And do you know how
that Config object is getting there?
Matei
On Apr 9, 2014, at
Ok I thought it may be closing over the config option. I am using config
for job configuration, but extracting vals from that. So not sure why as I
thought I'd avoided closing over it. Will go back to source and see where
it is creeping in.
On Thu, Apr 10, 2014 at 8:42 AM, Matei Zaharia
There was a closure over the config object lurking around - but in any case
upgrading to 1.2.0 for config did the trick as it seems to have been a bug
in Typesafe config,
Thanks Matei!
On Thu, Apr 10, 2014 at 8:46 AM, Nick Pentreath nick.pentre...@gmail.comwrote:
Ok I thought it may be
Hi
I'm using Spark 0.9.0.
When calling saveAsTextFile on a custom hadoop inputformat (loaded with
newAPIHadoopRDD), I get the following error below.
If I call count, I get the correct count of number of records, so the
inputformat is being read correctly... the issue only appears when trying
to