unparsed
>>> date column out to separate fields and try to see if a year field isn't
>>> matching the expected values.
>>>
>>> Thanks
>>>
>>> Mike
>>>
>>>
>>> On Thu, Sep 8, 2016 at 8:15 AM, Daniel Lopes <
>
>>
>> On Thu, Sep 8, 2016 at 8:15 AM, Daniel Lopes <dan...@onematch.com.br>
>> wrote:
>>
>>> Thanks,
>>>
>>> I *tested* the function offline and works
>>> Tested too with select * from after convert the data and see the new
>>>
d* the function offline and works
>> Tested too with select * from after convert the data and see the new data
>> good
>> *but* if I *register as temp table* to *join other table* stilll shows *the
>> same error*.
>>
>> ValueError: year out of range
>
an...@onematch.com.br> wrote:
> Thanks,
>
> I *tested* the function offline and works
> Tested too with select * from after convert the data and see the new data
> good
> *but* if I *register as temp table* to *join other table* stilll shows *the
> same error*.
>
Thanks,
I *tested* the function offline and works
Tested too with select * from after convert the data and see the new data
good
*but* if I *register as temp table* to *join other table* stilll shows *the
same error*.
ValueError: year out of range
Best,
*Daniel Lopes*
Chief Data and Analytics
rk-1.6.0-bin-2.6.0/python/
>>> lib/pyspark.zip/pyspark/worker.py", line 106, in process
>>> serializer.dump_stream(func(split_index, iterator), outfile)
>>> File "/usr/local/src/spark160master/spark-1.6.0-bin-2.6.0/python/
>>> lib/pyspark.zip/pyspa
in
>> func = lambda _, it: map(lambda x: returnType.toInternal(f(*x)), it)
>> File "/usr/local/src/spark160master/spark-1.6.0-bin-2.6.0/python/
>> lib/pyspark.zip/pyspark/sql/types.py", line 191, in toInternal
>> else time.mktime(dt.timetuple()))
>> *Value
y",
> line 1563, in
> func = lambda _, it: map(lambda x: returnType.toInternal(f(*x)), it)
> File "/usr/local/src/spark160master/spark-1.6.0-
> bin-2.6.0/python/lib/pyspark.zip/pyspark/sql/types.py", line 191, in
> toInternal
> else time.mktime(dt.timetupl
", line
1563, in
func = lambda _, it: map(lambda x: returnType.toInternal(f(*x)), it)
File
"/usr/local/src/spark160master/spark-1.6.0-bin-2.6.0/python/lib/pyspark.zip/pyspark/sql/types.py",
line 191, in toInternal
else time.mktime(dt.timetuple()))
*ValueError: year out of range *