ate fields and try to see if a year field isn't
>>> matching the expected values.
>>>
>>> Thanks
>>>
>>> Mike
>>>
>>>
>>> On Thu, Sep 8, 2016 at 8:15 AM, Daniel Lopes
>>> wrote:
>>>
>>>> T
Sep 8, 2016 at 8:15 AM, Daniel Lopes
>> wrote:
>>
>>> Thanks,
>>>
>>> I *tested* the function offline and works
>>> Tested too with select * from after convert the data and see the new
>>> data good
>>> *but* if I *register as temp ta
ffline and works
>> Tested too with select * from after convert the data and see the new data
>> good
>> *but* if I *register as temp table* to *join other table* stilll shows *the
>> same error*.
>>
>> ValueError: year out of range
>>
>> Best,
>>
>>
6 at 8:15 AM, Daniel Lopes wrote:
> Thanks,
>
> I *tested* the function offline and works
> Tested too with select * from after convert the data and see the new data
> good
> *but* if I *register as temp table* to *join other table* stilll shows *the
> same error*.
>
> Valu
Thanks,
I *tested* the function offline and works
Tested too with select * from after convert the data and see the new data
good
*but* if I *register as temp table* to *join other table* stilll shows *the
same error*.
ValueError: year out of range
Best,
*Daniel Lopes*
Chief Data and Analytics
recent
>>> call last):
>>> File "/usr/local/src/spark160master/spark-1.6.0-bin-2.6.0/python/
>>> lib/pyspark.zip/pyspark/worker.py", line 111, in main
>>> process()
>>> File "/usr/local/src/spark160master/spark-1.6.0-bin-2.6.0/py
ter/spark-1.6.0-bin-2.6.0/python/
>> lib/pyspark.zip/pyspark/serializers.py", line 263, in dump_stream
>> vs = list(itertools.islice(iterator, batch))
>> File "/usr/local/src/spark160master/spark/python/pyspark/sql/functions.py",
>> line 1563, in
>>
n
> func = lambda _, it: map(lambda x: returnType.toInternal(f(*x)), it)
> File "/usr/local/src/spark160master/spark-1.6.0-
> bin-2.6.0/python/lib/pyspark.zip/pyspark/sql/types.py", line 191, in
> toInternal
> else time.mktime(dt.timetuple()))
> *Va
quot;, line
1563, in
func = lambda _, it: map(lambda x: returnType.toInternal(f(*x)), it)
File
"/usr/local/src/spark160master/spark-1.6.0-bin-2.6.0/python/lib/pyspark.zip/pyspark/sql/types.py",
line 191, in toInternal
else time.mktime(dt.timetuple()))
*ValueError: year out of rang