GitHub user maver1ck opened a pull request:
https://github.com/apache/spark/pull/19234
[SPARK-22010] Change fromInternal method of TimestampType
## What changes were proposed in this pull request?
This PR changes the way pySpark converts Timestamp format from internal to
Python representation.
**Benchmarks**
Before change:
4.58 µs ± 558 ns per loop (mean ± std. dev. of 7 runs, 100000 loops each)
After change:
System with UTC timezone
1.49 µs ± 142 ns per loop (mean ± std. dev. of 7 runs, 1000000 loops
each)
Other timezones:
3.15 µs ± 388 ns per loop (mean ± std. dev. of 7 runs, 100000 loops each)
## How was this patch tested?
Existing tests.
Performance benchmarks.
You can merge this pull request into a Git repository by running:
$ git pull https://github.com/maver1ck/spark spark_22010
Alternatively you can review and apply these changes as the patch at:
https://github.com/apache/spark/pull/19234.patch
To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:
This closes #19234
----
commit 238b5563e444b6b936f2e2771ec7876f648af1e9
Author: Maciej BryÅski <[email protected]>
Date: 2017-09-14T14:56:52Z
Change internal Timestamp conversion
commit 0cb2a482a41711531a9367b88bf1558f5c87ac4c
Author: Maciej BryÅski <[email protected]>
Date: 2017-09-14T14:58:50Z
Typo fix
commit 02301eb4aa8686fcafdeba3b13ec772be8938ed6
Author: Maciej BryÅski <[email protected]>
Date: 2017-09-14T15:07:22Z
Import fix
----
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]