Thank you so much for the quick response and great help.
@jeff, I will use the library if the 3.1 release is getting delayed. Thank
you so much.
On Fri, Jan 29, 2021 at 1:23 PM Jeff Evans
wrote:
> If you need to do this in 2.x, this library does the trick:
> https://github.com/fqaiser94/mse
>
>
If you need to do this in 2.x, this library does the trick:
https://github.com/fqaiser94/mse
On Fri, Jan 29, 2021 at 12:15 PM Adam Binford wrote:
> I think they're voting on the next release candidate starting sometime
> next week. So hopefully barring any other major hurdles within the next few
I think they're voting on the next release candidate starting sometime next
week. So hopefully barring any other major hurdles within the next few
weeks.
On Fri, Jan 29, 2021, 1:01 PM Felix Kizhakkel Jose <
felixkizhakkelj...@gmail.com> wrote:
> Wow, that's really great to know. Thank you so much
Wow, that's really great to know. Thank you so much Adam. Do you know when
the 3.1 release is scheduled?
Regards,
Felix K Jose
On Fri, Jan 29, 2021 at 12:35 PM Adam Binford wrote:
> As of 3.0, the only way to do it is something that will recreate the whole
> struct:
> df.withColumn('timingPerio
As of 3.0, the only way to do it is something that will recreate the whole
struct:
df.withColumn('timingPeriod',
f.struct(f.col('timingPeriod.start').cast('timestamp').alias('start'),
f.col('timingPeriod.end').cast('timestamp').alias('end')))
There's a new method coming in 3.1 on the column class
Hello All,
I am using pyspark structured streaming and I am getting timestamp fields
as plain long (milliseconds), so I have to modify these fields into a
timestamp type
a sample json object object:
{
"id":{
"value": "f40b2e22-4003-4d90-afd3-557bc013b05e",
"type": "UUID",
"sy
Hello All,
I am using pyspark structured streaming and I am getting timestamp fields
as plain long (milliseconds), so I have to modify these fields into a
timestamp type
a sample json object object:
{
"id":{
"value": "f40b2e22-4003-4d90-afd3-557bc013b05e",
"type": "UUID",
"sy