Apologies, this is Spark 3.2.0.
~ Shawn
From: Lavelle, Shawn
Sent: Monday, February 21, 2022 5:39 PM
To: 'user@spark.apache.org'
Subject: Spark-SQL : Getting current user name in UDF
Hello Spark Users,
I have a UDF I wrote for use with Spark-SQL that performs a look up. In
that look up
Hello Spark Users,
I have a UDF I wrote for use with Spark-SQL that performs a look up. In
that look up, I need to get the current sql user so I can validate their
permissions. I was using org.apach.spark.sql.util.Utils.getCurrentUserName()
to retrieve the current active user from within
Thanks for following up, I will give this a go!
~ Shawn
-Original Message-
From: roizaig
Sent: Thursday, April 29, 2021 7:42 AM
To: user@spark.apache.org
Subject: Re: DataSource API v2 & Spark-SQL
You can create a custom data source following this blog
Hello Spark Community,
I have a Spark-SQL problem where I am receiving a NoClassDefFoundError error
for org.apache.spark.sql.catalyst.util.RebaseDateTime$ . This happens for any
query with a filter on a Timestamp column when the query is first run
programmatically but not when the query is
Thanks for clarifying, Russel. Is spark native catalog reference on the
roadmap for dsv2 or should I be trying to use something else?
~ Shawn
From: Russell Spitzer [mailto:russell.spit...@gmail.com]
Sent: Monday, August 3, 2020 8:27 AM
To: Lavelle, Shawn
Cc: user
Subject: Re: DataSource API
Hello Spark community,
I have a custom datasource in v1 API that I'm trying to port to v2 API, in
Java. Currently I have a DataSource registered via catalog.createTable(name,
, schema, options map). When trying to do this in data source API v2,
I get an error saying my class (package)
Jacek,
Thanks for your help. I didn’t want to write a bug/enhancement unless
warranted.
~ Shawn
From: Jacek Laskowski [mailto:ja...@japila.pl]
Sent: Thursday, April 27, 2017 8:39 AM
To: Lavelle, Shawn <shawn.lave...@osii.com>
Cc: user <user@spark.apache.org>
Subject: Re: Spa
of thing. We’re probably going to write our own
org.apache.spark.sql.catalyst.rules.Rule to handle it.
~ Shawn
From: Jacek Laskowski [mailto:ja...@japila.pl]
Sent: Wednesday, April 26, 2017 2:55 AM
To: Lavelle, Shawn <shawn.lave...@osii.com>
Cc: user <user@spark.apache.org>
Subject: Re: Spa
Hello Spark Users!
Does the Spark Optimization engine reduce overlapping column ranges? If so,
should it push this down to a Data Source?
Example,
This: Select * from table where col between 3 and 7 OR col between 5 and 9
Reduces to: Select * from table where col between 3 and 9
Minnesota 55340-9457
Phone: 763 551 0559
Fax: 763 551 0750
Email: shawn.lave...@osii.com<mailto:shawn.lave...@osii.com>
Website: www.osii.com<http://www.osii.com>
From: Lavelle, Shawn
Sent: Tuesday, February 28, 2017 10:25 AM
To: user@spark.apache.org
Subject: Register Spark UDF fo
Hello all,
I’m trying to make my custom UDFs available from a beeline session via Hive
ThriftServer. I’ve been successful in registering them via my DataSourceAPI as
it provides the current sqlContext. However, the udfs are not accessible at
initial connection, meaning a query won’t parse
Hello Spark Users,
I have a Hive UDF that I'm trying to use with Spark-SQL. It's showing up a
bit awkwardly:
I can load it into the Hive Thrift Server with a "Create function..." query
against the hive context. I can then use the UDF in queries. However, a "desc
function " says the
Hello,
I am looking to upgrade a Hive 0.11 external storage handler that was run on
Shark 0.9.2 to work on spark-sql 1.6.1. I’ve run into a snag in that it seems
that the storage handler is not receiving predicate pushdown information.
Being fairly new to Spark’s development, would
13 matches
Mail list logo