RE: Spark-SQL : Getting current user name in UDF

2022-02-22 Thread Lavelle, Shawn
Apologies, this is Spark 3.2.0. ~ Shawn From: Lavelle, Shawn Sent: Monday, February 21, 2022 5:39 PM To: 'user@spark.apache.org' Subject: Spark-SQL : Getting current user name in UDF Hello Spark Users, I have a UDF I wrote for use with Spark-SQL that performs a look up. In that look up

Spark-SQL : Getting current user name in UDF

2022-02-21 Thread Lavelle, Shawn
Hello Spark Users, I have a UDF I wrote for use with Spark-SQL that performs a look up. In that look up, I need to get the current sql user so I can validate their permissions. I was using org.apach.spark.sql.util.Utils.getCurrentUserName() to retrieve the current active user from within

RE: DataSource API v2 & Spark-SQL

2021-07-02 Thread Lavelle, Shawn
Thanks for following up, I will give this a go! ~ Shawn -Original Message- From: roizaig Sent: Thursday, April 29, 2021 7:42 AM To: user@spark.apache.org Subject: Re: DataSource API v2 & Spark-SQL You can create a custom data source following this blog

Odd NoClassDefFoundError exception

2021-01-26 Thread Lavelle, Shawn
Hello Spark Community, I have a Spark-SQL problem where I am receiving a NoClassDefFoundError error for org.apache.spark.sql.catalyst.util.RebaseDateTime$ . This happens for any query with a filter on a Timestamp column when the query is first run programmatically but not when the query is

RE: DataSource API v2 & Spark-SQL

2020-08-03 Thread Lavelle, Shawn
Thanks for clarifying, Russel. Is spark native catalog reference on the roadmap for dsv2 or should I be trying to use something else? ~ Shawn From: Russell Spitzer [mailto:russell.spit...@gmail.com] Sent: Monday, August 3, 2020 8:27 AM To: Lavelle, Shawn Cc: user Subject: Re: DataSource API

DataSource API v2 & Spark-SQL

2020-08-03 Thread Lavelle, Shawn
Hello Spark community, I have a custom datasource in v1 API that I'm trying to port to v2 API, in Java. Currently I have a DataSource registered via catalog.createTable(name, , schema, options map). When trying to do this in data source API v2, I get an error saying my class (package)

RE: Spark-SQL Query Optimization: overlapping ranges

2017-05-01 Thread Lavelle, Shawn
Jacek, Thanks for your help. I didn’t want to write a bug/enhancement unless warranted. ~ Shawn From: Jacek Laskowski [mailto:ja...@japila.pl] Sent: Thursday, April 27, 2017 8:39 AM To: Lavelle, Shawn <shawn.lave...@osii.com> Cc: user <user@spark.apache.org> Subject: Re: Spa

RE: Spark-SQL Query Optimization: overlapping ranges

2017-04-27 Thread Lavelle, Shawn
of thing. We’re probably going to write our own org.apache.spark.sql.catalyst.rules.Rule to handle it. ~ Shawn From: Jacek Laskowski [mailto:ja...@japila.pl] Sent: Wednesday, April 26, 2017 2:55 AM To: Lavelle, Shawn <shawn.lave...@osii.com> Cc: user <user@spark.apache.org> Subject: Re: Spa

Spark-SQL Query Optimization: overlapping ranges

2017-04-24 Thread Lavelle, Shawn
Hello Spark Users! Does the Spark Optimization engine reduce overlapping column ranges? If so, should it push this down to a Data Source? Example, This: Select * from table where col between 3 and 7 OR col between 5 and 9 Reduces to: Select * from table where col between 3 and 9

RE: Register Spark UDF for use with Hive Thriftserver/Beeline

2017-02-28 Thread Lavelle, Shawn
Minnesota 55340-9457 Phone: 763 551 0559 Fax: 763 551 0750 Email: shawn.lave...@osii.com<mailto:shawn.lave...@osii.com> Website: www.osii.com<http://www.osii.com> From: Lavelle, Shawn Sent: Tuesday, February 28, 2017 10:25 AM To: user@spark.apache.org Subject: Register Spark UDF fo

Register Spark UDF for use with Hive Thriftserver/Beeline

2017-02-28 Thread Lavelle, Shawn
Hello all, I’m trying to make my custom UDFs available from a beeline session via Hive ThriftServer. I’ve been successful in registering them via my DataSourceAPI as it provides the current sqlContext. However, the udfs are not accessible at initial connection, meaning a query won’t parse

Spark-SQL 1.6.2 w/Hive UDF @Description

2016-12-23 Thread Lavelle, Shawn
​Hello Spark Users, I have a Hive UDF that I'm trying to use with Spark-SQL. It's showing up a bit awkwardly: I can load it into the Hive Thrift Server with a "Create function..." query against the hive context. I can then use the UDF in queries. However, a "desc function " says the

Upgrading a Hive External Storage Handler...

2016-07-21 Thread Lavelle, Shawn
Hello, I am looking to upgrade a Hive 0.11 external storage handler that was run on Shark 0.9.2 to work on spark-sql 1.6.1. I’ve run into a snag in that it seems that the storage handler is not receiving predicate pushdown information. Being fairly new to Spark’s development, would