[jira] [Updated] (HUDI-4244) Support common Spark transformations w/in Spark SQL "partitioned by" clause

2022-10-01 Thread Zhaojing Yu (Jira)


 [ 
https://issues.apache.org/jira/browse/HUDI-4244?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Zhaojing Yu updated HUDI-4244:
--
Fix Version/s: 0.13.0
   (was: 0.12.1)

> Support common Spark transformations w/in Spark SQL "partitioned by" clause
> ---
>
> Key: HUDI-4244
> URL: https://issues.apache.org/jira/browse/HUDI-4244
> Project: Apache Hudi
>  Issue Type: Improvement
>Reporter: Alexey Kudinkin
>Priority: Major
> Fix For: 0.13.0
>
>
> Currently if you create a Hudi table from Spark SQL:
> {code:java}
> CREATE TABLE test_create(
> f1 STRING,
> f2 STRING,
> f3 STRING,
> ts timestamp
> )using hudi
>  partitioned by ( hours(ts))
>  options (
>   type = 'mor'
>  )
>  tblproperties (
>   primaryKey = 'f1',
>   preCombineField = 'ts'
> );
>  {code}
> You'll be getting
> {code:java}
> java.util.concurrent.ExecutionException: java.lang.RuntimeException: 
> org.apache.spark.sql.AnalysisException: Transforms cannot be converted to 
> partition columns: hours(ts){code}
>  
> Original reported task:
> [https://github.com/apache/hudi/issues/5810]
>  



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Updated] (HUDI-4244) Support common Spark transformations w/in Spark SQL "partitioned by" clause

2022-08-17 Thread Sagar Sumit (Jira)


 [ 
https://issues.apache.org/jira/browse/HUDI-4244?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Sagar Sumit updated HUDI-4244:
--
Epic Link: HUDI-1658  (was: HUDI-1297)

> Support common Spark transformations w/in Spark SQL "partitioned by" clause
> ---
>
> Key: HUDI-4244
> URL: https://issues.apache.org/jira/browse/HUDI-4244
> Project: Apache Hudi
>  Issue Type: Improvement
>Reporter: Alexey Kudinkin
>Priority: Major
> Fix For: 0.12.1
>
>
> Currently if you create a Hudi table from Spark SQL:
> {code:java}
> CREATE TABLE test_create(
> f1 STRING,
> f2 STRING,
> f3 STRING,
> ts timestamp
> )using hudi
>  partitioned by ( hours(ts))
>  options (
>   type = 'mor'
>  )
>  tblproperties (
>   primaryKey = 'f1',
>   preCombineField = 'ts'
> );
>  {code}
> You'll be getting
> {code:java}
> java.util.concurrent.ExecutionException: java.lang.RuntimeException: 
> org.apache.spark.sql.AnalysisException: Transforms cannot be converted to 
> partition columns: hours(ts){code}
>  
> Original reported task:
> [https://github.com/apache/hudi/issues/5810]
>  



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Updated] (HUDI-4244) Support common Spark transformations w/in Spark SQL "partitioned by" clause

2022-08-17 Thread Sagar Sumit (Jira)


 [ 
https://issues.apache.org/jira/browse/HUDI-4244?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Sagar Sumit updated HUDI-4244:
--
Issue Type: Improvement  (was: Bug)

> Support common Spark transformations w/in Spark SQL "partitioned by" clause
> ---
>
> Key: HUDI-4244
> URL: https://issues.apache.org/jira/browse/HUDI-4244
> Project: Apache Hudi
>  Issue Type: Improvement
>Reporter: Alexey Kudinkin
>Priority: Major
> Fix For: 0.12.1
>
>
> Currently if you create a Hudi table from Spark SQL:
> {code:java}
> CREATE TABLE test_create(
> f1 STRING,
> f2 STRING,
> f3 STRING,
> ts timestamp
> )using hudi
>  partitioned by ( hours(ts))
>  options (
>   type = 'mor'
>  )
>  tblproperties (
>   primaryKey = 'f1',
>   preCombineField = 'ts'
> );
>  {code}
> You'll be getting
> {code:java}
> java.util.concurrent.ExecutionException: java.lang.RuntimeException: 
> org.apache.spark.sql.AnalysisException: Transforms cannot be converted to 
> partition columns: hours(ts){code}
>  
> Original reported task:
> [https://github.com/apache/hudi/issues/5810]
>  



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Updated] (HUDI-4244) Support common Spark transformations w/in Spark SQL "partitioned by" clause

2022-08-16 Thread Sagar Sumit (Jira)


 [ 
https://issues.apache.org/jira/browse/HUDI-4244?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Sagar Sumit updated HUDI-4244:
--
Fix Version/s: 0.12.1
   (was: 0.12.0)

> Support common Spark transformations w/in Spark SQL "partitioned by" clause
> ---
>
> Key: HUDI-4244
> URL: https://issues.apache.org/jira/browse/HUDI-4244
> Project: Apache Hudi
>  Issue Type: Bug
>Reporter: Alexey Kudinkin
>Priority: Major
> Fix For: 0.12.1
>
>
> Currently if you create a Hudi table from Spark SQL:
> {code:java}
> CREATE TABLE test_create(
> f1 STRING,
> f2 STRING,
> f3 STRING,
> ts timestamp
> )using hudi
>  partitioned by ( hours(ts))
>  options (
>   type = 'mor'
>  )
>  tblproperties (
>   primaryKey = 'f1',
>   preCombineField = 'ts'
> );
>  {code}
> You'll be getting
> {code:java}
> java.util.concurrent.ExecutionException: java.lang.RuntimeException: 
> org.apache.spark.sql.AnalysisException: Transforms cannot be converted to 
> partition columns: hours(ts){code}
>  
> Original reported task:
> [https://github.com/apache/hudi/issues/5810]
>  



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Updated] (HUDI-4244) Support common Spark transformations w/in Spark SQL "partitioned by" clause

2022-06-13 Thread Alexey Kudinkin (Jira)


 [ 
https://issues.apache.org/jira/browse/HUDI-4244?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Alexey Kudinkin updated HUDI-4244:
--
Description: 
Currently if you create a Hudi table from Spark SQL:
{code:java}
CREATE TABLE test_create(
f1 STRING,
f2 STRING,
f3 STRING,
ts timestamp
)using hudi
 partitioned by ( hours(ts))
 options (
  type = 'mor'
 )
 tblproperties (
  primaryKey = 'f1',
  preCombineField = 'ts'
);
 {code}
You'll be getting
{code:java}
java.util.concurrent.ExecutionException: java.lang.RuntimeException: 
org.apache.spark.sql.AnalysisException: Transforms cannot be converted to 
partition columns: hours(ts){code}
 

Original reported task:

[https://github.com/apache/hudi/issues/5810]

 

  was:
Currently if you create a Hudi table from Spark SQL:
{code:java}
CREATE TABLE test_create(
f1 STRING,
f2 STRING,
f3 STRING,
ts timestamp
)using hudi
 partitioned by ( hours(ts))
 options (
  type = 'mor'
 )
 tblproperties (
  primaryKey = 'f1',
  preCombineField = 'ts'
);
 {code}
 

You'll be getting
{code:java}
java.util.concurrent.ExecutionException: java.lang.RuntimeException: 
org.apache.spark.sql.AnalysisException: Transforms cannot be converted to 
partition columns: hours(ts){code}


> Support common Spark transformations w/in Spark SQL "partitioned by" clause
> ---
>
> Key: HUDI-4244
> URL: https://issues.apache.org/jira/browse/HUDI-4244
> Project: Apache Hudi
>  Issue Type: Bug
>Reporter: Alexey Kudinkin
>Priority: Major
> Fix For: 0.12.0
>
>
> Currently if you create a Hudi table from Spark SQL:
> {code:java}
> CREATE TABLE test_create(
> f1 STRING,
> f2 STRING,
> f3 STRING,
> ts timestamp
> )using hudi
>  partitioned by ( hours(ts))
>  options (
>   type = 'mor'
>  )
>  tblproperties (
>   primaryKey = 'f1',
>   preCombineField = 'ts'
> );
>  {code}
> You'll be getting
> {code:java}
> java.util.concurrent.ExecutionException: java.lang.RuntimeException: 
> org.apache.spark.sql.AnalysisException: Transforms cannot be converted to 
> partition columns: hours(ts){code}
>  
> Original reported task:
> [https://github.com/apache/hudi/issues/5810]
>  



--
This message was sent by Atlassian Jira
(v8.20.7#820007)


[jira] [Updated] (HUDI-4244) Support common Spark transformations w/in Spark SQL "partitioned by" clause

2022-06-13 Thread Alexey Kudinkin (Jira)


 [ 
https://issues.apache.org/jira/browse/HUDI-4244?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Alexey Kudinkin updated HUDI-4244:
--
Priority: Major  (was: Critical)

> Support common Spark transformations w/in Spark SQL "partitioned by" clause
> ---
>
> Key: HUDI-4244
> URL: https://issues.apache.org/jira/browse/HUDI-4244
> Project: Apache Hudi
>  Issue Type: Bug
>Reporter: Alexey Kudinkin
>Priority: Major
> Fix For: 0.12.0
>
>
> Currently if you create a Hudi table from Spark SQL:
> {code:java}
> CREATE TABLE test_create(
> f1 STRING,
> f2 STRING,
> f3 STRING,
> ts timestamp
> )using hudi
>  partitioned by ( hours(ts))
>  options (
>   type = 'mor'
>  )
>  tblproperties (
>   primaryKey = 'f1',
>   preCombineField = 'ts'
> );
>  {code}
>  
> You'll be getting
> {code:java}
> java.util.concurrent.ExecutionException: java.lang.RuntimeException: 
> org.apache.spark.sql.AnalysisException: Transforms cannot be converted to 
> partition columns: hours(ts){code}



--
This message was sent by Atlassian Jira
(v8.20.7#820007)