[jira] [Commented] (SPARK-11623) Sparksql-1.4.1 DataFrameWrite.jdbc() bug

2015-11-11 Thread Nguyen Van Nghia (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-11623?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15001648#comment-15001648
 ] 

Nguyen Van Nghia commented on SPARK-11623:
--

I am using Spark to update RDB table in Oracle 11G, cheking the new spark 
source code, I still found that Spark still do not support for Oracle because 
no Oracle dialect case class definition  exist in JDBCDiallect.scala.
Issue is still opened for my case.

> Sparksql-1.4.1 DataFrameWrite.jdbc() bug
> 
>
> Key: SPARK-11623
> URL: https://issues.apache.org/jira/browse/SPARK-11623
> Project: Spark
>  Issue Type: Bug
>  Components: Java API, Spark Submit, SQL
>Affects Versions: 1.4.1, 1.5.1
> Environment: Spark stand alone cluster
>Reporter: Nguyen Van Nghia
>
> I am running spark-submit in window 8.1 with spark standalone cluster (01 
> worker and 01 master), the job throw Exception  in DataFrameWrite.jdbc(..) 
> scala function.
> We found that the following test:
>  var tableExists = JdbcUtils.tableExists(conn, table) 
> always return false event if we already created a table. 
> That drive the function to do creating table from specified  DataFrame and 
> the SQL Syntax error for creating table, we locate the SQL execution 
> statement hereafter:
>   if (!tableExists) {
> val schema = JDBCWriteDetails.schemaString(df, url)
> val sql = s"CREATE TABLE $table ($schema)"
> conn.prepareStatement(sql).executeUpdate() // This execution cause 
> sql syntax error
>   }
> This happened with spark-1.4.1 and Spark-1.5.1 (our dev environment).
> Please help!



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Comment Edited] (SPARK-11623) Sparksql-1.4.1 DataFrameWrite.jdbc() bug

2015-11-11 Thread Nguyen Van Nghia (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-11623?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15001648#comment-15001648
 ] 

Nguyen Van Nghia edited comment on SPARK-11623 at 11/12/15 3:34 AM:


I am using Spark to update RDB table in Oracle 11G, cheking the newest spark 
source code, I still found that Spark still do not support for Oracle because 
no Oracle dialect case class definition  exist in JDBCDiallect.scala.
Issue is still opened for my case.


was (Author: nghia.n.v2...@gmail.com):
I am using Spark to update RDB table in Oracle 11G, cheking the new spark 
source code, I still found that Spark still do not support for Oracle because 
no Oracle dialect case class definition  exist in JDBCDiallect.scala.
Issue is still opened for my case.

> Sparksql-1.4.1 DataFrameWrite.jdbc() bug
> 
>
> Key: SPARK-11623
> URL: https://issues.apache.org/jira/browse/SPARK-11623
> Project: Spark
>  Issue Type: Bug
>  Components: Java API, Spark Submit, SQL
>Affects Versions: 1.4.1, 1.5.1
> Environment: Spark stand alone cluster
>Reporter: Nguyen Van Nghia
>
> I am running spark-submit in window 8.1 with spark standalone cluster (01 
> worker and 01 master), the job throw Exception  in DataFrameWrite.jdbc(..) 
> scala function.
> We found that the following test:
>  var tableExists = JdbcUtils.tableExists(conn, table) 
> always return false event if we already created a table. 
> That drive the function to do creating table from specified  DataFrame and 
> the SQL Syntax error for creating table, we locate the SQL execution 
> statement hereafter:
>   if (!tableExists) {
> val schema = JDBCWriteDetails.schemaString(df, url)
> val sql = s"CREATE TABLE $table ($schema)"
> conn.prepareStatement(sql).executeUpdate() // This execution cause 
> sql syntax error
>   }
> This happened with spark-1.4.1 and Spark-1.5.1 (our dev environment).
> Please help!



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-11623) Sparksql-1.4.1 DataFrameWrite.jdbc() bug

2015-11-09 Thread Nguyen Van Nghia (JIRA)

 [ 
https://issues.apache.org/jira/browse/SPARK-11623?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Nguyen Van Nghia updated SPARK-11623:
-
Description: 
I am running spark-submit in window 8.1 with spark standalone cluster (01 
worker and 01 master), the job throw Exception  in DataFrameWrite.jdbc(..) 
scala function.
We found that the following test:
 var tableExists = JdbcUtils.tableExists(conn, table) 
always return false event if we already created a table. 
That drive the function to do creating table from specified  DataFrame and the 
SQL Syntax error for creating table, we locate the SQL execution statement 
hereafter:
  if (!tableExists) {
val schema = JDBCWriteDetails.schemaString(df, url)
val sql = s"CREATE TABLE $table ($schema)"
conn.prepareStatement(sql).executeUpdate() // This execution cause sql 
syntax error
  }
This happened with spark-1.4.1 and Spark-1.5.1 (our dev environment).
Please help!



  was:
I am running spark-submit in window 8.1 with spark standalone cluster (01 
worker and 01 master), the job throw Exception  in DataFrameWrite.jdbc(..) 
scala function.
We found that the following test:
 var tableExists = JdbcUtils.tableExists(conn, table) 
always return false event if we already created a table. That drive the 
function to do creating table from specified  DataFrame and the SQL Syntax 
error for creating table, we locate the SQL execution statement hereafter:
  if (!tableExists) {
val schema = JDBCWriteDetails.schemaString(df, url)
val sql = s"CREATE TABLE $table ($schema)"
conn.prepareStatement(sql).executeUpdate() // This execution cause sql 
syntax error
  }
This happened with spark-1.4.1 and Spark-1.5.1 (our dev environment).
Please help!




> Sparksql-1.4.1 DataFrameWrite.jdbc() bug
> 
>
> Key: SPARK-11623
> URL: https://issues.apache.org/jira/browse/SPARK-11623
> Project: Spark
>  Issue Type: Bug
>  Components: Java API, Spark Submit, SQL
>Affects Versions: 1.4.1, 1.5.1
> Environment: Spark stand alone cluster
>Reporter: Nguyen Van Nghia
>
> I am running spark-submit in window 8.1 with spark standalone cluster (01 
> worker and 01 master), the job throw Exception  in DataFrameWrite.jdbc(..) 
> scala function.
> We found that the following test:
>  var tableExists = JdbcUtils.tableExists(conn, table) 
> always return false event if we already created a table. 
> That drive the function to do creating table from specified  DataFrame and 
> the SQL Syntax error for creating table, we locate the SQL execution 
> statement hereafter:
>   if (!tableExists) {
> val schema = JDBCWriteDetails.schemaString(df, url)
> val sql = s"CREATE TABLE $table ($schema)"
> conn.prepareStatement(sql).executeUpdate() // This execution cause 
> sql syntax error
>   }
> This happened with spark-1.4.1 and Spark-1.5.1 (our dev environment).
> Please help!



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-11623) Sparksql-1.4.1 DataFrameWrite.jdbc() bug

2015-11-09 Thread Nguyen Van Nghia (JIRA)
Nguyen Van Nghia created SPARK-11623:


 Summary: Sparksql-1.4.1 DataFrameWrite.jdbc() bug
 Key: SPARK-11623
 URL: https://issues.apache.org/jira/browse/SPARK-11623
 Project: Spark
  Issue Type: Bug
  Components: Java API, Spark Submit, SQL
Affects Versions: 1.5.1, 1.4.1
 Environment: Spark stand alone cluster
Reporter: Nguyen Van Nghia


I am running spark-submit in window 8.1 with spark standalone cluster (01 
worker and 01 master), the job throw Exception  in DataFrameWrite.jdbc(..) 
scala function.
We found that the following test:
 var tableExists = JdbcUtils.tableExists(conn, table) 
always return false event if we already created a table. That drive the 
function to do creating table from specified  DataFrame and the SQL Syntax 
error for creating table, we locate the SQL execution statement hereafter:
  if (!tableExists) {
val schema = JDBCWriteDetails.schemaString(df, url)
val sql = s"CREATE TABLE $table ($schema)"
conn.prepareStatement(sql).executeUpdate() // This execution cause sql 
syntax error
  }
This happened with spark-1.4.1 and Spark-1.5.1 (our dev environment).
Please help!





--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org