[ 
https://issues.apache.org/jira/browse/SPARK-36802?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Vladislav updated SPARK-36802:
------------------------------
    Description: 
After writing the strings, containing symbol like "\" to Hive, the result 
record in HiveTable doesn't contain that symbol. It happens when using the 
standart method of pyspark.sql.readwriter.DataFrameWriter saveAsTable as well 
as insertInto.

For example, running the query

spark.sql("select '\d\{4}' as code").write.saveAsTable('db.table')

I've got the next result in Hive:

spark.table('db.table').collect()[0][0]

>>"d\{4}"

But expected the next 

>> "\d\{4}"

Spark version : '2.3.0.2.6.5.0-292'

 

  was:
After writing the strings, containing symbol like "\" to Hive, the result 
record in HiveTable doesn't contain that symbols. It happens when using the 
standart method of pyspark.sql.readwriter.DataFrameWriter saveAsTable as well 
as insertInto.

For example, running the query

spark.sql("select '\d\{4}' as code").write.saveAsTable('db.table')

I've got the next result in Hive:

spark.table('db.table').collect()[0][0]

>>"d\{4}"

But expected the next 

>> "\d\{4}"

Spark version : '2.3.0.2.6.5.0-292'

 


> Incorrect writing the string, containing symbols like "\" to Hive 
> ------------------------------------------------------------------
>
>                 Key: SPARK-36802
>                 URL: https://issues.apache.org/jira/browse/SPARK-36802
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 2.3.0
>            Reporter: Vladislav
>            Priority: Major
>
> After writing the strings, containing symbol like "\" to Hive, the result 
> record in HiveTable doesn't contain that symbol. It happens when using the 
> standart method of pyspark.sql.readwriter.DataFrameWriter saveAsTable as well 
> as insertInto.
> For example, running the query
> spark.sql("select '\d\{4}' as code").write.saveAsTable('db.table')
> I've got the next result in Hive:
> spark.table('db.table').collect()[0][0]
> >>"d\{4}"
> But expected the next 
> >> "\d\{4}"
> Spark version : '2.3.0.2.6.5.0-292'
>  



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to