[ 
https://issues.apache.org/jira/browse/SPARK-19312?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15832072#comment-15832072
 ] 

Sean Owen commented on SPARK-19312:
-----------------------------------

Hive on Spark is part of Hive, not Spark.

> Spark gives wrong error message when failes to create file due to hdfs quota 
> limit.
> -----------------------------------------------------------------------------------
>
>                 Key: SPARK-19312
>                 URL: https://issues.apache.org/jira/browse/SPARK-19312
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 1.6.0
>         Environment: CDH 5.8
>            Reporter: Rivkin Andrey
>            Priority: Minor
>              Labels: hdfs, quota, spark,
>
> If we set quota on user space and then will try to create table through hive 
> on spark, which will need more space then avaliable, spark will fail with: 
> Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: 
> org.apache.hadoop.hive.ql.metadata.HiveException: 
> org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.hdfs.protocol.AlreadyBeingCreatedException):
>  failed to create file 
> /user/xxxx/hive_db/.hive-staging_hive_..../_task_tmp.-ext-10003/_tmp.000030_0 
> for DFSClient_NONMAPREDUCE_-27052423_230 for client 192.168.x.x because 
> current leaseholder is trying to recreate file.
> If we will change hive execution engine to mr and execute the same command - 
> create table, we will get:
> Caused by: org.apache.hadoop.hdfs.protocol.DSQuotaExceededException: The 
> DiskSpace quota of /user/xxxx is exceeded: quota = 10737418240 B = 10 GB but 
> diskspace consumed = 11098812438 B = 10.34 GB
> After increasing quota hive on spark is working. 
> The problem is with log message, which is inaccurate and not helpful.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to