[jira] [Created] (HIVE-15256) One session close and delete resourceDir, cause others open failed

2016-11-21 Thread meiyoula (JIRA)
meiyoula created HIVE-15256:
---

 Summary: One session close and delete resourceDir, cause others 
open failed
 Key: HIVE-15256
 URL: https://issues.apache.org/jira/browse/HIVE-15256
 Project: Hive
  Issue Type: Bug
  Components: Clients
Reporter: meiyoula


*resourceDir* is shared to clients. When one connected client closes, it will 
delete *resourceDir*. At the same time, other clients are opening session, they 
will failed.

Exception is below:
 {quote}
Error opening session:  | 
org.apache.hive.service.cli.thrift.ThriftCLIService.OpenSession(ThriftCLIService.java:536)
java.lang.RuntimeException: ExitCodeException exitCode=1: chmod: cannot access 
`/opt/huawei/Bigdata/tmp/spark/dlresources': No such file or directory

at 
org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:528)
at 
org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:477)
at 
org.apache.spark.sql.hive.client.ClientWrapper.(ClientWrapper.scala:229)
at 
org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:191)
at 
org.apache.spark.sql.hive.client.ClientWrapper.newSession(ClientWrapper.scala:1053)
at 
org.apache.spark.sql.hive.HiveContext.newSession(HiveContext.scala:93)
at 
org.apache.spark.sql.hive.thriftserver.SparkSQLSessionManager.openSession(SparkSQLSessionManager.scala:89)
at 
org.apache.hive.service.cli.CLIService.openSessionWithImpersonation(CLIService.java:189)
at 
org.apache.hive.service.cli.thrift.ThriftCLIService.getSessionHandle(ThriftCLIService.java:654)
at 
org.apache.hive.service.cli.thrift.ThriftCLIService.OpenSession(ThriftCLIService.java:522)
at 
org.apache.hive.service.cli.thrift.TCLIService$Processor$OpenSession.getResult(TCLIService.java:1257)
at 
org.apache.hive.service.cli.thrift.TCLIService$Processor$OpenSession.getResult(TCLIService.java:1242)
at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39)
at 
org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Server$TUGIAssumingProcessor.process(HadoopThriftAuthBridge.java:690)
at 
org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:285)
at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: ExitCodeException exitCode=1: chmod: cannot access 
`/opt/huawei/Bigdata/tmp/spark/dlresources': No such file or directory

at org.apache.hadoop.util.Shell.runCommand(Shell.java:561)
at org.apache.hadoop.util.Shell.run(Shell.java:472)
at 
org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:738)
at org.apache.hadoop.util.Shell.execCommand(Shell.java:831)
at org.apache.hadoop.util.Shell.execCommand(Shell.java:814)
at 
org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:744)
at 
org.apache.hadoop.fs.RawLocalFileSystem.mkOneDirWithMode(RawLocalFileSystem.java:502)
at 
org.apache.hadoop.fs.RawLocalFileSystem.mkdirsWithOptionalPermission(RawLocalFileSystem.java:542)
at 
org.apache.hadoop.fs.RawLocalFileSystem.mkdirs(RawLocalFileSystem.java:520)
at 
org.apache.hadoop.fs.FilterFileSystem.mkdirs(FilterFileSystem.java:340)
at 
org.apache.hadoop.hive.ql.session.SessionState.createPath(SessionState.java:656)
at 
org.apache.hadoop.hive.ql.session.SessionState.createSessionDirs(SessionState.java:584)
at 
org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:514)
... 18 more
{quote}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Created] (HIVE-11166) HiveHBaseTableOutputFormat can't call getFileExtension(JobConf jc, boolean isCompressed, HiveOutputFormat hiveOutputFormat)

2015-07-01 Thread meiyoula (JIRA)
meiyoula created HIVE-11166:
---

 Summary: HiveHBaseTableOutputFormat can't call 
getFileExtension(JobConf jc, boolean isCompressed, HiveOutputFormat 
hiveOutputFormat)
 Key: HIVE-11166
 URL: https://issues.apache.org/jira/browse/HIVE-11166
 Project: Hive
  Issue Type: Bug
  Components: HBase Handler, Spark
Reporter: meiyoula


 I create a hbase table with HBaseStorageHandler in JDBCServer of spark, then 
execute the *insert into* sql statement, ClassCastException occurs.
{quote}
Error: org.apache.spark.SparkException: Job aborted due to stage failure: Task 
1 in stage 3.0 failed 4 times, most recent failure: Lost task 1.3 in stage 3.0 
(TID 12, vm-17): java.lang.ClassCastException: 
org.apache.hadoop.hive.hbase.HiveHBaseTableOutputFormat cannot be cast to 
org.apache.hadoop.hive.ql.io.HiveOutputFormat
at 
org.apache.spark.sql.hive.SparkHiveWriterContainer.outputFormat$lzycompute(hiveWriterContainers.scala:72)
at 
org.apache.spark.sql.hive.SparkHiveWriterContainer.outputFormat(hiveWriterContainers.scala:71)
at 
org.apache.spark.sql.hive.SparkHiveWriterContainer.getOutputName(hiveWriterContainers.scala:91)
at 
org.apache.spark.sql.hive.SparkHiveWriterContainer.initWriters(hiveWriterContainers.scala:115)
at 
org.apache.spark.sql.hive.SparkHiveWriterContainer.executorSideSetup(hiveWriterContainers.scala:84)
at 
org.apache.spark.sql.hive.execution.InsertIntoHiveTable.org$apache$spark$sql$hive$execution$InsertIntoHiveTable$$writeToFile$1(InsertIntoHiveTable.scala:112)
at 
org.apache.spark.sql.hive.execution.InsertIntoHiveTable$$anonfun$saveAsHiveFile$3.apply(InsertIntoHiveTable.scala:93)
at 
org.apache.spark.sql.hive.execution.InsertIntoHiveTable$$anonfun$saveAsHiveFile$3.apply(InsertIntoHiveTable.scala:93)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:61)
at org.apache.spark.scheduler.Task.run(Task.scala:56)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:197)
at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
{quote}

It's because the code in spark below. To hbase table, the outputFormat is 
HiveHBaseTableOutputFormat, it isn't instanceOf[HiveOutputForm
at].
{quote}
@transient private lazy val 
outputFormat=conf.value.getOutputFormat.asInstanceOf[HiveOutputForm
at[AnyRef, Writable]]
val extension = Utilities.getFileExtension(conf.value, 
fileSinkConf.getCompressed, outputFormat)
{quote}




--
This message was sent by Atlassian JIRA
(v6.3.4#6332)