Hi, Ravindra

Thank you for reply.
I tried to rename the table for several times, and this exception is still
there, and I also tried to delele the related lockfile, unfortunally did
not work.

2016-07-26 18:49 GMT+08:00 Ravindra Pesala <ravi.pes...@gmail.com>:

> Hi,
>
> Are you getting this exception continuously for every load? Usually it
> occurs when you try to load the data concurrently to the same table. So
> please make sure that no other instance of carbon is running and data load
> on the same table is not happening.
> Check if any locks are created under system temp folder with
> <detabasename>/<tablename>/lockfile, if it exists please delete.
>
> Thanks & Regards,
> Ravindra.
>
> On 26 July 2016 at 15:23, Zen Wellon <ustc...@gmail.com> wrote:
>
> > hi, today I use latest carbondata to create a table with
> > "dictionary_exclude" declaration, when I try to load data from csv,
> > carbondata throws "java.lang.RuntimeException: Table is locked for
> > updation. Please try after some time", any help? thx.
> >
> > below is the full stack:
> >
> > java.lang.RuntimeException: Table is locked for updation. Please try
> after
> > some time
> >         at scala.sys.package$.error(package.scala:27)
> >         at
> >
> >
> org.apache.spark.sql.execution.command.LoadTable.run(carbonTableSchema.scala:1045)
> >         at
> >
> >
> org.apache.spark.sql.execution.ExecutedCommand.sideEffectResult$lzycompute(commands.scala:58)
> >         at
> >
> >
> org.apache.spark.sql.execution.ExecutedCommand.sideEffectResult(commands.scala:56)
> >         at
> >
> org.apache.spark.sql.execution.ExecutedCommand.doExecute(commands.scala:70)
> >         at
> >
> >
> org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$5.apply(SparkPlan.scala:132)
> >         at
> >
> >
> org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$5.apply(SparkPlan.scala:130)
> >         at
> >
> >
> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:150)
> >         at
> > org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:130)
> >         at
> >
> >
> org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:55)
> >         at
> >
> >
> org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:55)
> >         at org.apache.spark.sql.DataFrame.<init>(DataFrame.scala:145)
> >         at org.apache.spark.sql.DataFrame.<init>(DataFrame.scala:130)
> >         at
> >
> >
> org.carbondata.spark.rdd.CarbonDataFrameRDD.<init>(CarbonDataFrameRDD.scala:23)
> >         at
> org.apache.spark.sql.CarbonContext.sql(CarbonContext.scala:131)
> >         at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:35)
> >         at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:40)
> >         at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:42)
> >         at $iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:44)
> >         at $iwC$$iwC$$iwC$$iwC.<init>(<console>:46)
> >         at $iwC$$iwC$$iwC.<init>(<console>:48)
> >         at $iwC$$iwC.<init>(<console>:50)
> >         at $iwC.<init>(<console>:52)
> >         at <init>(<console>:54)
> >         at .<init>(<console>:58)
> >         at .<clinit>(<console>)
> >         at .<init>(<console>:7)
> >         at .<clinit>(<console>)
> >         at $print(<console>)
> >         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >         at
> >
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> >         at
> >
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> >         at java.lang.reflect.Method.invoke(Method.java:606)
> >         at
> >
> org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
> >         at
> >
> org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1346)
> >         at
> > org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
> >         at
> org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
> >         at
> org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
> >         at
> > org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
> >         at
> >
> >
> org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
> >         at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
> >         at
> > org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.scala:657)
> >         at
> > org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:665)
> >         at org.apache.spark.repl.SparkILoop.org
> > $apache$spark$repl$SparkILoop$$loop(SparkILoop.scala:670)
> >         at
> >
> >
> org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:997)
> >         at
> >
> >
> org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
> >         at
> >
> >
> org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
> >         at
> >
> >
> scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
> >         at org.apache.spark.repl.SparkILoop.org
> > $apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
> >         at
> org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
> >         at org.apache.spark.repl.Main$.main(Main.scala:31)
> >         at org.apache.spark.repl.Main.main(Main.scala)
> >         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >         at
> >
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> >         at
> >
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> >         at java.lang.reflect.Method.invoke(Method.java:606)
> >         at
> >
> >
> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
> >         at
> > org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
> >         at
> > org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
> >         at
> org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
> >         at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
> >
> > Best regards,
> > William Zen
> >
>
>
>
> --
> Thanks & Regards,
> Ravi
>



-- 


Best regards,
William Zen

Reply via email to