I don't think it's raised by lockfile, because I've tried to recreate a
new table with a totally different name. However, I'll check it tomorrow.

2016-08-29 23:09 GMT+08:00 Ravindra Pesala <ravi.pes...@gmail.com>:

> Hi,
>
> Did you check if any locks are created under system temp folder with
> <detabasename>/<tablename>/lockfile, if it exists please delete and try.
>
> Thanks,
> Ravi.
>
> On 29 August 2016 at 20:29, Zen Wellon <ustc...@gmail.com> wrote:
>
> > Hi Ravi,
> >
> > After I upgrade carbon to 0.1.0, this problem occurs every time when I
> try
> > to load data, and I'm sure no other carbon is running because I use my
> > personal dev spark-cluster, I've also tried to recreate a new table, but
> > it's still there..
> >
> > 2016-08-29 18:11 GMT+08:00 Ravindra Pesala <ravi.pes...@gmail.com>:
> >
> > > Hi,
> > >
> > > Are you getting this exception continuously for every load? Usually it
> > > occurs when you try to load the data concurrently to the same table. So
> > > please make sure that no other instance of carbon is running and data
> > load
> > > on the same table is not happening.
> > > Check if any locks are created under system temp folder with
> > > <detabasename>/<tablename>/lockfile, if it exists please delete.
> > >
> > > Thanks & Regards,
> > > Ravi
> > >
> > > On Mon, 29 Aug 2016 1:27 pm Zen Wellon, <ustc...@gmail.com> wrote:
> > >
> > > > Hi guys,
> > > > When I tried to load some data into carbondata table with carbon
> > 0.1.0, I
> > > > met a problem below.
> > > >
> > > > WARN  29-08 15:40:17,535 - Lost task 10.0 in stage 2.1 (TID 365,
> > > > amlera-30-6.gtj): java.lang.RuntimeException: Dictionary file
> > > ***(sensitive
> > > > column) is locked for updation. Please try after some time
> > > >         at scala.sys.package$.error(package.scala:27)
> > > >         at
> > > >
> > > > org.apache.carbondata.spark.rdd.CarbonGlobalDictionaryGenerate
> > > RDD$$anon$1.<init>(CarbonGlobalDictionaryRDD.scala:354)
> > > >         at
> > > >
> > > > org.apache.carbondata.spark.rdd.CarbonGlobalDictionaryGenerate
> > > RDD.compute(CarbonGlobalDictionaryRDD.scala:294)
> > > >         at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.
> > > scala:306)
> > > >         at org.apache.spark.rdd.RDD.iterator(RDD.scala:270)
> > > >         at
> > > > org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)
> > > >         at org.apache.spark.scheduler.Task.run(Task.scala:89)
> > > >         at
> > > > org.apache.spark.executor.Executor$TaskRunner.run(
> Executor.scala:227)
> > > >         at
> > > >
> > > > java.util.concurrent.ThreadPoolExecutor.runWorker(
> > > ThreadPoolExecutor.java:1145)
> > > >         at
> > > >
> > > > java.util.concurrent.ThreadPoolExecutor$Worker.run(
> > > ThreadPoolExecutor.java:615)
> > > >         at java.lang.Thread.run(Thread.java:745)
> > > >
> > > > --
> > > >
> > > >
> > > > Best regards,
> > > > William Zen
> > > >
> > >
> >
> >
> >
> > --
> >
> >
> > Best regards,
> > William Zen
> >
>
>
>
> --
> Thanks & Regards,
> Ravi
>



-- 


Best regards,
William Zen

Reply via email to