Hi Sounak,

It is getting deleted when I add an action, such as show() as you
suggested.

+-----------------+-----------------+--------------------+--------------------+
|SegmentSequenceId|           Status|     Load Start Time|       Load End
Time|
+-----------------+-----------------+--------------------+--------------------+
|                2|Marked for Delete|2017-03-30 10:26:...|2017-03-30
10:26:...|
|                1|Marked for Delete|2017-03-30 09:01:...|2017-03-30
09:01:...|
+-----------------+-----------------+--------------------+--------------------+


It looks like an issue to me, if not the documentation may need an update
to indicate this. Can I raise a JIRA?


Thanks







On Thu, Mar 30, 2017 at 1:29 PM, sounak <soun...@gmail.com> wrote:

> Hi Sanoj,
>
> Can you please try
>
> sql("delete from accountentity").show()
>
> Thanks
> Sounak
>
> On Thu, Mar 30, 2017 at 2:31 PM, Sanoj MG <sanoj.george....@gmail.com>
> wrote:
>
> > Hi All,
> >
> > Is delete records from carbondata table supported?
> >
> >
> > As per below doc I am trying to delete entries from the table :
> > https://github.com/apache/incubator-carbondata/blob/
> > master/docs/dml-operation-on-carbondata.md
> >
> >
> > scala> cc.sql("select * from accountentity").count
> > res10: Long = 391351
> >
> > scala> cc.sql("delete from accountentity")
> > INFO  30-03 09:03:03,099 - main Query [DELETE FROM ACCOUNTENTITY]
> > INFO  30-03 09:03:03,104 - Parsing command: select tupleId from
> > accountentity
> > INFO  30-03 09:03:03,104 - Parse Completed
> > INFO  30-03 09:03:03,105 - Parsing command: select tupleId from
> > accountentity
> > INFO  30-03 09:03:03,105 - Parse Completed
> > res11: org.apache.spark.sql.DataFrame = []
> >
> > scala> cc.sql("select * from accountentity").count
> > res12: Long = 391351
> >
> > Is deletion some sort of lazy operation?
> >
>
>
>
> --
> Thanks
> Sounak
>

Reply via email to