Below is the complete stack trace
INFO [main] org.apache.phoenix.iterate.BaseResultIterators: Failed to
execute task during cancel
java.util.concurrent.ExecutionException:
java.lang.ArrayIndexOutOfBoundsException: 45
at java.util.concurrent.FutureTask.report(FutureTask.java:122)
at java.ut
I am new to HBASE and Phoenix world.
I have designed and executed a map reduce job which writes around 2.4 billion
cells (rows*columns) in HBASE via Phoenix in about 80min. I have reduced the
"mapreduce.input.fileinputformat.split.maxsize" to 8MB to increase the number
of mapper which helped m
Not sure if this works for the view use-case you have but it's working for
a Phoenix table.
The table create statement should have just the stable columns.
CREATE TABLE IF NOT EXISTS TESTC (
TIMESTAMP BIGINT NOT NULL,
NAME VARCHAR NOT NULL
CONSTRAINT PK PRIMARY KEY (TIMESTAMP, NAME)
);
-- insert