Was there any error in log of CsvBulkLoadTool ?

Which hbase release do you use ?

BTW phoenix  4.4 was pretty old release. Please consider using newer
release.

On Mon, Aug 8, 2016 at 3:07 PM, spark4plug <spark4p...@gmail.com> wrote:

> Hi folks looking for help in terms of bulkloading about 10 txt files into
> hbase using apache pheonix.
> after i create the table , i upload the first file about 6GB worth.  when i
> run the next bulkload command the table does not change.  This the command
> that is:
>
> HADOOP_CLASSPATH=$(hbase mapredcp):/etc/hbase/conf  hadoop jar
> /usr/hdp/current/phoenix-client/phoenix-4.4.0.2.4.2.4-5-client.jar
> org.apache.phoenix.mapreduce.CsvBulkLoadTool --table T3 -g  --input /load/
> --zookeeper myzookpt:2181:/hbase-unsecure
>
> my table schema is here
>
> CREATE TABLE T3 (
> DeviceName varchar NOT NULL PRIMARY KEY,
> Timestamp  varchar,
> value decimal,
> validity INTEGER,
> location varchar);
>
>
>
>
>
>
>
> --
> View this message in context: http://apache-hbase.679495.n3.
> nabble.com/bulkload-does-not-update-existing-table-tp4081652.html
> Sent from the HBase User mailing list archive at Nabble.com.
>

Reply via email to