Sorry, I meant to say table names are case sensitive.

On Sun, Oct 23, 2016 at 9:06 AM, Ravi Kiran <[email protected]>
wrote:

> Hi Mich,
>    Apparently, the tables are case sensitive. Since you have enclosed a
> double quote when creating the table, please pass the same when running the
> bulk load job.
>
> HADOOP_CLASSPATH=/home/hduser/jars/hbase-protocol-1.2.3.jar:/usr/lib/hbase/conf
> hadoop jar phoenix-4.8.1-HBase-1.2-client.jar
> org.apache.phoenix.mapreduce.CsvBulkLoadTool --table "dummy" --input
> /data/prices/2016-10-23/prices.1477228923115
>
> Regards
>
>
> On Sun, Oct 23, 2016 at 8:39 AM, Mich Talebzadeh <
> [email protected]> wrote:
>
>> Not sure whether phoenix-4.8.1-HBase-1.2-client.jar is the correct jar
>> file?
>>
>> Thanks
>>
>> Dr Mich Talebzadeh
>>
>>
>>
>> LinkedIn * 
>> https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
>> <https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>*
>>
>>
>>
>> http://talebzadehmich.wordpress.com
>>
>>
>> *Disclaimer:* Use it at your own risk. Any and all responsibility for
>> any loss, damage or destruction of data or any other property which may
>> arise from relying on this email's technical content is explicitly
>> disclaimed. The author will in no case be liable for any monetary damages
>> arising from such loss, damage or destruction.
>>
>>
>>
>> On 23 October 2016 at 15:39, Mich Talebzadeh <[email protected]>
>> wrote:
>>
>>> Hi,
>>>
>>> My stack
>>>
>>> Hbase: hbase-1.2.3
>>> Phoenix: apache-phoenix-4.8.1-HBase-1.2-bin
>>>
>>>
>>> As a suggestion I tried to load an Hbase file via
>>> org.apache.phoenix.mapreduce.CsvBulkLoadTool
>>>
>>> So
>>>
>>> I created a dummy table in Hbase as below
>>>
>>> create 'dummy', 'price_info'
>>>
>>> Then in Phoenix I created a table on Hbase table
>>>
>>>
>>> create table "dummy" (PK VARCHAR PRIMARY KEY, "price_info"."ticker"
>>> VARCHAR,"price_info"."timecreated" VARCHAR, "price_info"."price"
>>> VARCHAR);
>>>
>>> And then used the following comman to load the csv file
>>>
>>>  
>>> HADOOP_CLASSPATH=/home/hduser/jars/hbase-protocol-1.2.3.jar:/usr/lib/hbase/conf
>>> hadoop jar phoenix-4.8.1-HBase-1.2-client.jar
>>> org.apache.phoenix.mapreduce.CsvBulkLoadTool --table dummy --input
>>> /data/prices/2016-10-23/prices.1477228923115
>>>
>>> However, it does not seem to find the table dummy !
>>>
>>> 2016-10-23 14:38:39,442 INFO  [main] metrics.Metrics: Initializing
>>> metrics system: phoenix
>>> 2016-10-23 14:38:39,479 INFO  [main] impl.MetricsConfig: loaded
>>> properties from hadoop-metrics2.properties
>>> 2016-10-23 14:38:39,529 INFO  [main] impl.MetricsSystemImpl: Scheduled
>>> snapshot period at 10 second(s).
>>> 2016-10-23 14:38:39,529 INFO  [main] impl.MetricsSystemImpl: phoenix
>>> metrics system started
>>> Exception in thread "main" java.lang.IllegalArgumentException: Table
>>> DUMMY not found
>>>         at org.apache.phoenix.util.SchemaUtil.generateColumnInfo(Schema
>>> Util.java:873)
>>>         at org.apache.phoenix.mapreduce.AbstractBulkLoadTool.buildImpor
>>> tColumns(AbstractBulkLoadTool.java:377)
>>>         at org.apache.phoenix.mapreduce.AbstractBulkLoadTool.loadData(A
>>> bstractBulkLoadTool.java:214)
>>>         at org.apache.phoenix.mapreduce.AbstractBulkLoadTool.run(Abstra
>>> ctBulkLoadTool.java:183)
>>>         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
>>>         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
>>>         at org.apache.phoenix.mapreduce.CsvBulkLoadTool.main(CsvBulkLoa
>>> dTool.java:101)
>>>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>         at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAcce
>>> ssorImpl.java:62)
>>>         at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMe
>>> thodAccessorImpl.java:43)
>>>         at java.lang.reflect.Method.invoke(Method.java:498)
>>>         at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
>>>         at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
>>>
>>> I tried putting it inside "" etc but no joy I am afraid!
>>>
>>> Dr Mich Talebzadeh
>>>
>>>
>>>
>>> LinkedIn * 
>>> https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
>>> <https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>*
>>>
>>>
>>>
>>> http://talebzadehmich.wordpress.com
>>>
>>>
>>> *Disclaimer:* Use it at your own risk. Any and all responsibility for
>>> any loss, damage or destruction of data or any other property which may
>>> arise from relying on this email's technical content is explicitly
>>> disclaimed. The author will in no case be liable for any monetary damages
>>> arising from such loss, damage or destruction.
>>>
>>>
>>>
>>
>>
>

Reply via email to