hbase-site.xml is in /home/ubuntu/phoenix/bin directory

I am using a java program to run it and I have tried three ways to specify
classpath:

1. classpath environment variable in bashrc

export CLASSPATH=$CLASSPATH:/home/ubuntu/phoenix/bin:/home/
ubuntu/phoenix/phoenix-4.8.1-HBase-1.2-client.jar

2. specifying classpath in command line

java -Xmx12g -cp
"/home/ubuntu/phoenix/phoenix-4.8.1-HBase-1.2-client.jar:/home/ubuntu/phoenix/bin"
execQuery

3. directly copying phoenix client jar and hbase-site.xml in the same
directory as java program driver

Thanks,
Ashish


On Mon, Oct 31, 2016 at 5:34 PM, Josh Elser <josh.el...@gmail.com> wrote:

> Is the directory containing hbase-site.xml where you have made the
> modification included on your overriden CLASSPATH? How are you running this
> query -- is it on the classpath for that program?
>
>
> ashish tapdiya wrote:
>
>> Query:
>>
>> SELECT /*+ NO_STAR_JOIN*/ IP, RANK, TOTAL FROM (SELECT SOURCEIPADDR as
>> IP, AVG(PAGERANK) AS RANK, SUM(ADREVENUE) AS TOTAL FROM uservisits_sf1
>> AS UV  INNER JOIN rankings_sf1 AS R ON R.PAGEURL = UV.DESTINATIONURL
>> WHERE UV.VISITDATE >= TO_DATE('2000-01-15') AND UV.VISITDATE <
>> TO_DATE('2000-01-22') GROUP BY UV.SOURCEIPADDR) ORDER BY TOTAL DESC LIMIT
>> 1
>>
>> Execution plan:
>>
>> +-----------------------------------------------------------
>> ---------------------------------------------------------+
>> |
>> PLAN                                                        |
>> +-----------------------------------------------------------
>> ---------------------------------------------------------+
>> | CLIENT 2-CHUNK PARALLEL 2-WAY FULL SCAN OVER
>> USERVISITS_SF1                                                        |
>> |     SERVER FILTER BY (VISITDATE >= DATE '2000-01-15 00:00:00.000' AND
>> VISITDATE < DATE '2000-01-22 00:00:00.000')  |
>> |     SERVER AGGREGATE INTO DISTINCT ROWS BY
>> [UV.SOURCEIPADDR]                                                       |
>> | CLIENT MERGE
>> SORT
>> |
>> | CLIENT TOP 1 ROW SORTED BY [SUM(UV.ADREVENUE)
>> DESC]                                                                |
>> |     PARALLEL INNER-JOIN TABLE
>> 0
>> |
>> |         CLIENT 2-CHUNK PARALLEL 2-WAY ROUND ROBIN FULL SCAN OVER
>> RANKINGS_SF1                                      |
>> |     DYNAMIC SERVER FILTER BY UV.DESTINATIONURL IN
>> (R.PAGEURL)                                                      |
>> +-----------------------------------------------------------
>> ---------------------------------------------------------+
>>
>> Error in the client:
>>
>> java.sql.SQLException: Encountered exception in sub plan [0] execution.
>>          at
>> org.apache.phoenix.execute.HashJoinPlan.iterator(HashJoinPlan.java:199)
>>          at
>> org.apache.phoenix.execute.HashJoinPlan.iterator(HashJoinPlan.java:143)
>>          at
>> org.apache.phoenix.execute.HashJoinPlan.iterator(HashJoinPlan.java:138)
>>          at
>> org.apache.phoenix.jdbc.PhoenixStatement$1.call(PhoenixState
>> ment.java:281)
>>          at
>> org.apache.phoenix.jdbc.PhoenixStatement$1.call(PhoenixState
>> ment.java:266)
>>          at org.apache.phoenix.call.CallRunner.run(CallRunner.java:53)
>>          at
>> org.apache.phoenix.jdbc.PhoenixStatement.executeQuery(Phoeni
>> xStatement.java:265)
>>          at
>> org.apache.phoenix.jdbc.PhoenixPreparedStatement.executeQuer
>> y(PhoenixPreparedStatement.java:186)
>>          at execQuery.execQueryHJ3(execQuery.java:152)
>>          at execQuery.main(execQuery.java:25)
>> Caused by: java.lang.OutOfMemoryError
>>          at
>> java.io.ByteArrayOutputStream.hugeCapacity(ByteArrayOutputSt
>> ream.java:123)
>>          at
>> java.io.ByteArrayOutputStream.grow(ByteArrayOutputStream.java:117)
>>          at
>> java.io.ByteArrayOutputStream.ensureCapacity(ByteArrayOutput
>> Stream.java:93)
>>          at
>> java.io.ByteArrayOutputStream.write(ByteArrayOutputStream.java:153)
>>          at java.io.DataOutputStream.write(DataOutputStream.java:107)
>>          at org.apache.phoenix.util.TupleUtil.write(TupleUtil.java:153)
>>          at
>> org.apache.phoenix.join.HashCacheClient.serialize(HashCacheC
>> lient.java:108)
>>          at
>> org.apache.phoenix.join.HashCacheClient.addHashCache(HashCac
>> heClient.java:83)
>>          at
>> org.apache.phoenix.execute.HashJoinPlan$HashSubPlan.execute(
>> HashJoinPlan.java:385)
>>          at
>> org.apache.phoenix.execute.HashJoinPlan$1.call(HashJoinPlan.java:167)
>>          at
>> org.apache.phoenix.execute.HashJoinPlan$1.call(HashJoinPlan.java:163)
>>          at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>          at
>> org.apache.phoenix.job.JobManager$InstrumentedJobFutureTask.
>> run(JobManager.java:183)
>>          at
>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPool
>> Executor.java:1145)
>>          at
>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoo
>> lExecutor.java:615)
>>          at java.lang.Thread.run(Thread.java:745)
>>
>> The size of rankings_sf1 table (to be serialized in client) is 2.9 GB.
>> Java client driver is started with -Xmx12g flag. Phoenix version is
>> 4.8.1. Cluster size is 3 nodes (2 slaves). Relevant hbase-site.xml
>> configuration in both client and servers is as follows:
>>
>> <property>
>> <name>phoenix.query.maxServerCacheBytes</name>
>> <value>14088576000</value>
>> </property>
>> <property>
>> <name>phoenix.query.maxGlobalMemoryPercentage</name>
>> <value>90</value>
>> </property>
>>
>>
>> bin directory contains hbase-site.xml and its path in .bashrc is
>> specified as follows
>> export
>> CLASSPATH=$CLASSPATH:/home/ubuntu/phoenix/bin:/home/ubuntu/
>> phoenix/phoenix-4.8.1-HBase-1.2-client.jar
>>
>>
>> Is it a bug or am I missing some configuration that is causing this error?
>>
>> Thanks,
>> Ashish
>>
>

Reply via email to