Hey Thomas,

I made those changes in hbase-site.xml on each region server. I have 
crosschecked and looks like this file is in class path of sqlline.py. But still 
looks like updated config are not picked. Is there any way to apply these  
config ( by cmdline if possible) in phoenix sqlline?

Thanks,

_Nipur

-----Original Message-----
From: Thomas D'Silva [mailto:tdsi...@salesforce.com] 
Sent: Wednesday, August 12, 2015 1:20 AM
To: user@phoenix.apache.org
Subject: Re: Issue while joining data using pheonix

Nipur,

Are you sure the config change is getting picked up? The exception says the 
maximum allowed size is (104857664 bytes ~ 0.1GB) not 1GB.

Thanks,
Thomas

On Tue, Aug 11, 2015 at 12:43 AM, Nipur Patodi <er.nipur.pat...@gmail.com> 
wrote:
> Hi All,
>
> I am trying to join data in hbase phoenix tables. How ever I am 
> getting this exception.
>
> Error: Encountered exception in sub plan [0] execution. 
> (state=,code=0)
> java.sql.SQLException: Encountered exception in sub plan [0] execution.
> at 
> org.apache.phoenix.execute.HashJoinPlan.iterator(HashJoinPlan.java:157
> )
> at
> org.apache.phoenix.jdbc.PhoenixStatement$1.call(PhoenixStatement.java:
> 251)
> at
> org.apache.phoenix.jdbc.PhoenixStatement$1.call(PhoenixStatement.java:
> 241) at org.apache.phoenix.call.CallRunner.run(CallRunner.java:53)
> at
> org.apache.phoenix.jdbc.PhoenixStatement.executeQuery(PhoenixStatement
> .java:240)
> at
> org.apache.phoenix.jdbc.PhoenixStatement.execute(PhoenixStatement.java
> :1250) at sqlline.Commands.execute(Commands.java:822)
> at sqlline.Commands.sql(Commands.java:732)
> at sqlline.SqlLine.dispatch(SqlLine.java:808)
> at sqlline.SqlLine.begin(SqlLine.java:681)
> at sqlline.SqlLine.start(SqlLine.java:398)
> at sqlline.SqlLine.main(SqlLine.java:292)
> Caused by: 
> org.apache.phoenix.join.MaxServerCacheSizeExceededException: Size of 
> hash cache (104857664 bytes) exceeds the maximum allowed size 
> (104857600
> bytes)
> at
> org.apache.phoenix.join.HashCacheClient.serialize(HashCacheClient.java
> :109)
> at
> org.apache.phoenix.join.HashCacheClient.addHashCache(HashCacheClient.j
> ava:82)
> at
> org.apache.phoenix.execute.HashJoinPlan$HashSubPlan.execute(HashJoinPl
> an.java:339) at 
> org.apache.phoenix.execute.HashJoinPlan$1.call(HashJoinPlan.java:136)
> at java.util.concurrent.FutureTask.run(FutureTask.java:266)
> at
> org.apache.phoenix.job.JobManager$InstrumentedJobFutureTask.run(JobMan
> ager.java:172)
> at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.j
> ava:1142)
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.
> java:617) at java.lang.Thread.run(Thread.java:745)
>
> I also tried changing phoenix.query.maxServerCacheBytes to 1GB and 
> phoenix.query.maxGlobalMemoryPercentage to 40 in hbase-site.xml ( as I 
> am using ambari). but it desn't worked.
>
> Any suggestion would be extremely helpful.
>
> Thanks,
> _Nipur

Reply via email to