I'll take care of it ;)
On 25 Oct 2017 23:45, "Sergey Soldatov" wrote:
> Hi Flavio,
>
> It looks like you need to ask the vendor, not the community about their
> plan for further releases.
>
> Thanks,
> Sergey
>
> On Wed, Oct 25, 2017 at 2:21 PM, Flavio Pompermaier
Hi Flavio,
It looks like you need to ask the vendor, not the community about their
plan for further releases.
Thanks,
Sergey
On Wed, Oct 25, 2017 at 2:21 PM, Flavio Pompermaier
wrote:
> Hi to all,
> the latest Phoenix Cloudera parcel I can see is 4.7...any plan to
I don't know why running it inside of Spark would cause issues.
I would double-check the classpath of your application when running in
Spark as well as look at the PQS log (HTTP/500 is a server error).
On 10/25/17 6:39 AM, cmbendre wrote:
I am trying to connect to Phoenix queryserver from
Since you're deploying onto a vendor's platform, I suggest asking this
question on the vendor's forum.
Cheers
On Wed, Oct 25, 2017 at 3:59 AM, Sumanta Gh wrote:
> Hi,
> I am trying to install phoenix-4.12.0 (HBase-1.1) on HDP 2.6.2.0. As per
> installation guide, I have
Hi,
I am trying to install phoenix-4.12.0 (HBase-1.1) on HDP 2.6.2.0. As per
installation guide, I have copied the phoenix-4.12.0-HBase-1.1-server.jar
inside HBase lib directory. After restarting HBase using Ambari and connecting
through SqlLine, I can see phoenix system tables are getting
I am trying to connect to Phoenix queryserver from Spark. Following Scala
code works perfectly fine when i run it without spark.
*import java.sql.{Connection, DriverManager, PreparedStatement, ResultSet,
Statement}
Class.forName("org.apache.phoenix.queryserver.client.Driver")
val connection=
Hi Rafa,
Dfs name service issue for phoenix got resolved after setting class path of
Hadoop configuration and HBase configuration. This can be done by setting
environment variable named HADOOP_HOME and HBASE_HOME in the respective
machines.
Thanks for your support.
Regards,
Mallieswari
On Thu,