1. What Spark version are you using? After taking a brief look, there seems
to be changes in how classpaths are being passed into the latest version of
Spark.
2. When pio-shell starts, it should echo the classpath being. Does that
include a directory that contains your hbase-site.xml?
3. In your pio-shell, do
`sys.env.filterKeys(_.startsWith("PIO_")).foreach(println)`. Do you see
your configuration?

Regards,
Donald

On Mon, Jun 11, 2018 at 1:34 PM Miller, Clifford <
clifford.mil...@phoenix-opsgroup.com> wrote:

> We are using PIO 0.12.1.  Are there additional parameters that I should be
> sending in?
>
> Thanks for the response.
>
> --Cliff.
>
>
> On Mon, Jun 11, 2018 at 8:29 PM, Donald Szeto <don...@apache.org> wrote:
>
>> Hi Cliff,
>>
>> What PIO version are you using? We fixed a similar issue in
>> https://issues.apache.org/jira/browse/PIO-72. If it is happening still
>> in 0.12.0+ we will need to investigate.
>>
>> Regards,
>> Donald
>>
>> On Tue, Jun 5, 2018 at 1:35 PM Miller, Clifford <
>> clifford.mil...@phoenix-opsgroup.com> wrote:
>>
>>> I'm running a PIO will all remote dependencies.  I have the following:
>>>
>>>    - PIO event server
>>>    - Elasticsearch cluster
>>>    - Spark cluster
>>>    - Hbase Cluster
>>>
>>> I would like to be able to use pio-shell to count events in the event
>>> store.  I submit the following command:
>>>
>>> pio-shell --with-spark -- --master
>>> spark://ip-10-0-1-88.us-gov-west-1.compute.internal:7077
>>>
>>> This appears to connect me to the Spark cluster.  Then I run:
>>>
>>> import org.apache.predictionio.data.store.PEventStore
>>> PEventStore.find(appName="MyAppName")(sc).count
>>>
>>> This gives me connection errors in the pio-shell.  It appears to be
>>> attempting to connnect locally to HBase and not the remote one.
>>>
>>> Stacktrace is attached.
>>>
>>> Thanks,
>>>
>>> --Cliff.
>>>
>>>
>>>
>>>
>
>
> --
> Clifford Miller
> Mobile | 321.431.9089
>

Reply via email to