We added support for Elasticsearch 5 last year, so current PredictionIO
uses HTTP/REST protocol on port 9200, not the native protocol on port 9300.

Here's the local dev config I have working with PIO 0.12.0 and
Elasticsearch 5:

    PIO_STORAGE_SOURCES_ELASTICSEARCH_TYPE=elasticsearch
    PIO_STORAGE_SOURCES_ELASTICSEARCH_HOME=$PIO_HOME/vendors/elasticsearch
    PIO_STORAGE_SOURCES_ELASTICSEARCH_SCHEMES=http
    PIO_STORAGE_SOURCES_ELASTICSEARCH_HOSTS=localhost
    PIO_STORAGE_SOURCES_ELASTICSEARCH_PORTS=9200

I believe Elasticsearch 1 is only supported via native protocol on port
9300, which is eventually being removed in a future PIO release.

On Thu, Mar 8, 2018 at 2:50 AM, Pawan Agnihotri <pawan.agniho...@gmail.com>
wrote:

> Hello Donald and Team,
>
> I am working POC and I would like to use predictionIO.I know its
> configuration issue with elasticsearch but I am kind of stuck with below
> error so reaching out for help.
>
> I am in need of some quick hand here as the time is running out. anything
> you feel I can try out or steps would be helpful please.
>
>
> *2018-03-07 21:36:15,602 ERROR
> org.apache.predictionio.tools.console.Console$ [main] - None of the
> configured nodes are available: []
> (org.elasticsearch.client.transport.NoNodeAvailableException)*
> *org.elasticsearch.client.transport.NoNodeAvailableException: None of the
> configured nodes are available: []*
>
>
> *I am using the steps  from
> - http://predictionio.apache.org/install/install-sourcecode/
> <http://predictionio.apache.org/install/install-sourcecode/>*
>
> *below is pio-env.sh and Error logs*
>
> *[mapr@valvcshad004vm conf]$ cat pio-env.sh*
> #!/usr/bin/env bash
> #
> # Copy this file as pio-env.sh and edit it for your site's configuration.
> #
> # Licensed to the Apache Software Foundation (ASF) under one or more
> # contributor license agreements.  See the NOTICE file distributed with
> # this work for additional information regarding copyright ownership.
> # The ASF licenses this file to You under the Apache License, Version 2.0
> # (the "License"); you may not use this file except in compliance with
> # the License.  You may obtain a copy of the License at
> #
> #    http://www.apache.org/licenses/LICENSE-2.0
> #
> # Unless required by applicable law or agreed to in writing, software
> # distributed under the License is distributed on an "AS IS" BASIS,
> # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
> # See the License for the specific language governing permissions and
> # limitations under the License.
> #
>
> # PredictionIO Main Configuration
> #
> # This section controls core behavior of PredictionIO. It is very likely
> that
> # you need to change these to fit your site.
>
> # SPARK_HOME: Apache Spark is a hard dependency and must be configured.
> #SPARK_HOME=$PIO_HOME/vendors/spark-1.5.1-bin-hadoop2.6
> SPARK_HOME=/dfs/pawan_scala/mapr-predictionio/vendors/
> spark-2.1.1-bin-hadoop2.6
> POSTGRES_JDBC_DRIVER=$PIO_HOME/lib/postgresql-42.2.1.jar
> MYSQL_JDBC_DRIVER=$PIO_HOME/lib/mysql-connector-java-5.1.37.jar
>
> # ES_CONF_DIR: You must configure this if you have advanced configuration
> for
> #              your Elasticsearch setup.
> # ES_CONF_DIR=/opt/elasticsearch
>
> # HADOOP_CONF_DIR: You must configure this if you intend to run
> PredictionIO
> #                  with Hadoop 2.
> # HADOOP_CONF_DIR=/opt/hadoop
>
> # HBASE_CONF_DIR: You must configure this if you intend to run PredictionIO
> #                 with HBase on a remote cluster.
> # HBASE_CONF_DIR=$PIO_HOME/vendors/hbase-1.0.0/conf
>
> # Filesystem paths where PredictionIO uses as block storage.
> PIO_FS_BASEDIR=$HOME/.pio_store
> PIO_FS_ENGINESDIR=$PIO_FS_BASEDIR/engines
> PIO_FS_TMPDIR=$PIO_FS_BASEDIR/tmp
>
> # PredictionIO Storage Configuration
> #
> # This section controls programs that make use of PredictionIO's built-in
> # storage facilities. Default values are shown below.
> #
> # For more information on storage configuration please refer to
> # http://predictionio.incubator.apache.org/system/anotherdatastore/
>
> # Storage Repositories
>
> # Default is to use PostgreSQL
> #PIO_STORAGE_REPOSITORIES_METADATA_NAME=pio_meta
> #PIO_STORAGE_REPOSITORIES_METADATA_SOURCE=PGSQL
>
> #PIO_STORAGE_REPOSITORIES_EVENTDATA_NAME=pio_event
> #PIO_STORAGE_REPOSITORIES_EVENTDATA_SOURCE=PGSQL
>
> #PIO_STORAGE_REPOSITORIES_MODELDATA_NAME=pio_model
> #PIO_STORAGE_REPOSITORIES_MODELDATA_SOURCE=PGSQL
>
> PIO_STORAGE_REPOSITORIES_METADATA_NAME=predictionio_metadata
> PIO_STORAGE_REPOSITORIES_METADATA_SOURCE=ELASTICSEARCH
>
> PIO_STORAGE_REPOSITORIES_EVENTDATA_NAME=predictionio_eventdata
> PIO_STORAGE_REPOSITORIES_EVENTDATA_SOURCE=HBASE
>
> PIO_STORAGE_REPOSITORIES_MODELDATA_NAME=pio_
> PIO_STORAGE_REPOSITORIES_MODELDATA_SOURCE=LOCALFS
>
> # Storage Data Sources
>
>
> # PostgreSQL Default Settings
> # Please change "pio" to your database name in
> PIO_STORAGE_SOURCES_PGSQL_URL
> # Please change PIO_STORAGE_SOURCES_PGSQL_USERNAME and
> # PIO_STORAGE_SOURCES_PGSQL_PASSWORD accordingly
> #PIO_STORAGE_SOURCES_PGSQL_TYPE=jdbc
> #PIO_STORAGE_SOURCES_PGSQL_URL=jdbc:postgresql://localhost/pio
> #PIO_STORAGE_SOURCES_PGSQL_USERNAME=pio
> #$PIO_STORAGE_SOURCES_PGSQL_PASSWORD=pio
>
> # MySQL Example
> # PIO_STORAGE_SOURCES_MYSQL_TYPE=jdbc
> # PIO_STORAGE_SOURCES_MYSQL_URL=jdbc:mysql://localhost/pio
> # PIO_STORAGE_SOURCES_MYSQL_USERNAME=pio
> # PIO_STORAGE_SOURCES_MYSQL_PASSWORD=pio
>
> # Elasticsearch Example
> PIO_STORAGE_SOURCES_ELASTICSEARCH_TYPE=elasticsearch
> #PIO_STORAGE_SOURCES_ELASTICSEARCH_CLUSTERNAME=elasticsearch
> PIO_STORAGE_SOURCES_ELASTICSEARCH_HOSTS=localhost
> PIO_STORAGE_SOURCES_ELASTICSEARCH_PORTS=9300
> PIO_STORAGE_SOURCES_ELASTICSEARCH_HOME=/dfs/pawan_scala/mapr-predictionio/
> vendors/elasticsearch-5.5.2
>
> # Local File System Example
> PIO_STORAGE_SOURCES_LOCALFS_TYPE=localfs
> PIO_STORAGE_SOURCES_LOCALFS_PATH=$PIO_FS_BASEDIR/models
>
> # HBase Example
> PIO_STORAGE_SOURCES_HBASE_TYPE=hbase
> PIO_STORAGE_SOURCES_HBASE_HOME=/dfs/pawan_scala/mapr-
> predictionio/vendors/hbase-1.2.6
>
>
>
> *ERROR in the logs---*
>
> 2018-03-07 21:36:14,491 INFO  org.apache.predictionio.data.storage.Storage$
> [main] - Verifying Meta Data Backend (Source: ELASTICSEARCH)...
> 2018-03-07 21:36:15,601 ERROR org.apache.predictionio.tools.console.Console$
> [main] - Unable to connect to all storage backends successfully. The
> following shows the error message from the storage backend.
> 2018-03-07 21:36:15,602 ERROR org.apache.predictionio.tools.console.Console$
> [main] - None of the configured nodes are available: []
> (org.elasticsearch.client.transport.NoNodeAvailableException)
> org.elasticsearch.client.transport.NoNodeAvailableException: None of the
> configured nodes are available: []
>         at org.elasticsearch.client.transport.TransportClientNodesService.
> ensureNodesAreAvailable(TransportClientNodesService.java:305)
>         at org.elasticsearch.client.transport.TransportClientNodesService.
> execute(TransportClientNodesService.java:200)
>         at org.elasticsearch.client.transport.support.
> InternalTransportIndicesAdminClient.execute(InternalTransportIndicesAdminC
> lient.java:86)
>         at org.elasticsearch.client.support.AbstractIndicesAdminClient.
> exists(AbstractIndicesAdminClient.java:178)
>         at org.elasticsearch.action.admin.indices.exists.indices.
> IndicesExistsRequestBuilder.doExecute(IndicesExistsRequestBuilder.java:53)
>         at org.elasticsearch.action.ActionRequestBuilder.execute(
> ActionRequestBuilder.java:91)
>         at org.elasticsearch.action.ActionRequestBuilder.execute(
> ActionRequestBuilder.java:65)
>         at org.elasticsearch.action.ActionRequestBuilder.get(
> ActionRequestBuilder.java:73)
>         at org.apache.predictionio.data.storage.elasticsearch.
> ESEngineInstances.<init>(ESEngineInstances.scala:42)
>         at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
> Method)
>         at sun.reflect.NativeConstructorAccessorImpl.newInstance(
> NativeConstructorAccessorImpl.java:62)
>         at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(
> DelegatingConstructorAccessorImpl.java:45)
>         at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
>         at org.apache.predictionio.data.storage.Storage$.
> getDataObject(Storage.scala:306)
>         at org.apache.predictionio.data.storage.Storage$.
> getDataObjectFromRepo(Storage.scala:266)
>         at org.apache.predictionio.data.storage.Storage$.
> getMetaDataEngineInstances(Storage.scala:367)
>         at org.apache.predictionio.data.storage.Storage$.
> verifyAllDataObjects(Storage.scala:342)
>         at org.apache.predictionio.tools.console.Console$.status(
> Console.scala:1087)
>         at org.apache.predictionio.tools.console.Console$$anonfun$main$
> 1.apply(Console.scala:737)
>         at org.apache.predictionio.tools.console.Console$$anonfun$main$
> 1.apply(Console.scala:696)
>         at scala.Option.map(Option.scala:145)
>         at org.apache.predictionio.tools.console.Console$.main(Console.
> scala:696)
>         at org.apache.predictionio.tools.console.Console.main(Console.
> scala)
> 2018-03-07 21:36:15,605 ERROR org.apache.predictionio.tools.console.Console$
> [main] - Dumping configuration of initialized storage backend sources.
> Please make sure they are correct.
> 2018-03-07 21:36:15,607 ERROR org.apache.predictionio.tools.console.Console$
> [main] - Source Name: ELASTICSEARCH; Type: elasticsearch; Configuration:
> HOSTS -> localhost, TYPE -> elasticsearch, HOME -> /dfs/pawan_scala/mapr-
> predictionio/vendors/elasticsearch-5.5.2, PORTS -> 9300
> 2018-03-07 21:36:42,649 INFO  org.apache.predictionio.tools.console.Console$
> [main] - Creating Event Server at 0.0.0.0:7070
> 2018-03-07 21:36:43,417 ERROR org.apache.predictionio.data.storage.Storage$
> [main] - Error initializing storage client for source HBASE
> java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/HBaseConfiguration
>         at org.apache.predictionio.data.storage.hbase.StorageClient.<
> init>(StorageClient.scala:46)
>         at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
> Method)
>         at sun.reflect.NativeConstructorAccessorImpl.newInstance(
> NativeConstructorAccessorImpl.java:62)
>         at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(
> DelegatingConstructorAccessorImpl.java:45)
>         at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
>         at org.apache.predictionio.data.storage.Storage$.getClient(
> Storage.scala:220)
>         at org.apache.predictionio.data.storage.Storage$.org$apache$
> predictionio$data$storage$Storage$$updateS2CM(Storage.scala:251)
>         at org.apache.predictionio.data.storage.Storage$$anonfun$
> sourcesToClientMeta$1.apply(Storage.scala:212)
>         at org.apache.predictionio.data.storage.Storage$$anonfun$
> sourcesToClientMeta$1.apply(Storage.scala:212)
>
>
> On Wed, Mar 7, 2018 at 1:21 AM, Pawan Agnihotri <pawan.agniho...@gmail.com
> > wrote:
>
>> Hello,
>>
>> I need your help to configure predictionIO on linux 7.2 -
>>
>> I am using
>> http://predictionio.apache.org/install/install-sourcecode/  link for
>> steps and installed spark, elastic search and hbase but getting below error
>>
>>
>> [mapr@valvcshad004vm bin]$ ./pio status
>> /dfs/pawan_scala/mapr-predictionio/bin/pio-class: line 89:
>> /opt/mapr/spark/spark-2.1.0/mapr-util/generate-classpath.sh: No such
>> file or directory
>> /dfs/pawan_scala/mapr-predictionio/bin/pio-class: line 90:
>> generate_compatible_classpath: command not found
>> SLF4J: Class path contains multiple SLF4J bindings.
>> SLF4J: Found binding in [jar:file:/dfs/pawan_scala/map
>> r-predictionio/assembly/pio-assembly-0.10.0-SNAPSHOT.jar!/
>> org/slf4j/impl/StaticLoggerBinder.class]
>> SLF4J: Found binding in [jar:file:/data/opt/mapr/lib/s
>> lf4j-log4j12-1.7.12.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
>> explanation.
>> SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
>> [INFO] [Console$] Inspecting PredictionIO...
>> [INFO] [Console$] PredictionIO 0.10.0-SNAPSHOT is installed at
>> /dfs/pawan_scala/mapr-predictionio
>> [INFO] [Console$] Inspecting Apache Spark...
>> [INFO] [Console$] Apache Spark is installed at
>> /dfs/pawan_scala/mapr-predictionio/vendors/spark-2.1.1-bin-hadoop2.6
>> [INFO] [Console$] Apache Spark 2.1.1 detected (meets minimum requirement
>> of 1.3.0)
>> [INFO] [Console$] Inspecting storage backend connections...
>> *[WARN] [Storage$] There is no properly configured repository.*
>> *[ERROR] [Storage$] Required repository (METADATA) configuration is
>> missing.*
>> *[ERROR] [Storage$] There were 1 configuration errors. Exiting.*
>> [mapr@valvcshad004vm bin]$
>>
>> Here is my *pio-env.sh *file
>>
>> [mapr@valvcshad004vm conf]$ cat pio-env.sh
>> # Default is to use PostgreSQL
>> #PIO_STORAGE_REPOSITORIES_METADATA_NAME=pio_meta
>> #PIO_STORAGE_REPOSITORIES_METADATA_SOURCE=PGSQL
>>
>> #PIO_STORAGE_REPOSITORIES_EVENTDATA_NAME=pio_event
>> #PIO_STORAGE_REPOSITORIES_EVENTDATA_SOURCE=PGSQL
>>
>> #PIO_STORAGE_REPOSITORIES_MODELDATA_NAME=pio_model
>> #PIO_STORAGE_REPOSITORIES_MODELDATA_SOURCE=PGSQL
>>
>> # Storage Data Sources
>>
>> # PostgreSQL Default Settings
>> # Please change "pio" to your database name in
>> PIO_STORAGE_SOURCES_PGSQL_URL
>> # Please change PIO_STORAGE_SOURCES_PGSQL_USERNAME and
>> # PIO_STORAGE_SOURCES_PGSQL_PASSWORD accordingly
>> #PIO_STORAGE_SOURCES_PGSQL_TYPE=jdbc
>> #PIO_STORAGE_SOURCES_PGSQL_URL=jdbc:postgresql://localhost/pio
>> #PIO_STORAGE_SOURCES_PGSQL_USERNAME=pio
>> #$PIO_STORAGE_SOURCES_PGSQL_PASSWORD=pio
>>
>> # MySQL Example
>> # PIO_STORAGE_SOURCES_MYSQL_TYPE=jdbc
>> # PIO_STORAGE_SOURCES_MYSQL_URL=jdbc:mysql://localhost/pio
>> # PIO_STORAGE_SOURCES_MYSQL_USERNAME=pio
>> # PIO_STORAGE_SOURCES_MYSQL_PASSWORD=pio
>>
>> # Elasticsearch Example
>>  PIO_STORAGE_SOURCES_ELASTICSEARCH_TYPE=elasticsearch
>>  PIO_STORAGE_SOURCES_ELASTICSEARCH_CLUSTERNAME=elasticsearch_cluster_name
>>  PIO_STORAGE_SOURCES_ELASTICSEARCH_HOSTS=localhost
>>  PIO_STORAGE_SOURCES_ELASTICSEARCH_PORTS=9300
>>  PIO_STORAGE_SOURCES_ELASTICSEARCH_HOME=/dfs/pawan_scala/
>> mapr-predictionio/vendors/elasticsearch-5.5.2
>>
>> # Local File System Example
>>  PIO_STORAGE_SOURCES_LOCALFS_TYPE=localfs
>>  PIO_STORAGE_SOURCES_LOCALFS_PATH=$PIO_FS_BASEDIR/models
>>
>> # HBase Example
>>  PIO_STORAGE_SOURCES_HBASE_TYPE=hbase
>>  PIO_STORAGE_SOURCES_HBASE_HOME=/dfs/pawan_scala/mapr-predic
>> tionio/vendors/hbase-1.2.6
>>
>> [mapr@valvcshad004vm conf]$
>>
>>
>> --
>> Thanks,
>> Pawan Agnihotri
>>
>
>
>
> --
> Thanks,
> Pawan Agnihotri
>



-- 
*Mars Hall
415-818-7039
Customer Facing Architect
Salesforce Platform / Heroku
San Francisco, California

Reply via email to