Repository: incubator-hawq
Updated Branches:
  refs/heads/master b13ca83bf -> 43b8e44c5


HAWQ-1013. Move HAWQ Ambari plugin to Apache HAWQ


Project: http://git-wip-us.apache.org/repos/asf/incubator-hawq/repo
Commit: http://git-wip-us.apache.org/repos/asf/incubator-hawq/commit/5b013baa
Tree: http://git-wip-us.apache.org/repos/asf/incubator-hawq/tree/5b013baa
Diff: http://git-wip-us.apache.org/repos/asf/incubator-hawq/diff/5b013baa

Branch: refs/heads/master
Commit: 5b013baa88553a92f8c87085f1a4f4061e5154f6
Parents: b13ca83
Author: Matt <mmat...@pivotal.io>
Authored: Wed Aug 24 16:37:48 2016 -0700
Committer: Goden Yao <goden...@apache.org>
Committed: Thu Aug 25 13:42:05 2016 -0700

----------------------------------------------------------------------
 contrib/hawq-ambari-plugin/.gitignore           |   5 +
 contrib/hawq-ambari-plugin/README.md            |  98 ++++
 contrib/hawq-ambari-plugin/build.properties     |   8 +
 contrib/hawq-ambari-plugin/pom.xml              | 119 +++++
 .../main/resources/services/HAWQ/metainfo.xml   |  27 +
 .../main/resources/services/PXF/metainfo.xml    |  54 ++
 .../src/main/resources/utils/add-hawq.py        | 499 +++++++++++++++++++
 pom.xml                                         |   2 +
 8 files changed, 812 insertions(+)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/incubator-hawq/blob/5b013baa/contrib/hawq-ambari-plugin/.gitignore
----------------------------------------------------------------------
diff --git a/contrib/hawq-ambari-plugin/.gitignore 
b/contrib/hawq-ambari-plugin/.gitignore
new file mode 100644
index 0000000..859f265
--- /dev/null
+++ b/contrib/hawq-ambari-plugin/.gitignore
@@ -0,0 +1,5 @@
+# Maven target folder
+target/
+
+# Intellij
+*.idea/
\ No newline at end of file

http://git-wip-us.apache.org/repos/asf/incubator-hawq/blob/5b013baa/contrib/hawq-ambari-plugin/README.md
----------------------------------------------------------------------
diff --git a/contrib/hawq-ambari-plugin/README.md 
b/contrib/hawq-ambari-plugin/README.md
new file mode 100644
index 0000000..5aac4e0
--- /dev/null
+++ b/contrib/hawq-ambari-plugin/README.md
@@ -0,0 +1,98 @@
+# HAWQ Ambari Plugin
+
+hawq-ambari-plugin helps users install HAWQ and PXF using Ambari.
+
+To ensure that Ambari recognizes HAWQ and PXF as services that can be 
installed for a specific stack, the following steps are required:
+ * Add HAWQ and PXF metainfo.xml files (containing metadata about the service) 
under the stack to be installed.
+ * Add repositories where HAWQ and PXF rpms reside, so that Ambari can use it 
during installation. This requires updating repoinfo.xml under the stack HAWQ 
and PXF is to be added.
+ * If a stack is already installed using Ambari, add repositories to the 
existing stack using the Ambari REST API.
+
+The above steps are taken care of by the hawq-ambari-plugin when the 
```./add-hawq.py``` command is used.
+
+## Source Code
+Source code directory structure of the hawq-ambari-plugin is as follows:
+```
+hawq-ambari-plugin
++-- README.md
++-- build.properties
++-- pom.xml
++-- src
+    +-- main
+        +-- resources
+            +-- services
+            ¦   +-- HAWQ
+            ¦   ¦   +-- metainfo.xml
+            ¦   +-- PXF
+            ¦       +-- metainfo.xml
+            +-- utils
+                +-- add-hawq.py
+```
+
+### build.properties
+[build.properties](build.properties) contains properties required for building 
the plugin.
+
+### metainfo.xml
+[metainfo.xml](src/main/resources/services/HAWQ/metainfo.xml) contains the 
metadata about the service. The metainfo.xml specifies that the service 
definition is to be inherited from Ambari common-services. HAWQ and PXF 
common-services code can be found under [Apache Ambari 
repository](https://github.com/apache/ambari/tree/trunk/ambari-server/src/main/resources/common-services/).
+
+### add-hawq<i></i>.py
+[add-hawq.py](src/main/resources/utils/add-hawq.py) deploys HAWQ and PXF 
metainfo.xml files under the stack and adds the repositories to Ambari.
+
+
+## Building the plugin
+***Build Environment***: centos6 is the typical operating system used for 
building.
+
+Properties specified in the [build.properties](build.properties) file:
+
+| Property | Description | Value |
+| --- | --- | --- |
+| hawq.release.version | Release version of HAWQ | 2.0.1 |
+| hawq.common.services.version | HAWQ common services code in Ambari to be 
inherited | 2.0.0 |
+| pxf.release.version | Release version of PXF | 3.0.1 |
+| pxf.common.services.version | PXF common services code in Ambari to be 
inherited | 3.0.0 |
+| hawq.repo.prefix | Repository name for HAWQ core repository  | hawq |
+| hawq.addons.repo.prefix | Repository name for HAWQ Add Ons repository  | 
hawq-add-ons |
+| repository.version | Repository Version to be used in repository information 
| 2.0.1<i></i>.0 |
+| default.stack | Default stack under which, metainfo.xml and repositories 
have to be added | HDP-2.4 |
+
+To build the rpm for hawq-ambari-plugin, change the 
[build.properties](build.properties) file with the required parameters and run 
```mvn install``` command under hawq-ambari-plugin directory:
+```
+$ pwd
+incubator-hawq/contrib/hawq-ambari-plugin
+$ mvn clean resources:copy-resources rpm:rpm -Dbuild_number=1
+```
+
+## Usage
+
+Installing the hawq-ambari-plugin rpm would lay down the following directory:
+```
+/var/lib/hawq
++-- add-hawq.py
++-- staging
+    +-- HAWQ
+    ¦   +-- metainfo.xml
+    +-- PXF
+        +-- metainfo.xml
+```
+
+***Prerequisite***: Ensure that the script is run on the host where Ambari 
server is **running**.
+
+*Replace* ```<ambari-username>``` *and* ```<ambari-password>``` *with login 
Ambari credentials*.
+
+If hawq-2.0.1.0 and hawq-add-ons-2.0.1.0 repository have been set up on the 
Ambari server host, run the following command:
+
+```
+$ ./add-hawq.py --user <ambari-username> --password <ambari-password> --stack 
HDP-2.5
+```
+If ```--stack``` is not mentioned, *HDP-2.4* stack will be used as default 
parameter.
+
+If hawq-2.0.1.0 and hawq-add-ons-2.0.1.0 repository have been set up on a 
different host than the Ambari server host, run the following command:
+
+
+```
+$ ./add-hawq.py --user <ambari-username> --password <ambari-password> --stack 
HDP-2.5 --hawqrepo http://my.host.address/hawq-2.0.1.0 --addonsrepo 
http://my.host.address/hawq-add-ons-2.0.1.0
+```
+
+**Please restart ambari-server after running the script so that the changes 
take effect:**
+```
+$ ambari-server restart
+```
\ No newline at end of file

http://git-wip-us.apache.org/repos/asf/incubator-hawq/blob/5b013baa/contrib/hawq-ambari-plugin/build.properties
----------------------------------------------------------------------
diff --git a/contrib/hawq-ambari-plugin/build.properties 
b/contrib/hawq-ambari-plugin/build.properties
new file mode 100644
index 0000000..b4c2b74
--- /dev/null
+++ b/contrib/hawq-ambari-plugin/build.properties
@@ -0,0 +1,8 @@
+hawq.release.version=2.0.1
+hawq.common.services.version=2.0.0
+pxf.release.version=3.0.1
+pxf.common.services.version=3.0.0
+hawq.repo.prefix=hawq
+hawq.addons.repo.prefix=hawq-add-ons
+repository.version=2.0.1.0
+default.stack=HDP-2.4
\ No newline at end of file

http://git-wip-us.apache.org/repos/asf/incubator-hawq/blob/5b013baa/contrib/hawq-ambari-plugin/pom.xml
----------------------------------------------------------------------
diff --git a/contrib/hawq-ambari-plugin/pom.xml 
b/contrib/hawq-ambari-plugin/pom.xml
new file mode 100644
index 0000000..7b309a8
--- /dev/null
+++ b/contrib/hawq-ambari-plugin/pom.xml
@@ -0,0 +1,119 @@
+<?xml version="1.0"?>
+<!--
+   Licensed to the Apache Software Foundation (ASF) under one or more
+   contributor license agreements.  See the NOTICE file distributed with
+   this work for additional information regarding copyright ownership.
+   The ASF licenses this file to You under the Apache License, Version 2.0
+   (the "License"); you may not use this file except in compliance with
+   the License.  You may obtain a copy of the License at
+
+       http://www.apache.org/licenses/LICENSE-2.0
+
+   Unless required by applicable law or agreed to in writing, software
+   distributed under the License is distributed on an "AS IS" BASIS,
+   WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+   See the License for the specific language governing permissions and
+   limitations under the License.
+-->
+<project xmlns="http://maven.apache.org/POM/4.0.0"; 
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance";
+         xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 
http://maven.apache.org/maven-v4_0_0.xsd";>
+
+  <modelVersion>4.0.0</modelVersion>
+  <groupId>org.apache.hawq</groupId>
+  <artifactId>hawq-ambari-plugin</artifactId>
+  <version>2.0.1.0</version>
+  <name>hawq-ambari-plugin</name>
+  <url>http://maven.apache.org</url>
+
+  <properties>
+    <release>${project.version}</release>
+    <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
+    <resources.temp>${basedir}/target/resources-temp</resources.temp>
+    <hawq.lib.dir>/var/lib/hawq</hawq.lib.dir>
+    <hawq.lib.staging.dir>${hawq.lib.dir}/staging</hawq.lib.staging.dir>
+  </properties>
+
+  <build>
+    <plugins>
+
+      <plugin>
+        <groupId>org.apache.maven.plugins</groupId>
+        <artifactId>maven-resources-plugin</artifactId>
+        <version>3.0.1</version>
+
+        <executions>
+          <execution>
+            <phase>none</phase>
+            <goals>
+              <goal>copy-resources</goal>
+            </goals>
+          </execution>
+        </executions>
+
+        <configuration>
+          <filters>
+            <filter>${basedir}/build.properties</filter>
+          </filters>
+          <outputDirectory>${resources.temp}</outputDirectory>
+          <resources>
+            <resource>
+              <directory>${basedir}/src/main/resources/</directory>
+              <filtering>true</filtering>
+            </resource>
+          </resources>
+        </configuration>
+      </plugin>
+
+      <plugin>
+        <groupId>org.codehaus.mojo</groupId>
+        <artifactId>rpm-maven-plugin</artifactId>
+        <version>2.0.1</version>
+
+        <executions>
+          <execution>
+            <phase>none</phase>
+            <goals>
+              <goal>rpm</goal>
+            </goals>
+          </execution>
+        </executions>
+
+        <configuration>
+          <group>org.apache.hawq</group>
+          <description>
+            HAWQ plugin contains Ambari's stack definition for HAWQ.
+            When installed, Ambari will be able to support HAWQ as a service.
+          </description>
+          <needarch>x86_64</needarch>
+          <release>${build_number}%{?dist}</release>
+          <requires>
+            <require>ambari-server &gt;= 2.2</require>
+            <require>python &gt;= 2.6</require>
+          </requires>
+          <mappings>
+            <mapping>
+              <directory>${hawq.lib.staging.dir}</directory>
+              <sources>
+                <source>
+                  <location>${resources.temp}/services</location>
+                </source>
+              </sources>
+            </mapping>
+            <mapping>
+              <directory>${hawq.lib.dir}</directory>
+              <filemode>755</filemode>
+              <sources>
+                <source>
+                  <location>${resources.temp}/utils</location>
+                </source>
+              </sources>
+            </mapping>
+          </mappings>
+        </configuration>
+      </plugin>
+
+    </plugins>
+  </build>
+
+</project>
+

http://git-wip-us.apache.org/repos/asf/incubator-hawq/blob/5b013baa/contrib/hawq-ambari-plugin/src/main/resources/services/HAWQ/metainfo.xml
----------------------------------------------------------------------
diff --git 
a/contrib/hawq-ambari-plugin/src/main/resources/services/HAWQ/metainfo.xml 
b/contrib/hawq-ambari-plugin/src/main/resources/services/HAWQ/metainfo.xml
new file mode 100755
index 0000000..7ada845
--- /dev/null
+++ b/contrib/hawq-ambari-plugin/src/main/resources/services/HAWQ/metainfo.xml
@@ -0,0 +1,27 @@
+<?xml version="1.0"?>
+<!--
+   Licensed to the Apache Software Foundation (ASF) under one or more
+   contributor license agreements.  See the NOTICE file distributed with
+   this work for additional information regarding copyright ownership.
+   The ASF licenses this file to You under the Apache License, Version 2.0
+   (the "License"); you may not use this file except in compliance with
+   the License.  You may obtain a copy of the License at
+
+       http://www.apache.org/licenses/LICENSE-2.0
+
+   Unless required by applicable law or agreed to in writing, software
+   distributed under the License is distributed on an "AS IS" BASIS,
+   WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+   See the License for the specific language governing permissions and
+   limitations under the License.
+-->
+<metainfo>
+  <schemaVersion>2.0</schemaVersion>
+  <services>
+    <service>
+      <name>HAWQ</name>
+      <extends>common-services/HAWQ/${hawq.common.services.version}</extends>
+      <version>${hawq.release.version}</version>
+    </service>
+  </services>
+</metainfo>
\ No newline at end of file

http://git-wip-us.apache.org/repos/asf/incubator-hawq/blob/5b013baa/contrib/hawq-ambari-plugin/src/main/resources/services/PXF/metainfo.xml
----------------------------------------------------------------------
diff --git 
a/contrib/hawq-ambari-plugin/src/main/resources/services/PXF/metainfo.xml 
b/contrib/hawq-ambari-plugin/src/main/resources/services/PXF/metainfo.xml
new file mode 100755
index 0000000..e462538
--- /dev/null
+++ b/contrib/hawq-ambari-plugin/src/main/resources/services/PXF/metainfo.xml
@@ -0,0 +1,54 @@
+<?xml version="1.0"?>
+<!--
+   Licensed to the Apache Software Foundation (ASF) under one or more
+   contributor license agreements.  See the NOTICE file distributed with
+   this work for additional information regarding copyright ownership.
+   The ASF licenses this file to You under the Apache License, Version 2.0
+   (the "License"); you may not use this file except in compliance with
+   the License.  You may obtain a copy of the License at
+
+       http://www.apache.org/licenses/LICENSE-2.0
+
+   Unless required by applicable law or agreed to in writing, software
+   distributed under the License is distributed on an "AS IS" BASIS,
+   WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+   See the License for the specific language governing permissions and
+   limitations under the License.
+-->
+<metainfo>
+  <schemaVersion>2.0</schemaVersion>
+  <services>
+    <service>
+      <name>PXF</name>
+      <extends>common-services/PXF/${pxf.common.services.version}</extends>
+      <version>${pxf.release.version}</version>
+
+      <osSpecifics>
+        <osSpecific>
+          <osFamily>any</osFamily>
+          <packages>
+            <package>
+              <name>pxf-service</name>
+            </package>
+            <package>
+              <name>apache-tomcat</name>
+            </package>
+            <package>
+              <name>pxf-hive</name>
+            </package>
+            <package>
+              <name>pxf-hdfs</name>
+            </package>
+            <package>
+              <name>pxf-hbase</name>
+            </package>
+            <package>
+              <name>pxf-json</name>
+            </package>
+          </packages>
+        </osSpecific>
+      </osSpecifics>
+
+    </service>
+  </services>
+</metainfo>
\ No newline at end of file

http://git-wip-us.apache.org/repos/asf/incubator-hawq/blob/5b013baa/contrib/hawq-ambari-plugin/src/main/resources/utils/add-hawq.py
----------------------------------------------------------------------
diff --git a/contrib/hawq-ambari-plugin/src/main/resources/utils/add-hawq.py 
b/contrib/hawq-ambari-plugin/src/main/resources/utils/add-hawq.py
new file mode 100755
index 0000000..d8ba361
--- /dev/null
+++ b/contrib/hawq-ambari-plugin/src/main/resources/utils/add-hawq.py
@@ -0,0 +1,499 @@
+#!/usr/bin/env python
+
+"""
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+    http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing, software
+distributed under the License is distributed on an "AS IS" BASIS,
+WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+See the License for the specific language governing permissions and
+limitations under the License.
+"""
+
+import base64
+import getpass
+import json
+import os
+import shutil
+import socket
+import urllib2
+import xml.etree.ElementTree as ET
+from optparse import OptionParser
+
+PLUGIN_VERSION = '${release}'
+DEFAULT_STACK = '${default.stack}'
+SUPPORTED_OS_LIST = ['redhat6']
+HAWQ_LIB_STAGING_DIR = '${hawq.lib.staging.dir}'
+REPO_VERSION = '${repository.version}'
+HAWQ_REPO = '${hawq.repo.prefix}'
+HAWQ_ADD_ONS_REPO = '${hawq.addons.repo.prefix}'
+
+REPO_INFO = {
+  HAWQ_REPO: {
+    'repoid': '-'.join([HAWQ_REPO, REPO_VERSION]),
+    'input_param': '--hawqrepo',
+    'optional': False
+  },
+  HAWQ_ADD_ONS_REPO: {
+    'repoid': '-'.join([HAWQ_ADD_ONS_REPO, REPO_VERSION]),
+    'input_param': '--addonsrepo',
+    'optional': True
+  }
+}
+
+
+class APIClient:
+  """
+  Class which interacts with Ambari Server API
+  """
+
+  # Base API URL points to localhost. This script is to be executed on the 
Ambari Server
+  BASE_API_URL = 'http://localhost:8080/api/v1'
+
+  def __init__(self, user, password):
+    self.user = user
+    self.password = password
+    self.encoded_credentials = base64.encodestring(self.user + ':' + 
self.password).replace('\n', '')
+
+  def __request(self, method, url_path, headers=None, data=None):
+    """
+    Creates API requests and packages response into the following format: 
(response code, response body in json object)
+    """
+    headers = headers if headers is not None else {}
+    headers['Authorization'] = 'Basic {0}'.format(self.encoded_credentials)
+
+    req = urllib2.Request(self.BASE_API_URL + url_path, data, headers)
+    req.get_method = lambda: method
+    response = urllib2.urlopen(req)
+    response_str = response.read()
+
+    return response.getcode(), json.loads(response_str) if response_str else 
None
+
+  def verify_api_reachable(self):
+    """
+    Returns true if Ambari Server is reachable through API
+    """
+    try:
+      status_code, _ = self.__request('GET', '/stacks')
+    except Exception as e:
+      if type(e) == urllib2.HTTPError and e.code == 403:
+        raise Exception('Invalid username and/or password.')
+      elif type(e) == urllib2.URLError:
+        raise Exception('Ambari-server is not running. Please start 
ambari-server.')
+      else:
+        raise Exception('Unable to connect to Ambari Server.\n' + str(e))
+
+  def get_cluster_name(self):
+    """
+    Returns the name of the installed cluster
+    """
+    _, response_json = self.__request('GET', '/clusters')
+    return None if len(response_json['items']) == 0 else 
response_json['items'][0]['Clusters']['cluster_name']
+
+  def get_stack_info(self, cluster_name):
+    """
+    Returns stack information (stack name, stack version, repository version) 
of stack installed on cluster
+    """
+    _, response_json = self.__request('GET',
+                                      
'/clusters/{0}/stack_versions?ClusterStackVersions/state.matches(CURRENT)'.format(
+                                          cluster_name))
+    if 'items' not in response_json or len(response_json['items']) == 0:
+      raise Exception('No Stack found to be installed on the cluster 
{0}'.format(cluster_name))
+    stack_versions = response_json['items'][0]['ClusterStackVersions']
+    return stack_versions['stack'], stack_versions['version'], 
stack_versions['repository_version']
+
+  def get_existing_repository_info(self, stack_name, stack_version, 
repository_version):
+    """
+    Returns existing repo information for a given stack
+    """
+    url_path = 
'/stacks/{0}/versions/{1}/compatible_repository_versions/{2}?fields=*,operating_systems/*,operating_systems/repositories/*'.format(
+        stack_name,
+        stack_version,
+        repository_version)
+    _, response_json = self.__request('GET', url_path)
+    return response_json
+
+  def update_existing_repo(self, stack_name, stack_version, 
repository_version, merged_repo_info):
+    """
+    Sends a PUT request to add new repo information to the Ambari database
+    """
+    url_path = 
'/stacks/{0}/versions/{1}/repository_versions/{2}'.format(stack_name, 
stack_version,
+                                                                         
repository_version)
+
+    headers = {}
+    headers['X-Requested-By'] = 'ambari'
+    headers['Content-Type'] = 'application/x-www-form-urlencoded; 
charset=UTF-8'
+
+    try:
+      status_code, _ = self.__request('PUT', url_path, headers, 
merged_repo_info)
+    except:
+      # Ambari returns sporadic errors even if PUT succeeds
+      # Ignore any exception, because existing information from cluster will 
be verified after PUT request
+      return
+
+
+class RepoUtils:
+  """
+  Utility class for handling json structure to add new repo to existing repo
+  """
+
+  def __transform_repo(self, repository):
+    """
+    Extracts and returns the base_url, repo_id and repo_name for each 
repository
+    """
+    repo_info_json = repository['Repositories']
+    result = {}
+    result['Repositories'] = dict(
+        (k, v) for k, v in repo_info_json.iteritems() if k in ('base_url', 
'repo_id', 'repo_name'))
+    return result
+
+  def __transform_os_repos(self, os_repos):
+    """
+    Constructs the json string for each operating system
+    """
+    result = {
+      'OperatingSystems': {},
+      'repositories': []
+    }
+    result['OperatingSystems']['os_type'] = 
os_repos['OperatingSystems']['os_type']
+    result['repositories'] = [self.__transform_repo(repository) for repository 
in os_repos['repositories']]
+    return result
+
+  def __transform(self, repository_info):
+    """
+    Constructs the json string with required repository information
+    """
+    result = {
+      'operating_systems': []
+    }
+    result['operating_systems'] = [self.__transform_os_repos(os_repos) for 
os_repos in
+                                   repository_info['operating_systems']]
+    return result
+
+  def __create_repo_info_dict(self, repo):
+    """
+    Creates json string with new repo information
+    """
+    result = {}
+    result['Repositories'] = {
+      'base_url': repo['baseurl'],
+      'repo_id': repo['repoid'],
+      'repo_name': repo['reponame']
+    }
+    return result
+
+  def verify_repos_updated(self, existing_repo_info, repos_to_add):
+    """
+    Checks if input repo exists for that os_type on the cluster
+    """
+    existing_repos = self.__transform(existing_repo_info)
+
+    all_repos_updated = True
+
+    for os_repos in existing_repos['operating_systems']:
+      if os_repos['OperatingSystems']['os_type'] in SUPPORTED_OS_LIST:
+
+        for repo_to_add in repos_to_add:
+          repo_exists = False
+
+          for existing_repo in os_repos['repositories']:
+            if existing_repo['Repositories']['repo_id'] == 
repo_to_add['repoid'] and \
+                    existing_repo['Repositories']['repo_name'] == 
repo_to_add['reponame'] and \
+                url_exists(existing_repo['Repositories']['base_url'], 
repo_to_add['baseurl']):
+              repo_exists = True
+
+          all_repos_updated = all_repos_updated and repo_exists
+
+    return all_repos_updated
+
+  def add_to_existing_repos(self, existing_repo_info, repos_to_add):
+    """
+    Helper function for adding new repos to existing repos
+    """
+    existing_repos = self.__transform(existing_repo_info)
+
+    for os_repos in existing_repos['operating_systems']:
+      if os_repos['OperatingSystems']['os_type'] in SUPPORTED_OS_LIST:
+        for repo_to_add in repos_to_add:
+          repo_exists = False
+          for existing_repo in os_repos['repositories']:
+            if existing_repo['Repositories']['repo_id'] == 
repo_to_add['repoid']:
+              repo_exists = True
+              existing_repo['Repositories']['repo_name'] = 
repo_to_add['reponame']
+              existing_repo['Repositories']['base_url'] = 
repo_to_add['baseurl']
+
+          if not repo_exists:
+            
os_repos['repositories'].append(self.__create_repo_info_dict(repo_to_add))
+
+    return json.dumps(existing_repos)
+
+
+class InputValidator:
+  """
+  Class containing methods for validating command line inputs
+  """
+
+  def __is_repourl_valid(self, repo_url):
+    """
+    Returns True if repo_url points to a valid repository
+    """
+    repo_url = os.path.join(repo_url, 'repodata/repomd.xml')
+    req = urllib2.Request(repo_url)
+
+    try:
+      response = urllib2.urlopen(req)
+    except urllib2.URLError:
+      return False
+
+    if response.getcode() != 200:
+      return False
+
+    return True
+
+  def verify_stack(self, stack):
+    """
+    Returns stack info of stack
+    """
+    if not stack:
+      # Use default stack
+      print 'INFO: Using default stack {0}, since --stack parameter was not 
specified.'.format(DEFAULT_STACK)
+      stack = DEFAULT_STACK
+
+    stack_pair = stack.split('-')
+
+    if len(stack_pair) != 2:
+      raise Exception('Specified stack {0} is not of expected format 
STACK_NAME-STACK_VERSION'.format(stack))
+
+    stack_name = stack_pair[0]
+    stack_version = stack_pair[1]
+
+    stack_dir = 
'/var/lib/ambari-server/resources/stacks/{0}/{1}'.format(stack_name, 
stack_version)
+
+    if not os.path.isdir(stack_dir):
+      raise Exception(
+          'Specified stack {0} does not exist under 
/var/lib/ambari-server/resources/stacks'.format(stack))
+
+    return {
+      'stack_name': stack_name,
+      'stack_version': stack_version,
+      'stack_dir': stack_dir
+    }
+
+  def verify_repo(self, repoid_prefix, repo_url):
+    """
+    Returns repo info of repo
+    """
+    repo_specified = True
+    if not repo_url:
+      # Use default repo_url
+      repo_url = 'http://{0}/{1}'.format(socket.getfqdn(), 
REPO_INFO[repoid_prefix]['repoid'])
+      repo_specified = False
+
+    if not self.__is_repourl_valid(repo_url):
+      if repo_specified:
+        raise Exception('Specified URL {0} is not a valid repository. \n'
+                        'Please specify a valid url for {1}'.format(repo_url,
+                                                                    
REPO_INFO[repoid_prefix]['input_param']))
+      elif REPO_INFO[repoid_prefix]['optional']:
+        return None
+      else:
+        raise Exception(
+            'Repository URL {0} is not valid. \nPlease ensure setup_repo.sh 
has been run for the {1} repository on this machine '
+            'OR specify a valid url for {2}'.format(repo_url, 
REPO_INFO[repoid_prefix]['repoid'],
+                                                    
REPO_INFO[repoid_prefix]['input_param']))
+
+    return {
+      'repoid': REPO_INFO[repoid_prefix]['repoid'],
+      'reponame': REPO_INFO[repoid_prefix]['repoid'],
+      'baseurl': repo_url
+    }
+
+
+def url_exists(repoA, repoB):
+  """
+  Returns True if given repourl repoA exists in repoB
+  """
+  if type(repoB) in (list, tuple):
+    return repoA.rstrip('/') in [existing_url.rstrip('/') for existing_url in 
repoB]
+  else:
+    return repoA.rstrip('/') == repoB.rstrip('/')
+
+
+def update_repoinfo(stack_dir, repos_to_add):
+  """
+  Updates the repoinfo.xml under the specified stack_dir
+  """
+  file_path = '{0}/repos/repoinfo.xml'.format(stack_dir)
+
+  for repo in repos_to_add:
+    repo['xmltext'] = '<repo>\n' \
+                      '  <repoid>{0}</repoid>\n' \
+                      '  <reponame>{1}</reponame>\n' \
+                      '  <baseurl>{2}</baseurl>\n' \
+                      '</repo>\n'.format(repo['repoid'], repo['reponame'], 
repo['baseurl'])
+
+  tree = ET.parse(file_path)
+  root = tree.getroot()
+  file_needs_update = False
+
+  for os_tag in root.findall('.//os'):
+
+    if os_tag.attrib['family'] in SUPPORTED_OS_LIST:
+      for repo_to_add in repos_to_add:
+
+        repo_needs_update = False
+        for existing_repo in os_tag.findall('.//repo'):
+
+          existing_repoid = [repoid.text for repoid in 
existing_repo.findall('.//repoid')][0]
+          existing_reponame = [repoid.text for repoid in 
existing_repo.findall('.//reponame')][0]
+          existing_baseurl = [baseurl.text for baseurl in 
existing_repo.findall('.//baseurl')][0]
+
+          if existing_repoid == repo_to_add['repoid']:
+            repo_needs_update = True
+            print 'INFO: Repository {0} already exists with reponame {1}, 
baseurl {2} in {3}'.format(
+                repo_to_add['repoid'], existing_reponame, existing_baseurl, 
file_path)
+
+            if existing_reponame != repo_to_add['reponame'] or 
existing_baseurl != repo_to_add['baseurl']:
+              os_tag.remove(existing_repo)
+              os_tag.append(ET.fromstring(repo_to_add['xmltext']))
+              print 'INFO: Repository {0} updated with reponame {1}, baseurl 
{2} in {3}'.format(
+                  repo_to_add['repoid'], repo_to_add['reponame'], 
repo_to_add['baseurl'], file_path)
+              file_needs_update = True
+
+        if not repo_needs_update:
+          os_tag.append(ET.fromstring(repo_to_add['xmltext']))
+          print 'INFO: Repository {0} with baseurl {1} added to 
{2}'.format(repo_to_add['repoid'],
+                                                                            
repo_to_add['baseurl'], file_path)
+          file_needs_update = True
+
+  if file_needs_update:
+    tree.write(file_path)
+
+
+def add_repo_to_cluster(api_client, stack, repos_to_add):
+  """
+  Adds the new repository to the existing cluster if the specified stack has 
been installed on that cluster
+  """
+  stack_name = stack['stack_name']
+  stack_version = stack['stack_version']
+
+  cluster_name = api_client.get_cluster_name()
+
+  # Proceed only if cluster is installed
+  if cluster_name is None:
+    return
+
+  repo_utils = RepoUtils()
+  installed_stack_name, installed_stack_version, installed_repository_version 
= api_client.get_stack_info(
+      cluster_name)
+
+  # Proceed only if installed stack matches input stack
+  if stack_name != installed_stack_name or stack_version != 
installed_stack_version:
+    return
+
+  existing_repo_info = api_client.get_existing_repository_info(stack_name, 
stack_version,
+                                                               
installed_repository_version)
+
+  new_repo_info = repo_utils.add_to_existing_repos(existing_repo_info, 
repos_to_add)
+  api_client.update_existing_repo(stack_name, stack_version, 
installed_repository_version, new_repo_info)
+
+  if not repo_utils.verify_repos_updated(
+      api_client.get_existing_repository_info(stack_name, stack_version, 
installed_repository_version),
+      repos_to_add):
+    raise Exception(
+        'Failed to update repository information on existing cluster, {0} with 
stack {1}-{2}'.format(cluster_name,
+                                                                               
                      stack_name,
+                                                                               
                      stack_version))
+
+  print 'INFO: Repositories are available on existing cluster, {0} with stack 
{1}-{2}'.format(cluster_name,
+                                                                               
               stack_name,
+                                                                               
               stack_version)
+
+
+def write_service_info(stack_dir):
+  """
+  Writes the service info content to the specified stack_dir
+  """
+  stack_services = os.path.join(stack_dir, 'services')
+
+  for service in ('HAWQ', 'PXF'):
+    source_directory = os.path.join(HAWQ_LIB_STAGING_DIR, service)
+    destination_directory = os.path.join(stack_services, service)
+
+    if not os.path.exists(source_directory):
+      raise Exception('{0} directory was not found under {1}'.format(service, 
HAWQ_LIB_STAGING_DIR))
+
+    service_exists = False
+    if os.path.exists(destination_directory):
+      service_exists = True
+      shutil.rmtree(destination_directory)
+
+    if service_exists:
+      print 'INFO: Updating service {0}, which already exists under 
{1}'.format(service, stack_services)
+
+    shutil.copytree(source_directory, destination_directory)
+
+    print 'INFO: {0} directory was successfully {1}d under directory 
{2}'.format(service,
+                                                                               
  'update' if service_exists else 'create',
+                                                                               
  stack_services)
+
+
+def build_parser():
+  """
+  Builds the parser required for parsing user inputs from command line
+  """
+  usage_string = 'Usage: ./add-hawq.py --user admin --password admin --stack 
HDP-2.4 --hawqrepo http://my.host.address/hawq-2.0.1.0/ --addonsrepo 
http://my.host.address/hawq-add-ons-2.0.1.0/'
+  parser = OptionParser(usage=usage_string, version='%prog 
{0}'.format(PLUGIN_VERSION))
+  parser.add_option('-u', '--user', dest='user', help='Ambari login username 
(Required)')
+  parser.add_option('-p', '--password', dest='password',
+                    help='Ambari login password. Providing password through 
command line is not recommended.\n'
+                         'The script prompts for the password.')
+  parser.add_option('-s', '--stack', dest='stack', help='Stack Name and 
Version to be added.'
+                                                        '(Eg: HDP-2.4 or 
HDP-2.5)')
+  parser.add_option('-r', '--hawqrepo', dest='hawqrepo', help='Repository URL 
which points to the HAWQ packages')
+  parser.add_option('-a', '--addonsrepo', dest='addonsrepo',
+                    help='Repository URL which points to the HAWQ Add Ons 
packages')
+  return parser
+
+
+def main():
+  parser = build_parser()
+
+  options, _ = parser.parse_args()
+
+  user = options.user if options.user else raw_input('Enter Ambari login 
Username: ')
+  password = options.password if options.password else getpass.getpass('Enter 
Ambari login Password: ')
+
+  try:
+    # Verify if Ambari credentials are correct and API is reachable
+    api_client = APIClient(user, password)
+    api_client.verify_api_reachable()
+
+    validator = InputValidator()
+
+    stack_info = validator.verify_stack(options.stack)
+
+    repos_to_add = [validator.verify_repo(HAWQ_REPO, options.hawqrepo)]
+
+    add_ons_repo = validator.verify_repo(HAWQ_ADD_ONS_REPO, options.addonsrepo)
+    if add_ons_repo is not None:
+      repos_to_add.append(add_ons_repo)
+
+    update_repoinfo(stack_info['stack_dir'], repos_to_add)
+    add_repo_to_cluster(api_client, stack_info, repos_to_add)
+    write_service_info(stack_info['stack_dir'])
+    print '\nINFO: Please restart ambari-server for changes to take effect'
+  except Exception as e:
+    print '\nERROR: {0}'.format(str(e))
+
+
+if __name__ == '__main__':
+  main()
\ No newline at end of file

http://git-wip-us.apache.org/repos/asf/incubator-hawq/blob/5b013baa/pom.xml
----------------------------------------------------------------------
diff --git a/pom.xml b/pom.xml
index 1b9a1a0..442b8c6 100644
--- a/pom.xml
+++ b/pom.xml
@@ -132,6 +132,8 @@
 
               
<exclude>contrib/hawq-hadoop/hawq-mapreduce-tool/src/test/resources/dataset</exclude>
               <exclude>contrib/hawq-hadoop/**/*.yaml</exclude>
+              <exclude>contrib/hawq-ambari-plugin/build.properties</exclude>
+              <exclude>contrib/hawq-ambari-plugin/README.md</exclude>
 
               <exclude>src/backend/access/index/caql.files</exclude>
               <exclude>src/backend/gpopt/library.ver</exclude>

Reply via email to