Oh great it worked !

I have seen it once or twice that the deployment WARs do not get properly
updated, no idea what causes it to happen though :(
If you are able to reproduce it again please open up a bug and we can
investigate it further.

Glad it worked !

Best,
Sandeep

On Thu, Mar 23, 2017 at 1:22 PM, Jennifer Coston <
[email protected]> wrote:

> I tried deleting the cluster1.war.* file from the deployments folder and
> restarted the Knox server. That seems to have fixed the problem. I’m not
> sure what was wrong initially.
>
>
>
> Thank you for all of your help!
>
> -Jennifer
>
>
>
> *From:* Jennifer Coston
> *Sent:* Thursday, March 23, 2017 12:46 PM
> *To:* [email protected]
> *Subject:* RE: How Do I Troubleshoot the Warning Failed to match path
> /my/path
>
>
>
> Hi Sandeep,
>
>
>
> The curl command actually doesn’t return anything, it just goes to a new
> line. Here is the command I’m using: curl -k -u admin:admin -X GET
> https://server.running.knox:8443/gateway/cluster1/hello/v1/test
> <https://server.running.knox:8443/gateway/cluster1/helloworldservice/v1/test>
>
>
>
> The working curl command is: curl -k -u admin:admin -X GET
> https://original.server.running.knox:8443/gateway/cluster1/hello/v1/test
> and that returns {"code":4,"type":"ok","message":"hello there!"}
>
>
>
> My rewrite.xml file has:
>
> <?xml version="1.0" encoding="UTF-8" standalone="yes"?>
>
> <!--
>
>    Licensed to the Apache Software Foundation (ASF) under one or more
>
>    contributor license agreements.  See the NOTICE file distributed with
>
>    this work for additional information regarding copyright ownership.
>
>    The ASF licenses this file to You under the Apache License, Version 2.0
>
>    (the "License"); you may not use this file except in compliance with
>
>    the License.  You may obtain a copy of the License at
>
>
>
>        http://www.apache.org/licenses/LICENSE-2.0
>
>
>
>    Unless required by applicable law or agreed to in writing, software
>
>    distributed under the License is distributed on an "AS IS" BASIS,
>
>    WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
>
>    See the License for the specific language governing permissions and
>
>    limitations under the License.
>
> -->
>
> <rules>
>
>   <rule dir="IN" name="HELLOSERVICE/hello/inbound/root"
> pattern="*://*:*/**/hello/">
>
>     <rewrite template="{$serviceUrl[HELLOSERVICE]}/"/>
>
>   </rule>
>
>   <rule dir="IN" name="HELLOSERVICE/hello/inbound/path"
> pattern="*://*:*/**/hello/{**}">
>
>     <rewrite template="{$serviceUrl[HELLOSERVICE]}/{**}"/>
>
>   </rule>
>
>   <rule dir="IN" name="HELLOSERVICE/hello/inbound/query"
> pattern="*://*:*/**/hello/{**}?{**}">
>
>     <rewrite template="{$serviceUrl[HELLOSERVICE]}/{**}?{**}"/>
>
>   </rule>
>
>
>
>   <rule dir="OUT" name="HELLOSERVICE/hello/outbound/demo/v1">
>
>     <match pattern="*://*:*/demo/v1/{**}"/>
>
>     <rewrite template="{$frontend[url]}/hello/v1/{**}"/>
>
>   </rule>
>
> </rules>
>
>
>
> I’ve enabled DEBUG logging as you suggested and my logs are now showing:
>
>
>
> NEW SYSTEM:
>
> 2017-03-23 11:18:47,172 INFO  hadoop.gateway 
> (GatewayConfigImpl.java:loadConfigResource(285))
> - Loading configuration resource jar:file:/usr/hdp/2.4.2.0-258/
> knox/bin/../lib/gateway-server-0.6.0.2.4.2.0-258.jar!/
> conf/gateway-default.xml
>
> 2017-03-23 11:18:47,181 INFO  hadoop.gateway 
> (GatewayConfigImpl.java:loadConfigFile(273))
> - Loading configuration file /usr/hdp/2.4.2.0-258/knox/bin/
> ../conf/gateway-site.xml
>
> 2017-03-23 11:18:47,205 INFO  hadoop.gateway 
> (GatewayConfigImpl.java:initGatewayHomeDir(217))
> - Using /usr/hdp/2.4.2.0-258/knox/bin/.. as GATEWAY_HOME via system
> property.
>
> 2017-03-23 11:18:47,732 DEBUG hadoop.gateway 
> (CMFMasterService.java:initializeFromMaster(158))
> - Loading from persistent master: #1.0# Mon, Oct 17 2016 18:06:52.180
>
> 2017-03-23 11:18:47,873 INFO  hadoop.gateway (JettySSLService.java:init(89))
> - Credential store for the gateway instance found - no need to create one.
>
> 2017-03-23 11:18:47,876 INFO  hadoop.gateway (JettySSLService.java:init(106))
> - Keystore for the gateway instance found - no need to create one.
>
> 2017-03-23 11:18:47,877 INFO  hadoop.gateway (JettySSLService.java:
> logAndValidateCertificate(128)) - The Gateway SSL certificate is issued
> to hostname: server.running.knox.
>
> 2017-03-23 11:18:47,879 INFO  hadoop.gateway (JettySSLService.java:
> logAndValidateCertificate(131)) - The Gateway SSL certificate is valid
> between: 10/17/16 6:06 PM and 10/17/17 6:06 PM.
>
> 2017-03-23 11:18:48,070 INFO  hadoop.gateway 
> (GatewayServer.java:startGateway(219))
> - Starting gateway...
>
> 2017-03-23 11:18:48,253 INFO  hadoop.gateway (GatewayServer.java:start(311))
> - Loading topologies from directory: /usr/hdp/2.4.2.0-258/knox/bin/
> ../conf/topologies
>
> 2017-03-23 11:18:48,254 DEBUG hadoop.gateway 
> (DefaultTopologyService.java:loadTopology(95))
> - Loading topology file: /usr/hdp/2.4.2.0-258/knox/bin/
> ../conf/topologies/admin.xml
>
> 2017-03-23 11:18:48,349 DEBUG hadoop.gateway 
> (DefaultTopologyService.java:loadTopology(95))
> - Loading topology file: /usr/hdp/2.4.2.0-258/knox/bin/
> ../conf/topologies/default.xml
>
> 2017-03-23 11:18:48,357 DEBUG hadoop.gateway 
> (DefaultTopologyService.java:loadTopology(95))
> - Loading topology file: /usr/hdp/2.4.2.0-258/knox/bin/
> ../conf/topologies/cluster1.xml
>
> 2017-03-23 11:18:48,367 INFO  hadoop.gateway 
> (GatewayServer.java:handleCreateDeployment(450))
> - Loading topology default from /usr/hdp/2.4.2.0-258/knox/bin/
> ../data/deployments/default.war.157d4e5c6f8
>
> 2017-03-23 11:18:48,779 INFO  hadoop.gateway 
> (GatewayServer.java:handleCreateDeployment(450))
> - Loading topology admin from /usr/hdp/2.4.2.0-258/knox/bin/
> ../data/deployments/admin.war.153ec5663f0
>
> 2017-03-23 11:18:48,798 INFO  hadoop.gateway 
> (GatewayServer.java:handleCreateDeployment(450))
> - Loading topology cluster1 from /usr/hdp/2.4.2.0-258/knox/bin/
> ../data/deployments/cluster1.war.15af1acbcd0
>
> 2017-03-23 11:18:48,812 INFO  hadoop.gateway (GatewayServer.java:start(315))
> - Monitoring topologies in directory: /usr/hdp/2.4.2.0-258/knox/bin/
> ../conf/topologies
>
> 2017-03-23 11:18:48,814 INFO  hadoop.gateway 
> (GatewayServer.java:startGateway(232))
> - Started gateway on port 8,443.
>
> 2017-03-23 11:20:10,232 DEBUG hadoop.gateway 
> (GatewayFilter.java:doFilter(110))
> - Received request: GET /hello/v1/test
>
> 2017-03-23 11:20:10,235 WARN  hadoop.gateway 
> (GatewayFilter.java:doFilter(152))
> - Failed to match path /hello/v1/test
>
>
>
> ORIGINAL SYSTEM:
>
>
>
> 2017-03-23 12:26:34,725 INFO  hadoop.gateway 
> (GatewayConfigImpl.java:loadConfigResource(285))
> - Loading configuration resource jar:file:/usr/hdp/2.4.2.0-258/
> knox/bin/../lib/gateway-server-0.6.0.2.4.2.0-258.jar!/
> conf/gateway-default.xml
>
> 2017-03-23 12:26:34,741 INFO  hadoop.gateway 
> (GatewayConfigImpl.java:loadConfigFile(273))
> - Loading configuration file /usr/hdp/2.4.2.0-258/knox/bin/
> ../conf/gateway-site.xml
>
> 2017-03-23 12:26:34,773 INFO  hadoop.gateway 
> (GatewayConfigImpl.java:initGatewayHomeDir(217))
> - Using /usr/hdp/2.4.2.0-258/knox/bin/.. as GATEWAY_HOME via system
> property.
>
> 2017-03-23 12:26:35,013 DEBUG hadoop.gateway 
> (CMFMasterService.java:initializeFromMaster(158))
> - Loading from persistent master: #1.0# Wed, May 25 2016 12:04:46.908
>
> 2017-03-23 12:26:35,221 INFO  hadoop.gateway (JettySSLService.java:init(89))
> - Credential store for the gateway instance found - no need to create one.
>
> 2017-03-23 12:26:35,242 INFO  hadoop.gateway (JettySSLService.java:init(106))
> - Keystore for the gateway instance found - no need to create one.
>
> 2017-03-23 12:26:35,248 INFO  hadoop.gateway (JettySSLService.java:
> logAndValidateCertificate(128)) - The Gateway SSL certificate is issued
> to hostname: original.server.running.knox.
>
> 2017-03-23 12:26:35,250 INFO  hadoop.gateway (JettySSLService.java:
> logAndValidateCertificate(131)) - The Gateway SSL certificate is valid
> between: 5/25/16 12:04 PM and 5/25/17 12:04 PM.
>
> 2017-03-23 12:26:35,567 INFO  hadoop.gateway 
> (GatewayServer.java:startGateway(219))
> - Starting gateway...
>
> 2017-03-23 12:26:35,876 INFO  hadoop.gateway (GatewayServer.java:start(311))
> - Loading topologies from directory: /usr/hdp/2.4.2.0-258/knox/bin/
> ../conf/topologies
>
> 2017-03-23 12:26:35,877 DEBUG hadoop.gateway 
> (DefaultTopologyService.java:loadTopology(95))
> - Loading topology file: /usr/hdp/2.4.2.0-258/knox/bin/
> ../conf/topologies/default.xml
>
> 2017-03-23 12:26:36,008 DEBUG hadoop.gateway 
> (DefaultTopologyService.java:loadTopology(95))
> - Loading topology file: /usr/hdp/2.4.2.0-258/knox/bin/
> ../conf/topologies/cluster1.xml
>
> 2017-03-23 12:26:36,022 DEBUG hadoop.gateway 
> (DefaultTopologyService.java:loadTopology(95))
> - Loading topology file: /usr/hdp/2.4.2.0-258/knox/bin/
> ../conf/topologies/admin.xml
>
> 2017-03-23 12:26:36,034 INFO  hadoop.gateway 
> (GatewayServer.java:handleCreateDeployment(450))
> - Loading topology default from /usr/hdp/2.4.2.0-258/knox/bin/
> ../data/deployments/default.war.1572999d550
>
> 2017-03-23 12:26:36,383 INFO  hadoop.gateway 
> (GatewayServer.java:handleCreateDeployment(450))
> - Loading topology admin from /usr/hdp/2.4.2.0-258/knox/bin/
> ../data/deployments/admin.war.154e94513b8
>
> 2017-03-23 12:26:36,396 INFO  hadoop.gateway 
> (GatewayServer.java:handleCreateDeployment(450))
> - Loading topology cluster1 from /usr/hdp/2.4.2.0-258/knox/bin/
> ../data/deployments/cluster1.war.15af67141f0
>
> 2017-03-23 12:26:36,426 INFO  hadoop.gateway (GatewayServer.java:start(315))
> - Monitoring topologies in directory: /usr/hdp/2.4.2.0-258/knox/bin/
> ../conf/topologies
>
> 2017-03-23 12:26:36,427 INFO  hadoop.gateway 
> (GatewayServer.java:startGateway(232))
> - Started gateway on port 8,443.
>
> 2017-03-23 12:27:36,361 DEBUG hadoop.gateway 
> (GatewayFilter.java:doFilter(110))
> - Received request: GET /hello/v1/test
>
> 2017-03-23 12:27:36,404 INFO  hadoop.gateway 
> (KnoxLdapRealm.java:getUserDn(556))
> - Computed userDn: uid=admin,ou=people,dc=hadoop,dc=apache,dc=org using
> dnTemplate for principal: admin
>
> 2017-03-23 12:27:36,505 INFO  hadoop.gateway 
> (AclsAuthorizationFilter.java:init(62))
> - Initializing AclsAuthz Provider for: HELLOSERVICE
>
> 2017-03-23 12:27:36,505 DEBUG hadoop.gateway 
> (AclsAuthorizationFilter.java:init(70))
> - ACL Processing Mode is: AND
>
> 2017-03-23 12:27:36,506 DEBUG hadoop.gateway
> (AclParser.java:parseAcls(59)) - No ACLs found for: HELLOSERVICE
>
> 2017-03-23 12:27:36,507 INFO  hadoop.gateway 
> (AclsAuthorizationFilter.java:doFilter(85))
> - Access Granted: true
>
> 2017-03-23 12:27:36,704 DEBUG hadoop.gateway 
> (UrlRewriteProcessor.java:rewrite(155))
> - Rewrote URL: https://original.server.running.knox:8443/gateway/
> cluster1/hello/v1/test, direction: IN via implicit rule:
> HELLOSERVICE/hello/inbound/query to URL: http://server.running.service:
> 8088/ demo /v1/test
>
> 2017-03-23 12:27:36,710 DEBUG hadoop.gateway 
> (DefaultDispatch.java:executeOutboundRequest(119))
> - Dispatch request: GET http:// server.running.service:8088/demo/v1/test?
> user.name=admin
>
> 2017-03-23 12:27:37,398 DEBUG hadoop.gateway 
> (DefaultDispatch.java:executeOutboundRequest(145))
> - Dispatch response status: 200
>
>
>
> From looking at the differences in the logs, it looks like the ACLs are
> not getting logged on the new system. I did specify that in my topology
> file. Here is that as well:
>
>
>
> <?xml version="1.0" encoding="utf-8"?>
>
> <!--
>
>   Licensed to the Apache Software Foundation (ASF) under one or more
>
>   contributor license agreements.  See the NOTICE file distributed with
>
>   this work for additional information regarding copyright ownership.
>
>   The ASF licenses this file to You under the Apache License, Version 2.0
>
>   (the "License"); you may not use this file except in compliance with
>
>   the License.  You may obtain a copy of the License at
>
>
>
>       http://www.apache.org/licenses/LICENSE-2.0
>
>
>
>   Unless required by applicable law or agreed to in writing, software
>
>   distributed under the License is distributed on an "AS IS" BASIS,
>
>   WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
>
>   See the License for the specific language governing permissions and
>
>   limitations under the License.
>
> -->
>
> <topology>
>
>
>
>     <gateway>
>
>
>
>         <provider>
>
>             <role>authentication</role>
>
>             <name>ShiroProvider</name>
>
>             <enabled>true</enabled>
>
>             <param>
>
>                 <!--
>
>                 session timeout in minutes,  this is really idle timeout,
>
>                 defaults to 30mins, if the property value is not defined,,
>
>                 current client authentication would expire if client idles
> contiuosly for more than this value
>
>                 -->
>
>                 <name>sessionTimeout</name>
>
>                 <value>30</value>
>
>             </param>
>
>             <param>
>
>                 <name>main.ldapRealm</name>
>
>                 <value>org.apache.hadoop.gateway.shirorealm.
> KnoxLdapRealm</value>
>
>             </param>
>
>             <param>
>
>                 <name>main.ldapRealm.userDnTemplate</name>
>
>                 <value>uid={0},ou=people,dc=hadoop,dc=apache,dc=org</
> value>
>
>             </param>
>
>             <param>
>
>                 <name>main.ldapRealm.contextFactory.url</name>
>
>                 <value>ldap://server.running.knox:33389</value>
>
>             </param>
>
>             <param>
>
>                 <name>main.ldapRealm.contextFactory.
> authenticationMechanism</name>
>
>                 <value>simple</value>
>
>             </param>
>
>             <param>
>
>                 <name>urls./**</name>
>
>                 <value>authcBasic</value>
>
>             </param>
>
>         </provider>
>
>
>
>         <provider>
>
>             <role>identity-assertion</role>
>
>             <name>Default</name>
>
>             <enabled>true</enabled>
>
>         </provider>
>
>
>
>         <provider>
>
>             <role>authorization</role>
>
>             <name>AclsAuthz</name>
>
>             <enabled>true</enabled>
>
>         </provider>
>
>
>
>
>
>         <!--
>
>         Defines rules for mapping host names internal to a Hadoop cluster
> to externally accessible host names.
>
>         For example, a hadoop service running in AWS may return a response
> that includes URLs containing the
>
>         some AWS internal host name.  If the client needs to make a
> subsequent request to the host identified
>
>         in those URLs they need to be mapped to external host names that
> the client Knox can use to connect.
>
>
>
>         If the external hostname and internal host names are same turn of
> this provider by setting the value of
>
>         enabled parameter as false.
>
>
>
>         The name parameter specifies the external host names in a comma
> separated list.
>
>         The value parameter specifies corresponding internal host names in
> a comma separated list.
>
>
>
>         Note that when you are using Sandbox, the external hostname needs
> to be localhost, as seen in out
>
>         of box sandbox.xml.  This is because Sandbox uses port mapping to
> allow clients to connect to the
>
>         Hadoop services using localhost.  In real clusters, external host
> names would almost never be localhost.
>
>         -->
>
>
>
>                                 <provider>
>
>                                                 <role>webappsec</role>
>
>                                                 <name>WebAppSec</name>
>
>                                                 <enabled>true</enabled>
>
>
> <param><name>csrf.enabled</name><value>false</value></param> <!-- CSRF
> Disabled -->
>
>                                                 <param><name>csrf.
> customHeader</name><value>X-XSRF-Header</value></param>
>
>                                                 <param><name>csrf.
> methodsToIgnore</name><value>GET,OPTIONS,HEAD</value></param>
>
>
> <param><name>cors.enabled</name><value>true</value></param>
>
>
> <param><name>cors.allowOrigin</name><value>*</value></param>
>
>                                                 <param><name>cors.
> allowSubdomains</name><value>false</value></param>
>
>                                                 <param><name>cors.
> supportedMethods</name><value>GET,POST,HEAD,OPTIONS,PUT,
> DELETE</value></param>
>
>                                 </provider>
>
>     </gateway>
>
>
>
>     <service>
>
>         <role>HELLOSERVICE</role>
>
>         <url>http://server.running.knox:8088/demo</url>
>
>     </service>
>
> </topology>
>
>
>
> Thank you!
>
> Jennifer
>
>
>
> *From:* Sandeep More [mailto:[email protected] <[email protected]>]
>
> *Sent:* Thursday, March 23, 2017 11:17 AM
> *To:* [email protected]
> *Subject:* Re: How Do I Troubleshoot the Warning Failed to match path
> /my/path
>
>
>
> Hello Jennifer,
>
>
> From the logs it appears that no rewrite rules were found for path
> '/helloworldservice/v1/say/hello'.
>
>
>
> Assuming that the topology xml file and the service definitions
> (service.xml and rewrite.xml) are copied from the working system to the new
> system, the exact same curl command (except the hostname) should have
> worked.
>
>
>
> One thing you can try is to enable DEBUG logging (on both the machines),
> in conf/gateway-log4j.properties, uncommenting log4j.logger.org.apache.
> hadoop.gateway=DEBUG
>
>
>
> This should tell us a bit more about what's happening and help us compare
> both the logs in more details
>
>
>
> Also, what response are you getting with curl ? I am guessing 404. Can you
> also provide the curl command and the rules from the rewrite.xml file ?
>
>
>
> Best,
>
> Sandeep
>
>
>
> On Thu, Mar 23, 2017 at 10:30 AM, Jennifer Coston <
> [email protected]> wrote:
>
> Hello,
>
>
>
> I have Knox successfully running on one system and I am trying duplicate
> the configuration on another. My topology file is the same and so are the
> rewrite rules for the service I am trying to use. However, when I send a
> curl command through Knox on the new system I don’t get a response. When I
> look in the logs I am seeing the following:
>
>
>
> 2017-03-23 09:09:34,121 INFO  hadoop.gateway 
> (GatewayConfigImpl.java:loadConfigResource(285))
> - Loading configuration resource jar:file:/usr/hdp/2.4.2.0-258/
> knox/bin/../lib/gateway-server-0.6.0.2.4.2.0-258.jar!/
> conf/gateway-default.xml
>
> 2017-03-23 09:09:34,131 INFO  hadoop.gateway 
> (GatewayConfigImpl.java:loadConfigFile(273))
> - Loading configuration file /usr/hdp/2.4.2.0-258/knox/bin/
> ../conf/gateway-site.xml
>
> 2017-03-23 09:09:34,158 INFO  hadoop.gateway 
> (GatewayConfigImpl.java:initGatewayHomeDir(217))
> - Using /usr/hdp/2.4.2.0-258/knox/bin/.. as GATEWAY_HOME via system
> property.
>
> 2017-03-23 09:09:34,875 INFO  hadoop.gateway (JettySSLService.java:init(89))
> - Credential store for the gateway instance found - no need to create one.
>
> 2017-03-23 09:09:34,878 INFO  hadoop.gateway (JettySSLService.java:init(106))
> - Keystore for the gateway instance found - no need to create one.
>
> 2017-03-23 09:09:34,880 INFO  hadoop.gateway (JettySSLService.java:
> logAndValidateCertificate(128)) - The Gateway SSL certificate is issued
> to hostname: server.running.knox.
>
> 2017-03-23 09:09:34,881 INFO  hadoop.gateway (JettySSLService.java:
> logAndValidateCertificate(131)) - The Gateway SSL certificate is valid
> between: 10/17/16 6:06 PM and 10/17/17 6:06 PM.
>
> 2017-03-23 09:09:35,119 INFO  hadoop.gateway 
> (GatewayServer.java:startGateway(219))
> - Starting gateway...
>
> 2017-03-23 09:09:35,349 INFO  hadoop.gateway (GatewayServer.java:start(311))
> - Loading topologies from directory: /usr/hdp/2.4.2.0-258/knox/bin/
> ../conf/topologies
>
> 2017-03-23 09:09:35,459 INFO  hadoop.gateway 
> (GatewayServer.java:handleCreateDeployment(450))
> - Loading topology default from /usr/hdp/2.4.2.0-258/knox/bin/
> ../data/deployments/default.war.157d4e5c6f8
>
> 2017-03-23 09:09:35,866 INFO  hadoop.gateway 
> (GatewayServer.java:handleCreateDeployment(450))
> - Loading topology admin from /usr/hdp/2.4.2.0-258/knox/bin/
> ../data/deployments/admin.war.153ec5663f0
>
> 2017-03-23 09:09:35,884 INFO  hadoop.gateway 
> (GatewayServer.java:handleCreateDeployment(450))
> - Loading topology cluster1 from /usr/hdp/2.4.2.0-258/knox/bin/
> ../data/deployments/cluster1.war.15af1acbcd0
>
> 2017-03-23 09:09:35,899 INFO  hadoop.gateway (GatewayServer.java:start(315))
> - Monitoring topologies in directory: /usr/hdp/2.4.2.0-258/knox/bin/
> ../conf/topologies
>
> 2017-03-23 09:09:35,900 INFO  hadoop.gateway 
> (GatewayServer.java:startGateway(232))
> - Started gateway on port 8,443.
>
> 2017-03-23 09:11:50,664 WARN  hadoop.gateway 
> (GatewayFilter.java:doFilter(152))
> - Failed to match path /helloworldservice/v1/say/hello
>
>
>
> How do I troubleshoot this issue? I’m thinking there must be something
> different in the configuration of the two systems. If it helps, here is the
> logs from the working system:
>
>
>
> 2017-03-23 09:03:08,172 INFO  hadoop.gateway 
> (GatewayConfigImpl.java:loadConfigResource(285))
> - Loading configuration resource jar:file:/usr/hdp/2.4.2.0-258/
> knox/bin/../lib/gateway-server-0.6.0.2.4.2.0-258.jar!/
> conf/gateway-default.xml
>
> 2017-03-23 09:03:08,188 INFO  hadoop.gateway 
> (GatewayConfigImpl.java:loadConfigFile(273))
> - Loading configuration file /usr/hdp/2.4.2.0-258/knox/bin/
> ../conf/gateway-site.xml
>
> 2017-03-23 09:03:08,228 INFO  hadoop.gateway 
> (GatewayConfigImpl.java:initGatewayHomeDir(217))
> - Using /usr/hdp/2.4.2.0-258/knox/bin/.. as GATEWAY_HOME via system
> property.
>
> 2017-03-23 09:03:08,808 INFO  hadoop.gateway (JettySSLService.java:init(89))
> - Credential store for the gateway instance found - no need to create one.
>
> 2017-03-23 09:03:08,819 INFO  hadoop.gateway (JettySSLService.java:init(106))
> - Keystore for the gateway instance found - no need to create one.
>
> 2017-03-23 09:03:08,821 INFO  hadoop.gateway (JettySSLService.java:
> logAndValidateCertificate(128)) - The Gateway SSL certificate is issued
> to hostname: gpi-col44.gpi.collect.
>
> 2017-03-23 09:03:08,822 INFO  hadoop.gateway (JettySSLService.java:
> logAndValidateCertificate(131)) - The Gateway SSL certificate is valid
> between: 5/25/16 12:04 PM and 5/25/17 12:04 PM.
>
> 2017-03-23 09:03:09,140 INFO  hadoop.gateway 
> (GatewayServer.java:startGateway(219))
> - Starting gateway...
>
> 2017-03-23 09:03:09,428 INFO  hadoop.gateway (GatewayServer.java:start(311))
> - Loading topologies from directory: /usr/hdp/2.4.2.0-258/knox/bin/
> ../conf/topologies
>
> 2017-03-23 09:03:09,584 INFO  hadoop.gateway 
> (GatewayServer.java:handleCreateDeployment(450))
> - Loading topology default from /usr/hdp/2.4.2.0-258/knox/bin/
> ../data/deployments/default.war.1572999d550
>
> 2017-03-23 09:03:10,150 INFO  hadoop.gateway 
> (GatewayServer.java:handleCreateDeployment(450))
> - Loading topology admin from /usr/hdp/2.4.2.0-258/knox/bin/
> ../data/deployments/admin.war.154e94513b8
>
> 2017-03-23 09:03:10,187 INFO  hadoop.gateway 
> (GatewayServer.java:handleCreateDeployment(450))
> - Loading topology cluster1 from /usr/hdp/2.4.2.0-258/knox/bin/
> ../data/deployments/cluster1.war.15af67141f0
>
> 2017-03-23 09:03:10,211 INFO  hadoop.gateway (GatewayServer.java:start(315))
> - Monitoring topologies in directory: /usr/hdp/2.4.2.0-258/knox/bin/
> ../conf/topologies
>
> 2017-03-23 09:03:10,213 INFO  hadoop.gateway 
> (GatewayServer.java:startGateway(232))
> - Started gateway on port 8,443.
>
> 2017-03-23 09:04:59,218 INFO  hadoop.gateway 
> (KnoxLdapRealm.java:getUserDn(556))
> - Computed userDn: uid=admin,ou=people,dc=hadoop,dc=apache,dc=org using
> dnTemplate for principal: admin
>
> 2017-03-23 09:04:59,347 INFO  hadoop.gateway 
> (AclsAuthorizationFilter.java:init(62))
> - Initializing AclsAuthz Provider for: HELLOWORLDSERVICE
>
> 2017-03-23 09:04:59,349 INFO  hadoop.gateway 
> (AclsAuthorizationFilter.java:doFilter(85))
> - Access Granted: true
>
>
>
> Thank you,
>
>
>
> *Jennifer *
>
>
>
>
>

Reply via email to