Hi Venkatesh, Thanks for pointing me to the client.properties file ! However, after changing the port by 15000, there is another issue.
# grep -R falcon.url * conf/client.properties:falcon.url=https://localhost:15000/ server/webapp/falcon/WEB-INF/classes/client.properties:falcon.url=http://localhost:41000/ # bin/falcon-start -port 15000 Hadoop is installed, adding hadoop classpath to falcon classpath falcon started using hadoop version: Hadoop 2.4.1 # bin/falcon entity -submit -type cluster -file /opt/falcon-0.5-incubating/examples/entity/filesystem/standalone-cluster.xml ---> With: conf/client.properties:falcon.url=http://localhost:15000/ I have: Unable to connect to Falcon server, please check if the URL is correct and Falcon server is up and running java.net.SocketTimeoutException: Read timed out With: conf/client.properties:falcon.url=https://localhost:15000/ I have: Error: Unable to initialize Falcon Client object # ps -edf | grep falcon root 20578 1 5 10:58 pts/2 00:00:08 /usr/lib/jvm/ibm-java-ppc64-71/bin/java -Xmx1024m -Dfalcon.log.dir=/opt/falcon-0.5-incubating/logs -Dfalcon.embeddedmq.data=/opt/falcon-0.5-incubating/data -Dfalcon.home=/opt/falcon-0.5-incubating -Dconfig.location=/opt/falcon-0.5-incubating/conf -Dfalcon.app.type=falcon -Dfalcon.catalog.service.enabled= -cp /opt/falcon-0.5-incubating/conf:/opt/hadoop/etc/hadoop:/opt/hadoop/share/hadoop/common/lib/*:/opt/hadoop/share/hadoop/common/*:/opt/hadoop/share/hadoop/hdfs:/opt/hadoop/share/hadoop/hdfs/lib/*:/opt/hadoop/share/hadoop/hdfs/*:/opt/hadoop-2.4.1/share/hadoop/yarn/lib/*:/opt/hadoop-2.4.1/share/hadoop/yarn/*:/opt/hadoop/share/hadoop/mapreduce/lib/*:/opt/hadoop/share/hadoop/mapreduce/*:/opt/hadoop/contrib/capacity-scheduler/*.jar:/opt/falcon-0.5-incubating/server/webapp/falcon/WEB-INF/classes:/opt/falcon-0.5-incubating/server/webapp/falcon/WEB-INF/lib/*:/opt/falcon-0.5-incubating/libext/* org.apache.falcon.Main -app /opt/falcon-0.5-incubating/server/webapp/falcon -port 15000 Looking at logs, I see: 1) with conf/client.properties:falcon.url=https://localhost:15000/ - logs/falcon.application.log : 2014-10-24 11:06:21,816 WARN - [main:] ~ Unable to load native-hadoop library for your platform... using builtin-java classes where applicable (NativeCodeLoader:62) ... 2014-10-24 11:06:23,482 INFO - [main:] ~ BasicAuthFilter initialization started (BasicAuthFilter:80) 196 2014-10-24 11:06:24,513 INFO - [main:] ~ Started [email protected]:15000 (log:67) That looks fine. - logs/falcon.out.2014102411061414141578 ... oct. 24, 2014 11:06:23 AM com.sun.jersey.server.impl.application.WebApplicationImpl _initiate INFO: Initiating Jersey application, version 'Jersey: 1.9 09/02/2011 11:17 AM' Everything looks fine. - After I launched the bin/falcon entity -submit ... command, I see no errors in logs. logs/falcon.application.log : 2014-10-24 11:11:21,557 INFO - [Thread-17:] ~ config.location is set, using: /opt/falcon-0.5-incubating/conf/runtime.properties (ApplicationProperties:106) 2014-10-24 11:11:21,558 INFO - [Thread-17:] ~ Initializing org.apache.falcon.util.RuntimeProperties properties with domain falcon (ApplicationProperties:143) 2014-10-24 11:11:21,558 DEBUG - [Thread-17:] ~ log.cleanup.frequency.hours.retention=minutes(1) (ApplicationProperties:149) 2014-10-24 11:11:21,558 DEBUG - [Thread-17:] ~ log.cleanup.frequency.months.retention=months(3) (ApplicationProperties:149) 2014-10-24 11:11:21,559 DEBUG - [Thread-17:] ~ log.cleanup.frequency.minutes.retention=hours(6) (ApplicationProperties:149) 2014-10-24 11:11:21,559 DEBUG - [Thread-17:] ~ domain=falcon (ApplicationProperties:149) 2014-10-24 11:11:21,559 DEBUG - [Thread-17:] ~ current.colo=local (ApplicationProperties:149) 2014-10-24 11:11:21,559 DEBUG - [Thread-17:] ~ log.cleanup.frequency.days.retention=days(7) (ApplicationProperties:149) 2014-10-24 11:11:21,560 INFO - [Thread-17:] ~ config.location is set, using: /opt/falcon-0.5-incubating/conf/runtime.properties (ApplicationProperties:106) 2014-10-24 11:11:21,560 INFO - [Thread-17:] ~ Initializing org.apache.falcon.util.RuntimeProperties properties with domain falcon (ApplicationProperties:143) 2014-10-24 11:11:21,560 DEBUG - [Thread-17:] ~ log.cleanup.frequency.hours.retention=minutes(1) (ApplicationProperties:149) 2014-10-24 11:11:21,560 DEBUG - [Thread-17:] ~ log.cleanup.frequency.months.retention=months(3) (ApplicationProperties:149) 2014-10-24 11:11:21,560 DEBUG - [Thread-17:] ~ log.cleanup.frequency.minutes.retention=hours(6) (ApplicationProperties:149) 2014-10-24 11:11:21,561 DEBUG - [Thread-17:] ~ domain=falcon (ApplicationProperties:149) 2014-10-24 11:11:21,561 DEBUG - [Thread-17:] ~ current.colo=local (ApplicationProperties:149) 2014-10-24 11:11:21,561 DEBUG - [Thread-17:] ~ log.cleanup.frequency.days.retention=days(7) (ApplicationProperties:149) 2) with conf/client.properties:falcon.url=http://localhost:15000/ - logs/falcon.application.log : .... 2014-10-24 11:16:05,788 INFO - [main:] ~ BasicAuthFilter initialization started (BasicAuthFilter:80) 2014-10-24 11:16:06,791 INFO - [main:] ~ Started [email protected]:15000 (log:67) - logs/falcon.out.2014102411161414142161 I see issues: 2014-10-24 11:33:19,689 INFO - [2032902871@qtp--1089279400-0:root:POST//entities/submit/cluster 5ffe8e09-05ab-47b3-85e6-1701d2016728] ~ Connecting to ResourceManager at localhost/127.0.0.1:8021 ( RMProxy:92) 2014-10-24 11:33:20,838 INFO - [2032902871@qtp--1089279400-0:root:POST//entities/submit/cluster 5ffe8e09-05ab-47b3-85e6-1701d2016728] ~ Retrying connect to server: localhost/127.0.0.1:8021. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS) (Client:841) # grep -R 8021 * docs/EntitySpecification.html:<interface type="execute" endpoint="localhost:8021" version="0.20.2" /> examples/entity/filesystem/standalone-cluster.xml: <interface type="execute" endpoint="localhost:8021" version="1.1.2"/> examples/entity/hcat/hcat-standalone-cluster.xml: <interface type="execute" endpoint="localhost:8021" version="1.1.2"/> # cat /etc/hosts 127.0.0.1 localhost localhost.localdomain localhost4 localhost4.localdomain4 ::1 localhost localhost.localdomain localhost6 localhost6.localdomain6 I have attached the logs for the 2 cases to this email. I think that Falcon logs should provide hints about the root cause of my issue. Tony ________________________________ De : [email protected] [[email protected]] de la part de Seetharam Venkatesh [[email protected]] Date d'envoi : vendredi 24 octobre 2014 05:37 À : Tony Reix Cc: [email protected]; [email protected] Objet : Re: RE : Falcon Thanks Tony for a detailed analysis. I think the issue is with the client.properties pointing to a different port. Pls change the client.properties in conf folder and make sure its pointing to http://localhost:15000 assuming you have not enabled TLS. Please find my comments below. On Thu, Oct 23, 2014 at 8:11 AM, Tony Reix <[email protected]<mailto:[email protected]>> wrote: Hi Venkatesh, Often, with Hadoop Java code, there are some small issues dealing with security and IBM JVM. That's why I'm checking that Falcon works fine on Ubuntu/PPC64LE and RHEL7/PPC64BE, both with IBM JVM 1.7 . We have not tested with IBM JDK, we have tested with Oracle JDK 1.6, 1.7, Open JDK 1.7. I've found no issue with Falcon Unit tests. So, now, I'm trying to put Falcon at work and check that there are no hidden issues. # bin/falcon admin -version Falcon server build version: {"properties":[{"key":"Version","value":"0.5-incubating-rrelease"},{"key":"Mode","value":"distributed"}]} Looks good so far. # cat $FALCON_HOME/logs/falcon.out.2014102216221413987744 SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:file:/opt/hadoop-2.4.1/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/opt/falcon-distributed-0.5-incubating/server/webapp/falcon/WEB-INF/lib/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory] SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:file:/opt/hadoop-2.4.1/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/opt/falcon-distributed-0.5-incubating/server/webapp/falcon/WEB-INF/lib/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/opt/falcon-distributed-0.5-incubating/server/webapp/falcon/WEB-INF/lib/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. oct. 22, 2014 4:22:28 PM com.sun.jersey.server.impl.application.WebApplicationImpl _initiate INFO: Initiating Jersey application, version 'Jersey: 1.9 09/02/2011 11:17 AM' oct. 22, 2014 4:22:29 PM com.sun.jersey.server.impl.application.WebApplicationImpl _initiate INFO: Initiating Jersey application, version 'Jersey: 1.9 09/02/2011 11:17 AM' About using: src/package.sh Hadoop-version Oozie-version , as explained in: http://falcon.incubator.apache.org/0.5-incubating/InstallationSteps.html , that does not work as-is, since there is an issue for downloading apache-tomcat-6.0.37.tar.gz and put it at: target/oozie-4.0.1/distro/downloads/tomcat-6.0.37.tar.gz . This is a common issue I see often with Hadoop packages. Solution is: put apache-tomcat by hand, and remove the "clean" in "mvn clean install". However, within Falcon, that does not seem to work, since the "downloads" directory seems to be removed. So, I had to: 1) launch src/package.sh and, while it is still compiling: 2) continuously create the "downloads" directory and put the apache-tomcat-6.0.37.tar.gz file in this directory with the appropriate name. A real pain. Sorry, this is maven magic to work around oozie not being published to maven repo. I know its PITA but I just tested a patch by Peeyush and this worked as expected. Can you please create a jira so we can follow up and not drop this. Anyway, I succeeded in building the tar-ball, and I put it on my test machine. # ll target total 280176 -rw-r--r--. 1 root root 95344375 Oct 23 16:37 apache-falcon-0.5-incubating-bin.tar.gz -rw-r--r--. 1 root root 191549421 Oct 23 16:37 apache-falcon-0.5-incubating-sources.tar.gz drwxr-xr-x. 2 root root 6 Oct 23 16:37 archive-tmp drwxr-xr-x. 3 root root 21 Oct 23 15:25 maven-shared-archive-resources drwxr-xr-x. 20 502 wheel 4096 Oct 23 16:20 oozie-4.0.1 # bin/falcon-start -port 15000 Hadoop is installed, adding hadoop classpath to falcon classpath /opt/falcon-0.5-incubating falcon started using hadoop version: Hadoop 2.4.1 # bin/falcon entity -submit -type cluster -file examples/entity/filesystem/standalone-cluster.xml Error: Unable to initialize Falcon Client object I think the jersey client is unable to initialize the client for the given url. You cna override this with -url $FALCON_URL That still does not work ! Logs : # ll -tr logs total 36 -rw-r--r-- 1 root root 5 23 oct. 16:41 falcon.pid -rw-r--r-- 1 root root 0 23 oct. 16:41 falcon.security.audit.log -rw-r--r-- 1 root root 0 23 oct. 16:41 falcon.metric.log -rw-r--r-- 1 root root 0 23 oct. 16:41 falcon.audit.log drwxr-xr-x 2 root root 6 23 oct. 16:41 retry -rw-r--r-- 1 root root 2116 23 oct. 16:41 falcon.out.2014102316411414075311 -rw-r--r-- 1 root root 25876 23 oct. 16:46 falcon.application.log # cat logs/falcon.out.2014102316411414075311 SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:file:/opt/hadoop-2.4.1/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/opt/falcon-0.5-incubating/server/webapp/falcon/WEB-INF/lib/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory] SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:file:/opt/hadoop-2.4.1/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/opt/falcon-0.5-incubating/server/webapp/falcon/WEB-INF/lib/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/opt/falcon-0.5-incubating/server/webapp/falcon/WEB-INF/lib/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. oct. 23, 2014 4:41:55 PM com.sun.jersey.api.core.PackagesResourceConfig init INFO: Scanning for root resource and provider classes in the packages: org.apache.falcon.resource.admin org.apache.falcon.resource.provider org.apache.falcon.resource.proxy org.apache.falcon.resource.metadata oct. 23, 2014 4:41:55 PM com.sun.jersey.api.core.ScanningResourceConfig logClasses INFO: Root resource classes found: class org.apache.falcon.resource.proxy.InstanceManagerProxy class org.apache.falcon.resource.proxy.SchedulableEntityManagerProxy class org.apache.falcon.resource.metadata.LineageMetadataResource class org.apache.falcon.resource.admin.AdminResource oct. 23, 2014 4:41:55 PM com.sun.jersey.api.core.ScanningResourceConfig logClasses INFO: Provider classes found: class org.apache.falcon.resource.provider.JAXBContextResolver oct. 23, 2014 4:41:55 PM com.sun.jersey.server.impl.application.WebApplicationImpl _initiate INFO: Initiating Jersey application, version 'Jersey: 1.9 09/02/2011 11:17 AM' Server came up clean. falcon.application.log is attached to this email. I see no ERROR. Details of the "bin/falcon entity -submit -type cluster -file examples/entity/filesystem" command: .... /usr/lib/jvm/ibm-java-ppc64-71/bin/java -Xmx1024m -cp '/opt/falcon-0.5-incubating/conf:/opt/falcon-0.5-incubating/client/lib/*:falcon/WEB-INF/lib/*:falcon.war/WEB-INF/lib/*' -Dfalcon.log.dir=/root -Dfalcon.app.type=client org.apache.falcon.cli.FalconCLI entity -submit -type cluster -file examples/entity/filesystem/standalone-cluster.xml Which files are there: [root@laurel5 falcon-0.5-incubating]# ll total 64 drwxr-xr-x 2 502 wheel 4096 23 oct. 16:52 bin -rwxrwxrwx 1 502 wheel 19491 15 sept. 08:01 CHANGES.txt drwxr-xr-x 3 root root 16 23 oct. 16:41 client drwxr-xr-x 2 502 wheel 137 20 oct. 17:53 conf drwxr-xr-x 5 root root 55 23 oct. 16:41 data -rwxrwxrwx 1 502 wheel 694 15 sept. 08:01 DISCLAIMER.txt drwxr-xr-x 7 root root 4096 21 oct. 17:32 docs drwxr-xr-x 5 502 wheel 40 15 sept. 08:01 examples drwxrwxrwx 2 root root 4096 22 oct. 11:21 hadooplibs -rwxrwxrwx 1 502 wheel 12281 15 sept. 08:01 LICENSE.txt drwxr-xr-x 3 root root 4096 23 oct. 16:52 logs -rwxrwxrwx 1 502 wheel 181 15 sept. 08:01 NOTICE.txt drwxr-xr-x 3 root root 19 23 oct. 16:41 oozie -rwxrwxrwx 1 502 wheel 5823 15 sept. 08:01 README drwxr-xr-x 3 root root 19 23 oct. 16:41 server How can I get more traces ? Thanks, Tony ________________________________ De : [email protected]<mailto:[email protected]> [[email protected]<mailto:[email protected]>] de la part de Seetharam Venkatesh [[email protected]<mailto:[email protected]>] Date d'envoi : mercredi 22 octobre 2014 18:38 À : Tony Reix; [email protected]<mailto:[email protected]> Cc: [email protected]<mailto:[email protected]> Objet : Re: Falcon Hi Tony, Sorry that you are having issues. It'd be better to send it to dev ML and I have copied that. Also, why are you trying to build falcon since we have the released bits for 0.5 version. Lets dig this further. * try executing 'bin/falcon admin -version' * logs are at $FALCON_HOME/logs specifically look at falcon.out.xxxx and see if the server started ok * the web page will not have anything since this is a fresh install and does not have entities added * lets follow the examples and add a few entities Let us know how this goes. Thanks! On Wed, Oct 22, 2014 at 8:26 AM, Tony Reix <[email protected]<mailto:[email protected]>> wrote: Hi Venkatesh, I am trying to run Falcon examples, and I have issues. I've googled the issue and only found something close but different: http://mail-archives.apache.org/mod_mbox/falcon-dev/201404.mbox/%[email protected]%3E I'm following: http://falcon.incubator.apache.org/0.5-incubating/InstallationSteps.html . I have a Hadoop 2.4.1 cluster running (checked by running Pi computation). And I've compiled Falcon saying Hadoop version is 2.4.1 . I did: bin/falcon-start -port 15000 bin/prism-start -port 16000 I have: # bin/prism-status Hadoop is installed, adding hadoop classpath to falcon classpath Falcon server is running (on http://laurel5:16000/) # bin/falcon-status Hadoop is installed, adding hadoop classpath to falcon classpath When browsing "http://localhost:15000/", I only see : Apache Falcon appearing on the page. What should I expect ? Where can I find some images showing the expected page ? When trying to run the example, I have: # sh -x bin/falcon entity -submit -type cluster -file examples/entity/filesystem/standalone-cluster.xml + PRG=bin/falcon + '[' -h bin/falcon ']' ++ dirname bin/falcon + BASEDIR=bin ++ cd bin/.. ++ pwd + BASEDIR=/opt/falcon-distributed-0.5-incubating + . /opt/falcon-distributed-0.5-incubating/bin/falcon-config.sh client ++ PRG=bin/falcon ++ '[' -h bin/falcon ']' +++ dirname bin/falcon ++ BASEDIR=bin +++ cd bin/.. +++ pwd ++ BASEDIR=/opt/falcon-distributed-0.5-incubating ++ '[' -z '' ']' ++ FALCON_CONF=/opt/falcon-distributed-0.5-incubating/conf ++ export FALCON_CONF ++ '[' -f /opt/falcon-distributed-0.5-incubating/conf/falcon-env.sh ']' ++ . /opt/falcon-distributed-0.5-incubating/conf/falcon-env.sh +++ export JAVA_HOME=/usr/lib/jvm/ibm-java-ppc64-71 +++ JAVA_HOME=/usr/lib/jvm/ibm-java-ppc64-71 ++ test -z /usr/lib/jvm/ibm-java-ppc64-71 ++ JAVA_BIN=/usr/lib/jvm/ibm-java-ppc64-71/bin/java ++ JAR_BIN=/usr/lib/jvm/ibm-java-ppc64-71/bin/jar ++ export JAVA_BIN ++ '[' '!' -e /usr/lib/jvm/ibm-java-ppc64-71/bin/java ']' ++ '[' '!' -e /usr/lib/jvm/ibm-java-ppc64-71/bin/jar ']' ++ DEFAULT_JAVA_HEAP_MAX=-Xmx1024m ++ FALCON_OPTS='-Xmx1024m ' ++ type=client ++ shift ++ case $type in ++ FALCONCPPATH='/opt/falcon-distributed-0.5-incubating/conf:/opt/falcon-distributed-0.5-incubating/client/lib/*' +++ ls /opt/falcon-distributed-0.5-incubating/server/webapp ++ for i in '`ls ${BASEDIR}/server/webapp`' ++ FALCONCPPATH='/opt/falcon-distributed-0.5-incubating/conf:/opt/falcon-distributed-0.5-incubating/client/lib/*:falcon/WEB-INF/lib/*' ++ for i in '`ls ${BASEDIR}/server/webapp`' ++ FALCONCPPATH='/opt/falcon-distributed-0.5-incubating/conf:/opt/falcon-distributed-0.5-incubating/client/lib/*:falcon/WEB-INF/lib/*:falcon.war/WEB-INF/lib/*' ++ for i in '`ls ${BASEDIR}/server/webapp`' ++ FALCONCPPATH='/opt/falcon-distributed-0.5-incubating/conf:/opt/falcon-distributed-0.5-incubating/client/lib/*:falcon/WEB-INF/lib/*:falcon.war/WEB-INF/lib/*:prism/WEB-INF/lib/*' ++ for i in '`ls ${BASEDIR}/server/webapp`' ++ FALCONCPPATH='/opt/falcon-distributed-0.5-incubating/conf:/opt/falcon-distributed-0.5-incubating/client/lib/*:falcon/WEB-INF/lib/*:falcon.war/WEB-INF/lib/*:prism/WEB-INF/lib/*:prism.war/WEB-INF/lib/*' ++ FALCON_OPTS='-Xmx1024m ' ++ export FALCONCPPATH ++ export FALCON_OPTS + JAVA_PROPERTIES='-Xmx1024m ' + [[ entity =~ ^-D ]] + /usr/lib/jvm/ibm-java-ppc64-71/bin/java -Xmx1024m -cp '/opt/falcon-distributed-0.5-incubating/conf:/opt/falcon-distributed-0.5-incubating/client/lib/*:falcon/WEB-INF/lib/*:falcon.war/WEB-INF/lib/*:prism/WEB-INF/lib/*:prism.war/WEB-INF/lib/*' -Dfalcon.log.dir=/root -Dfalcon.app.type=client org.apache.falcon.cli.FalconCLI entity -submit -type cluster -file examples/entity/filesystem/standalone-cluster.xml Error: Bad Request; local/org.apache.falcon.FalconException::com.sun.jersey.api.client.ClientHandlerException: java.net.ConnectException: Connexion refused Can you help ? How can I dig in the issue ? Where are the logs ? How to have debug traces ? Thanks, Tony -- Regards, Venkatesh "Perfection (in design) is achieved not when there is nothing more to add, but rather when there is nothing more to take away." - Antoine de Saint-Exupéry -- Regards, Venkatesh "Perfection (in design) is achieved not when there is nothing more to add, but rather when there is nothing more to take away." - Antoine de Saint-Exupéry
logs.http.tar.gz
Description: logs.http.tar.gz
logs.https.tar.gz
Description: logs.https.tar.gz
