Hi,

You used a hadoopless docker image, therefore it cannot find hadoop
dependencies. It is ok if you don't need to use any, the bolded messages
are just INFO, those are not errors.

Best,

Dawid

On 19/12/2018 12:58, Alexandru Gutan wrote:
> Dear all,
>
> I followed the instructions found here:
> https://ci.apache.org/projects/flink/flink-docs-release-1.7/ops/deployment/kubernetes.html
> Minikube version 0.31-01
> Kubernetes version 1.10
> Flink Docker image: flink:latest (1.7.0-scala_2.12)
>
> I ran the following commands:
>
> minikube start
> minikube ssh 'sudo ip link set docker0 promisc on'
> kubectl create -f jobmanager-deployment.yaml
> kubectl create -f taskmanager-deployment.yaml
> kubectl create -f jobmanager-service.yaml
>
> The 2 taskmanagers fail.
> Output:
>
> Starting Task Manager
> config file:
> jobmanager.rpc.address: flink-jobmanager
> jobmanager.rpc.port: 6123
> jobmanager.heap.size: 1024m
> taskmanager.heap.size: 1024m
> taskmanager.numberOfTaskSlots: 2
> parallelism.default: 1
> rest.port: 8081
> blob.server.port: 6124
> query.server.port: 6125
> Starting taskexecutor as a console application on host
> flink-taskmanager-7679c9d55d-n2trk.
> 2018-12-19 11:42:45,216 INFO 
> org.apache.flink.runtime.taskexecutor.TaskManagerRunner       -
> --------------------------------------------------------------------------------
> 2018-12-19 11:42:45,218 INFO 
> org.apache.flink.runtime.taskexecutor.TaskManagerRunner       - 
> Starting TaskManager (Version: 1.7.0, Rev:49da9f9, Date:28.11.2018 @
> 17:59:06 UTC)
> 2018-12-19 11:42:45,218 INFO 
> org.apache.flink.runtime.taskexecutor.TaskManagerRunner       -  OS
> current user: flink
> 2018-12-19 11:42:45,219 INFO 
> org.apache.flink.runtime.taskexecutor.TaskManagerRunner       - 
> *Current Hadoop/Kerberos user: <no hadoop dependency found>*
> 2018-12-19 11:42:45,219 INFO 
> org.apache.flink.runtime.taskexecutor.TaskManagerRunner       -  JVM:
> OpenJDK 64-Bit Server VM - Oracle Corporation - 1.8/25.181-b13
> 2018-12-19 11:42:45,219 INFO 
> org.apache.flink.runtime.taskexecutor.TaskManagerRunner       - 
> Maximum heap size: 922 MiBytes
> 2018-12-19 11:42:45,220 INFO 
> org.apache.flink.runtime.taskexecutor.TaskManagerRunner       - 
> JAVA_HOME: /docker-java-home/jre
> 2018-12-19 11:42:45,220 INFO 
> org.apache.flink.runtime.taskexecutor.TaskManagerRunner       -  No
> Hadoop Dependency available
> 2018-12-19 11:42:45,221 INFO 
> org.apache.flink.runtime.taskexecutor.TaskManagerRunner       -  JVM
> Options:
> 2018-12-19 11:42:45,221 INFO 
> org.apache.flink.runtime.taskexecutor.TaskManagerRunner       -    
> -XX:+UseG1GC
> 2018-12-19 11:42:45,221 INFO 
> org.apache.flink.runtime.taskexecutor.TaskManagerRunner       -    
> -Xms922M
> 2018-12-19 11:42:45,221 INFO 
> org.apache.flink.runtime.taskexecutor.TaskManagerRunner       -    
> -Xmx922M
> 2018-12-19 11:42:45,221 INFO 
> org.apache.flink.runtime.taskexecutor.TaskManagerRunner       -    
> -XX:MaxDirectMemorySize=8388607T
> 2018-12-19 11:42:45,223 INFO 
> org.apache.flink.runtime.taskexecutor.TaskManagerRunner       -    
> -Dlog4j.configuration=file:/opt/flink/conf/log4j-console.properties
> 2018-12-19 11:42:45,223 INFO 
> org.apache.flink.runtime.taskexecutor.TaskManagerRunner       -    
> -Dlogback.configurationFile=file:/opt/flink/conf/logback-console.xml
> 2018-12-19 11:42:45,223 INFO 
> org.apache.flink.runtime.taskexecutor.TaskManagerRunner       - 
> Program Arguments:
> 2018-12-19 11:42:45,223 INFO 
> org.apache.flink.runtime.taskexecutor.TaskManagerRunner       -    
> --configDir
> 2018-12-19 11:42:45,224 INFO 
> org.apache.flink.runtime.taskexecutor.TaskManagerRunner       -    
> /opt/flink/conf
> 2018-12-19 11:42:45,224 INFO 
> org.apache.flink.runtime.taskexecutor.TaskManagerRunner       - 
> Classpath:
> /opt/flink/lib/flink-python_2.12-1.7.0.jar:/opt/flink/lib/log4j-1.2.17.jar:/opt/flink/lib/slf4j-log4j12-1.7.15.jar:/opt/flink/lib/flink-dist_2.12-1.7.0.jar:::
> 2018-12-19 11:42:45,224 INFO 
> org.apache.flink.runtime.taskexecutor.TaskManagerRunner       -
> --------------------------------------------------------------------------------
> 2018-12-19 11:42:45,228 INFO 
> org.apache.flink.runtime.taskexecutor.TaskManagerRunner       -
> Registered UNIX signal handlers for [TERM, HUP, INT]
> 2018-12-19 11:42:45,233 INFO 
> org.apache.flink.runtime.taskexecutor.TaskManagerRunner       -
> Maximum number of open file descriptors is 1048576.
> 2018-12-19 11:42:45,249 INFO 
> org.apache.flink.configuration.GlobalConfiguration            -
> Loading configuration property: jobmanager.rpc.address, flink-jobmanager
> 2018-12-19 11:42:45,250 INFO 
> org.apache.flink.configuration.GlobalConfiguration            -
> Loading configuration property: jobmanager.rpc.port, 6123
> 2018-12-19 11:42:45,251 INFO 
> org.apache.flink.configuration.GlobalConfiguration            -
> Loading configuration property: jobmanager.heap.size, 1024m
> 2018-12-19 11:42:45,251 INFO 
> org.apache.flink.configuration.GlobalConfiguration            -
> Loading configuration property: taskmanager.heap.size, 1024m
> 2018-12-19 11:42:45,251 INFO 
> org.apache.flink.configuration.GlobalConfiguration            -
> Loading configuration property: taskmanager.numberOfTaskSlots, 2
> 2018-12-19 11:42:45,252 INFO 
> org.apache.flink.configuration.GlobalConfiguration            -
> Loading configuration property: parallelism.default, 1
> 2018-12-19 11:42:45,252 INFO 
> org.apache.flink.configuration.GlobalConfiguration            -
> Loading configuration property: rest.port, 8081
> 2018-12-19 11:42:45,254 INFO 
> org.apache.flink.configuration.GlobalConfiguration            -
> Loading configuration property: blob.server.port, 6124
> 2018-12-19 11:42:45,254 INFO 
> org.apache.flink.configuration.GlobalConfiguration            -
> Loading configuration property: query.server.port, 6125
> *2018-12-19 11:42:45,261 INFO 
> org.apache.flink.core.fs.FileSystem                           - Hadoop
> is not in the classpath/dependencies. The extended set of supported
> File Systems via Hadoop is not available.
> 2018-12-19 11:42:45,282 INFO 
> org.apache.flink.runtime.security.modules.HadoopModuleFactory  -
> Cannot create Hadoop Security Module because Hadoop cannot be found in
> the Classpath.
> 2018-12-19 11:42:45,311 INFO 
> org.apache.flink.runtime.security.SecurityUtils               - Cannot
> install HadoopSecurityContext because Hadoop cannot be found in the
> Classpath.*
>
> Any suggestions? Should I try maybe the Hadoop images? (I'm not
> planning to integrate with Hadoop)
>
> Thank you!
> ||

Attachment: signature.asc
Description: OpenPGP digital signature

Reply via email to