It is weird that the yarn app log shows the SQLContext is created
successfully, but in zeppelin side it shows error of "Fail to create
SQLContext"

Ben Vogan <b...@shopkick.com>于2017年5月15日周一 下午8:07写道:

> I am using 0.7.1 and I checked the yarn app log and don't see any errors.
> It looks like this:
>
> 17/05/16 00:04:12 INFO yarn.ApplicationMaster: Registered signal handlers for 
> [TERM, HUP, INT]
> 17/05/16 00:04:13 INFO yarn.ApplicationMaster: ApplicationAttemptId: 
> appattempt_1494373289850_0336_000001
> 17/05/16 00:04:13 INFO spark.SecurityManager: Changing view acls to: yarn,hdfs
> 17/05/16 00:04:13 INFO spark.SecurityManager: Changing modify acls to: 
> yarn,hdfs
> 17/05/16 00:04:13 INFO spark.SecurityManager: SecurityManager: authentication 
> disabled; ui acls disabled; users with view permissions: Set(yarn, hdfs); 
> users with modify permissions: Set(yarn, hdfs)
> 17/05/16 00:04:13 INFO yarn.ApplicationMaster: Starting the user application 
> in a separate Thread
> 17/05/16 00:04:13 INFO yarn.ApplicationMaster: Waiting for spark context 
> initialization
> 17/05/16 00:04:13 INFO yarn.ApplicationMaster: Waiting for spark context 
> initialization ...
> 17/05/16 00:04:14 INFO driver.RSCDriver: Connecting to: 
> jarvis-hue002.internal.shopkick.com:40819
> 17/05/16 00:04:14 INFO driver.RSCDriver: Starting RPC server...
> 17/05/16 00:04:14 WARN rsc.RSCConf: Your hostname, 
> jarvis-yarn008.internal.shopkick.com, resolves to a loopback address, but we 
> couldn't find any external IP address!
> 17/05/16 00:04:14 WARN rsc.RSCConf: Set livy.rsc.rpc.server.address if you 
> need to bind to another address.
> 17/05/16 00:04:14 INFO driver.RSCDriver: Received job request 
> cd7d1356-709d-4674-a85c-21edade2c38d
> 17/05/16 00:04:14 INFO driver.RSCDriver: SparkContext not yet up, queueing 
> job request.
> 17/05/16 00:04:17 INFO spark.SparkContext: Running Spark version 1.6.0
> 17/05/16 00:04:17 INFO spark.SecurityManager: Changing view acls to: yarn,hdfs
> 17/05/16 00:04:17 INFO spark.SecurityManager: Changing modify acls to: 
> yarn,hdfs
> 17/05/16 00:04:17 INFO spark.SecurityManager: SecurityManager: authentication 
> disabled; ui acls disabled; users with view permissions: Set(yarn, hdfs); 
> users with modify permissions: Set(yarn, hdfs)
> 17/05/16 00:04:17 INFO util.Utils: Successfully started service 'sparkDriver' 
> on port 53267.
> 17/05/16 00:04:18 INFO slf4j.Slf4jLogger: Slf4jLogger started
> 17/05/16 00:04:18 INFO Remoting: Starting remoting
> 17/05/16 00:04:18 INFO Remoting: Remoting started; listening on addresses 
> :[akka.tcp://sparkDriverActorSystem@10.19.194.147:38037]
> 17/05/16 00:04:18 INFO Remoting: Remoting now listens on addresses: 
> [akka.tcp://sparkDriverActorSystem@10.19.194.147:38037]
> 17/05/16 00:04:18 INFO util.Utils: Successfully started service 
> 'sparkDriverActorSystem' on port 38037.
> 17/05/16 00:04:18 INFO spark.SparkEnv: Registering MapOutputTracker
> 17/05/16 00:04:18 INFO spark.SparkEnv: Registering BlockManagerMaster
> 17/05/16 00:04:18 INFO storage.DiskBlockManager: Created local directory at 
> /yarn/nm/usercache/hdfs/appcache/application_1494373289850_0336/blockmgr-f46429a6-7466-42c1-bd79-9ddf6ec61cb4
> 17/05/16 00:04:18 INFO storage.MemoryStore: MemoryStore started with capacity 
> 1966.1 MB
> 17/05/16 00:04:18 INFO spark.SparkEnv: Registering OutputCommitCoordinator
> 17/05/16 00:04:18 INFO ui.JettyUtils: Adding filter: 
> org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter
> 17/05/16 00:04:18 INFO util.Utils: Successfully started service 'SparkUI' on 
> port 49024.
> 17/05/16 00:04:18 INFO ui.SparkUI: Started SparkUI at 
> http://10.19.194.147:49024
> 17/05/16 00:04:18 INFO spark.SparkContext: Added JAR 
> file:/services/livy-server/livy-server-current/rsc-jars/livy-api-0.3.0.jar at 
> spark://10.19.194.147:53267/jars/livy-api-0.3.0.jar with timestamp 
> 1494893058608
> 17/05/16 00:04:18 INFO spark.SparkContext: Added JAR 
> file:/services/livy-server/livy-server-current/rsc-jars/livy-rsc-0.3.0.jar at 
> spark://10.19.194.147:53267/jars/livy-rsc-0.3.0.jar with timestamp 
> 1494893058609
> 17/05/16 00:04:18 INFO spark.SparkContext: Added JAR 
> file:/services/livy-server/livy-server-current/rsc-jars/netty-all-4.0.29.Final.jar
>  at spark://10.19.194.147:53267/jars/netty-all-4.0.29.Final.jar with 
> timestamp 1494893058609
> 17/05/16 00:04:18 INFO spark.SparkContext: Added JAR 
> hdfs://jarvis-nameservice001/jarvis_pipelines/vertica-jdbc-7.1.2-0.jar at 
> hdfs://jarvis-nameservice001/jarvis_pipelines/vertica-jdbc-7.1.2-0.jar with 
> timestamp 1494893058609
> 17/05/16 00:04:18 INFO spark.SparkContext: Added JAR 
> hdfs://jarvis-nameservice001/jarvis_pipelines/shopkick-data-pipeline.jar at 
> hdfs://jarvis-nameservice001/jarvis_pipelines/shopkick-data-pipeline.jar with 
> timestamp 1494893058609
> 17/05/16 00:04:18 INFO spark.SparkContext: Added JAR 
> file:/services/livy-server/livy-server-current/repl_2.10-jars/commons-codec-1.9.jar
>  at spark://10.19.194.147:53267/jars/commons-codec-1.9.jar with timestamp 
> 1494893058609
> 17/05/16 00:04:18 INFO spark.SparkContext: Added JAR 
> file:/services/livy-server/livy-server-current/repl_2.10-jars/livy-repl_2.10-0.3.0.jar
>  at spark://10.19.194.147:53267/jars/livy-repl_2.10-0.3.0.jar with timestamp 
> 1494893058609
> 17/05/16 00:04:18 INFO spark.SparkContext: Added JAR 
> file:/services/livy-server/livy-server-current/repl_2.10-jars/livy-core_2.10-0.3.0.jar
>  at spark://10.19.194.147:53267/jars/livy-core_2.10-0.3.0.jar with timestamp 
> 1494893058609
> 17/05/16 00:04:18 INFO cluster.YarnClusterScheduler: Created 
> YarnClusterScheduler
> 17/05/16 00:04:18 INFO util.Utils: Successfully started service 
> 'org.apache.spark.network.netty.NettyBlockTransferService' on port 57551.
> 17/05/16 00:04:18 INFO netty.NettyBlockTransferService: Server created on 
> 57551
> 17/05/16 00:04:18 INFO storage.BlockManager: external shuffle service port = 
> 7337
> 17/05/16 00:04:18 INFO storage.BlockManagerMaster: Trying to register 
> BlockManager
> 17/05/16 00:04:18 INFO storage.BlockManagerMasterEndpoint: Registering block 
> manager 10.19.194.147:57551 with 1966.1 MB RAM, BlockManagerId(driver, 
> 10.19.194.147, 57551)
> 17/05/16 00:04:18 INFO storage.BlockManagerMaster: Registered BlockManager
> 17/05/16 00:04:19 INFO scheduler.EventLoggingListener: Logging events to 
> hdfs://jarvis-nameservice001/user/spark/applicationHistory/application_1494373289850_0336_1
> 17/05/16 00:04:19 INFO cluster.YarnClusterSchedulerBackend: SchedulerBackend 
> is ready for scheduling beginning after reached minRegisteredResourcesRatio: 
> 0.8
> 17/05/16 00:04:19 INFO cluster.YarnClusterScheduler: 
> YarnClusterScheduler.postStartHook done
> 17/05/16 00:04:19 INFO cluster.YarnSchedulerBackend$YarnSchedulerEndpoint: 
> ApplicationMaster registered as 
> NettyRpcEndpointRef(spark://YarnAM@10.19.194.147:53267)
> 17/05/16 00:04:19 INFO yarn.YarnRMClient: Registering the ApplicationMaster
> 17/05/16 00:04:19 INFO yarn.ApplicationMaster: Started progress reporter 
> thread with (heartbeat : 3000, initial allocation : 200) intervals
> 17/05/16 00:04:19 INFO hive.HiveContext: Initializing execution hive, version 
> 1.1.0
> 17/05/16 00:04:19 INFO client.ClientWrapper: Inspected Hadoop version: 
> 2.6.0-cdh5.7.0
> 17/05/16 00:04:19 INFO client.ClientWrapper: Loaded 
> org.apache.hadoop.hive.shims.Hadoop23Shims for Hadoop version 2.6.0-cdh5.7.0
> 17/05/16 00:04:20 INFO hive.metastore: Trying to connect to metastore with 
> URI thrift://jarvis-hdfs003.internal.shopkick.com:9083
> 17/05/16 00:04:20 INFO hive.metastore: Opened a connection to metastore, 
> current connections: 1
> 17/05/16 00:04:20 INFO hive.metastore: Connected to metastore.
> 17/05/16 00:04:20 INFO session.SessionState: Created HDFS directory: 
> file:/yarn/nm/usercache/hdfs/appcache/application_1494373289850_0336/container_e14_1494373289850_0336_01_000001/tmp/spark-2217d267-a3c0-4cf4-9565-45f80517d41c/scratch/hdfs
> 17/05/16 00:04:20 INFO session.SessionState: Created local directory: 
> /yarn/nm/usercache/hdfs/appcache/application_1494373289850_0336/container_e14_1494373289850_0336_01_000001/tmp/yarn
> 17/05/16 00:04:20 INFO session.SessionState: Created local directory: 
> /yarn/nm/usercache/hdfs/appcache/application_1494373289850_0336/container_e14_1494373289850_0336_01_000001/tmp/478f39e9-5295-4e8e-97aa-40b5828f9440_resources
> 17/05/16 00:04:20 INFO session.SessionState: Created HDFS directory: 
> file:/yarn/nm/usercache/hdfs/appcache/application_1494373289850_0336/container_e14_1494373289850_0336_01_000001/tmp/spark-2217d267-a3c0-4cf4-9565-45f80517d41c/scratch/hdfs/478f39e9-5295-4e8e-97aa-40b5828f9440
> 17/05/16 00:04:20 INFO session.SessionState: Created local directory: 
> /yarn/nm/usercache/hdfs/appcache/application_1494373289850_0336/container_e14_1494373289850_0336_01_000001/tmp/yarn/478f39e9-5295-4e8e-97aa-40b5828f9440
> 17/05/16 00:04:20 INFO session.SessionState: Created HDFS directory: 
> file:/yarn/nm/usercache/hdfs/appcache/application_1494373289850_0336/container_e14_1494373289850_0336_01_000001/tmp/spark-2217d267-a3c0-4cf4-9565-45f80517d41c/scratch/hdfs/478f39e9-5295-4e8e-97aa-40b5828f9440/_tmp_space.db
> 17/05/16 00:04:20 INFO session.SessionState: No Tez session required at this 
> point. hive.execution.engine=mr.
> 17/05/16 00:04:20 INFO repl.SparkInterpreter: Created sql context (with Hive 
> support).
>
>
> On Mon, May 15, 2017 at 5:43 PM, Jeff Zhang <zjf...@gmail.com> wrote:
>
>>
>> Which version of zeppelin do you use ? And can you check the yarn app log
>> ?
>>
>>
>> Ben Vogan <b...@shopkick.com>于2017年5月15日周一 下午5:56写道:
>>
>>> Hi all,
>>>
>>> For some reason today I'm getting a stack:
>>>
>>> org.apache.zeppelin.livy.LivyException: Fail to create
>>> SQLContext,<console>:4: error: illegal inheritance;
>>> at
>>> org.apache.zeppelin.livy.LivySparkSQLInterpreter.open(LivySparkSQLInterpreter.java:76)
>>> at
>>> org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:70)
>>> at
>>> org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:483)
>>> at org.apache.zeppelin.scheduler.Job.run(Job.java:175)
>>> at
>>> org.apache.zeppelin.scheduler.FIFOScheduler$1.run(FIFOScheduler.java:139)
>>> at
>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
>>> at java.util.concurrent.FutureTask.run(FutureTask.java:266)
>>> at
>>> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
>>> at
>>> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
>>> at
>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
>>> at
>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
>>> at java.lang.Thread.run(Thread.java:745)
>>>
>>> On the Livy server I see no errors and there is an open session on yarn.
>>>
>>> Some help on this would be greatly appreciated!
>>>
>>> --Ben
>>>
>>> On Sun, May 14, 2017 at 6:16 AM, Ben Vogan <b...@shopkick.com> wrote:
>>>
>>>> Hi all,
>>>>
>>>> I've been using Zeppelin for a couple of weeks now with a stable
>>>> configuration, but all of a sudden I am getting "Illegal inheritance"
>>>> errors like so:
>>>>
>>>>  INFO [2017-05-14 03:25:32,678] ({pool-2-thread-56}
>>>> Paragraph.java[jobRun]:362) - run paragraph 20170514-032326_663206142 using
>>>> livy org.apache.zeppelin.interpreter.LazyOpenInterpreter@505a171c
>>>>  WARN [2017-05-14 03:25:33,696] ({pool-2-thread-56}
>>>> NotebookServer.java[afterStatusChange]:2058) - Job
>>>> 20170514-032326_663206142 is finished, status: ERROR, exception: null,
>>>> result: %text <console>:4: error: illegal inheritance;
>>>>
>>>> It happens across multiple notebooks and across by my spark and livy
>>>> interpreters.  I don't know where to look for more information about what
>>>> is wrong.  I don't see any errors in spark/yarn at all.  The driver got
>>>> created, but it looks like no jobs were ever submitted to spark.
>>>>
>>>> Help would be greatly appreciated.
>>>>
>>>> Thanks,
>>>>
>>>> --
>>>> *BENJAMIN VOGAN* | Data Platform Team Lead
>>>>
>>>> <http://www.shopkick.com/>
>>>> <https://www.facebook.com/shopkick>
>>>> <https://www.instagram.com/shopkick/>
>>>> <https://www.pinterest.com/shopkick/> <https://twitter.com/shopkickbiz>
>>>> <https://www.linkedin.com/company-beta/831240/?pathWildcard=831240>
>>>>
>>>
>>>
>>>
>>> --
>>> *BENJAMIN VOGAN* | Data Platform Team Lead
>>>
>>> <http://www.shopkick.com/>
>>> <https://www.facebook.com/shopkick>
>>> <https://www.instagram.com/shopkick/>
>>> <https://www.pinterest.com/shopkick/> <https://twitter.com/shopkickbiz>
>>> <https://www.linkedin.com/company-beta/831240/?pathWildcard=831240>
>>>
>>
>
>
> --
> *BENJAMIN VOGAN* | Data Platform Team Lead
>
> <http://www.shopkick.com/>
> <https://www.facebook.com/shopkick> <https://www.instagram.com/shopkick/>
> <https://www.pinterest.com/shopkick/> <https://twitter.com/shopkickbiz>
> <https://www.linkedin.com/company-beta/831240/?pathWildcard=831240>
>

Reply via email to