[jira] [Updated] (LIVY-995) JsonParseException is thrown when closing Livy session when using python profile

2024-01-18 Thread Jianzhen Wu (Jira)


 [ 
https://issues.apache.org/jira/browse/LIVY-995?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jianzhen Wu updated LIVY-995:
-
Description: 
Startup  and enable spark.python.profile.
{code:java}
./bin/pyspark --master local --conf spark.python.profile=true
{code}
 
Execute code related to Spark RDD. When pyspark is closed, Pyspark will output 
profile information.
{code:java}
>>> rdd = sc.parallelize(range(100)).map(str)
>>> rdd.count()
[Stage 0:>                                                          (0 + 1) / 1]
100
>>>

Profile of RDD

         244 function calls (241 primitive calls) in 0.001 seconds
 
   Ordered by: internal time, cumulative time
 
   ncalls  tottime  percall  cumtime  percall filename:lineno(function)
      101    0.000    0.000    0.000    0.000 rdd.py:1237()
      101    0.000    0.000    0.000    0.000 util.py:72(wrapper)
        1    0.000    0.000    0.000    0.000 serializers.py:255(dump_stream)
        1    0.000    0.000    0.000    0.000 serializers.py:213(load_stream)
        2    0.000    0.000    0.000    0.000 \{built-in method builtins.sum}
        1    0.000    0.000    0.001    0.001 worker.py:607(process)
        1    0.000    0.000    0.000    0.000 context.py:549(f)
        1    0.000    0.000    0.000    0.000 \{built-in method _pickle.dumps}
        1    0.000    0.000    0.000    0.000 serializers.py:561(read_int)
        1    0.000    0.000    0.000    0.000 serializers.py:568(write_int)
      4/1    0.000    0.000    0.000    0.000 rdd.py:2917(pipeline_func)
        1    0.000    0.000    0.000    0.000 serializers.py:426(dumps)
        1    0.000    0.000    0.000    0.000 rdd.py:1237()
        1    0.000    0.000    0.000    0.000 serializers.py:135(load_stream)
        2    0.000    0.000    0.000    0.000 rdd.py:1072(func)
        1    0.000    0.000    0.000    0.000 rdd.py:384(func)
        1    0.000    0.000    0.000    0.000 util.py:67(fail_on_stopiteration)
        1    0.000    0.000    0.000    0.000 
serializers.py:151(_read_with_length)
        2    0.000    0.000    0.000    0.000 context.py:546(getStart)
        3    0.000    0.000    0.000    0.000 rdd.py:416(func)
        1    0.000    0.000    0.000    0.000 
serializers.py:216(_load_stream_without_unbatching)
        2    0.000    0.000    0.000    0.000 \{method 'write' of 
'_io.BufferedWriter' objects}
        1    0.000    0.000    0.000    0.000 \{method 'read' of 
'_io.BufferedReader' objects}
        1    0.000    0.000    0.000    0.000 \{built-in method _operator.add}
        1    0.000    0.000    0.000    0.000 \{built-in method 
builtins.hasattr}
        3    0.000    0.000    0.000    0.000 \{built-in method builtins.len}
        1    0.000    0.000    0.000    0.000 \{built-in method _struct.unpack}
        1    0.000    0.000    0.000    0.000 rdd.py:1226()
        1    0.000    0.000    0.000    0.000 \{method 'close' of 'generator' 
objects}
        1    0.000    0.000    0.000    0.000 \{built-in method from_iterable}
        1    0.000    0.000    0.000    0.000 \{built-in method _struct.pack}
        1    0.000    0.000    0.000    0.000 \{method 'disable' of 
'_lsprof.Profiler' objects}
        1    0.000    0.000    0.000    0.000 \{built-in method builtins.iter}
{code}
 
This is because Spark register show_profiles when Spark exit in profile.py 
{code:java}
    def add_profiler(self, id, profiler):
        """Add a profiler for RDD/UDF `id`"""
        if not self.profilers:
            if self.profile_dump_path:
                atexit.register(self.dump_profiles, self.profile_dump_path)
            else:
                atexit.register(self.show_profiles)
 
        self.profilers.append([id, profiler, False])
{code}
 
 
For Livy session, Livy does not convert the output to JSON format. And throw 
below exception:
 
{code:java}
com.fasterxml.jackson.core.JsonParseException: Unexpected character ('=' (code 
61)): expected a valid value (JSON String, Number, Array, Object or token 
'null', 'true' or 'false')
 at [Source: 
(String)""; line: 
1, column: 2]
at 
com.fasterxml.jackson.core.JsonParser._constructError(JsonParser.java:2337)
at 
com.fasterxml.jackson.core.base.ParserMinimalBase._reportError(ParserMinimalBase.java:710)
at 
com.fasterxml.jackson.core.base.ParserMinimalBase._reportUnexpectedChar(ParserMinimalBase.java:635)
at 
com.fasterxml.jackson.core.json.ReaderBasedJsonParser._handleOddValue(ReaderBasedJsonParser.java:1952)
at 
com.fasterxml.jackson.core.json.ReaderBasedJsonParser.nextToken(ReaderBasedJsonParser.java:781)
at 
com.fasterxml.jackson.databind.ObjectReader._initForReading(ObjectReader.java:355)
at 

[jira] [Updated] (LIVY-995) JsonParseException is thrown when closing Livy session when using python profile

2024-01-18 Thread Jianzhen Wu (Jira)


 [ 
https://issues.apache.org/jira/browse/LIVY-995?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jianzhen Wu updated LIVY-995:
-
  Component/s: REPL
Fix Version/s: 0.9.0

> JsonParseException is thrown when closing Livy session when using python 
> profile
> 
>
> Key: LIVY-995
> URL: https://issues.apache.org/jira/browse/LIVY-995
> Project: Livy
>  Issue Type: Improvement
>  Components: REPL
>Reporter: Jianzhen Wu
>Assignee: Jianzhen Wu
>Priority: Critical
> Fix For: 0.9.0
>
>
> Startup  and enable spark.python.profile.
> {code:java}
> ./bin/pyspark --master local --conf spark.python.profile=true
> {code}
>  
> Execute code related to Spark RDD. When pyspark is closed, Pyspark will 
> output profile information.
> {code:java}
> >>> rdd = sc.parallelize(range(100)).map(str)
> >>> rdd.count()
> [Stage 0:>                                                          (0 + 1) / 
> 1]
> 100
> >>>
> 
> Profile of RDD
> 
>          244 function calls (241 primitive calls) in 0.001 seconds
>  
>    Ordered by: internal time, cumulative time
>  
>    ncalls  tottime  percall  cumtime  percall filename:lineno(function)
>       101    0.000    0.000    0.000    0.000 rdd.py:1237()
>       101    0.000    0.000    0.000    0.000 util.py:72(wrapper)
>         1    0.000    0.000    0.000    0.000 serializers.py:255(dump_stream)
>         1    0.000    0.000    0.000    0.000 serializers.py:213(load_stream)
>         2    0.000    0.000    0.000    0.000 \{built-in method builtins.sum}
>         1    0.000    0.000    0.001    0.001 worker.py:607(process)
>         1    0.000    0.000    0.000    0.000 context.py:549(f)
>         1    0.000    0.000    0.000    0.000 \{built-in method _pickle.dumps}
>         1    0.000    0.000    0.000    0.000 serializers.py:561(read_int)
>         1    0.000    0.000    0.000    0.000 serializers.py:568(write_int)
>       4/1    0.000    0.000    0.000    0.000 rdd.py:2917(pipeline_func)
>         1    0.000    0.000    0.000    0.000 serializers.py:426(dumps)
>         1    0.000    0.000    0.000    0.000 rdd.py:1237()
>         1    0.000    0.000    0.000    0.000 serializers.py:135(load_stream)
>         2    0.000    0.000    0.000    0.000 rdd.py:1072(func)
>         1    0.000    0.000    0.000    0.000 rdd.py:384(func)
>         1    0.000    0.000    0.000    0.000 
> util.py:67(fail_on_stopiteration)
>         1    0.000    0.000    0.000    0.000 
> serializers.py:151(_read_with_length)
>         2    0.000    0.000    0.000    0.000 context.py:546(getStart)
>         3    0.000    0.000    0.000    0.000 rdd.py:416(func)
>         1    0.000    0.000    0.000    0.000 
> serializers.py:216(_load_stream_without_unbatching)
>         2    0.000    0.000    0.000    0.000 \{method 'write' of 
> '_io.BufferedWriter' objects}
>         1    0.000    0.000    0.000    0.000 \{method 'read' of 
> '_io.BufferedReader' objects}
>         1    0.000    0.000    0.000    0.000 \{built-in method _operator.add}
>         1    0.000    0.000    0.000    0.000 \{built-in method 
> builtins.hasattr}
>         3    0.000    0.000    0.000    0.000 \{built-in method builtins.len}
>         1    0.000    0.000    0.000    0.000 \{built-in method 
> _struct.unpack}
>         1    0.000    0.000    0.000    0.000 rdd.py:1226()
>         1    0.000    0.000    0.000    0.000 \{method 'close' of 'generator' 
> objects}
>         1    0.000    0.000    0.000    0.000 \{built-in method from_iterable}
>         1    0.000    0.000    0.000    0.000 \{built-in method _struct.pack}
>         1    0.000    0.000    0.000    0.000 \{method 'disable' of 
> '_lsprof.Profiler' objects}
>         1    0.000    0.000    0.000    0.000 \{built-in method builtins.iter}
> {code}
>  
> This is because Spark register show_profiles when Spark exit in profile.py 
> {code:java}
>     def add_profiler(self, id, profiler):
>         """Add a profiler for RDD/UDF `id`"""
>         if not self.profilers:
>             if self.profile_dump_path:
>                 atexit.register(self.dump_profiles, self.profile_dump_path)
>             else:
>                 atexit.register(self.show_profiles)
>  
>         self.profilers.append([id, profiler, False])
> {code}
>  
>  
> For Livy session, Livy does not convert the output to JSON format. And throw 
> below exception:
>  
> {code:java}
> 24/01/17 11:17:30 INFO [shutdown-hook-0] ApplicationMaster: Unregistering 
> ApplicationMaster with FAILED (diag message: User class threw exception: 
> com.fasterxml.jackson.core.JsonParseException: Unexpected character ('=' 
> (code 61)): expected a valid value (JSON 

[jira] [Updated] (LIVY-995) JsonParseException is thrown when closing Livy session when using python profile

2024-01-18 Thread Jianzhen Wu (Jira)


 [ 
https://issues.apache.org/jira/browse/LIVY-995?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jianzhen Wu updated LIVY-995:
-
Description: 
Startup  and enable spark.python.profile.
{code:java}
./bin/pyspark --master local --conf spark.python.profile=true
{code}
 
Execute code related to Spark RDD. When pyspark is closed, Pyspark will output 
profile information.
{code:java}
>>> rdd = sc.parallelize(range(100)).map(str)
>>> rdd.count()
[Stage 0:>                                                          (0 + 1) / 1]
100
>>>

Profile of RDD

         244 function calls (241 primitive calls) in 0.001 seconds
 
   Ordered by: internal time, cumulative time
 
   ncalls  tottime  percall  cumtime  percall filename:lineno(function)
      101    0.000    0.000    0.000    0.000 rdd.py:1237()
      101    0.000    0.000    0.000    0.000 util.py:72(wrapper)
        1    0.000    0.000    0.000    0.000 serializers.py:255(dump_stream)
        1    0.000    0.000    0.000    0.000 serializers.py:213(load_stream)
        2    0.000    0.000    0.000    0.000 \{built-in method builtins.sum}
        1    0.000    0.000    0.001    0.001 worker.py:607(process)
        1    0.000    0.000    0.000    0.000 context.py:549(f)
        1    0.000    0.000    0.000    0.000 \{built-in method _pickle.dumps}
        1    0.000    0.000    0.000    0.000 serializers.py:561(read_int)
        1    0.000    0.000    0.000    0.000 serializers.py:568(write_int)
      4/1    0.000    0.000    0.000    0.000 rdd.py:2917(pipeline_func)
        1    0.000    0.000    0.000    0.000 serializers.py:426(dumps)
        1    0.000    0.000    0.000    0.000 rdd.py:1237()
        1    0.000    0.000    0.000    0.000 serializers.py:135(load_stream)
        2    0.000    0.000    0.000    0.000 rdd.py:1072(func)
        1    0.000    0.000    0.000    0.000 rdd.py:384(func)
        1    0.000    0.000    0.000    0.000 util.py:67(fail_on_stopiteration)
        1    0.000    0.000    0.000    0.000 
serializers.py:151(_read_with_length)
        2    0.000    0.000    0.000    0.000 context.py:546(getStart)
        3    0.000    0.000    0.000    0.000 rdd.py:416(func)
        1    0.000    0.000    0.000    0.000 
serializers.py:216(_load_stream_without_unbatching)
        2    0.000    0.000    0.000    0.000 \{method 'write' of 
'_io.BufferedWriter' objects}
        1    0.000    0.000    0.000    0.000 \{method 'read' of 
'_io.BufferedReader' objects}
        1    0.000    0.000    0.000    0.000 \{built-in method _operator.add}
        1    0.000    0.000    0.000    0.000 \{built-in method 
builtins.hasattr}
        3    0.000    0.000    0.000    0.000 \{built-in method builtins.len}
        1    0.000    0.000    0.000    0.000 \{built-in method _struct.unpack}
        1    0.000    0.000    0.000    0.000 rdd.py:1226()
        1    0.000    0.000    0.000    0.000 \{method 'close' of 'generator' 
objects}
        1    0.000    0.000    0.000    0.000 \{built-in method from_iterable}
        1    0.000    0.000    0.000    0.000 \{built-in method _struct.pack}
        1    0.000    0.000    0.000    0.000 \{method 'disable' of 
'_lsprof.Profiler' objects}
        1    0.000    0.000    0.000    0.000 \{built-in method builtins.iter}
{code}
 
This is because Spark register show_profiles when Spark exit in profile.py 
{code:java}
    def add_profiler(self, id, profiler):
        """Add a profiler for RDD/UDF `id`"""
        if not self.profilers:
            if self.profile_dump_path:
                atexit.register(self.dump_profiles, self.profile_dump_path)
            else:
                atexit.register(self.show_profiles)
 
        self.profilers.append([id, profiler, False])
{code}
 
 
For Livy session, Livy does not convert the output to JSON format. And throw 
below exception:
 
{code:java}
24/01/17 11:17:30 INFO [shutdown-hook-0] ApplicationMaster: Unregistering 
ApplicationMaster with FAILED (diag message: User class threw exception: 
com.fasterxml.jackson.core.JsonParseException: Unexpected character ('=' (code 
61)): expected a valid value (JSON String, Number, Array, Object or token 
'null', 'true' or 'false')
 at [Source: 
(String)""; line: 
1, column: 2]
at com.fasterxml.jackson.core.JsonParser._constructError(JsonParser.java:2337)
at 
com.fasterxml.jackson.core.base.ParserMinimalBase._reportError(ParserMinimalBase.java:710)
at 
com.fasterxml.jackson.core.base.ParserMinimalBase._reportUnexpectedChar(ParserMinimalBase.java:635)
at 
com.fasterxml.jackson.core.json.ReaderBasedJsonParser._handleOddValue(ReaderBasedJsonParser.java:1952)
at 
com.fasterxml.jackson.core.json.ReaderBasedJsonParser.nextToken(ReaderBasedJsonParser.java:781)
at 

[jira] [Created] (LIVY-995) JsonParseException is thrown when closing Livy session when using python profile

2024-01-18 Thread Jianzhen Wu (Jira)
Jianzhen Wu created LIVY-995:


 Summary: JsonParseException is thrown when closing Livy session 
when using python profile
 Key: LIVY-995
 URL: https://issues.apache.org/jira/browse/LIVY-995
 Project: Livy
  Issue Type: Improvement
Reporter: Jianzhen Wu
Assignee: Jianzhen Wu


Startup  and enable spark.python.profile.
{code:java}
./bin/pyspark --master local --conf spark.python.profile=true
{code}
 
Execute code related to Spark RDD. When pyspark is closed, Pyspark will output 
profile information.
{code:java}
>>> rdd = sc.parallelize(range(100)).map(str)
>>> rdd.count()
[Stage 0:>                                                          (0 + 1) / 1]
100
>>>

Profile of RDD

         244 function calls (241 primitive calls) in 0.001 seconds
 
   Ordered by: internal time, cumulative time
 
   ncalls  tottime  percall  cumtime  percall filename:lineno(function)
      101    0.000    0.000    0.000    0.000 rdd.py:1237()
      101    0.000    0.000    0.000    0.000 util.py:72(wrapper)
        1    0.000    0.000    0.000    0.000 serializers.py:255(dump_stream)
        1    0.000    0.000    0.000    0.000 serializers.py:213(load_stream)
        2    0.000    0.000    0.000    0.000 \{built-in method builtins.sum}
        1    0.000    0.000    0.001    0.001 worker.py:607(process)
        1    0.000    0.000    0.000    0.000 context.py:549(f)
        1    0.000    0.000    0.000    0.000 \{built-in method _pickle.dumps}
        1    0.000    0.000    0.000    0.000 serializers.py:561(read_int)
        1    0.000    0.000    0.000    0.000 serializers.py:568(write_int)
      4/1    0.000    0.000    0.000    0.000 rdd.py:2917(pipeline_func)
        1    0.000    0.000    0.000    0.000 serializers.py:426(dumps)
        1    0.000    0.000    0.000    0.000 rdd.py:1237()
        1    0.000    0.000    0.000    0.000 serializers.py:135(load_stream)
        2    0.000    0.000    0.000    0.000 rdd.py:1072(func)
        1    0.000    0.000    0.000    0.000 rdd.py:384(func)
        1    0.000    0.000    0.000    0.000 util.py:67(fail_on_stopiteration)
        1    0.000    0.000    0.000    0.000 
serializers.py:151(_read_with_length)
        2    0.000    0.000    0.000    0.000 context.py:546(getStart)
        3    0.000    0.000    0.000    0.000 rdd.py:416(func)
        1    0.000    0.000    0.000    0.000 
serializers.py:216(_load_stream_without_unbatching)
        2    0.000    0.000    0.000    0.000 \{method 'write' of 
'_io.BufferedWriter' objects}
        1    0.000    0.000    0.000    0.000 \{method 'read' of 
'_io.BufferedReader' objects}
        1    0.000    0.000    0.000    0.000 \{built-in method _operator.add}
        1    0.000    0.000    0.000    0.000 \{built-in method 
builtins.hasattr}
        3    0.000    0.000    0.000    0.000 \{built-in method builtins.len}
        1    0.000    0.000    0.000    0.000 \{built-in method _struct.unpack}
        1    0.000    0.000    0.000    0.000 rdd.py:1226()
        1    0.000    0.000    0.000    0.000 \{method 'close' of 'generator' 
objects}
        1    0.000    0.000    0.000    0.000 \{built-in method from_iterable}
        1    0.000    0.000    0.000    0.000 \{built-in method _struct.pack}
        1    0.000    0.000    0.000    0.000 \{method 'disable' of 
'_lsprof.Profiler' objects}
        1    0.000    0.000    0.000    0.000 \{built-in method builtins.iter}
{code}
 
This is because Spark register show_profiles when Spark exit in profile.py 
{code:java}
    def add_profiler(self, id, profiler):
        """Add a profiler for RDD/UDF `id`"""
        if not self.profilers:
            if self.profile_dump_path:
                atexit.register(self.dump_profiles, self.profile_dump_path)
            else:
                atexit.register(self.show_profiles)
 
        self.profilers.append([id, profiler, False])
{code}
 
 
For Livy session, Livy does not convert the output to JSON format. And throw 
below exception:
 
{code:java}
24/01/17 11:17:30 INFO [shutdown-hook-0] ApplicationMaster: Unregistering 
ApplicationMaster with FAILED (diag message: User class threw exception: 
com.fasterxml.jackson.core.JsonParseException: Unexpected character ('=' (code 
61)): expected a valid value (JSON String, Number, Array, Object or token 
'null', 'true' or 'false')
 at [Source: 
(String)""; line: 
1, column: 2]
at com.fasterxml.jackson.core.JsonParser._constructError(JsonParser.java:2337)
at 
com.fasterxml.jackson.core.base.ParserMinimalBase._reportError(ParserMinimalBase.java:710)
at 
com.fasterxml.jackson.core.base.ParserMinimalBase._reportUnexpectedChar(ParserMinimalBase.java:635)
at 

[jira] [Created] (LIVY-987) NPE when waiting for thrift session to start timeout.

2023-08-25 Thread Jianzhen Wu (Jira)
Jianzhen Wu created LIVY-987:


 Summary: NPE when waiting for thrift session to start timeout.
 Key: LIVY-987
 URL: https://issues.apache.org/jira/browse/LIVY-987
 Project: Livy
  Issue Type: Bug
Reporter: Jianzhen Wu
Assignee: Jianzhen Wu


 

Livy spends 10 min waiting for the session to start. If it takes more than 10 
minutes to start, it will throw a Timeout exception. There is no cause for the 
timeout exception. When Livy throws e.getCause, NPE occurs.

*Livy Code*
{code:java}
  Try(Await.result(future, maxSessionWait)) match {
case Success(session) => session
case Failure(e) => throw e.getCause
  } {code}
*Error Log*
{code:java}
23/08/25 16:01:41 INFO  LivyExecuteStatementOperation: (Error executing query, 
currentState RUNNING, ,java.lang.NullPointerException)
23/08/25 16:01:41 ERROR  LivyExecuteStatementOperation: Error running hive 
query:
org.apache.hive.service.cli.HiveSQLException: java.lang.NullPointerException
        at 
org.apache.livy.thriftserver.LivyExecuteStatementOperation.execute(LivyExecuteStatementOperation.scala:186)
        at 
org.apache.livy.thriftserver.LivyExecuteStatementOperation$$anon$2$$anon$3.run(LivyExecuteStatementOperation.scala:105)
        at 
org.apache.livy.thriftserver.LivyExecuteStatementOperation$$anon$2$$anon$3.run(LivyExecuteStatementOperation.scala:102)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:422)
        at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:2038)
        at 
org.apache.livy.thriftserver.LivyExecuteStatementOperation$$anon$2.run(LivyExecuteStatementOperation.scala:115)
        at 
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.NullPointerException
        at 
org.apache.livy.thriftserver.LivyThriftSessionManager.getLivySession(LivyThriftSessionManager.scala:99)
        at 
org.apache.livy.thriftserver.LivyExecuteStatementOperation.rpcClient$lzycompute(LivyExecuteStatementOperation.scala:65)
        at 
org.apache.livy.thriftserver.LivyExecuteStatementOperation.rpcClient(LivyExecuteStatementOperation.scala:58)
        at 
org.apache.livy.thriftserver.LivyExecuteStatementOperation.execute(LivyExecuteStatementOperation.scala:173)
 {code}
 

 



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Assigned] (LIVY-971) Support to get session variables when using JDBC to connect to Livy thrift server.

2023-02-24 Thread Jianzhen Wu (Jira)


 [ 
https://issues.apache.org/jira/browse/LIVY-971?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jianzhen Wu reassigned LIVY-971:


Assignee: Jianzhen Wu

> Support to get session variables when using JDBC to connect to Livy thrift 
> server.
> --
>
> Key: LIVY-971
> URL: https://issues.apache.org/jira/browse/LIVY-971
> Project: Livy
>  Issue Type: New Feature
>  Components: Thriftserver
>Reporter: Jianzhen Wu
>Assignee: Jianzhen Wu
>Priority: Major
> Fix For: 0.9.0
>
>
> I don’t know if you have encountered the following scenario.
>  # When other platforms use JDBC to integrate with Livy, they need to save 
> the session ID and application ID.
>  # When the connection creation fails, it may be that the spark session 
> failed to start. At this time, they need to check the reason why the spark 
> session failed to start.
> I thought it would be possible to provide an action statement for getting the 
> fields in the list.
> {code:java}
> // code placeholder
> DESC LIVY SESSION; {code}
> ||field||
> |id|
> |appId|
> |state|
> |logs|
>  
>  



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Updated] (LIVY-971) Support to get session variables when using JDBC to connect to Livy thrift server.

2023-02-24 Thread Jianzhen Wu (Jira)


 [ 
https://issues.apache.org/jira/browse/LIVY-971?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jianzhen Wu updated LIVY-971:
-
Description: 
I don’t know if you have encountered the following scenario.
 # When other platforms use JDBC to integrate with Livy, they need to save the 
session ID and application ID.
 # When the connection creation fails, it may be that the spark session failed 
to start. At this time, they need to check the reason why the spark session 
failed to start.

I thought it would be possible to provide an action statement for getting the 
fields in the list.
{code:java}
// code placeholder
DESC LIVY SESSION; {code}
||field||
|id|
|appId|
|state|
|logs|

 

 

  was:
I don’t know if you have encountered the following scenario.
 # When other platforms use JDBC to integrate with Livy, they need to save the 
session ID and application ID.
 # When the connection creation fails, it may be that the spark session failed 
to start. At this time, they need to check the reason why the spark session 
failed to start.

I thought it would be possible to provide an action statement for getting the 
fields in the list.
{code:java}
// code placeholder
DESC LIVY SESSION; {code}
||field||
|id|
|appId|
|state|
|stderr|
|stdout|
|yarnDiagnostics|

 

 


> Support to get session variables when using JDBC to connect to Livy thrift 
> server.
> --
>
> Key: LIVY-971
> URL: https://issues.apache.org/jira/browse/LIVY-971
> Project: Livy
>  Issue Type: New Feature
>  Components: Thriftserver
>Reporter: Jianzhen Wu
>Priority: Major
> Fix For: 0.9.0
>
>
> I don’t know if you have encountered the following scenario.
>  # When other platforms use JDBC to integrate with Livy, they need to save 
> the session ID and application ID.
>  # When the connection creation fails, it may be that the spark session 
> failed to start. At this time, they need to check the reason why the spark 
> session failed to start.
> I thought it would be possible to provide an action statement for getting the 
> fields in the list.
> {code:java}
> // code placeholder
> DESC LIVY SESSION; {code}
> ||field||
> |id|
> |appId|
> |state|
> |logs|
>  
>  



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Updated] (LIVY-971) Support to get session variables when using JDBC to connect to Livy thrift server.

2023-02-24 Thread Jianzhen Wu (Jira)


 [ 
https://issues.apache.org/jira/browse/LIVY-971?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jianzhen Wu updated LIVY-971:
-
Description: 
I don’t know if you have encountered the following scenario.
 # When other platforms use JDBC to integrate with Livy, they need to save the 
session ID and application ID.
 # When the connection creation fails, it may be that the spark session failed 
to start. At this time, they need to check the reason why the spark session 
failed to start.

I thought it would be possible to provide an action statement for getting the 
fields in the list.

 
{code:java}
// code placeholder
DESC LIVY SESSION; {code}
 

 

 
||field||
|id|
|appId|
|state|
|stderr|
|stdout|
|yarnDiagnostics|

 

 

  was:
I don’t know if you have encountered the following scenario.
 # When other platforms use JDBC to integrate with Livy, they need to save the 
session ID and application ID.
 # When the connection creation fails, it may be that the spark session failed 
to start. At this time, they need to check the reason why the spark session 
failed to start.


> Support to get session variables when using JDBC to connect to Livy thrift 
> server.
> --
>
> Key: LIVY-971
> URL: https://issues.apache.org/jira/browse/LIVY-971
> Project: Livy
>  Issue Type: New Feature
>  Components: Thriftserver
>Reporter: Jianzhen Wu
>Priority: Major
> Fix For: 0.9.0
>
>
> I don’t know if you have encountered the following scenario.
>  # When other platforms use JDBC to integrate with Livy, they need to save 
> the session ID and application ID.
>  # When the connection creation fails, it may be that the spark session 
> failed to start. At this time, they need to check the reason why the spark 
> session failed to start.
> I thought it would be possible to provide an action statement for getting the 
> fields in the list.
>  
> {code:java}
> // code placeholder
> DESC LIVY SESSION; {code}
>  
>  
>  
> ||field||
> |id|
> |appId|
> |state|
> |stderr|
> |stdout|
> |yarnDiagnostics|
>  
>  



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Updated] (LIVY-971) Support to get session variables when using JDBC to connect to Livy thrift server.

2023-02-24 Thread Jianzhen Wu (Jira)


 [ 
https://issues.apache.org/jira/browse/LIVY-971?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jianzhen Wu updated LIVY-971:
-
Description: 
I don’t know if you have encountered the following scenario.
 # When other platforms use JDBC to integrate with Livy, they need to save the 
session ID and application ID.
 # When the connection creation fails, it may be that the spark session failed 
to start. At this time, they need to check the reason why the spark session 
failed to start.

I thought it would be possible to provide an action statement for getting the 
fields in the list.
{code:java}
// code placeholder
DESC LIVY SESSION; {code}
||field||
|id|
|appId|
|state|
|stderr|
|stdout|
|yarnDiagnostics|

 

 

  was:
I don’t know if you have encountered the following scenario.
 # When other platforms use JDBC to integrate with Livy, they need to save the 
session ID and application ID.
 # When the connection creation fails, it may be that the spark session failed 
to start. At this time, they need to check the reason why the spark session 
failed to start.

I thought it would be possible to provide an action statement for getting the 
fields in the list.

 
{code:java}
// code placeholder
DESC LIVY SESSION; {code}
 

 

 
||field||
|id|
|appId|
|state|
|stderr|
|stdout|
|yarnDiagnostics|

 

 


> Support to get session variables when using JDBC to connect to Livy thrift 
> server.
> --
>
> Key: LIVY-971
> URL: https://issues.apache.org/jira/browse/LIVY-971
> Project: Livy
>  Issue Type: New Feature
>  Components: Thriftserver
>Reporter: Jianzhen Wu
>Priority: Major
> Fix For: 0.9.0
>
>
> I don’t know if you have encountered the following scenario.
>  # When other platforms use JDBC to integrate with Livy, they need to save 
> the session ID and application ID.
>  # When the connection creation fails, it may be that the spark session 
> failed to start. At this time, they need to check the reason why the spark 
> session failed to start.
> I thought it would be possible to provide an action statement for getting the 
> fields in the list.
> {code:java}
> // code placeholder
> DESC LIVY SESSION; {code}
> ||field||
> |id|
> |appId|
> |state|
> |stderr|
> |stdout|
> |yarnDiagnostics|
>  
>  



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Updated] (LIVY-971) Support to get session variables when using JDBC to connect to Livy thrift server.

2023-02-24 Thread Jianzhen Wu (Jira)


 [ 
https://issues.apache.org/jira/browse/LIVY-971?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jianzhen Wu updated LIVY-971:
-
Description: 
I don’t know if you have encountered the following scenario.
 # When other platforms use JDBC to integrate with Livy, they need to save the 
session ID and application ID.
 # When the connection creation fails, it may be that the spark session failed 
to start. At this time, they need to check the reason why the spark session 
failed to start.

  was:
I don’t know if you have encountered the following scenario.
 # When other platforms use JDBC to integrate with Livy, you need to save the 
session ID and application ID.
 # When the connection creation fails, it may be that the spark session failed 
to start. At this time, you need to check the spark session The reason for the 
startup failure.


> Support to get session variables when using JDBC to connect to Livy thrift 
> server.
> --
>
> Key: LIVY-971
> URL: https://issues.apache.org/jira/browse/LIVY-971
> Project: Livy
>  Issue Type: New Feature
>  Components: Thriftserver
>Reporter: Jianzhen Wu
>Priority: Major
> Fix For: 0.9.0
>
>
> I don’t know if you have encountered the following scenario.
>  # When other platforms use JDBC to integrate with Livy, they need to save 
> the session ID and application ID.
>  # When the connection creation fails, it may be that the spark session 
> failed to start. At this time, they need to check the reason why the spark 
> session failed to start.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Assigned] (LIVY-971) Support to get session variables when using JDBC to connect to Livy thrift server.

2023-02-24 Thread Jianzhen Wu (Jira)


 [ 
https://issues.apache.org/jira/browse/LIVY-971?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jianzhen Wu reassigned LIVY-971:


Assignee: (was: Jianzhen Wu)

> Support to get session variables when using JDBC to connect to Livy thrift 
> server.
> --
>
> Key: LIVY-971
> URL: https://issues.apache.org/jira/browse/LIVY-971
> Project: Livy
>  Issue Type: New Feature
>  Components: Thriftserver
>Reporter: Jianzhen Wu
>Priority: Major
> Fix For: 0.9.0
>
>
> I don’t know if you have encountered the following scenario.
>  # When other platforms use JDBC to integrate with Livy, you need to save the 
> session ID and application ID.
>  # When the connection creation fails, it may be that the spark session 
> failed to start. At this time, you need to check the spark session The reason 
> for the startup failure.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Created] (LIVY-971) Support to get session variables when using JDBC to connect to Livy thrift server.

2023-02-24 Thread Jianzhen Wu (Jira)
Jianzhen Wu created LIVY-971:


 Summary: Support to get session variables when using JDBC to 
connect to Livy thrift server.
 Key: LIVY-971
 URL: https://issues.apache.org/jira/browse/LIVY-971
 Project: Livy
  Issue Type: New Feature
  Components: Thriftserver
Reporter: Jianzhen Wu
Assignee: Jianzhen Wu
 Fix For: 0.9.0


I don’t know if you have encountered the following scenario.
 # When other platforms use JDBC to integrate with Livy, you need to save the 
session ID and application ID.
 # When the connection creation fails, it may be that the spark session failed 
to start. At this time, you need to check the spark session The reason for the 
startup failure.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Commented] (LIVY-899) The state of interactive session is always idle when using thrift protocol.

2022-12-17 Thread Jianzhen Wu (Jira)


[ 
https://issues.apache.org/jira/browse/LIVY-899?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17648966#comment-17648966
 ] 

Jianzhen Wu commented on LIVY-899:
--

Hi Marco, thank you for your reply.  I created a PR with the UT and the fix to 
the Livy project. Could you review it?

> The state of interactive session is always idle when using thrift protocol.
> ---
>
> Key: LIVY-899
> URL: https://issues.apache.org/jira/browse/LIVY-899
> Project: Livy
>  Issue Type: Bug
>  Components: Thriftserver
>Affects Versions: 0.8.0
>Reporter: Jianzhen Wu
>Priority: Major
> Attachments: image-2022-11-15-20-42-01-214.png, 
> image-2022-11-15-20-42-25-472.png, image-2022-11-15-20-48-54-653.png
>
>
> In REST API, ReplDriver would broadcast ReplState to RSCClient when handling 
> the ReplJobRequest in stateChangedCallback function.
> !image-2022-11-15-20-42-01-214.png|width=242,height=200!
> But in Thrift service, the RSCDriver does not broadcast ReplState to 
> RSCClient when handling JobRequest.
> !image-2022-11-15-20-42-25-472.png|width=241,height=199!
> I would like to discuss with you how to resolve this issue.
> Here's what I think.
> !image-2022-11-15-20-48-54-653.png|width=228,height=136!
>  
>  
>  



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Updated] (LIVY-899) The state of interactive session is always idle when using thrift protocol.

2022-11-15 Thread Jianzhen Wu (Jira)


 [ 
https://issues.apache.org/jira/browse/LIVY-899?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jianzhen Wu updated LIVY-899:
-
Description: 
In REST API, ReplDriver would broadcast ReplState to RSCClient when handling 
the ReplJobRequest in stateChangedCallback function.
!image-2022-11-15-20-42-01-214.png|width=242,height=200!
But in Thrift service, the RSCDriver does not broadcast ReplState to RSCClient 
when handling JobRequest.
!image-2022-11-15-20-42-25-472.png|width=241,height=199!

I would like to discuss with you how to resolve this issue.

Here's what I think.

!image-2022-11-15-20-48-54-653.png|width=228,height=136!

 

 

 

  was:
In REST API, ReplDriver would broadcast ReplState to RSCClient when handling 
the ReplJobRequest in stateChangedCallback function.
!image-2022-11-15-20-42-01-214.png|width=242,height=200!
But in Thrift service, the RSCDriver does not broadcast ReplState to RSCClient 
when handling JobRequest.
!image-2022-11-15-20-42-25-472.png|width=241,height=199!

I would like to discuss with you how to resolve this issue.

 

Here's what I think.

 

!image-2022-11-15-20-48-54-653.png|width=228,height=136!

 

 

 


> The state of interactive session is always idle when using thrift protocol.
> ---
>
> Key: LIVY-899
> URL: https://issues.apache.org/jira/browse/LIVY-899
> Project: Livy
>  Issue Type: Bug
>  Components: Thriftserver
>Affects Versions: 0.8.0
>Reporter: Jianzhen Wu
>Assignee: Marco Gaido
>Priority: Major
> Attachments: image-2022-11-15-20-42-01-214.png, 
> image-2022-11-15-20-42-25-472.png, image-2022-11-15-20-48-54-653.png
>
>
> In REST API, ReplDriver would broadcast ReplState to RSCClient when handling 
> the ReplJobRequest in stateChangedCallback function.
> !image-2022-11-15-20-42-01-214.png|width=242,height=200!
> But in Thrift service, the RSCDriver does not broadcast ReplState to 
> RSCClient when handling JobRequest.
> !image-2022-11-15-20-42-25-472.png|width=241,height=199!
> I would like to discuss with you how to resolve this issue.
> Here's what I think.
> !image-2022-11-15-20-48-54-653.png|width=228,height=136!
>  
>  
>  



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Updated] (LIVY-899) The state of interactive session is always idle when using thrift protocol.

2022-11-15 Thread Jianzhen Wu (Jira)


 [ 
https://issues.apache.org/jira/browse/LIVY-899?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jianzhen Wu updated LIVY-899:
-
Description: 
In REST API, ReplDriver would broadcast ReplState to RSCClient when handling 
the ReplJobRequest in stateChangedCallback function.
!image-2022-11-15-20-42-01-214.png|width=242,height=200!
But in Thrift service, the RSCDriver does not broadcast ReplState to RSCClient 
when handling JobRequest.
!image-2022-11-15-20-42-25-472.png|width=241,height=199!

I would like to discuss with you how to resolve this issue.

 

Here's what I think.

 

!image-2022-11-15-20-48-54-653.png|width=228,height=136!

 

 

 

  was:
In REST API, ReplDriver would broadcast ReplState to RSCClient when handling 
the ReplJobRequest in stateChangedCallback function.
!image-2022-11-15-20-42-01-214.png|width=242,height=200!
But in Thrift service, the RSCDriver does not broadcast ReplState to RSCClient 
when handling JobRequest.
!image-2022-11-15-20-42-25-472.png|width=241,height=199!


> The state of interactive session is always idle when using thrift protocol.
> ---
>
> Key: LIVY-899
> URL: https://issues.apache.org/jira/browse/LIVY-899
> Project: Livy
>  Issue Type: Bug
>  Components: Thriftserver
>Affects Versions: 0.8.0
>Reporter: Jianzhen Wu
>Assignee: Marco Gaido
>Priority: Major
> Attachments: image-2022-11-15-20-42-01-214.png, 
> image-2022-11-15-20-42-25-472.png, image-2022-11-15-20-48-54-653.png
>
>
> In REST API, ReplDriver would broadcast ReplState to RSCClient when handling 
> the ReplJobRequest in stateChangedCallback function.
> !image-2022-11-15-20-42-01-214.png|width=242,height=200!
> But in Thrift service, the RSCDriver does not broadcast ReplState to 
> RSCClient when handling JobRequest.
> !image-2022-11-15-20-42-25-472.png|width=241,height=199!
> I would like to discuss with you how to resolve this issue.
>  
> Here's what I think.
>  
> !image-2022-11-15-20-48-54-653.png|width=228,height=136!
>  
>  
>  



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Updated] (LIVY-899) The state of interactive session is always idle when using thrift protocol.

2022-11-15 Thread Jianzhen Wu (Jira)


 [ 
https://issues.apache.org/jira/browse/LIVY-899?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jianzhen Wu updated LIVY-899:
-
Attachment: image-2022-11-15-20-48-54-653.png

> The state of interactive session is always idle when using thrift protocol.
> ---
>
> Key: LIVY-899
> URL: https://issues.apache.org/jira/browse/LIVY-899
> Project: Livy
>  Issue Type: Bug
>  Components: Thriftserver
>Affects Versions: 0.8.0
>Reporter: Jianzhen Wu
>Assignee: Marco Gaido
>Priority: Major
> Attachments: image-2022-11-15-20-42-01-214.png, 
> image-2022-11-15-20-42-25-472.png, image-2022-11-15-20-48-54-653.png
>
>
> In REST API, ReplDriver would broadcast ReplState to RSCClient when handling 
> the ReplJobRequest in stateChangedCallback function.
> !image-2022-11-15-20-42-01-214.png|width=242,height=200!
> But in Thrift service, the RSCDriver does not broadcast ReplState to 
> RSCClient when handling JobRequest.
> !image-2022-11-15-20-42-25-472.png|width=241,height=199!



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Updated] (LIVY-899) The state of interactive session is always idle when using thrift protocol.

2022-11-15 Thread Jianzhen Wu (Jira)


 [ 
https://issues.apache.org/jira/browse/LIVY-899?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jianzhen Wu updated LIVY-899:
-
Attachment: image-2022-11-15-20-42-25-472.png

> The state of interactive session is always idle when using thrift protocol.
> ---
>
> Key: LIVY-899
> URL: https://issues.apache.org/jira/browse/LIVY-899
> Project: Livy
>  Issue Type: Bug
>  Components: Thriftserver
>Affects Versions: 0.8.0
>Reporter: Jianzhen Wu
>Assignee: Marco Gaido
>Priority: Major
> Attachments: image-2022-11-15-20-42-01-214.png, 
> image-2022-11-15-20-42-25-472.png
>
>
> In REST API, ReplDriver would broadcast ReplState to RSCClient when handling 
> the ReplJobRequest in stateChangedCallback function.
> !image-2022-11-15-20-34-49-176.png!
> But in Thrift service, the RSCDriver does not broadcast ReplState to 
> RSCClient when handling JobRequest.
> !image-2022-11-15-20-36-27-004.png!



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Updated] (LIVY-899) The state of interactive session is always idle when using thrift protocol.

2022-11-15 Thread Jianzhen Wu (Jira)


 [ 
https://issues.apache.org/jira/browse/LIVY-899?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jianzhen Wu updated LIVY-899:
-
Description: 
In REST API, ReplDriver would broadcast ReplState to RSCClient when handling 
the ReplJobRequest in stateChangedCallback function.
!image-2022-11-15-20-42-01-214.png!
But in Thrift service, the RSCDriver does not broadcast ReplState to RSCClient 
when handling JobRequest.
!image-2022-11-15-20-42-25-472.png!

  was:
In REST API, ReplDriver would broadcast ReplState to RSCClient when handling 
the ReplJobRequest in stateChangedCallback function.
!image-2022-11-15-20-34-49-176.png!
But in Thrift service, the RSCDriver does not broadcast ReplState to RSCClient 
when handling JobRequest.
!image-2022-11-15-20-36-27-004.png!


> The state of interactive session is always idle when using thrift protocol.
> ---
>
> Key: LIVY-899
> URL: https://issues.apache.org/jira/browse/LIVY-899
> Project: Livy
>  Issue Type: Bug
>  Components: Thriftserver
>Affects Versions: 0.8.0
>Reporter: Jianzhen Wu
>Assignee: Marco Gaido
>Priority: Major
> Attachments: image-2022-11-15-20-42-01-214.png, 
> image-2022-11-15-20-42-25-472.png
>
>
> In REST API, ReplDriver would broadcast ReplState to RSCClient when handling 
> the ReplJobRequest in stateChangedCallback function.
> !image-2022-11-15-20-42-01-214.png!
> But in Thrift service, the RSCDriver does not broadcast ReplState to 
> RSCClient when handling JobRequest.
> !image-2022-11-15-20-42-25-472.png!



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Updated] (LIVY-899) The state of interactive session is always idle when using thrift protocol.

2022-11-15 Thread Jianzhen Wu (Jira)


 [ 
https://issues.apache.org/jira/browse/LIVY-899?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jianzhen Wu updated LIVY-899:
-
Attachment: image-2022-11-15-20-42-01-214.png

> The state of interactive session is always idle when using thrift protocol.
> ---
>
> Key: LIVY-899
> URL: https://issues.apache.org/jira/browse/LIVY-899
> Project: Livy
>  Issue Type: Bug
>  Components: Thriftserver
>Affects Versions: 0.8.0
>Reporter: Jianzhen Wu
>Assignee: Marco Gaido
>Priority: Major
> Attachments: image-2022-11-15-20-42-01-214.png, 
> image-2022-11-15-20-42-25-472.png
>
>
> In REST API, ReplDriver would broadcast ReplState to RSCClient when handling 
> the ReplJobRequest in stateChangedCallback function.
> !image-2022-11-15-20-34-49-176.png!
> But in Thrift service, the RSCDriver does not broadcast ReplState to 
> RSCClient when handling JobRequest.
> !image-2022-11-15-20-36-27-004.png!



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Updated] (LIVY-899) The state of interactive session is always idle when using thrift protocol.

2022-11-15 Thread Jianzhen Wu (Jira)


 [ 
https://issues.apache.org/jira/browse/LIVY-899?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jianzhen Wu updated LIVY-899:
-
Description: 
In REST API, ReplDriver would broadcast ReplState to RSCClient when handling 
the ReplJobRequest in stateChangedCallback function.
!image-2022-11-15-20-42-01-214.png|width=242,height=200!
But in Thrift service, the RSCDriver does not broadcast ReplState to RSCClient 
when handling JobRequest.
!image-2022-11-15-20-42-25-472.png|width=241,height=199!

  was:
In REST API, ReplDriver would broadcast ReplState to RSCClient when handling 
the ReplJobRequest in stateChangedCallback function.
!image-2022-11-15-20-42-01-214.png!
But in Thrift service, the RSCDriver does not broadcast ReplState to RSCClient 
when handling JobRequest.
!image-2022-11-15-20-42-25-472.png!


> The state of interactive session is always idle when using thrift protocol.
> ---
>
> Key: LIVY-899
> URL: https://issues.apache.org/jira/browse/LIVY-899
> Project: Livy
>  Issue Type: Bug
>  Components: Thriftserver
>Affects Versions: 0.8.0
>Reporter: Jianzhen Wu
>Assignee: Marco Gaido
>Priority: Major
> Attachments: image-2022-11-15-20-42-01-214.png, 
> image-2022-11-15-20-42-25-472.png
>
>
> In REST API, ReplDriver would broadcast ReplState to RSCClient when handling 
> the ReplJobRequest in stateChangedCallback function.
> !image-2022-11-15-20-42-01-214.png|width=242,height=200!
> But in Thrift service, the RSCDriver does not broadcast ReplState to 
> RSCClient when handling JobRequest.
> !image-2022-11-15-20-42-25-472.png|width=241,height=199!



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Created] (LIVY-899) The state of interactive session is always idle when using thrift protocol.

2022-11-15 Thread Jianzhen Wu (Jira)
Jianzhen Wu created LIVY-899:


 Summary: The state of interactive session is always idle when 
using thrift protocol.
 Key: LIVY-899
 URL: https://issues.apache.org/jira/browse/LIVY-899
 Project: Livy
  Issue Type: Bug
  Components: Thriftserver
Affects Versions: 0.8.0
Reporter: Jianzhen Wu
Assignee: Marco Gaido


In REST API, ReplDriver would broadcast ReplState to RSCClient when handling 
the ReplJobRequest in stateChangedCallback function.
!image-2022-11-15-20-34-49-176.png!
But in Thrift service, the RSCDriver does not broadcast ReplState to RSCClient 
when handling JobRequest.
!image-2022-11-15-20-36-27-004.png!



--
This message was sent by Atlassian Jira
(v8.20.10#820010)