[jira] [Created] (ZEPPELIN-1909) spark session with hive support can't be created correctly

2017-01-05 Thread lichenglin (JIRA)
lichenglin created ZEPPELIN-1909:


 Summary: spark session with hive support can't be created correctly
 Key: ZEPPELIN-1909
 URL: https://issues.apache.org/jira/browse/ZEPPELIN-1909
 Project: Zeppelin
  Issue Type: Bug
  Components: Interpreters
Affects Versions: 0.7.0
Reporter: lichenglin
Priority: Blocker
 Fix For: 0.7.0


{code}

if (useHiveContext()) {
  if (hiveClassesArePresent()) {
Utils.invokeMethod(builder, "enableHiveSupport");
sparkSession = Utils.invokeMethod(builder, "getOrCreate");
logger.info("Created Spark session with Hive support");
  } else {
Utils.invokeMethod(builder, "config",
new Class[]{ String.class, String.class},
new Object[]{ "spark.sql.catalogImplementation", "in-memory"});
sparkSession = Utils.invokeMethod(builder, "getOrCreate");
logger.info("Created Spark session with Hive support");
  }
} 
{code}

{code}
 /**
   * See org.apache.spark.sql.SparkSession.hiveClassesArePresent
   * @return
   */
  private boolean hiveClassesArePresent() {
try {
  this.getClass().forName("org.apache.spark.sql.hive.HiveSessionState");
  this.getClass().forName("org.apache.spark.sql.hive.HiveSharedState");
  this.getClass().forName("org.apache.hadoop.hive.conf.HiveConf");
  return true;
} catch (ClassNotFoundException | NoClassDefFoundError e) {
  return false;
}
  }
{code}

since "org.apache.spark.sql.hive.HiveSharedState" has been removed from 
spark2.1.x 
So hiveClassesArePresent will always return false.
This will cause spark session with hive support would be created with 
"spark.sql.catalogImplementation=in-memory"

May be just remove 
{code}
  this.getClass().forName("org.apache.spark.sql.hive.HiveSharedState");
{code}

is OK?



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Created] (ZEPPELIN-1543) update jackson's version to 2.6.5

2016-10-13 Thread lichenglin (JIRA)
lichenglin created ZEPPELIN-1543:


 Summary: update jackson's version to 2.6.5
 Key: ZEPPELIN-1543
 URL: https://issues.apache.org/jira/browse/ZEPPELIN-1543
 Project: Zeppelin
  Issue Type: Wish
  Components: build
Affects Versions: 0.6.1
Reporter: lichenglin


Spark now use jackson 2.6.5.
Zeppelin's jackson will cause Spark worker throw excetition



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Created] (ZEPPELIN-1531) Allow Multi sql in one paragraph

2016-10-10 Thread lichenglin (JIRA)
lichenglin created ZEPPELIN-1531:


 Summary: Allow Multi sql in one paragraph
 Key: ZEPPELIN-1531
 URL: https://issues.apache.org/jira/browse/ZEPPELIN-1531
 Project: Zeppelin
  Issue Type: Bug
  Components: Interpreters
Affects Versions: 0.6.1
Reporter: lichenglin


%sql should support multi sql like this

 
%sql
show tables;
show tables






--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Created] (ZEPPELIN-1272) Zeppelin spark innterpreters throws java.lang.StackOverflowError after running a while

2016-08-02 Thread lichenglin (JIRA)
lichenglin created ZEPPELIN-1272:


 Summary: Zeppelin spark innterpreters throws 
java.lang.StackOverflowError after running a while
 Key: ZEPPELIN-1272
 URL: https://issues.apache.org/jira/browse/ZEPPELIN-1272
 Project: Zeppelin
  Issue Type: Bug
  Components: Interpreters
Affects Versions: 0.6.0
Reporter: lichenglin


I'm using Zeppelin  to run some spark job with crontab.

But In the last days,I found Zeppelin's spark interpreter shut down almost 
every day once or more.

And here is the log 
{code}
java.lang.StackOverflowError
at scala.reflect.internal.Types$TypeMap.mapOver(Types.scala:4183)
at scala.reflect.internal.Types$AsSeenFromMap.apply(Types.scala:4638)
at scala.reflect.internal.Types$TypeMap.mapOver(Types.scala:4197)
at scala.reflect.internal.Types$AsSeenFromMap.apply(Types.scala:4638)
at scala.reflect.internal.Types$Type.asSeenFrom(Types.scala:754)
at scala.reflect.internal.Types$Type.computeMemberType(Types.scala:788)
at 
scala.reflect.internal.Symbols$MethodSymbol.typeAsMemberOf(Symbols.scala:2655)
at scala.reflect.internal.Types$Type.memberType(Types.scala:779)
at 
scala.reflect.internal.Types$class.defineUnderlyingOfSingleType(Types.scala:1534)
at 
scala.reflect.internal.SymbolTable.defineUnderlyingOfSingleType(SymbolTable.scala:13)
at scala.reflect.internal.Types$SingleType.underlying(Types.scala:1486)
at scala.reflect.internal.Types$SingletonType.widen(Types.scala:1340)
at 
scala.reflect.internal.Types$AsSeenFromMap.toPrefix$1(Types.scala:4541)
at scala.reflect.internal.Types$AsSeenFromMap.apply(Types.scala:4556)
at scala.reflect.internal.Types$TypeMap.mapOver(Types.scala:4183)
at scala.reflect.internal.Types$AsSeenFromMap.apply(Types.scala:4638)
at scala.reflect.internal.Types$TypeMap.mapOver(Types.scala:4197)
at scala.reflect.internal.Types$AsSeenFromMap.apply(Types.scala:4638)
at scala.reflect.internal.Types$Type.asSeenFrom(Types.scala:754)
at scala.reflect.internal.Types$Type.computeMemberType(Types.scala:788)
at 
scala.reflect.internal.Symbols$MethodSymbol.typeAsMemberOf(Symbols.scala:2655)
at scala.reflect.internal.Types$Type.memberType(Types.scala:779)
at 
scala.reflect.internal.Types$class.defineUnderlyingOfSingleType(Types.scala:1534)
at 
scala.reflect.internal.SymbolTable.defineUnderlyingOfSingleType(SymbolTable.scala:13)
at scala.reflect.internal.Types$SingleType.underlying(Types.scala:1486)
at scala.reflect.internal.Types$SingletonType.widen(Types.scala:1340)
at 
scala.reflect.internal.Types$AsSeenFromMap.toPrefix$1(Types.scala:4541)
at scala.reflect.internal.Types$AsSeenFromMap.apply(Types.scala:4556)
at scala.reflect.internal.Types$TypeMap.mapOver(Types.scala:4183)
at scala.reflect.internal.Types$AsSeenFromMap.apply(Types.scala:4638)
at scala.reflect.internal.Types$TypeMap.mapOver(Types.scala:4197)
at scala.reflect.internal.Types$AsSeenFromMap.apply(Types.scala:4638)
at scala.reflect.internal.Types$Type.asSeenFrom(Types.scala:754)
at scala.reflect.internal.Types$Type.computeMemberType(Types.scala:788)
at 
scala.reflect.internal.Symbols$MethodSymbol.typeAsMemberOf(Symbols.scala:2655)
at scala.reflect.internal.Types$Type.memberType(Types.scala:779)
at 
scala.reflect.internal.Types$class.defineUnderlyingOfSingleType(Types.scala:1534)
at 
scala.reflect.internal.SymbolTable.defineUnderlyingOfSingleType(SymbolTable.scala:13)
at scala.reflect.internal.Types$SingleType.underlying(Types.scala:1486)
at scala.reflect.internal.Types$SingletonType.widen(Types.scala:1340)
at 
scala.reflect.internal.Types$AsSeenFromMap.toPrefix$1(Types.scala:4541)
at scala.reflect.internal.Types$AsSeenFromMap.apply(Types.scala:4556)
at scala.reflect.internal.Types$TypeMap.mapOver(Types.scala:4183)

{code}

It was a very very long log, seems to be a loop

I have adjust the env like this
{code}
export JAVA_HOME=/home/hadoop/jdk
export MASTER=spark://namenode:7077 
export ZEPPELIN_PORT=10001
export SPARK_HOME=/home/hadoop/spark-1.6.2-bin-hadoop2.6
export SPARK_SUBMIT_OPTIONS="--driver-memory 2g --executor-memory 5g 
--driver-class-path