Re: scala test is unable to initialize spark context.

2017-04-06 Thread Jeff Zhang
Seems it is caused by your log4j file

*Caused by: java.lang.IllegalStateException: FileNamePattern [-.log]
does not contain a valid date format specifier*




于2017年4月6日周四 下午4:03写道:

> Hi All ,
>
>
>
>I am just trying to use scala test for testing a small spark code . But
> spark context is not getting initialized , while I am running test file .
>
> I have given code, pom and exception I am getting in mail , please help me
> to understand what mistake I am doing , so that
>
> Spark context is not getting initialized
>
>
>
> *Code:-*
>
>
>
> *import *org.apache.log4j.LogManager
> *import *org.apache.spark.SharedSparkContext
> *import *org.scalatest.FunSuite
> *import *org.apache.spark.{SparkContext, SparkConf}
>
>
>
>
> */**  * Created by PSwain on 4/5/2017.   */ **class *Test *extends *FunSuite
> *with *SharedSparkContext  {
>
>
>   test(*"test initializing spark context"*) {
> *val *list = *List*(1, 2, 3, 4)
> *val *rdd = sc.parallelize(list)
> assert(list.length === rdd.count())
>   }
> }
>
>
>
> *POM File:-*
>
>
>
> * *?>*<*project **xmlns=*
> *"http://maven.apache.org/POM/4.0.0 "  
>**xmlns:**xsi**=*
> *"http://www.w3.org/2001/XMLSchema-instance 
> " 
> **xsi**:schemaLocation=**"http://maven.apache.org/POM/4.0.0 
>  
> http://maven.apache.org/xsd/maven-4.0.0.xsd 
> "*>
> <*modelVersion*>4.0.0
>
> <*groupId*>tesing.loging
> <*artifactId*>logging
> <*version*>1.0-SNAPSHOT
>
>
> <*repositories*>
> <*repository*>
> <*id*>central
> <*name*>central
> <*url*>http://repo1.maven.org/maven/
> 
> 
>
> <*dependencies*>
> <*dependency*>
> <*groupId*>org.apache.spark
> <*artifactId*>spark-core_2.10
> <*version*>1.6.0
> <*type*>test-jar
>
>
> 
> <*dependency*>
> <*groupId*>org.apache.spark
> <*artifactId*>spark-sql_2.10
> <*version*>1.6.0
> 
>
> <*dependency*>
> <*groupId*>org.scalatest
> <*artifactId*>scalatest_2.10
> <*version*>2.2.6
> 
>
> <*dependency*>
> <*groupId*>org.apache.spark
> <*artifactId*>spark-hive_2.10
> <*version*>1.5.0
> <*scope*>provided
> 
> <*dependency*>
> <*groupId*>com.databricks
> <*artifactId*>spark-csv_2.10
> <*version*>1.3.0
> 
> <*dependency*>
> <*groupId*>com.rxcorp.bdf.logging
> <*artifactId*>loggingframework
> <*version*>1.0-SNAPSHOT
> 
> <*dependency*>
> <*groupId*>mysql
> <*artifactId*>mysql-connector-java
> <*version*>5.1.6
> <*scope*>provided
> 
>
> **<*dependency*>
> <*groupId*>org.scala-lang
> <*artifactId*>scala-library
> <*version*>2.10.5
> <*scope*>compile
> <*optional*>true
> 
>
> <*dependency*>
> <*groupId*>org.scalatest
> <*artifactId*>scalatest
> <*version*>1.4.RC2
> 
>
> <*dependency*>
> <*groupId*>log4j
> <*artifactId*>log4j
> <*version*>1.2.17
> 
>
> <*dependency*>
> <*groupId*>org.scala-lang
> <*artifactId*>scala-compiler
> <*version*>2.10.5
> <*scope*>compile
> <*optional*>true
> 
>
> **
> <*build*>
> <*sourceDirectory*>src/main/scala
> <*plugins*>
> <*plugin*>
> <*artifactId*>maven-assembly-plugin
> <*version*>2.2.1
> <*configuration*>
> <*descriptorRefs*>
> 
> <*descriptorRef*>jar-with-dependencies
> 
> 
> <*executions*>
> <*execution*>
> <*id*>make-assembly
> <*phase*>package
> <*goals*>
> <*goal*>single
> 
> 
> 
> 
> <*plugin*>
> <*groupId*>net.alchim31.maven
> <*artifactId*>scala-maven-plugin
> <*version*>3.2.0
> <*executions*>
> <*execution*>
> <*goals*>
> <*goal*>compile
> <*goal*>testCompile
> 
> 
> 
> <*configuration*>
> <*sourceDir*>src/main/scala
>
> <*jvmArgs*>
>

scala test is unable to initialize spark context.

2017-04-06 Thread PSwain
Hi All ,

   I am just trying to use scala test for testing a small spark code . But 
spark context is not getting initialized , while I am running test file .
I have given code, pom and exception I am getting in mail , please help me to 
understand what mistake I am doing , so that
Spark context is not getting initialized

Code:-

import org.apache.log4j.LogManager
import org.apache.spark.SharedSparkContext
import org.scalatest.FunSuite
import org.apache.spark.{SparkContext, SparkConf}

/**
 * Created by PSwain on 4/5/2017.
  */
class Test extends FunSuite with SharedSparkContext  {


  test("test initializing spark context") {
val list = List(1, 2, 3, 4)
val rdd = sc.parallelize(list)
assert(list.length === rdd.count())
  }
}

POM File:-



http://maven.apache.org/POM/4.0.0;
 xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance;
 xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 
http://maven.apache.org/xsd/maven-4.0.0.xsd;>
4.0.0

tesing.loging
logging
1.0-SNAPSHOT




central
central
http://repo1.maven.org/maven/





org.apache.spark
spark-core_2.10
1.6.0
test-jar




org.apache.spark
spark-sql_2.10
1.6.0



org.scalatest
scalatest_2.10
2.2.6



org.apache.spark
spark-hive_2.10
1.5.0
provided


com.databricks
spark-csv_2.10
1.3.0


com.rxcorp.bdf.logging
loggingframework
1.0-SNAPSHOT


mysql
mysql-connector-java
5.1.6
provided



org.scala-lang
scala-library
2.10.5
compile
true



org.scalatest
scalatest
1.4.RC2



log4j
log4j
1.2.17



org.scala-lang
scala-compiler
2.10.5
compile
true




src/main/scala


maven-assembly-plugin
2.2.1


jar-with-dependencies




make-assembly
package

single





net.alchim31.maven
scala-maven-plugin
3.2.0



compile
testCompile




src/main/scala


-Xms64m
-Xmx1024m












Exception:-



An exception or error caused a run to abort.

java.lang.ExceptionInInitializerError

 at org.apache.spark.Logging$class.initializeLogging(Logging.scala:121)

 at 
org.apache.spark.Logging$class.initializeIfNecessary(Logging.scala:106)

 at org.apache.spark.Logging$class.log(Logging.scala:50)

 at org.apache.spark.SparkContext.log(SparkContext.scala:79)

 at org.apache.spark.Logging$class.logInfo(Logging.scala:58)

 at org.apache.spark.SparkContext.logInfo(SparkContext.scala:79)

 at org.apache.spark.SparkContext.(SparkContext.scala:211)

 at org.apache.spark.SparkContext.(SparkContext.scala:147)

 at 
org.apache.spark.SharedSparkContext$class.beforeAll(SharedSparkContext.scala:33)

 at Test.beforeAll(Test.scala:10)

 at 
org.scalatest.BeforeAndAfterAll$class.beforeAll(BeforeAndAfterAll.scala:187)

 at Test.beforeAll(Test.scala:10)

 at 
org.scalatest.BeforeAndAfterAll$class.run(BeforeAndAfterAll.scala:253)

 at Test.run(Test.scala:10)

 at org.scalatest.tools.SuiteRunner.run(SuiteRunner.scala:55)

 at 
org.scalatest.tools.Runner$$anonfun$doRunRunRunDaDoRunRun$3.apply(Runner.scala:2563)

 at 
org.scalatest.tools.Runner$$anonfun$doRunRunRunDaDoRunRun$3.apply(Runner.scala:2557)

 at scala.collection.immutable.List.foreach(List.scala:318)

 at org.scalatest.tools.Runner$.doRunRunRunDaDoRunRun(Runner.scala:2557)

 at 
org.scalatest.tools.Runner$$anonfun$runOptionallyWithPassFailReporter$2.apply(Runner.scala:1044)

 at