Re: Oracle Table not resolved [Spark 2.1.1]

2017-08-28 Thread Naga G
Not able to find the database name.
ora is the database in the below url ?

Sent from Naga iPad

> On Aug 28, 2017, at 4:06 AM, Imran Rajjad  wrote:
> 
> Hello,
> 
> I am trying to retrieve an oracle table into Dataset using following code
> 
> String url = "jdbc:oracle@localhost:1521:ora";
>   Dataset jdbcDF = spark.read()
>   .format("jdbc")
>   .option("driver", "oracle.jdbc.driver.OracleDriver")
>   .option("url", url)
>   .option("dbtable", "INCIDENTS")
>   .option("user", "user1")
>   .option("password", "pass1")
>   .load();
>   
>   System.out.println(jdbcDF.count());
> 
> below is the stack trace
> 
> java.lang.NullPointerException
>  at 
> org.apache.spark.sql.execution.datasources.jdbc.JDBCRDD$.resolveTable(JDBCRDD.scala:72)
>  at 
> org.apache.spark.sql.execution.datasources.jdbc.JDBCRelation.(JDBCRelation.scala:113)
>  at 
> org.apache.spark.sql.execution.datasources.jdbc.JdbcRelationProvider.createRelation(JdbcRelationProvider.scala:45)
>  at 
> org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:330)
>  at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:152)
>  at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:125)
>  at com.elogic.hazelcast_test.JDBCTest.test(JDBCTest.java:56)
>  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>  at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>  at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>  at java.lang.reflect.Method.invoke(Method.java:498)
>  at 
> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:50)
>  at 
> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
>  at 
> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:47)
>  at 
> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
>  at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:325)
>  at 
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:78)
>  at 
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:57)
>  at org.junit.runners.ParentRunner$3.run(ParentRunner.java:290)
>  at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:71)
>  at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:288)
>  at org.junit.runners.ParentRunner.access$000(ParentRunner.java:58)
>  at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:268)
>  at 
> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
>  at 
> org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27)
>  at org.junit.runners.ParentRunner.run(ParentRunner.java:363)
>  at 
> org.eclipse.jdt.internal.junit4.runner.JUnit4TestReference.run(JUnit4TestReference.java:86)
>  at 
> org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38)
>  at 
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:459)
>  at 
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:678)
>  at 
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:382)
>  at 
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:192)
> 
> Apparently the connection is made but Table is not being detected. Any ideas 
> whats wrong with the code?
> 
> regards,
> Imran
> -- 
> I.R

-
To unsubscribe e-mail: user-unsubscr...@spark.apache.org



Re: Spark submit OutOfMemory Error in local mode

2017-08-22 Thread Naga G
Increase the cores, as you're trying to run multiple threads

Sent from Naga iPad

> On Aug 22, 2017, at 3:26 PM, "u...@moosheimer.com"  
> wrote:
> 
> Since you didn't post any concrete information it's hard to give you an 
> advice.
> 
> Try to increase the executor memory (spark.executor.memory).
> If that doesn't help give all the experts in the community a chance to help 
> you by adding more details like version, logfile, source etc
> 
> Mit freundlichen Grüßen / best regards
> Kay-Uwe Moosheimer
> 
>> Am 22.08.2017 um 20:16 schrieb shitijkuls :
>> 
>> Any help here will be appreciated.
>> 
>> 
>> 
>> --
>> View this message in context: 
>> http://apache-spark-user-list.1001560.n3.nabble.com/Spark-submit-OutOfMemory-Error-in-local-mode-tp29081p29096.html
>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>> 
>> -
>> To unsubscribe e-mail: user-unsubscr...@spark.apache.org
>> 
> 
> 
> -
> To unsubscribe e-mail: user-unsubscr...@spark.apache.org
> 

-
To unsubscribe e-mail: user-unsubscr...@spark.apache.org