[jira] [Commented] (SPARK-21063) Spark return an empty result from remote hadoop cluster

2018-07-25 Thread nick (JIRA)


[ 
https://issues.apache.org/jira/browse/SPARK-21063?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16555403#comment-16555403
 ] 

nick commented on SPARK-21063:
--

[~paulstaab]

It does work when both registering the dialect and using fetchsize, but how can 
we get all the data (don't set "fetchsize") ?

> Spark return an empty result from remote hadoop cluster
> ---
>
> Key: SPARK-21063
> URL: https://issues.apache.org/jira/browse/SPARK-21063
> Project: Spark
>  Issue Type: Bug
>  Components: Spark Core, SQL
>Affects Versions: 2.1.0, 2.1.1
>Reporter: Peter Bykov
>Priority: Major
>
> Spark returning empty result from when querying remote hadoop cluster.
> All firewall settings removed.
> Querying using JDBC working properly using hive-jdbc driver from version 1.1.1
> Code snippet is:
> {code:java}
> val spark = SparkSession.builder
> .appName("RemoteSparkTest")
> .master("local")
> .getOrCreate()
> val df = spark.read
>   .option("url", "jdbc:hive2://remote.hive.local:1/default")
>   .option("user", "user")
>   .option("password", "pass")
>   .option("dbtable", "test_table")
>   .option("driver", "org.apache.hive.jdbc.HiveDriver")
>   .format("jdbc")
>   .load()
>  
> df.show()
> {code}
> Result:
> {noformat}
> +---+
> |test_table.test_col|
> +---+
> +---+
> {noformat}
> All manipulations like: 
> {code:java}
> df.select(*).show()
> {code}
> returns empty result too.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-19767) API Doc pages for Streaming with Kafka 0.10 not current

2017-03-01 Thread Nick (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-19767?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15890200#comment-15890200
 ] 

Nick commented on SPARK-19767:
--

Perhaps in the interim there could be a note on the 0.10 Integration page about 
the missing API doc and how to build it.  That would save people from fishing 
around for doc that's not there.  




> API Doc pages for Streaming with Kafka 0.10 not current
> ---
>
> Key: SPARK-19767
> URL: https://issues.apache.org/jira/browse/SPARK-19767
> Project: Spark
>  Issue Type: Bug
>  Components: Structured Streaming
>Affects Versions: 2.1.0
>Reporter: Nick
>Priority: Minor
>
> The API docs linked from the Spark Kafka 0.10 Integration page are not 
> current.  For instance, on the page
>https://spark.apache.org/docs/latest/streaming-kafka-0-10-integration.html
> the code examples show the new API (i.e. class ConsumerStrategies).  However, 
> following the links
> API Docs --> (Scala | Java)
> lead to API pages that do not have class ConsumerStrategies) .  The API doc 
> package names  also have {code}streaming.kafka{code} as opposed to 
> {code}streaming.kafka10{code} 
> as in the code examples on streaming-kafka-0-10-integration.html.  



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Comment Edited] (SPARK-19767) API Doc pages for Streaming with Kafka 0.10 not current

2017-02-28 Thread Nick (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-19767?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15888932#comment-15888932
 ] 

Nick edited comment on SPARK-19767 at 2/28/17 9:58 PM:
---

{{SKIP_API=1 jekyll build}}  yields the error

{code}
  Dependency Error: Yikes! It looks like you don't have 
/home/nafshartous/Projects/spark/docs/_plugins/include_example.rb or one of its 
dependencies installed. In order to use Jekyll as currently configured, you'll 
need to install this gem. The full error message from Ruby is: 'cannot load 
such file -- pygments' If you run into trouble, you can find helpful resources 
at https://jekyllrb.com/help/! 
jekyll 3.4.0 | Error:  
/home/nafshartous/Projects/spark/docs/_plugins/include_example.rb
{code}

The file {{_plugins/include_example.rb}} is present.



was (Author: nafshartous):
`SKIP_API=1 jekyll build`  yields the error

{code}
  Dependency Error: Yikes! It looks like you don't have 
/home/nafshartous/Projects/spark/docs/_plugins/include_example.rb or one of its 
dependencies installed. In order to use Jekyll as currently configured, you'll 
need to install this gem. The full error message from Ruby is: 'cannot load 
such file -- pygments' If you run into trouble, you can find helpful resources 
at https://jekyllrb.com/help/! 
jekyll 3.4.0 | Error:  
/home/nafshartous/Projects/spark/docs/_plugins/include_example.rb
{code}

The file `_plugins/include_example.rb` is present.


> API Doc pages for Streaming with Kafka 0.10 not current
> ---
>
> Key: SPARK-19767
> URL: https://issues.apache.org/jira/browse/SPARK-19767
> Project: Spark
>  Issue Type: Bug
>  Components: Structured Streaming
>Affects Versions: 2.1.0
>Reporter: Nick
>Priority: Minor
>
> The API docs linked from the Spark Kafka 0.10 Integration page are not 
> current.  For instance, on the page
>https://spark.apache.org/docs/latest/streaming-kafka-0-10-integration.html
> the code examples show the new API (i.e. class ConsumerStrategies).  However, 
> following the links
> API Docs --> (Scala | Java)
> lead to API pages that do not have class ConsumerStrategies) .  The API doc 
> package names  also have {code}streaming.kafka{code} as opposed to 
> {code}streaming.kafka10{code} 
> as in the code examples on streaming-kafka-0-10-integration.html.  



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-19767) API Doc pages for Streaming with Kafka 0.10 not current

2017-02-28 Thread Nick (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-19767?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15888932#comment-15888932
 ] 

Nick commented on SPARK-19767:
--

`SKIP_API=1 jekyll build`  yields the error

{code}
  Dependency Error: Yikes! It looks like you don't have 
/home/nafshartous/Projects/spark/docs/_plugins/include_example.rb or one of its 
dependencies installed. In order to use Jekyll as currently configured, you'll 
need to install this gem. The full error message from Ruby is: 'cannot load 
such file -- pygments' If you run into trouble, you can find helpful resources 
at https://jekyllrb.com/help/! 
jekyll 3.4.0 | Error:  
/home/nafshartous/Projects/spark/docs/_plugins/include_example.rb
{code}

The file `_plugins/include_example.rb` is present.


> API Doc pages for Streaming with Kafka 0.10 not current
> ---
>
> Key: SPARK-19767
> URL: https://issues.apache.org/jira/browse/SPARK-19767
> Project: Spark
>  Issue Type: Bug
>  Components: Structured Streaming
>Affects Versions: 2.1.0
>Reporter: Nick
>Priority: Minor
>
> The API docs linked from the Spark Kafka 0.10 Integration page are not 
> current.  For instance, on the page
>https://spark.apache.org/docs/latest/streaming-kafka-0-10-integration.html
> the code examples show the new API (i.e. class ConsumerStrategies).  However, 
> following the links
> API Docs --> (Scala | Java)
> lead to API pages that do not have class ConsumerStrategies) .  The API doc 
> package names  also have {code}streaming.kafka{code} as opposed to 
> {code}streaming.kafka10{code} 
> as in the code examples on streaming-kafka-0-10-integration.html.  



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-19767) API Doc pages for Streaming with Kafka 0.10 not current

2017-02-28 Thread Nick (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-19767?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15888572#comment-15888572
 ] 

Nick commented on SPARK-19767:
--

Yes the code examples on the Integration Page are current.  The issue with the 
linked API pages looks like more than incompleteness because the package names
name {code}org.apache.spark.streaming.kafka{code} should be 
{code}org.apache.spark.streaming.kafka10{code}.  

I'd be happy to help.  Tried to build the doc running "jekyll build" from the 
docs dir and got the error below.  Is this target broken or my env ?  
{code}
[info] Note: Custom tags that could override future standard tags:  @todo, 
@note, @tparam, @constructor, @groupname, @example, @group. To avoid potential 
overrides, use at least one period character (.) in custom tag names.
[info] Note: Custom tags that were not seen:  @todo, @tparam, @constructor, 
@groupname, @group
[info] 1 error
[info] 100 warnings
[error] (spark/javaunidoc:doc) javadoc returned nonzero exit code
[error] Total time: 198 s, completed Feb 28, 2017 11:56:20 AM
jekyll 3.4.0 | Error:  Unidoc generation failed
{code}

> API Doc pages for Streaming with Kafka 0.10 not current
> ---
>
> Key: SPARK-19767
> URL: https://issues.apache.org/jira/browse/SPARK-19767
> Project: Spark
>  Issue Type: Bug
>  Components: Structured Streaming
>Affects Versions: 2.1.0
>Reporter: Nick
>Priority: Minor
>
> The API docs linked from the Spark Kafka 0.10 Integration page are not 
> current.  For instance, on the page
>https://spark.apache.org/docs/latest/streaming-kafka-0-10-integration.html
> the code examples show the new API (i.e. class ConsumerStrategies).  However, 
> following the links
> API Docs --> (Scala | Java)
> lead to API pages that do not have class ConsumerStrategies) .  The API doc 
> package names  also have {code}streaming.kafka{code} as opposed to 
> {code}streaming.kafka10{code} 
> as in the code examples on streaming-kafka-0-10-integration.html.  



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-19767) API Doc pages for Streaming with Kafka 0.10 not current

2017-02-28 Thread Nick (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-19767?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15888205#comment-15888205
 ] 

Nick commented on SPARK-19767:
--

I'm looking for a code example showing how to use the Java API to start 
streaming from specific offsets.  Thanks for any pointers to code or doc.  

> API Doc pages for Streaming with Kafka 0.10 not current
> ---
>
> Key: SPARK-19767
> URL: https://issues.apache.org/jira/browse/SPARK-19767
> Project: Spark
>  Issue Type: Bug
>  Components: Structured Streaming
>Affects Versions: 2.1.0
>Reporter: Nick
>Priority: Minor
>
> The API docs linked from the Spark Kafka 0.10 Integration page are not 
> current.  For instance, on the page
>https://spark.apache.org/docs/latest/streaming-kafka-0-10-integration.html
> the code examples show the new API (i.e. class ConsumerStrategies).  However, 
> following the links
> API Docs --> (Scala | Java)
> lead to API pages that do not have class ConsumerStrategies) .  The API doc 
> package names  also have {code}streaming.kafka{code} as opposed to 
> {code}streaming.kafka10{code} 
> as in the code examples on streaming-kafka-0-10-integration.html.  



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-19767) API Doc pages for Streaming with Kafka 0.10 not current

2017-02-28 Thread Nick (JIRA)

 [ 
https://issues.apache.org/jira/browse/SPARK-19767?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Nick updated SPARK-19767:
-
Description: 
The API docs linked from the Spark Kafka 0.10 Integration page are not current. 
 For instance, on the page

   https://spark.apache.org/docs/latest/streaming-kafka-0-10-integration.html

the code examples show the new API (i.e. class ConsumerStrategies).  However, 
following the links

API Docs --> (Scala | Java)

lead to API pages that do not have class ConsumerStrategies) .  The API doc 
package names  also have {code}streaming.kafka{code} as opposed to 
{code}streaming.kafka10{code} 
as in the code examples on streaming-kafka-0-10-integration.html.  


  was:
The API docs linked from the Spark Kafka 0.10 Integration page are not current. 
 For instance, on the page

   https://spark.apache.org/docs/latest/streaming-kafka-0-10-integration.html

the code examples show the new API (i.e. class ConsumerStrategies).  However, 
following the links

API Docs --> (Scala | Java)

lead to API pages that do not have class ConsumerStrategies) .  The API doc 
package names  also have {code}streaming.kafka{code} as opposed to 
{code}streaming.kafka10{code}.


> API Doc pages for Streaming with Kafka 0.10 not current
> ---
>
> Key: SPARK-19767
> URL: https://issues.apache.org/jira/browse/SPARK-19767
> Project: Spark
>  Issue Type: Bug
>  Components: Structured Streaming
>Affects Versions: 2.1.0
>Reporter: Nick
>Priority: Minor
>
> The API docs linked from the Spark Kafka 0.10 Integration page are not 
> current.  For instance, on the page
>https://spark.apache.org/docs/latest/streaming-kafka-0-10-integration.html
> the code examples show the new API (i.e. class ConsumerStrategies).  However, 
> following the links
> API Docs --> (Scala | Java)
> lead to API pages that do not have class ConsumerStrategies) .  The API doc 
> package names  also have {code}streaming.kafka{code} as opposed to 
> {code}streaming.kafka10{code} 
> as in the code examples on streaming-kafka-0-10-integration.html.  



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-19767) API Doc pages for Streaming with Kafka 0.10 not current

2017-02-28 Thread Nick (JIRA)

 [ 
https://issues.apache.org/jira/browse/SPARK-19767?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Nick updated SPARK-19767:
-
Description: 
The API docs linked from the Spark Kafka 0.10 Integration page are not current. 
 For instance, on the page

   https://spark.apache.org/docs/latest/streaming-kafka-0-10-integration.html

the code examples show the new API (i.e. class ConsumerStrategies).  However, 
following the links

API Docs --> (Scala | Java)

lead to API pages that do not have class ConsumerStrategies) .  The API doc 
package names  also have {code}streaming.kafka{code} as opposed to 
{code}streaming.kafka10{code}.

  was:
The API docs linked from the Spark Kafka 0.10 Integration page are not current. 
 For instance, on the page

   https://spark.apache.org/docs/latest/streaming-kafka-0-10-integration.html

the code examples show the new API (i.e. class ConsumerStrategies).  However, 
following the links

API Docs --> (Scala | Java)

leads to API pages that do not have class ConsumerStrategies) .  The API doc 
package names  also have {code}streaming.kafka{code} as opposed to 
{code}streaming.kafka10{code}.


> API Doc pages for Streaming with Kafka 0.10 not current
> ---
>
> Key: SPARK-19767
> URL: https://issues.apache.org/jira/browse/SPARK-19767
> Project: Spark
>  Issue Type: Bug
>  Components: Structured Streaming
>Affects Versions: 2.1.0
>Reporter: Nick
>Priority: Minor
>
> The API docs linked from the Spark Kafka 0.10 Integration page are not 
> current.  For instance, on the page
>https://spark.apache.org/docs/latest/streaming-kafka-0-10-integration.html
> the code examples show the new API (i.e. class ConsumerStrategies).  However, 
> following the links
> API Docs --> (Scala | Java)
> lead to API pages that do not have class ConsumerStrategies) .  The API doc 
> package names  also have {code}streaming.kafka{code} as opposed to 
> {code}streaming.kafka10{code}.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-19767) API Doc pages for Streaming with Kafka 0.10 not current

2017-02-28 Thread Nick (JIRA)
Nick created SPARK-19767:


 Summary: API Doc pages for Streaming with Kafka 0.10 not current
 Key: SPARK-19767
 URL: https://issues.apache.org/jira/browse/SPARK-19767
 Project: Spark
  Issue Type: Bug
  Components: Structured Streaming
Affects Versions: 2.1.0
Reporter: Nick
Priority: Minor


The API docs linked from the Spark Kafka 0.10 Integration page are not current. 
 For instance, on the page

   https://spark.apache.org/docs/latest/streaming-kafka-0-10-integration.html

the code examples show the new API (i.e. class ConsumerStrategies).  However, 
following the links

API Docs --> (Scala | Java)

leads to API pages that do not have class ConsumerStrategies) .  The API doc 
package names  also have {code}streaming.kafka{code} as opposed to 
{code}streaming.kafka10{code}.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-6500) Scala code example in README.md does not compile

2015-03-24 Thread Nick (JIRA)

 [ 
https://issues.apache.org/jira/browse/SPARK-6500?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Nick updated SPARK-6500:

Description: 
I just downloaded and installed Spark 1.3.

Inside README.md there is this example
  
{code}
And run the following command, which should also return 1000:
 sc.parallelize(range(1000)).count()
{code}

which does not compile 

{code}
console:22: error: not found: value range
{code}

This example does work

{code}
  sc.parallelize(1 to 1000).count()
{code}

I'd be happy to create a pull request if necessary.  

  was:
I just downloaded and installed Spark 1.3.

Inside README.md there is this example
  
{code}
And run the following command, which should also return 1000:
 sc.parallelize(range(1000)).count()
{code}

which does not compile 

{code}
console:22: error: not found: value range
{code}

This example does work

{code}
  sc.parallelize(1 to 1000).count()
{code}


 Scala code example in README.md does not compile
 

 Key: SPARK-6500
 URL: https://issues.apache.org/jira/browse/SPARK-6500
 Project: Spark
  Issue Type: Bug
Reporter: Nick
Priority: Trivial

 I just downloaded and installed Spark 1.3.
 Inside README.md there is this example
   
 {code}
 And run the following command, which should also return 1000:
  sc.parallelize(range(1000)).count()
 {code}
 which does not compile 
 {code}
 console:22: error: not found: value range
 {code}
 This example does work
 {code}
   sc.parallelize(1 to 1000).count()
 {code}
 I'd be happy to create a pull request if necessary.  



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-6500) Scala code example in README.md does not compile

2015-03-24 Thread Nick (JIRA)
Nick created SPARK-6500:
---

 Summary: Scala code example in README.md does not compile
 Key: SPARK-6500
 URL: https://issues.apache.org/jira/browse/SPARK-6500
 Project: Spark
  Issue Type: Bug
Reporter: Nick
Priority: Trivial


I just downloaded and installed Spark 1.3.

Inside README.md there is this example
  
{code}
And run the following command, which should also return 1000:
 sc.parallelize(range(1000)).count()
{code}

which does not compile 

{code}
console:22: error: not found: value range
{code}

This example does work

{code}
  sc.parallelize(1 to 1000).count()
{code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Closed] (SPARK-6500) Scala code example in README.md does not compile

2015-03-24 Thread Nick (JIRA)

 [ 
https://issues.apache.org/jira/browse/SPARK-6500?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Nick closed SPARK-6500.
---

 Scala code example in README.md does not compile
 

 Key: SPARK-6500
 URL: https://issues.apache.org/jira/browse/SPARK-6500
 Project: Spark
  Issue Type: Bug
Reporter: Nick
Priority: Trivial

 I just downloaded and installed Spark 1.3.
 Inside README.md there is this example
   
 {code}
 And run the following command, which should also return 1000:
  sc.parallelize(range(1000)).count()
 {code}
 which does not compile 
 {code}
 console:22: error: not found: value range
 {code}
 This example does work
 {code}
   sc.parallelize(1 to 1000).count()
 {code}
 I'd be happy to create a pull request if necessary.  



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org