[jira] [Commented] (SPARK-23213) SparkR:::textFile(sc1,"/opt/test333") can not work on spark2.2.1

2018-01-26 Thread Hyukjin Kwon (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-23213?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16341316#comment-16341316
 ] 

Hyukjin Kwon commented on SPARK-23213:
--

Google what triple columns in R mean ... Questions should go to mailing list. 
Please see https://spark.apache.org/community.html. I think you can have a 
better answer there if it's a question.

> SparkR:::textFile(sc1,"/opt/test333") can not work on spark2.2.1 
> -
>
> Key: SPARK-23213
> URL: https://issues.apache.org/jira/browse/SPARK-23213
> Project: Spark
>  Issue Type: Bug
>  Components: SparkR
>Affects Versions: 2.2.1
> Environment: JAVA_HOME=/opt/jdk1.8.0_161/
> spark 2.2.1
> R version 3.4.3 (2017-11-30) – "Kite-Eating Tree"
>Reporter: Tony 
>Priority: Major
>
> Welcome to
>                   __ 
>    / __/__  ___ _/ /__ 
>   _\ \/ _ \/ _ `/ __/  '_/ 
>  /___/ .__/\_,_/_/ /_/\_\   version  2.2.1 
>     /_/ 
>  
>  
>  SparkSession available as 'spark'.
> > 
> sc1 <- sparkR.session(appName = "wordcount")
> lines <- SparkR:::textFile(sc1,"/opt/test333")
> 18/01/25 02:33:37 ERROR RBackendHandler: defaultParallelism on 1 failed
> java.lang.IllegalArgumentException: invalid method defaultParallelism for 
> object 1
> at 
> org.apache.spark.api.r.RBackendHandler.handleMethodCall(RBackendHandler.scala:193)
> at 
> org.apache.spark.api.r.RBackendHandler.channelRead0(RBackendHandler.scala:108)
> at 
> org.apache.spark.api.r.RBackendHandler.channelRead0(RBackendHandler.scala:40)
> at 
> io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:105)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
> at 
> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336)
> at 
> io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:287)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
> at 
> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336)
> at 
> io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:102)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
> at 
> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336)
> at 
> io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:293)
> at 
> io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:267)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
> at 
> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336)
> at 
> io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1294)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
> at 
> io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:911)
> at 
> io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:131)
> at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:643)
> at 
> io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:566)
> at 
> io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:480)
> at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:442)
> at 
> io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:131)
> at 
> io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:144)
> at java.lang.Thread.run(Thread.java:748)



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-23213) SparkR:::textFile(sc1,"/opt/test333") can not work on spark2.2.1

2018-01-26 Thread Tony (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-23213?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16341286#comment-16341286
 ] 

Tony  commented on SPARK-23213:
---

Hi, [~felixcheung] Another question here, what's the mechanism to make a API 
public or private in sparkR source code.

e.g I found 

> toRDD

Error: object 'toRDD' not found

> insertInto

nonstandardGenericFunction for "insertInto" defined from package "SparkR"

But these two method all defined in the *DataFrame.R* .

What's make the difference?

 

> SparkR:::textFile(sc1,"/opt/test333") can not work on spark2.2.1 
> -
>
> Key: SPARK-23213
> URL: https://issues.apache.org/jira/browse/SPARK-23213
> Project: Spark
>  Issue Type: Bug
>  Components: SparkR
>Affects Versions: 2.2.1
> Environment: JAVA_HOME=/opt/jdk1.8.0_161/
> spark 2.2.1
> R version 3.4.3 (2017-11-30) – "Kite-Eating Tree"
>Reporter: Tony 
>Priority: Major
>
> Welcome to
>                   __ 
>    / __/__  ___ _/ /__ 
>   _\ \/ _ \/ _ `/ __/  '_/ 
>  /___/ .__/\_,_/_/ /_/\_\   version  2.2.1 
>     /_/ 
>  
>  
>  SparkSession available as 'spark'.
> > 
> sc1 <- sparkR.session(appName = "wordcount")
> lines <- SparkR:::textFile(sc1,"/opt/test333")
> 18/01/25 02:33:37 ERROR RBackendHandler: defaultParallelism on 1 failed
> java.lang.IllegalArgumentException: invalid method defaultParallelism for 
> object 1
> at 
> org.apache.spark.api.r.RBackendHandler.handleMethodCall(RBackendHandler.scala:193)
> at 
> org.apache.spark.api.r.RBackendHandler.channelRead0(RBackendHandler.scala:108)
> at 
> org.apache.spark.api.r.RBackendHandler.channelRead0(RBackendHandler.scala:40)
> at 
> io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:105)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
> at 
> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336)
> at 
> io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:287)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
> at 
> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336)
> at 
> io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:102)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
> at 
> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336)
> at 
> io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:293)
> at 
> io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:267)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
> at 
> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336)
> at 
> io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1294)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
> at 
> io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:911)
> at 
> io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:131)
> at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:643)
> at 
> io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:566)
> at 
> io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:480)
> at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:442)
> at 
> io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:131)
> at 
> io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:144)
> at java.lang.Thread.run(Thread.java:748)



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.a

[jira] [Commented] (SPARK-23213) SparkR:::textFile(sc1,"/opt/test333") can not work on spark2.2.1

2018-01-26 Thread Tony (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-23213?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16340803#comment-16340803
 ] 

Tony  commented on SPARK-23213:
---

[~felixcheung]

I used the SparkR::: way  to call the method in RDD.R to implement a word count 
app ,it works normally.

So do we have clarified we don't support RDD in sparkR somewhere in the 
*official doc* ?If so ,please share the related link to me.

Also, please help to  share the reason why we still keep the *RDD.R* file in 
the source code if we don not support RDD.Any other usage.

Thanks a lot .

  

> SparkR:::textFile(sc1,"/opt/test333") can not work on spark2.2.1 
> -
>
> Key: SPARK-23213
> URL: https://issues.apache.org/jira/browse/SPARK-23213
> Project: Spark
>  Issue Type: Bug
>  Components: SparkR
>Affects Versions: 2.2.1
> Environment: JAVA_HOME=/opt/jdk1.8.0_161/
> spark 2.2.1
> R version 3.4.3 (2017-11-30) – "Kite-Eating Tree"
>Reporter: Tony 
>Priority: Major
>
> Welcome to
>                   __ 
>    / __/__  ___ _/ /__ 
>   _\ \/ _ \/ _ `/ __/  '_/ 
>  /___/ .__/\_,_/_/ /_/\_\   version  2.2.1 
>     /_/ 
>  
>  
>  SparkSession available as 'spark'.
> > 
> sc1 <- sparkR.session(appName = "wordcount")
> lines <- SparkR:::textFile(sc1,"/opt/test333")
> 18/01/25 02:33:37 ERROR RBackendHandler: defaultParallelism on 1 failed
> java.lang.IllegalArgumentException: invalid method defaultParallelism for 
> object 1
> at 
> org.apache.spark.api.r.RBackendHandler.handleMethodCall(RBackendHandler.scala:193)
> at 
> org.apache.spark.api.r.RBackendHandler.channelRead0(RBackendHandler.scala:108)
> at 
> org.apache.spark.api.r.RBackendHandler.channelRead0(RBackendHandler.scala:40)
> at 
> io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:105)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
> at 
> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336)
> at 
> io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:287)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
> at 
> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336)
> at 
> io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:102)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
> at 
> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336)
> at 
> io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:293)
> at 
> io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:267)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
> at 
> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336)
> at 
> io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1294)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
> at 
> io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:911)
> at 
> io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:131)
> at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:643)
> at 
> io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:566)
> at 
> io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:480)
> at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:442)
> at 
> io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:131)
> at 
> io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:144)
> at java.lang.Thread.run(Thread.java:748)



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (SPARK-23213) SparkR:::textFile(sc1,"/opt/test333") can not work on spark2.2.1

2018-01-25 Thread Felix Cheung (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-23213?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16339748#comment-16339748
 ] 

Felix Cheung commented on SPARK-23213:
--

To clarify we don’t support RDD in R.

Anything you access via SparkR::: is not supported, that include unionRDD, is 
not supported.


> SparkR:::textFile(sc1,"/opt/test333") can not work on spark2.2.1 
> -
>
> Key: SPARK-23213
> URL: https://issues.apache.org/jira/browse/SPARK-23213
> Project: Spark
>  Issue Type: Bug
>  Components: SparkR
>Affects Versions: 2.2.1
> Environment: JAVA_HOME=/opt/jdk1.8.0_161/
> spark 2.2.1
> R version 3.4.3 (2017-11-30) – "Kite-Eating Tree"
>Reporter: Tony 
>Priority: Major
>
> Welcome to
>                   __ 
>    / __/__  ___ _/ /__ 
>   _\ \/ _ \/ _ `/ __/  '_/ 
>  /___/ .__/\_,_/_/ /_/\_\   version  2.2.1 
>     /_/ 
>  
>  
>  SparkSession available as 'spark'.
> > 
> sc1 <- sparkR.session(appName = "wordcount")
> lines <- SparkR:::textFile(sc1,"/opt/test333")
> 18/01/25 02:33:37 ERROR RBackendHandler: defaultParallelism on 1 failed
> java.lang.IllegalArgumentException: invalid method defaultParallelism for 
> object 1
> at 
> org.apache.spark.api.r.RBackendHandler.handleMethodCall(RBackendHandler.scala:193)
> at 
> org.apache.spark.api.r.RBackendHandler.channelRead0(RBackendHandler.scala:108)
> at 
> org.apache.spark.api.r.RBackendHandler.channelRead0(RBackendHandler.scala:40)
> at 
> io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:105)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
> at 
> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336)
> at 
> io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:287)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
> at 
> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336)
> at 
> io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:102)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
> at 
> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336)
> at 
> io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:293)
> at 
> io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:267)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
> at 
> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336)
> at 
> io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1294)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
> at 
> io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:911)
> at 
> io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:131)
> at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:643)
> at 
> io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:566)
> at 
> io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:480)
> at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:442)
> at 
> io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:131)
> at 
> io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:144)
> at java.lang.Thread.run(Thread.java:748)



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-23213) SparkR:::textFile(sc1,"/opt/test333") can not work on spark2.2.1

2018-01-25 Thread Tony (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-23213?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16339603#comment-16339603
 ] 

Tony  commented on SPARK-23213:
---

[~felixcheung] 

The only way to using method in  *RDD.R*  *is*  called by SparkR::: (e.g 
SparkR:::unionRDD) in a R application ,right? Otherwise, if directly call the 
unionRDD in a sparkR application  will  not find the method in *RDD.R* when run 
the R application.

Error: object 'unionRDD' not found

So still not got your point why we could  support SparkR:::unionRDD(*RDD.R*) 
,but do not support the textFile(context.R) anymore. Using RDD in R is not a 
common case now? 

Again need your help on  how to convert a dataframe to rdd in sparkR.

Thanks a lot.

> SparkR:::textFile(sc1,"/opt/test333") can not work on spark2.2.1 
> -
>
> Key: SPARK-23213
> URL: https://issues.apache.org/jira/browse/SPARK-23213
> Project: Spark
>  Issue Type: Bug
>  Components: SparkR
>Affects Versions: 2.2.1
> Environment: JAVA_HOME=/opt/jdk1.8.0_161/
> spark 2.2.1
> R version 3.4.3 (2017-11-30) – "Kite-Eating Tree"
>Reporter: Tony 
>Priority: Major
>
> Welcome to
>                   __ 
>    / __/__  ___ _/ /__ 
>   _\ \/ _ \/ _ `/ __/  '_/ 
>  /___/ .__/\_,_/_/ /_/\_\   version  2.2.1 
>     /_/ 
>  
>  
>  SparkSession available as 'spark'.
> > 
> sc1 <- sparkR.session(appName = "wordcount")
> lines <- SparkR:::textFile(sc1,"/opt/test333")
> 18/01/25 02:33:37 ERROR RBackendHandler: defaultParallelism on 1 failed
> java.lang.IllegalArgumentException: invalid method defaultParallelism for 
> object 1
> at 
> org.apache.spark.api.r.RBackendHandler.handleMethodCall(RBackendHandler.scala:193)
> at 
> org.apache.spark.api.r.RBackendHandler.channelRead0(RBackendHandler.scala:108)
> at 
> org.apache.spark.api.r.RBackendHandler.channelRead0(RBackendHandler.scala:40)
> at 
> io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:105)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
> at 
> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336)
> at 
> io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:287)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
> at 
> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336)
> at 
> io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:102)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
> at 
> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336)
> at 
> io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:293)
> at 
> io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:267)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
> at 
> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336)
> at 
> io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1294)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
> at 
> io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:911)
> at 
> io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:131)
> at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:643)
> at 
> io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:566)
> at 
> io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:480)
> at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:442)
> at 
> io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:131)
> at 
> io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:144)
> at java.l

[jira] [Commented] (SPARK-23213) SparkR:::textFile(sc1,"/opt/test333") can not work on spark2.2.1

2018-01-25 Thread Felix Cheung (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-23213?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16339559#comment-16339559
 ] 

Felix Cheung commented on SPARK-23213:
--

If you have any specific on what you need - we should have alternative API you 
can use that is not RDD based.

Anything you access with SparkR::: (3 :) - this is accessing methods inside the 
namespace that is not exported. So they are not public API.


> SparkR:::textFile(sc1,"/opt/test333") can not work on spark2.2.1 
> -
>
> Key: SPARK-23213
> URL: https://issues.apache.org/jira/browse/SPARK-23213
> Project: Spark
>  Issue Type: Bug
>  Components: SparkR
>Affects Versions: 2.2.1
> Environment: JAVA_HOME=/opt/jdk1.8.0_161/
> spark 2.2.1
> R version 3.4.3 (2017-11-30) – "Kite-Eating Tree"
>Reporter: Tony 
>Priority: Major
>
> Welcome to
>                   __ 
>    / __/__  ___ _/ /__ 
>   _\ \/ _ \/ _ `/ __/  '_/ 
>  /___/ .__/\_,_/_/ /_/\_\   version  2.2.1 
>     /_/ 
>  
>  
>  SparkSession available as 'spark'.
> > 
> sc1 <- sparkR.session(appName = "wordcount")
> lines <- SparkR:::textFile(sc1,"/opt/test333")
> 18/01/25 02:33:37 ERROR RBackendHandler: defaultParallelism on 1 failed
> java.lang.IllegalArgumentException: invalid method defaultParallelism for 
> object 1
> at 
> org.apache.spark.api.r.RBackendHandler.handleMethodCall(RBackendHandler.scala:193)
> at 
> org.apache.spark.api.r.RBackendHandler.channelRead0(RBackendHandler.scala:108)
> at 
> org.apache.spark.api.r.RBackendHandler.channelRead0(RBackendHandler.scala:40)
> at 
> io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:105)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
> at 
> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336)
> at 
> io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:287)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
> at 
> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336)
> at 
> io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:102)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
> at 
> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336)
> at 
> io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:293)
> at 
> io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:267)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
> at 
> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336)
> at 
> io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1294)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
> at 
> io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:911)
> at 
> io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:131)
> at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:643)
> at 
> io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:566)
> at 
> io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:480)
> at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:442)
> at 
> io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:131)
> at 
> io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:144)
> at java.lang.Thread.run(Thread.java:748)



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-23213) SparkR:::textFile(sc1,"/opt/test333") can not work on spark2.2.1

2018-01-25 Thread Tony (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-23213?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16339551#comment-16339551
 ] 

Tony  commented on SPARK-23213:
---

[~felixcheung] Thanks for you reply. Some more questions need your further help.

 " RDD (all RDD APIs) are not supported *public API"*,  you mean the API 
defined in the context.R(e.g textFile) are *public API.*   How do we define the 
*public API* here*?*

 *R* RDD only can use the API defined in *RDD.R* through the way  SparkR::: 
e.gSparkR:::unionRDD, SparkR:::reduceByKey?

I  am porting a  python application that work based on the RDD rather than the 
dataframe.

So want also get the RDD from a textfile in sparkR env.

Could you please give me a sample "You can convert DataFrame into RDD",not 
found a proper convert method so far.

 

 

> SparkR:::textFile(sc1,"/opt/test333") can not work on spark2.2.1 
> -
>
> Key: SPARK-23213
> URL: https://issues.apache.org/jira/browse/SPARK-23213
> Project: Spark
>  Issue Type: Bug
>  Components: SparkR
>Affects Versions: 2.2.1
> Environment: JAVA_HOME=/opt/jdk1.8.0_161/
> spark 2.2.1
> R version 3.4.3 (2017-11-30) – "Kite-Eating Tree"
>Reporter: Tony 
>Priority: Major
>
> Welcome to
>                   __ 
>    / __/__  ___ _/ /__ 
>   _\ \/ _ \/ _ `/ __/  '_/ 
>  /___/ .__/\_,_/_/ /_/\_\   version  2.2.1 
>     /_/ 
>  
>  
>  SparkSession available as 'spark'.
> > 
> sc1 <- sparkR.session(appName = "wordcount")
> lines <- SparkR:::textFile(sc1,"/opt/test333")
> 18/01/25 02:33:37 ERROR RBackendHandler: defaultParallelism on 1 failed
> java.lang.IllegalArgumentException: invalid method defaultParallelism for 
> object 1
> at 
> org.apache.spark.api.r.RBackendHandler.handleMethodCall(RBackendHandler.scala:193)
> at 
> org.apache.spark.api.r.RBackendHandler.channelRead0(RBackendHandler.scala:108)
> at 
> org.apache.spark.api.r.RBackendHandler.channelRead0(RBackendHandler.scala:40)
> at 
> io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:105)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
> at 
> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336)
> at 
> io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:287)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
> at 
> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336)
> at 
> io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:102)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
> at 
> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336)
> at 
> io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:293)
> at 
> io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:267)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
> at 
> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336)
> at 
> io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1294)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
> at 
> io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:911)
> at 
> io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:131)
> at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:643)
> at 
> io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:566)
> at 
> io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:480)
> at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:442)
> at 
> io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:131)
> at 
> io.netty.util.concurrent.DefaultTh

[jira] [Commented] (SPARK-23213) SparkR:::textFile(sc1,"/opt/test333") can not work on spark2.2.1

2018-01-25 Thread Felix Cheung (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-23213?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16339522#comment-16339522
 ] 

Felix Cheung commented on SPARK-23213:
--

You can convert DataFrame into RDD
But again textFile and RDD (all RDD APIs) are not  supported public API, sorry.

It will help if you could elaborate on what you are trying to do and what you 
might need.


> SparkR:::textFile(sc1,"/opt/test333") can not work on spark2.2.1 
> -
>
> Key: SPARK-23213
> URL: https://issues.apache.org/jira/browse/SPARK-23213
> Project: Spark
>  Issue Type: Bug
>  Components: SparkR
>Affects Versions: 2.2.1
> Environment: JAVA_HOME=/opt/jdk1.8.0_161/
> spark 2.2.1
> R version 3.4.3 (2017-11-30) – "Kite-Eating Tree"
>Reporter: Tony 
>Priority: Major
>
> Welcome to
>                   __ 
>    / __/__  ___ _/ /__ 
>   _\ \/ _ \/ _ `/ __/  '_/ 
>  /___/ .__/\_,_/_/ /_/\_\   version  2.2.1 
>     /_/ 
>  
>  
>  SparkSession available as 'spark'.
> > 
> sc1 <- sparkR.session(appName = "wordcount")
> lines <- SparkR:::textFile(sc1,"/opt/test333")
> 18/01/25 02:33:37 ERROR RBackendHandler: defaultParallelism on 1 failed
> java.lang.IllegalArgumentException: invalid method defaultParallelism for 
> object 1
> at 
> org.apache.spark.api.r.RBackendHandler.handleMethodCall(RBackendHandler.scala:193)
> at 
> org.apache.spark.api.r.RBackendHandler.channelRead0(RBackendHandler.scala:108)
> at 
> org.apache.spark.api.r.RBackendHandler.channelRead0(RBackendHandler.scala:40)
> at 
> io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:105)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
> at 
> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336)
> at 
> io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:287)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
> at 
> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336)
> at 
> io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:102)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
> at 
> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336)
> at 
> io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:293)
> at 
> io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:267)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
> at 
> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336)
> at 
> io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1294)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
> at 
> io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:911)
> at 
> io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:131)
> at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:643)
> at 
> io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:566)
> at 
> io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:480)
> at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:442)
> at 
> io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:131)
> at 
> io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:144)
> at java.lang.Thread.run(Thread.java:748)



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-23213) SparkR:::textFile(sc1,"/opt/test333") can not work on spark2.2.1

2018-01-25 Thread Tony (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-23213?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16339466#comment-16339466
 ] 

Tony  commented on SPARK-23213:
---

[~felixcheung] Please  help to have a look.

> SparkR:::textFile(sc1,"/opt/test333") can not work on spark2.2.1 
> -
>
> Key: SPARK-23213
> URL: https://issues.apache.org/jira/browse/SPARK-23213
> Project: Spark
>  Issue Type: Bug
>  Components: SparkR
>Affects Versions: 2.2.1
> Environment: JAVA_HOME=/opt/jdk1.8.0_161/
> spark 2.2.1
> R version 3.4.3 (2017-11-30) – "Kite-Eating Tree"
>Reporter: Tony 
>Priority: Major
>
> Welcome to
>                   __ 
>    / __/__  ___ _/ /__ 
>   _\ \/ _ \/ _ `/ __/  '_/ 
>  /___/ .__/\_,_/_/ /_/\_\   version  2.2.1 
>     /_/ 
>  
>  
>  SparkSession available as 'spark'.
> > 
> sc1 <- sparkR.session(appName = "wordcount")
> lines <- SparkR:::textFile(sc1,"/opt/test333")
> 18/01/25 02:33:37 ERROR RBackendHandler: defaultParallelism on 1 failed
> java.lang.IllegalArgumentException: invalid method defaultParallelism for 
> object 1
> at 
> org.apache.spark.api.r.RBackendHandler.handleMethodCall(RBackendHandler.scala:193)
> at 
> org.apache.spark.api.r.RBackendHandler.channelRead0(RBackendHandler.scala:108)
> at 
> org.apache.spark.api.r.RBackendHandler.channelRead0(RBackendHandler.scala:40)
> at 
> io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:105)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
> at 
> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336)
> at 
> io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:287)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
> at 
> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336)
> at 
> io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:102)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
> at 
> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336)
> at 
> io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:293)
> at 
> io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:267)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
> at 
> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336)
> at 
> io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1294)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
> at 
> io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:911)
> at 
> io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:131)
> at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:643)
> at 
> io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:566)
> at 
> io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:480)
> at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:442)
> at 
> io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:131)
> at 
> io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:144)
> at java.lang.Thread.run(Thread.java:748)



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-23213) SparkR:::textFile(sc1,"/opt/test333") can not work on spark2.2.1

2018-01-25 Thread Tony (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-23213?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16339461#comment-16339461
 ] 

Tony  commented on SPARK-23213:
---

Hi,*[~hyukjin.kwon]  ,*I  tried read.text before ,but read.text  will "Loads 
text files and returns a SparkDataFrame".

Here I want Loads text files and just return a RDD, only method I found is 
SparkR:::textFile .If we can not use SparkR:::textFile any more ,how can we 
loads text files and just return a RDD, any other method can do this ?

Reopen this ticket  for trace. If anything I misunderstand, please let kindly 
let me know.

Thanks.

> SparkR:::textFile(sc1,"/opt/test333") can not work on spark2.2.1 
> -
>
> Key: SPARK-23213
> URL: https://issues.apache.org/jira/browse/SPARK-23213
> Project: Spark
>  Issue Type: Bug
>  Components: SparkR
>Affects Versions: 2.2.1
> Environment: JAVA_HOME=/opt/jdk1.8.0_161/
> spark 2.2.1
> R version 3.4.3 (2017-11-30) – "Kite-Eating Tree"
>Reporter: Tony 
>Priority: Major
>
> Welcome to
>                   __ 
>    / __/__  ___ _/ /__ 
>   _\ \/ _ \/ _ `/ __/  '_/ 
>  /___/ .__/\_,_/_/ /_/\_\   version  2.2.1 
>     /_/ 
>  
>  
>  SparkSession available as 'spark'.
> > 
> sc1 <- sparkR.session(appName = "wordcount")
> lines <- SparkR:::textFile(sc1,"/opt/test333")
> 18/01/25 02:33:37 ERROR RBackendHandler: defaultParallelism on 1 failed
> java.lang.IllegalArgumentException: invalid method defaultParallelism for 
> object 1
> at 
> org.apache.spark.api.r.RBackendHandler.handleMethodCall(RBackendHandler.scala:193)
> at 
> org.apache.spark.api.r.RBackendHandler.channelRead0(RBackendHandler.scala:108)
> at 
> org.apache.spark.api.r.RBackendHandler.channelRead0(RBackendHandler.scala:40)
> at 
> io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:105)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
> at 
> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336)
> at 
> io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:287)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
> at 
> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336)
> at 
> io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:102)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
> at 
> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336)
> at 
> io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:293)
> at 
> io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:267)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
> at 
> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336)
> at 
> io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1294)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
> at 
> io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:911)
> at 
> io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:131)
> at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:643)
> at 
> io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:566)
> at 
> io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:480)
> at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:442)
> at 
> io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:131)
> at 
> io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:144)
> at java.lang.Thread.run(Thread.java:748)



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

--

[jira] [Commented] (SPARK-23213) SparkR:::textFile(sc1,"/opt/test333") can not work on spark2.2.1

2018-01-25 Thread Hyukjin Kwon (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-23213?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16339166#comment-16339166
 ] 

Hyukjin Kwon commented on SPARK-23213:
--

I think we should rather leave this won't fix .. I don't think we should keep 
the compatibility or any guarantee on private methods.

> SparkR:::textFile(sc1,"/opt/test333") can not work on spark2.2.1 
> -
>
> Key: SPARK-23213
> URL: https://issues.apache.org/jira/browse/SPARK-23213
> Project: Spark
>  Issue Type: Bug
>  Components: SparkR
>Affects Versions: 2.2.1
> Environment: JAVA_HOME=/opt/jdk1.8.0_161/
> spark 2.2.1
> R version 3.4.3 (2017-11-30) – "Kite-Eating Tree"
>Reporter: Tony 
>Priority: Major
>
> Welcome to
>                   __ 
>    / __/__  ___ _/ /__ 
>   _\ \/ _ \/ _ `/ __/  '_/ 
>  /___/ .__/\_,_/_/ /_/\_\   version  2.2.1 
>     /_/ 
>  
>  
>  SparkSession available as 'spark'.
> > 
> sc1 <- sparkR.session(appName = "wordcount")
> lines <- SparkR:::textFile(sc1,"/opt/test333")
> 18/01/25 02:33:37 ERROR RBackendHandler: defaultParallelism on 1 failed
> java.lang.IllegalArgumentException: invalid method defaultParallelism for 
> object 1
> at 
> org.apache.spark.api.r.RBackendHandler.handleMethodCall(RBackendHandler.scala:193)
> at 
> org.apache.spark.api.r.RBackendHandler.channelRead0(RBackendHandler.scala:108)
> at 
> org.apache.spark.api.r.RBackendHandler.channelRead0(RBackendHandler.scala:40)
> at 
> io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:105)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
> at 
> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336)
> at 
> io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:287)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
> at 
> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336)
> at 
> io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:102)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
> at 
> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336)
> at 
> io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:293)
> at 
> io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:267)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
> at 
> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336)
> at 
> io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1294)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
> at 
> io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:911)
> at 
> io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:131)
> at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:643)
> at 
> io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:566)
> at 
> io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:480)
> at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:442)
> at 
> io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:131)
> at 
> io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:144)
> at java.lang.Thread.run(Thread.java:748)



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-23213) SparkR:::textFile(sc1,"/opt/test333") can not work on spark2.2.1

2018-01-25 Thread Hyukjin Kwon (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-23213?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16339160#comment-16339160
 ] 

Hyukjin Kwon commented on SPARK-23213:
--

target version is usually reserved for committers. I just took out.

> SparkR:::textFile(sc1,"/opt/test333") can not work on spark2.2.1 
> -
>
> Key: SPARK-23213
> URL: https://issues.apache.org/jira/browse/SPARK-23213
> Project: Spark
>  Issue Type: Bug
>  Components: SparkR
>Affects Versions: 2.2.1
> Environment: JAVA_HOME=/opt/jdk1.8.0_161/
> spark 2.2.1
> R version 3.4.3 (2017-11-30) – "Kite-Eating Tree"
>Reporter: Tony 
>Priority: Major
>
> Welcome to
>                   __ 
>    / __/__  ___ _/ /__ 
>   _\ \/ _ \/ _ `/ __/  '_/ 
>  /___/ .__/\_,_/_/ /_/\_\   version  2.2.1 
>     /_/ 
>  
>  
>  SparkSession available as 'spark'.
> > 
> sc1 <- sparkR.session(appName = "wordcount")
> lines <- SparkR:::textFile(sc1,"/opt/test333")
> 18/01/25 02:33:37 ERROR RBackendHandler: defaultParallelism on 1 failed
> java.lang.IllegalArgumentException: invalid method defaultParallelism for 
> object 1
> at 
> org.apache.spark.api.r.RBackendHandler.handleMethodCall(RBackendHandler.scala:193)
> at 
> org.apache.spark.api.r.RBackendHandler.channelRead0(RBackendHandler.scala:108)
> at 
> org.apache.spark.api.r.RBackendHandler.channelRead0(RBackendHandler.scala:40)
> at 
> io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:105)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
> at 
> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336)
> at 
> io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:287)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
> at 
> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336)
> at 
> io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:102)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
> at 
> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336)
> at 
> io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:293)
> at 
> io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:267)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
> at 
> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336)
> at 
> io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1294)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
> at 
> io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:911)
> at 
> io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:131)
> at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:643)
> at 
> io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:566)
> at 
> io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:480)
> at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:442)
> at 
> io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:131)
> at 
> io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:144)
> at java.lang.Thread.run(Thread.java:748)



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-23213) SparkR:::textFile(sc1,"/opt/test333") can not work on spark2.2.1

2018-01-25 Thread Felix Cheung (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-23213?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16338987#comment-16338987
 ] 

Felix Cheung commented on SPARK-23213:
--

Try read.text instead?

[http://spark.apache.org/docs/latest/api/R/read.text.html]

SparkR:::textFile is an internal method. Is there a reason you need it?

> SparkR:::textFile(sc1,"/opt/test333") can not work on spark2.2.1 
> -
>
> Key: SPARK-23213
> URL: https://issues.apache.org/jira/browse/SPARK-23213
> Project: Spark
>  Issue Type: Bug
>  Components: SparkR
>Affects Versions: 2.2.1
> Environment: JAVA_HOME=/opt/jdk1.8.0_161/
> spark 2.2.1
> R version 3.4.3 (2017-11-30) – "Kite-Eating Tree"
>Reporter: Tony 
>Priority: Major
>
> Welcome to
>                   __ 
>    / __/__  ___ _/ /__ 
>   _\ \/ _ \/ _ `/ __/  '_/ 
>  /___/ .__/\_,_/_/ /_/\_\   version  2.2.1 
>     /_/ 
>  
>  
>  SparkSession available as 'spark'.
> > 
> sc1 <- sparkR.session(appName = "wordcount")
> lines <- SparkR:::textFile(sc1,"/opt/test333")
> 18/01/25 02:33:37 ERROR RBackendHandler: defaultParallelism on 1 failed
> java.lang.IllegalArgumentException: invalid method defaultParallelism for 
> object 1
> at 
> org.apache.spark.api.r.RBackendHandler.handleMethodCall(RBackendHandler.scala:193)
> at 
> org.apache.spark.api.r.RBackendHandler.channelRead0(RBackendHandler.scala:108)
> at 
> org.apache.spark.api.r.RBackendHandler.channelRead0(RBackendHandler.scala:40)
> at 
> io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:105)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
> at 
> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336)
> at 
> io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:287)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
> at 
> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336)
> at 
> io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:102)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
> at 
> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336)
> at 
> io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:293)
> at 
> io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:267)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
> at 
> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336)
> at 
> io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1294)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
> at 
> io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:911)
> at 
> io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:131)
> at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:643)
> at 
> io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:566)
> at 
> io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:480)
> at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:442)
> at 
> io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:131)
> at 
> io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:144)
> at java.lang.Thread.run(Thread.java:748)



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org