[ 
https://issues.apache.org/jira/browse/SPARK-23213?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16339551#comment-16339551
 ] 

Tony  commented on SPARK-23213:
-------------------------------

[~felixcheung] Thanks for you reply. Some more questions need your further help.

 " RDD (all RDD APIs) are not supported *public API"*,  you mean the API 
defined in the context.R(e.g textFile) are *public API.*   How do we define the 
*public API* here*?*

 *R* RDD only can use the API defined in *RDD.R* through the way  SparkR::: 
e.gSparkR:::unionRDD, SparkR:::reduceByKey?

I  am porting a  python application that work based on the RDD rather than the 
dataframe.

So want also get the RDD from a textfile in sparkR env.

Could you please give me a sample "You can convert DataFrame into RDD",not 
found a proper convert method so far.

 

 

> SparkR:::textFile(sc1,"/opt/test333") can not work on spark2.2.1 
> -----------------------------------------------------------------
>
>                 Key: SPARK-23213
>                 URL: https://issues.apache.org/jira/browse/SPARK-23213
>             Project: Spark
>          Issue Type: Bug
>          Components: SparkR
>    Affects Versions: 2.2.1
>         Environment: JAVA_HOME=/opt/jdk1.8.0_161/
> spark 2.2.1
> R version 3.4.3 (2017-11-30) – "Kite-Eating Tree"
>            Reporter: Tony 
>            Priority: Major
>
> Welcome to
>     ____              __ 
>    / __/__  ___ _____/ /__ 
>   _\ \/ _ \/ _ `/ __/  '_/ 
>  /___/ .__/\_,_/_/ /_/\_\   version  2.2.1 
>     /_/ 
>  
>  
>  SparkSession available as 'spark'.
> > 
> sc1 <- sparkR.session(appName = "wordcount")
> lines <- SparkR:::textFile(sc1,"/opt/test333")
> 18/01/25 02:33:37 ERROR RBackendHandler: defaultParallelism on 1 failed
> java.lang.IllegalArgumentException: invalid method defaultParallelism for 
> object 1
> at 
> org.apache.spark.api.r.RBackendHandler.handleMethodCall(RBackendHandler.scala:193)
> at 
> org.apache.spark.api.r.RBackendHandler.channelRead0(RBackendHandler.scala:108)
> at 
> org.apache.spark.api.r.RBackendHandler.channelRead0(RBackendHandler.scala:40)
> at 
> io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:105)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
> at 
> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336)
> at 
> io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:287)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
> at 
> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336)
> at 
> io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:102)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
> at 
> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336)
> at 
> io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:293)
> at 
> io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:267)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
> at 
> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336)
> at 
> io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1294)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
> at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
> at 
> io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:911)
> at 
> io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:131)
> at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:643)
> at 
> io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:566)
> at 
> io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:480)
> at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:442)
> at 
> io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:131)
> at 
> io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:144)
> at java.lang.Thread.run(Thread.java:748)



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to