[ 
https://issues.apache.org/jira/browse/SPARK-9678?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Aseem Bansal updated SPARK-9678:
--------------------------------
    Description: 
I was going through the quick start for spark 1.4.1 at 
http://spark.apache.org/docs/latest/quick-start.html. I am using pySpark. Also 
the exact version that I am using is spark-1.4.1-bin-hadoop2.4

The quick start has textFile = sc.textFile("README.md"). I ran that and then 
the following text appeared in the command line


15/08/06 10:37:03 INFO MemoryStore: ensureFreeSpace(143840) called with 
curMem=0, maxMem=278302556
15/08/06 10:37:03 INFO MemoryStore: Block broadcast_0 stored as values in 
memory (estimated size 140.5 KB, free 265.3 MB)
15/08/06 10:37:03 INFO MemoryStore: ensureFreeSpace(12633) called with 
curMem=143840, maxMem=278302556
15/08/06 10:37:03 INFO MemoryStore: Block broadcast_0_piece0 stored as bytes in 
memory (estimated size 12.3 KB, free 265.3 MB)
15/08/06 10:37:03 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory on 
localhost:53311 (size: 12.3 KB, free: 265.4 MB)
15/08/06 10:37:03 INFO SparkContext: Created broadcast 0 from textFile at 
NativeMethodAccessorImpl.java:-2


I saw that there was an IP in these logs i.e. localhost:53311

I tried connecting to it via Google Chrome and got an exception.

>>> 15/08/06 10:37:30 WARN TransportChannelHandler: Exception in connection 
>>> from /127.0.0.1:54056
io.netty.handler.codec.TooLongFrameException: Adjusted frame length exceeds 
2147483647: 5135603447292250196 - discarded
        at 
io.netty.handler.codec.LengthFieldBasedFrameDecoder.fail(LengthFieldBasedFrameDecoder.java:501)
        at 
io.netty.handler.codec.LengthFieldBasedFrameDecoder.failIfNecessary(LengthFieldBasedFrameDecoder.java:477)
        at 
io.netty.handler.codec.LengthFieldBasedFrameDecoder.decode(LengthFieldBasedFrameDecoder.java:403)
        at 
io.netty.handler.codec.LengthFieldBasedFrameDecoder.decode(LengthFieldBasedFrameDecoder.java:343)
        at 
io.netty.handler.codec.ByteToMessageDecoder.callDecode(ByteToMessageDecoder.java:249)
        at 
io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:149)
        at 
io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:333)
        at 
io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:319)
        at 
io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:787)
        at 
io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:130)
        at 
io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:511)
        at 
io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:468)
        at 
io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:382)
        at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:354)
        at 
io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:116)
        at java.lang.Thread.run(Thread.java:745)


  was:
I was going through the quick start for spark 1.4.1 at 
http://spark.apache.org/docs/latest/quick-start.html. I am using pySpark

The quick start has textFile = sc.textFile("README.md"). I ran that and then 
the following text appeared in the command line


15/08/06 10:37:03 INFO MemoryStore: ensureFreeSpace(143840) called with 
curMem=0, maxMem=278302556
15/08/06 10:37:03 INFO MemoryStore: Block broadcast_0 stored as values in 
memory (estimated size 140.5 KB, free 265.3 MB)
15/08/06 10:37:03 INFO MemoryStore: ensureFreeSpace(12633) called with 
curMem=143840, maxMem=278302556
15/08/06 10:37:03 INFO MemoryStore: Block broadcast_0_piece0 stored as bytes in 
memory (estimated size 12.3 KB, free 265.3 MB)
15/08/06 10:37:03 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory on 
localhost:53311 (size: 12.3 KB, free: 265.4 MB)
15/08/06 10:37:03 INFO SparkContext: Created broadcast 0 from textFile at 
NativeMethodAccessorImpl.java:-2


I saw that there was an IP in these logs i.e. localhost:53311

I tried connecting to it via Google Chrome and got an exception.

>>> 15/08/06 10:37:30 WARN TransportChannelHandler: Exception in connection 
>>> from /127.0.0.1:54056
io.netty.handler.codec.TooLongFrameException: Adjusted frame length exceeds 
2147483647: 5135603447292250196 - discarded
        at 
io.netty.handler.codec.LengthFieldBasedFrameDecoder.fail(LengthFieldBasedFrameDecoder.java:501)
        at 
io.netty.handler.codec.LengthFieldBasedFrameDecoder.failIfNecessary(LengthFieldBasedFrameDecoder.java:477)
        at 
io.netty.handler.codec.LengthFieldBasedFrameDecoder.decode(LengthFieldBasedFrameDecoder.java:403)
        at 
io.netty.handler.codec.LengthFieldBasedFrameDecoder.decode(LengthFieldBasedFrameDecoder.java:343)
        at 
io.netty.handler.codec.ByteToMessageDecoder.callDecode(ByteToMessageDecoder.java:249)
        at 
io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:149)
        at 
io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:333)
        at 
io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:319)
        at 
io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:787)
        at 
io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:130)
        at 
io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:511)
        at 
io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:468)
        at 
io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:382)
        at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:354)
        at 
io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:116)
        at java.lang.Thread.run(Thread.java:745)



> Exception while going through quick start
> -----------------------------------------
>
>                 Key: SPARK-9678
>                 URL: https://issues.apache.org/jira/browse/SPARK-9678
>             Project: Spark
>          Issue Type: Bug
>    Affects Versions: 1.4.1
>         Environment: Ubuntu 14.0.4
>            Reporter: Aseem Bansal
>
> I was going through the quick start for spark 1.4.1 at 
> http://spark.apache.org/docs/latest/quick-start.html. I am using pySpark. 
> Also the exact version that I am using is spark-1.4.1-bin-hadoop2.4
> The quick start has textFile = sc.textFile("README.md"). I ran that and then 
> the following text appeared in the command line
> 15/08/06 10:37:03 INFO MemoryStore: ensureFreeSpace(143840) called with 
> curMem=0, maxMem=278302556
> 15/08/06 10:37:03 INFO MemoryStore: Block broadcast_0 stored as values in 
> memory (estimated size 140.5 KB, free 265.3 MB)
> 15/08/06 10:37:03 INFO MemoryStore: ensureFreeSpace(12633) called with 
> curMem=143840, maxMem=278302556
> 15/08/06 10:37:03 INFO MemoryStore: Block broadcast_0_piece0 stored as bytes 
> in memory (estimated size 12.3 KB, free 265.3 MB)
> 15/08/06 10:37:03 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory 
> on localhost:53311 (size: 12.3 KB, free: 265.4 MB)
> 15/08/06 10:37:03 INFO SparkContext: Created broadcast 0 from textFile at 
> NativeMethodAccessorImpl.java:-2
> I saw that there was an IP in these logs i.e. localhost:53311
> I tried connecting to it via Google Chrome and got an exception.
> >>> 15/08/06 10:37:30 WARN TransportChannelHandler: Exception in connection 
> >>> from /127.0.0.1:54056
> io.netty.handler.codec.TooLongFrameException: Adjusted frame length exceeds 
> 2147483647: 5135603447292250196 - discarded
>       at 
> io.netty.handler.codec.LengthFieldBasedFrameDecoder.fail(LengthFieldBasedFrameDecoder.java:501)
>       at 
> io.netty.handler.codec.LengthFieldBasedFrameDecoder.failIfNecessary(LengthFieldBasedFrameDecoder.java:477)
>       at 
> io.netty.handler.codec.LengthFieldBasedFrameDecoder.decode(LengthFieldBasedFrameDecoder.java:403)
>       at 
> io.netty.handler.codec.LengthFieldBasedFrameDecoder.decode(LengthFieldBasedFrameDecoder.java:343)
>       at 
> io.netty.handler.codec.ByteToMessageDecoder.callDecode(ByteToMessageDecoder.java:249)
>       at 
> io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:149)
>       at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:333)
>       at 
> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:319)
>       at 
> io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:787)
>       at 
> io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:130)
>       at 
> io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:511)
>       at 
> io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:468)
>       at 
> io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:382)
>       at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:354)
>       at 
> io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:116)
>       at java.lang.Thread.run(Thread.java:745)



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to