[ 
https://issues.apache.org/jira/browse/HDFS-1115?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13013836#comment-13013836
 ] 

Scott C. Frase commented on HDFS-1115:
--------------------------------------

Same problem, openSuSE 11.4 on x86_64.  I had this java stack installed:
linux-z6tw:/> java -version
java version "1.6.0_20"
OpenJDK Runtime Environment (IcedTea6 1.9.7) (suse-1.2.1-x86_64)
OpenJDK 64-Bit Server VM (build 19.0-b09, mixed mode)

I yanked out the java-1_6_0-openjdk* version:
linux-z6tw:~> sudo zypper remove java-1_6_0-openjdk*
Loading repository data...
Reading installed packages...
Resolving package dependencies...

The following NEW package is going to be installed:
  java-1_6_0-sun 

The following packages are going to be REMOVED:
  java-1_6_0-openjdk java-1_6_0-openjdk-plugin 

1 new package to install, 2 to remove.
Overall download size: 20.8 MiB. After the operation, additional 3.1 MiB will 
be used.                                        
Continue? [y/n/p/?] (y): y
Retrieving package java-1_6_0-sun-1.6.0.u23-3.3.x86_64 (1/1), 20.8 MiB (88.6 
MiB unpacked)                                    
Retrieving: java-1_6_0-sun-1.6.0.u23-3.3.x86_64.rpm [done (1.6 MiB/s)]
Removing java-1_6_0-openjdk-plugin-1.6.0.0_b20.1.9.7-1.2.1 [done]
Removing java-1_6_0-openjdk-1.6.0.0_b20.1.9.7-1.2.1 [done]

And Hadoop is now running MapReduce jobs!

> SocketException: Protocol not available for some JVMs
> -----------------------------------------------------
>
>                 Key: HDFS-1115
>                 URL: https://issues.apache.org/jira/browse/HDFS-1115
>             Project: Hadoop HDFS
>          Issue Type: Bug
>    Affects Versions: 0.20.2
>         Environment: OpenSuse 11.2 running as a Virtual Machine on Windows 
> Vista
>            Reporter: manas
>
> Here, input is a folder containing all .xml files from ./conf  
> Then trying the command:
> ./bin/hadoop fs -copyFromLocal input input
> The following message is displayed: 
> {noformat}
> INFO hdfs.DFSClient: Exception in createBlockOutputStream 
> java.net.SocketException: Operation not supported
> INFO hdfs.DFSClient: Abandoning block blk_-1884214035513073759_1010
> INFO hdfs.DFSClient: Exception in createBlockOutputStream 
> java.net.SocketException: Protocol not available
> INFO hdfs.DFSClient: Abandoning block blk_5533397873275401028_1010
> INFO hdfs.DFSClient: Exception in createBlockOutputStream 
> java.net.SocketException: Protocol not available
> INFO hdfs.DFSClient: Abandoning block blk_-237603871573204731_1011
> INFO hdfs.DFSClient: Exception in createBlockOutputStream 
> java.net.SocketException: Protocol not available
> INFO hdfs.DFSClient: Abandoning block blk_-8668593183126057334_1011
> WARN hdfs.DFSClient: DataStreamer Exception: java.io.IOException: Unable to 
> create new block.
>         at 
> org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.nextBlockOutputStream(DFSClient.java:2845)
>         at 
> org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.access$2000(DFSClient.java:2102)
>         at 
> org.apache.hadoop.hdfs.DFSClient$DFSOutputStream$DataStreamer.run(DFSClient.java:2288)
> WARN hdfs.DFSClient: Error Recovery for block blk_-8668593183126057334_1011 
> bad datanode[0] nodes == null
> WARN hdfs.DFSClient: Could not get block locations. Source file 
> "/user/max/input/core-site.xml" - Aborting...
> copyFromLocal: Protocol not available
> ERROR hdfs.DFSClient: Exception closing file /user/max/input/core-site.xml : 
> java.net.SocketException: Protocol not available
> java.net.SocketException: Protocol not available
>         at sun.nio.ch.Net.getIntOption0(Native Method)
>         at sun.nio.ch.Net.getIntOption(Net.java:178)
>         at sun.nio.ch.SocketChannelImpl$1.getInt(SocketChannelImpl.java:419)
>         at sun.nio.ch.SocketOptsImpl.getInt(SocketOptsImpl.java:60)
>         at sun.nio.ch.SocketOptsImpl.sendBufferSize(SocketOptsImpl.java:156)
>         at 
> sun.nio.ch.SocketOptsImpl$IP$TCP.sendBufferSize(SocketOptsImpl.java:286)
>         at sun.nio.ch.OptionAdaptor.getSendBufferSize(OptionAdaptor.java:129)
>         at sun.nio.ch.SocketAdaptor.getSendBufferSize(SocketAdaptor.java:328)
>         at 
> org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.createBlockOutputStream(DFSClient.java:2873)
>         at 
> org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.nextBlockOutputStream(DFSClient.java:2826)
>         at 
> org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.access$2000(DFSClient.java:2102)
>         at 
> org.apache.hadoop.hdfs.DFSClient$DFSOutputStream$DataStreamer.run(DFSClient.java:2288)
> INFO hdfs.DFSClient: Exception in createBlockOutputStream 
> java.net.SocketException: Operation not supported
> INFO hdfs.DFSClient: Abandoning block blk_-1884214035513073759_1010
> INFO hdfs.DFSClient: Exception in createBlockOutputStream 
> java.net.SocketException: Protocol not available
> INFO hdfs.DFSClient: Abandoning block blk_5533397873275401028_1010
> INFO hdfs.DFSClient: Exception in createBlockOutputStream 
> java.net.SocketException: Protocol not available
> INFO hdfs.DFSClient: Abandoning block blk_-237603871573204731_1011
> INFO hdfs.DFSClient: Exception in createBlockOutputStream 
> java.net.SocketException: Protocol not available
> INFO hdfs.DFSClient: Abandoning block blk_-8668593183126057334_1011
> WARN hdfs.DFSClient: DataStreamer Exception: java.io.IOException: Unable to 
> create new block.
>         at 
> org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.nextBlockOutputStream(DFSClient.java:2845)
>         at 
> org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.access$2000(DFSClient.java:2102)
>         at 
> org.apache.hadoop.hdfs.DFSClient$DFSOutputStream$DataStreamer.run(DFSClient.java:2288)
> WARN hdfs.DFSClient: Error Recovery for block blk_-8668593183126057334_1011 
> bad datanode[0] nodes == null
> WARN hdfs.DFSClient: Could not get block locations. Source file 
> "/user/max/input/core-site.xml" - Aborting...
> copyFromLocal: Protocol not available
> ERROR hdfs.DFSClient: Exception closing file /user/max/input/core-site.xml : 
> java.net.SocketException: Protocol not available
> java.net.SocketException: Protocol not available
>         at sun.nio.ch.Net.getIntOption0(Native Method)
>         at sun.nio.ch.Net.getIntOption(Net.java:178)
>         at sun.nio.ch.SocketChannelImpl$1.getInt(SocketChannelImpl.java:419)
>         at sun.nio.ch.SocketOptsImpl.getInt(SocketOptsImpl.java:60)
>         at sun.nio.ch.SocketOptsImpl.sendBufferSize(SocketOptsImpl.java:156)
>         at 
> sun.nio.ch.SocketOptsImpl$IP$TCP.sendBufferSize(SocketOptsImpl.java:286)
>         at sun.nio.ch.OptionAdaptor.getSendBufferSize(OptionAdaptor.java:129)
>         at sun.nio.ch.SocketAdaptor.getSendBufferSize(SocketAdaptor.java:328)
>         at 
> org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.createBlockOutputStream(DFSClient.java:2873)
>         at 
> org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.nextBlockOutputStream(DFSClient.java:2826)
>         at 
> org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.access$2000(DFSClient.java:2102)
>         at 
> org.apache.hadoop.hdfs.DFSClient$DFSOutputStream$DataStreamer.run(DFSClient.java:2288)
> {noformat}
> However, only empty files are created on HDFS.

--
This message is automatically generated by JIRA.
For more information on JIRA, see: http://www.atlassian.com/software/jira

Reply via email to