1. It is important to ensure your clients are on the same major
version jars as your server.
2. You are probably looking for "hadoop fs -chown" and "hadoop fs
-chmod" tools to modify permissions.
On Wed, Feb 22, 2012 at 3:15 AM, Mohit Anchlia wrote:
> I am past this error. Looks like I needed to use CDH libraries. I changed
> my maven repo. Now I am stuck at
>
> *org.apache.hadoop.security.AccessControlException *since I am not writing
> as user that owns the file. Looking online for solutions
>
>
> On Tue, Feb 21, 2012 at 12:48 PM, Mohit Anchlia wrote:
>
>> I am trying to write to the sequence file and it seems to be failing. Not
>> sure why, Is there something I need to do
>>
>> String uri="hdfs://db1:54310/examples/testfile1.seq";
>>
>> FileSystem fs = FileSystem.*get*(URI.*create*(uri), conf); //Fails
>> on this line
>>
>>
>> Caused by:
>> *java.io.EOFException*
>>
>> at java.io.DataInputStream.readInt(
>> *DataInputStream.java:375*)
>>
>> at org.apache.hadoop.ipc.Client$Connection.receiveResponse(
>> *Client.java:501*)
>>
>> at org.apache.hadoop.ipc.Client$Connection.run(*Client.java:446*)
>>
--
Harsh J
Customer Ops. Engineer
Cloudera | http://tiny.cloudera.com/about