This should be about right: Configuration conf = new Configuration(); FileSystem fs = FileSystem.get(conf); OutputStream os = fs.create(new Path("/images/img.jpg")); os.write()... os.close();
Or even: Configuration conf = new Configuration(); FileSystem fs = FileSystem.get(conf); fs.copyFromLocalFile(new Path("/filer/img.jpg"), new Path("/images/img.jpg")); http://hadoop.apache.org/common/docs/r1.0.2/api/org/apache/hadoop/fs/FileSystem.html On Fri, Apr 20, 2012 at 7:38 AM, vishnupriyaa <vpatoff...@gmail.com> wrote: > > Hi Brock, > I am new to hbase ,can you help me with some sort of code to store > directly in HDFS. > > Brock Noland-2 wrote: >> >> Hi, >> >> Any reason you cannot have an exception marker for large files and >> then store them directly in HDFS? >> >> Brock >> >> On Tue, Apr 17, 2012 at 1:06 PM, vishnupriyaa <vpatoff...@gmail.com> >> wrote: >>> >>> I want to save a file of size 12MB but an exception occuring like this >>> KeyValue size too large. >>> I have set the value of hbase.client.keyvalue.maxsize in hbase-site.xml >>> and >>> hbase-default.xml to 3GB >>> but the default value 10MB is taking for >>> hbase.client.keyvalue.maxsize.How >>> could I change the value of hbase.client.keyvalue.maxsize or how to store >>> the file of extremely large size. >>> -- >>> View this message in context: >>> http://old.nabble.com/Storing-extremely-large-size-file-tp33701522p33701522.html >>> Sent from the HBase User mailing list archive at Nabble.com. >>> >> >> >> >> -- >> Apache MRUnit - Unit testing MapReduce - >> http://incubator.apache.org/mrunit/ >> >> > > -- > View this message in context: > http://old.nabble.com/Storing-extremely-large-size-file-tp33701522p33720293.html > Sent from the HBase User mailing list archive at Nabble.com. > -- Apache MRUnit - Unit testing MapReduce - http://incubator.apache.org/mrunit/