You would probably have to implement your own Hadoop filesystem similar to S3 and KFS integrate.
I looked at it a while back and it didn't seem insanely difficult … Kevin On Wed, Sep 14, 2011 at 9:47 AM, Steve Lewis <lordjoe2...@gmail.com> wrote: > No - the issue is I want is I want Hadoop to read resources as input files > as if they were in hdfs - I know how to read resources as > input streams but I don't know how to get a Hadoop file system which will > treat a Path like res://myclass/myresource.txt as useful and > give me an FSInputStream (rather that a simple InputStream) > > > On Wed, Sep 14, 2011 at 9:38 AM, Kevin Burton <bur...@spinn3r.com> wrote: > >> You can already do this with the JAR file format… if you load a resource >> via path it uses the class loader system to find it in all available jars. >> >> Kevin >> >> >> On Wed, Sep 14, 2011 at 9:24 AM, Steve Lewis <lordjoe2...@gmail.com>wrote: >> >>> When writing tests it is useful to keep all data in resources since this >>> makes automatic execution easier. >>> The structure of a set of resources should make it easy to have a schema >>> such as res:// to look there - >>> Has anyone already done the work. >>> >>> -- >>> Steven M. Lewis PhD >>> 4221 105th Ave NE >>> Kirkland, WA 98033 >>> 206-384-1340 (cell) >>> Skype lordjoe_com >>> >>> >>> >> >> >> -- >> >> Founder/CEO Spinn3r.com >> >> Location: *San Francisco, CA* >> Skype: *burtonator* >> >> Skype-in: *(415) 871-0687* >> >> > > > -- > Steven M. Lewis PhD > 4221 105th Ave NE > Kirkland, WA 98033 > 206-384-1340 (cell) > Skype lordjoe_com > > > -- Founder/CEO Spinn3r.com Location: *San Francisco, CA* Skype: *burtonator* Skype-in: *(415) 871-0687*