If you DIY you can do about 140 terabytes for $20k, thats not including hosting/electricity/etc though.
Add in annual expansion and maintenance + other costs and you can arrive at your own figure of what's a minimum annual funded amount. I'd say minimum 30-40k/yr for anything meaningful. That's not a lot of money in the grand scope but I don't know who would donate that. - bri On Wed, Jan 27, 2010 at 2:15 PM, Christopher Schmidt < [email protected]> wrote: > On Wed, Jan 27, 2010 at 04:40:43PM -0700, Mike Swope wrote: > > In the last 2 weeks there have been hundreds of emails on the group, and > > much of it dealing with openstreetmap and imagery, and using this to help > > people on the ground. > > > > One thing that did strike me is the need for a project such as > > openaerialmap. There was probably a TB of imagery put up for download, so > > people could use it for relief work. > > Far more, actually. In total, there has been about 2.5TB of source imagery > so far, ignoring LWIR, SWIR, MWIR, and topos. > > 20M deltastate > 1004G digitalglobe > 564M EROS > 194G geoeye > 32G google > 1.3G HDDS > 31M Navy > 869G noaa > 644G worldbank-rit > > Combining this source imagery with processed (warped+overlayed imagery), > this brings us up to 6.3TB so far, and more still coming. > > http://haiticrisismap.org/?zoom=8&lat=19&lon=-72.2&layers=basephoto > > http://haiticrisismap.org/?zoom=11&lat=18.58007&lon=-72.31467&layers=basephoto > > > But it does seem that a nice open system would be very beneficial for > > people. > > > > So I'm curious, what are your thoughts? How can the OAM project be used > for > > a crisis? > > Actually, what this has shown me is one thing, and one thing only: > we need a lot more disk than we have. > > Ignoring source imagery, which can be compressed and stashed somewhere > else, and assuming that we have enough CPUs that it somehow becomes > reasonable to compress the processed data while still keeping it on > fast disks, we've got a situation where we have the need to store > 1-2TB of imagery. For coverage of one week. For a country smaller than most > US states. With most areas not being covered with imagery of higher > resolution > than a meter, and not including any additional bands. > > If you *really* want to do OAM, you're gonna need better than that. Under > the situation where we're not pre-caching everything, we need to > have access to dozens or hundreds of terabytes, of disk, and the > ability to grow that amount rapidly. > > If you want to do this the Google way -- pre-cache everything -- you > still have issues not that much different than this, if you want to solve > the 'imagery in the past' problem as well. The only difference is that you > can slightly more easily distribute your disk space, because there's no > need for CPU where the tiles are. Hwoever, I think that is likely > impractical > for something like OAM using the publicaly available resources at > this time. > > If the OpenAerialMap community is actually interested in going anywhere, > the first question should be "Who is going to provide lots of disk > space that we can put near lots of CPU?" You can imagine, for the time > being, that we have the CPU, and the rackspace nearby it. The question > then becomes "Who is willing to donate lots of fast disks that can be > racked up, and how much does it add up to?" > > When the potential donations start to cross the 100TB mark, we have a > potential chance of being able to get OAM started in a serious way. > Until we have that, as soon as we open the doors for imagery, we're > going to run out of space. > > Best Regards, > -- > Christopher Schmidt > Web Developer > > _______________________________________________ > talk mailing list > [email protected] > http://openaerialmap.org/mailman/listinfo/talk_openaerialmap.org >
_______________________________________________ talk mailing list [email protected] http://openaerialmap.org/mailman/listinfo/talk_openaerialmap.org
