रविंदर ठाकुर (ravinder thakur) wrote:
my suggestion is that lets just collect few 100$ (10$ each ?) and purchase a EC2 machine upload it with _all_ semantic data, run a sparql endpoint on it and keep it running for everyone's use.
fwiw - I already indicated that the following is about to happen:

1. All of LOD in an instance deployed like the current DBpedia instance (from our data center) as per <http://b3s.openlinksw.com/> (that already has 11 Billion Triples in it and simply needs an update re. DBpedia 3.2 and a few other data sets from LOD)

2. For those that have personal or service specific needs, a replica will be on EC2 (as we've done with DBpedia).

Current roadmap re. EC2:

1. DBpedia - done
2. Neurocommons - WIP
3. Bio2RDF - WIP
4. Entier LOD Data Set collection - WIP

Of course, you can also put together the scheme you are suggesting also via donation etc. The approaches the better (imho).

Kingsley



On Fri, Dec 5, 2008 at 10:50 AM, Kingsley Idehen <[EMAIL PROTECTED] <mailto:[EMAIL PROTECTED]>> wrote:


    Hugh Glaser wrote:

        Thanks for the swift response!
        I'm still puzzled - sorry to be slow.
        http://aws.amazon.com/publicdatasets/#2
        Says:
        Amazon EC2 customers can access this data by creating their
        own personal Amazon EBS volumes, using the public data set
        snapshots as a starting point. They can then access, modify
        and perform computation on these volumes directly using their
        Amazon EC2 instances and just pay for the compute and storage
        resources that they use.
         Does this not mean it costs me money on my EC2 account? Or is
        there some other way of accessing the data? Or am I looking at
        the wrong bit?
    Okay, I see what I overlooked: the cost of paying for an AMI that
    mounts these EBS volumes, even though Amazon is charging $0.00 for
    uploading these huge amounts of data where it would usually charge.

    So to conclude, using the loaded data sets isn't free, but I think
    we have to be somewhat appreciative of a value here, right? Amazon
    is providing a service that is ultimately pegged to usage (utility
    model), and the usage comes down to value associated with that
    scarce resource called time.

        Ie Can you give me a clue how to get at the data without using
        my credit card please? :-)
    You can't you will need someone to build an EC2 service for you
    and eat the costs on your behalf. Of course such a service isn't
    impossible in a "Numerati" [1] economy, but we aren't quite there
    yet, need the Linked Data Web in place first :-)

    Links:

    1. http://tinyurl.com/64gsan

    Kingsley

        Best
        Hugh

        On 05/12/2008 02:28, "Kingsley Idehen" <[EMAIL PROTECTED]
        <mailto:[EMAIL PROTECTED]>> wrote:



        Hugh Glaser wrote:
            Exciting stuff, Kingsley.
            I'm not quite sure I have worked out how I might use it
            though.
            The page says that hosting data is clearly free, but I
            can't see how to get at it without paying for it as an EC2
            customer.
            Is this right?
            Cheers

        Hugh,

        No, shouldn't cost anything if the LOD data sets are hosted in
        this
        particular location :-)


        Kingsley
            Hugh


            On 01/12/2008 15:30, "Kingsley Idehen"
            <[EMAIL PROTECTED] <mailto:[EMAIL PROTECTED]>>
            wrote:



            All,

            Please see: <http://aws.amazon.com/publicdatasets/> ;
            potentially the
            final destination of all published RDF archives from the
            LOD cloud.

            I've already made a request on behalf of LOD, but
            additional requests
            from the community will accelerate the general
            comprehension and
            awareness at Amazon.

            Once the data sets are available from Amazon, database
            constructions
            costs will be significantly alleviated.

            We have DBpedia reconstruction down to 1.5 hrs (or less)
            based on
            Virtuoso's in-built integration with Amazon S3 for backup and
            restoration etc..  We could get the reconstruction of the
            entire LOD
            cloud down to some interesting numbers once all the data
            is situated in
            an Amazon data center.


            --


            Regards,

            Kingsley Idehen       Weblog:
            http://www.openlinksw.com/blog/~kidehen
            <http://www.openlinksw.com/blog/%7Ekidehen>
            President & CEO
            OpenLink Software     Web: http://www.openlinksw.com










        --


        Regards,

        Kingsley Idehen       Weblog:
        http://www.openlinksw.com/blog/~kidehen
        <http://www.openlinksw.com/blog/%7Ekidehen>
        President & CEO
        OpenLink Software     Web: http://www.openlinksw.com









--

    Regards,

    Kingsley Idehen       Weblog:
    http://www.openlinksw.com/blog/~kidehen
    <http://www.openlinksw.com/blog/%7Ekidehen>
    President & CEO OpenLink Software     Web: http://www.openlinksw.com








--


Regards,

Kingsley Idehen       Weblog: http://www.openlinksw.com/blog/~kidehen
President & CEO OpenLink Software Web: http://www.openlinksw.com





Reply via email to