i would sugest Mongo db

just use it from its binaries packages and don't worrie

like it says
If you are using a vanilla MongoDB server from either source or binary
packages you have NO obligations. You can ignore the rest of this page.

http://www.mongodb.org/display/DOCS/Licensing


2011/8/9 Jaco Breitenbach <jjbreitenb...@gmail.com>

> Hi Gabriel,
>
> Is there such a database that is both free and non-GPL that you can
> recommend?
>
> Jaco
>
> On 9 August 2011 14:38, gabriel.b...@gmail.com <gabriel.b...@gmail.com
> >wrote:
>
> > Heve you ever considere using a NOSQL database I think it would serve you
> > better
> >
> > 2011/8/9 Jaco Breitenbach <jjbreitenb...@gmail.com>
> >
> > > Hi Igor and Michael,
> > >
> > > Yes, of course, 1440 minutes in a day. :-)
> > >
> > > I am building an application that filters out duplicate input data by
> > > generating an MD5 hash of each input, and implicitly comparing that
> > against
> > > a set of keys already stored in the SQLite database by doing an insert
> > into
> > > a unique-indexed table.  If the insert fails, a duplicate is assumed,
> > > otherwise the new unique key is stored, and the input processed.
> > >
> > > The problem that I'm facing, is that I would ultimately need to process
> > > 1,000,000,000 records a day, with history to be kept for up to 128
> days.
> >  I
> > > am currently creating a new data file per day, with hourly tables.
> > >  However,
> > > that will eventually result in 40,000,000+ records to be inserted into
> a
> > > single indexed table.  Unfortunately the performance rate of the
> inserts
> > > into the indexed tables decreases significantly as the number of
> records
> > in
> > > the tables increases.  This seems to be because of a CPU bottleneck
> > rather
> > > than I/O while doing the searches.
> > >
> > > I am now considering partitioning the data even further into tables
> that
> > > span shorter time periods, e.g. 60 min, 30 min, 15 min, 5 min, 1 min.
>  I
> > am
> > > hoping that reducing the search space will help to maintain a higher
> > insert
> > > rate.
> > >
> > > I'd appreciate any feedback and comments on my suggested approach.
> > >
> > > Regards,
> > > Jaco
> > >
> > >
> > > On 9 August 2011 14:13, Igor Tandetnik <itandet...@mvps.org> wrote:
> > >
> > > > Jaco Breitenbach <jjbreitenb...@gmail.com> wrote:
> > > > > Can anyone please tell me if there is a limit to the number of
> tables
> > > > that
> > > > > can be held in a single data file?  I am considering an application
> > > that
> > > > > will require a table for every minute in a day, i.e. 3600+ tables
> in
> > a
> > > > > single database or data file.
> > > >
> > > > First, there are 1440 minutes in a day. Second, you should be able to
> > > > create this number of tables: if the limit exists, it's likely much
> > > higher
> > > > than that. Finally, I predict that the schema you envision would be
> > very
> > > > awkward to work with. Have you considered a single table having
> > > MinuteOfDay
> > > > as an extra column?
> > > > --
> > > > Igor Tandetnik
> > > >
> > > > _______________________________________________
> > > > sqlite-users mailing list
> > > > sqlite-users@sqlite.org
> > > > http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users
> > > >
> > > _______________________________________________
> > > sqlite-users mailing list
> > > sqlite-users@sqlite.org
> > > http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users
> > >
> > _______________________________________________
> > sqlite-users mailing list
> > sqlite-users@sqlite.org
> > http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users
> >
> _______________________________________________
> sqlite-users mailing list
> sqlite-users@sqlite.org
> http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users
>
_______________________________________________
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users

Reply via email to