On 30/11/2013 04:52, Jejo Koola wrote:
Hi

Anyone have experience with very large datasets and the Bayesian Network
package, bnlearn?  In my experience R doesn't react well to very large
datasets.

Maybe, but a million is not 'very large': R handles billions of observations without problems on machines with commensurate resources.

Package bnlearn is not 'R'. Your questions are not about R itself and should be addressed to the package maintainer.

Is there a way to divide up the dataset into pieces and incrementally learn
the network with the pieces?  This would also be helpful incase R crashes,
because I could save the network after learning each piece.

Thank you.

        [[alternative HTML version deleted]]

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

PLEASE do, including what it says about HTML mail and about 'crashes'.

--
Brian D. Ripley,                  rip...@stats.ox.ac.uk
Professor of Applied Statistics,  http://www.stats.ox.ac.uk/~ripley/
University of Oxford,             Tel:  +44 1865 272861 (self)
1 South Parks Road,                     +44 1865 272866 (PA)
Oxford OX1 3TG, UK                Fax:  +44 1865 272595

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to