Just tried joblib and it managed to save the data. However the size is
still very large (over 3GB contained in multiple files) where as models I
created using SVR and Earth using the same data only produce files in the
order of 3Mb and 30 kB respectively. Is there a problem with the gaussian
model, is this normal???
Z
On Thu, Mar 20, 2014 at 9:41 AM, Gael Varoquaux <
[email protected]> wrote:
> On Thu, Mar 20, 2014 at 09:39:56AM +1100, Zack L wrote:
> > I am trying to save a gaussian model using the command:
>
> > pickle.dump(model, srfFile)
>
> Don't use pickle. Use sklearn.externals.joblib.dump/load. This will
> handle very large arrays well.
>
> G
>
>
> ------------------------------------------------------------------------------
> Learn Graph Databases - Download FREE O'Reilly Book
> "Graph Databases" is the definitive new guide to graph databases and their
> applications. Written by three acclaimed leaders in the field,
> this first edition is now available. Download your free book today!
> http://p.sf.net/sfu/13534_NeoTech
> _______________________________________________
> Scikit-learn-general mailing list
> [email protected]
> https://lists.sourceforge.net/lists/listinfo/scikit-learn-general
>
------------------------------------------------------------------------------
Learn Graph Databases - Download FREE O'Reilly Book
"Graph Databases" is the definitive new guide to graph databases and their
applications. Written by three acclaimed leaders in the field,
this first edition is now available. Download your free book today!
http://p.sf.net/sfu/13534_NeoTech
_______________________________________________
Scikit-learn-general mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/scikit-learn-general