Bug Bash 1-10-17 for Mahout 0.13.0. Current target is a Jan 15 code freeze.
===========================================================================

Pat
----------
MAHOUT-1786: Make classes implements Serializable for Spark 1.5+
MAHOUT-1904: Create a test harness to test mahout across different hardware 
configurations

Sebastian
----------
MAHOUT-1884: Allow specification of dimensions of a DRM

Andrew
----------
MAHOUT-1791: Automatic threading for java based mmul in the front end and the 
backend.
MAHOUT-1682: Create a documentation page for SPCA
MAHOUT-1893: Fix Algorithm list on mahout.apache.org
MAHOUT-1686: Create a documentattion page for ALS


Dmitriy
----------
MAHOUT-1790: SparkEngine nnz overflow resultSize when reducing.

Andy
----------
EPIC:  MAHOUT-1862: Native Mahout integration
EPIC: MAHOUT-1742 non-legacy framework related issues.

MAHOUT-1860: Add Stack Image to the top of the front page of the Website
MAHOUT-1879: Lazy density analysis of DRMs in CheckpointedDrm
MAHOUT-1885: Inital Implementation of VCL Bindings
MAHOUT-1873: Use densityAnalysis() in all necessary
MAHOUT-1892: Can't broadcast vector in Mahout-Shell
MAHOUT-1851: Automatic probing of in-core and back-end solvers
MAHOUT-1885: Inital Implementation of VCL Bindings
MAHOUT-1862: Native Mahout integration


Suneel
----------
EPIC: MAHOUT-1861: New Mahout Clustering, Classification, Sketching and 
Optimization Algorithms

MAHOUT-1870: Add import and export capabilities for DRMs to and from Apache 
Arrow
MAHOUT-1882: SequentialAccessSparseVector inerateNonZeros is incorrect.
MAHOUT-1875: Use faster shallowCopy for dense matices in blockify 
drm/package.blockify(..)
MAHOUT-1830: Publish scaladocs for Mahout 0.12.0 release
MAHOUT-1902: Parse Spark and Mahout variable arguments from the Mahout 
spark-shell

Trevor
----------
MAHOUT-1856: Create a framework for new Mahout Clustering, Classification, and 
Optimization Algorithms
MAHOUT-1895: Add convenience methods for converting Vectors to Scala types
MAHOUT-1896: Add convenience methods for interacting with Spark ML

Reply via email to