Hello Ehud,

These are excellent, well-reasoned feature proposals for GaussianMixture! 
You've accurately identified potential gaps in flexibility, especially 
regarding parameter control and covariance structure.

Here's an assessment of your proposals in the context of scikit-learn's 
GaussianMixture implementation:

1. 🧊 Freezing Component Weights (weights_init)
This is a very useful feature, particularly in transfer learning or when 
components are physically constrained (e.g., modeling known proportions in a 
mixture).

Current State: The GaussianMixture implementation currently optimizes all 
parameters (means, covariances, and weights) during the 
Expectation-Maximization (EM) loop, even if initialized via weights_init. There 
is no built-in mechanism to freeze the weights.

Implementation Feasibility: You are correct; this should be relatively easy to 
implement.

The EM algorithm has a separate step for updating weights, called M-step for 
weights.

To implement frozen_weights=True, you would simply add a conditional check in 
the M-step for weights: if the weights are frozen, skip the update calculation 
and keep the values from weights_init.

Parameter Suggestion: A boolean parameter like frozen_weights (or perhaps 
fix_weights to align with common ML terminology) would be clear and 
appropriate, specifically acting as a flag when weights_init is provided <a 
href="https://textinvisible.org/";>https://textinvisible.org/</a>
_______________________________________________
scikit-learn mailing list -- [email protected]
To unsubscribe send an email to [email protected]
https://mail.python.org/mailman3//lists/scikit-learn.python.org
Member address: [email protected]

Reply via email to