Hi. 2020-08-11 8:51 UTC+02:00, Christoph Läubrich <[email protected]>: > Hi Gilles, > > > Just to make clear I don't suspect any error with GausianCurveFitter, I > just don't understand how the advice in the user-doc to restrict > parameter (for general problems) could be applied to a concrete problem > and thus chosen GausianCurvefitter as an example as it uses > LeastSquaresBuilder. > > I also noticed that Gaussian Fitter has a restriction on parameters > (norm can't be negative) that is handled in a third way (returning > Double.POSITIVE_INFINITY instead of Parameter Validator) not mentioned > in the userdoc at all, so I wonder if this is a general purpose solution > for restricting parameters (seems the simplest approach).
I'd indeed suggest to first try the same trick as in "GaussianCurveFitter" (i.e. return a "high" value for arguments outside a known range). That way, you only have to define a suitable "ParametricUnivariateFunction" and pass it to "SimpleCurveFitter". One case for the "ParameterValidator" is when some of the model parameters might be correlated to others. But using it makes it necessary that you handle yourself all the arguments to be passed to the "LeastSquaresProblem". > To take the gausian example for my use case, consider an observed signal > similar to [1], given I know (from other source as the plain data) for > example that the result must be found in the range of 2...3 and I wanted > to restrict valid solutions to this area. The same might apply to the > norm: I know it must be between a given range and I want to restrict the > optimizer here even though there might be a solution outside of the > range that (compared of the R^2) fits better, e.g. a gausian fit well > inside the -1..1. > > I hope it is a little bit clearer. I'm not sure. The picture shows a function that is not a Gaussian. Do you mean that you want to fit only *part* of the data with a function that would not fit well *all* the data? Regards, Gilles > > > [1] > https://ascelibrary.org/cms/asset/6ca2b016-1a4f-4eed-80da-71219777cac1/1.jpg > > Am 11.08.20 um 00:42 schrieb Gilles Sadowski: >> Hello. >> >> Le lun. 10 août 2020 à 17:09, Christoph Läubrich >> <[email protected]> a écrit : >>> >>> The userguide [1] mentions that it is currently not directly possible to >>> contrain parameters directly but suggest one can use the >>> ParameterValidator, is there any example code for both mentioned >>> alternatives? >>> >>> For example GaussianCurveFitter uses LeastSquaresBuilder and I wan't to >>> archive that the mean is within a closed bound e.g from 5 to 6 where my >>> datapoints ranges from 0..90, how would this be archived? >> >> Could you set up a unit test as a practical example of what >> you need to achieve? >> >>> I'm especially interested because the FUNCTION inside >>> GaussianCurveFitter seems to reject invalid values (e.g. negative >>> valuenorm) by simply return Double.POSITIVE_INFINITY instead of using >>> either approach described in the user-docs. >> >> What I don't quite get is why you need to force the mean within a >> certain range; if the data match a Gaussian with a mean within that >> range, I would assume that the fitter will find the correct value... >> Sorry if I missed something. Hopefully the example will clarify. >> >> Best, >> Gilles >> >>> >>> >>> [1] >>> https://commons.apache.org/proper/commons-math/userguide/leastsquares.html >>> --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
