Your message dated Sun, 19 Oct 2025 02:34:05 +0000
with message-id <[email protected]>
and subject line Bug#1117992: fixed in scikit-optimize 0.10.2-5
has caused the Debian Bug report #1117992,
regarding scikit-optimize: autopkgtest regression with scikit-learn 1.7.2
to be marked as done.

This means that you claim that the problem has been dealt with.
If this is not the case it is now your responsibility to reopen the
Bug report if necessary, and/or fix the problem forthwith.

(NB: If you are a system administrator and have no idea what this
message is talking about, this may indicate a serious mail system
misconfiguration somewhere. Please contact [email protected]
immediately.)


-- 
1117992: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=1117992
Debian Bug Tracking System
Contact [email protected] with problems
--- Begin Message ---
Source: scikit-optimize
Version: 0.10.2-4
Severity: important
User: [email protected]
Usertags: scikit-learn-1.7

-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA512

Dear maintainer,

your package has an autopkgtest regression with scikit-learn 1.7.2.
Relevant excerpt from 
https://ci.debian.net/packages/s/scikit-optimize/unstable/amd64/65190549/ 
follows:


483s =================================== FAILURES 
===================================
483s _______________ test_minimizer_api[minimizer7-call_single-True] 
________________
483s
483s verbose = True, call = <function call_single at 0x7f8fed034220>
483s minimizer = functools.partial(<function gbrt_minimize at 0x7f8fecff1580>, 
acq_func='LCB')
483s
483s     @pytest.mark.slow_test
483s     @pytest.mark.parametrize("verbose", [True, False])
483s     @pytest.mark.parametrize("call", [call_single, [call_single, 
check_result_callable]])
483s     @pytest.mark.parametrize("minimizer", MINIMIZERS)
483s     def test_minimizer_api(verbose, call, minimizer):
483s         n_calls = 7
483s         n_initial_points = 3
483s         n_models = n_calls - n_initial_points + 1
483s
483s >       result = minimizer(
483s             branin,
483s             [(-5.0, 10.0), (0.0, 15.0)],
483s             n_initial_points=n_initial_points,
483s             n_calls=n_calls,
483s             random_state=1,
483s             verbose=verbose,
483s             callback=call,
483s         )
483s
483s tests/test_common.py:112:
483s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
_ _ _
483s /usr/lib/python3/dist-packages/skopt/optimizer/gbrt.py:197: in 
gbrt_minimize
483s     return base_minimize(
483s /usr/lib/python3/dist-packages/skopt/optimizer/base.py:276: in 
base_minimize
483s     optimizer = Optimizer(
483s _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
_ _ _
483s
483s self = <skopt.optimizer.optimizer.Optimizer object at 0x7f8feb1c2d50>
483s dimensions = [(-5.0, 10.0), (0.0, 15.0)]
483s base_estimator = 
GradientBoostingQuantileRegressor(base_estimator=GradientBoostingRegressor(loss='quantile',
483s                           ...    quantiles=[0.16, 0.5, 0.84],
483s                                   random_state=RandomState(MT19937) at 
0x7F8FF0D98440)
483s n_random_starts = None, n_initial_points = 3, initial_point_generator = 
'random'
483s n_jobs = 1, acq_func = 'LCB', acq_optimizer = 'sampling', random_state = 1
483s model_queue_size = None, space_constraint = None
483s acq_func_kwargs = {'kappa': 1.96, 'xi': 0.01}
483s acq_optimizer_kwargs = {'n_jobs': 1, 'n_points': 10000, 
'n_restarts_optimizer': 5}
483s avoid_duplicates = True
483s
483s     def __init__(
483s         self,
483s         dimensions,
483s         base_estimator="gp",
483s         n_random_starts=None,
483s         n_initial_points=10,
483s         initial_point_generator="random",
483s         n_jobs=1,
483s         acq_func="gp_hedge",
483s         acq_optimizer="auto",
483s         random_state=None,
483s         model_queue_size=None,
483s         space_constraint=None,
483s         acq_func_kwargs=None,
483s         acq_optimizer_kwargs=None,
483s         avoid_duplicates=True,
483s     ):
483s         args = locals().copy()
483s         del args['self']
483s         self.specs = {"args": args, "function": "Optimizer"}
483s         self.rng = check_random_state(random_state)
483s
483s         # Configure acquisition function
483s
483s         # Store and creat acquisition function set
483s         self.acq_func = acq_func
483s         self.acq_func_kwargs = acq_func_kwargs
483s         self.avoid_duplicates = avoid_duplicates
483s
483s         allowed_acq_funcs = [
483s             "gp_hedge",
483s             "EI",
483s             "LCB",
483s             "MES",
483s             "PVRS",
483s             "PI",
483s             "EIps",
483s             "PIps",
483s         ]
483s         if self.acq_func not in allowed_acq_funcs:
483s             raise ValueError(
483s                 "expected acq_func to be in %s, got %s"
483s                 % (",".join(allowed_acq_funcs), self.acq_func)
483s             )
483s
483s         # treat hedging method separately
483s         if self.acq_func == "gp_hedge":
483s             self.cand_acq_funcs_ = ["EI", "LCB", "PI"]
483s             self.gains_ = np.zeros(3)
483s         else:
483s             self.cand_acq_funcs_ = [self.acq_func]
483s
483s         if acq_func_kwargs is None:
483s             acq_func_kwargs = dict()
483s         self.eta = acq_func_kwargs.get("eta", 1.0)
483s
483s         # Configure counters of points
483s
483s         # Check `n_random_starts` deprecation first
483s         if n_random_starts is not None:
483s             warnings.warn(
483s                 ("n_random_starts will be removed in favour of " 
"n_initial_points."),
483s                 DeprecationWarning,
483s             )
483s             n_initial_points = n_random_starts
483s
483s         if n_initial_points < 0:
483s             raise ValueError(
483s                 "Expected `n_initial_points` >= 0, got %d" % 
n_initial_points
483s             )
483s         self._n_initial_points = n_initial_points
483s         self.n_initial_points_ = n_initial_points
483s
483s         # Configure estimator
483s
483s         # build base_estimator if doesn't exist
483s         if isinstance(base_estimator, str):
483s             base_estimator = cook_estimator(
483s                 base_estimator,
483s                 space=dimensions,
483s                 random_state=self.rng.randint(0, np.iinfo(np.int32).max),
483s                 n_jobs=n_jobs,
483s             )
483s     483s         # check if regressor
483s         # check if regressor
483s         if not is_regressor(base_estimator) and base_estimator is not None:
483s >           raise ValueError("%s has to be a regressor." % base_estimator)
483s E           ValueError: 
GradientBoostingQuantileRegressor(base_estimator=GradientBoostingRegressor(loss='quantile',
483s E                                                                          
            n_estimators=30),
483s E                                             quantiles=[0.16, 0.5, 0.84],
483s E                                             
random_state=RandomState(MT19937) at 0x7F8FF0D98440) has to be a regressor.
483s
483s /usr/lib/python3/dist-packages/skopt/optimizer/optimizer.py:257: ValueError


Cheers
Timo


-----BEGIN PGP SIGNATURE-----

iQIzBAEBCgAdFiEEmwPruYMA35fCsSO/zIxr3RQD9MoFAmjs7J4ACgkQzIxr3RQD
9Moohg/9EnKczWoWNHeaW3C+HhoYmp7BZold/yi7CswmMxcwpxGKeq7Kc90VPht+
1QS9nbRPZqkTUawQU7I/uRMymhVouTaUcYcRl19N4Qe5hjdow4RmgJOWLjjFYbKt
Rn7g4/MhaLEHjHFeVW84VIJUZ3s/WBS2XJBmxfBnbUj0A04lAnDowvRRFWTlVWS3
PduLUkTDRDhdjo7XSvatYYZWbRk1Oa6TRkTT7tEextgN2ROdBqtf6INrflk0GzKD
5cLZSnn/MozGvFOdpF3MVM65m0fnEai8to8VGlTg/Tbe2gxxpZAEkVOHOdMM0GGW
dShoXTeMY/kOVTHeYBIT7/rp3D1s3xzeInDkPLaE7hAZc3rZ5329hV1yteevSPBn
iNiGdBaG44XjecYW6FzYQEcNeEvdRy6DezCY14J7DEvfZuVi92kARqNRWS2vOr0b
R0fGJFiGJjf031717ZwEATCdZuBsOmTeXkvy/gY+jAhde0Qelyw3RkiQq0Ptj/eT
nwxO4ue5jjksXrWtXOKe/NxvHz0jn02x+hgizak86SrAFA+8ExolGxP+UUjmz3t1
/pH5Jxe1gzS4srrfrtqrxiRRRcGgKE1yjwSGZ1tzAWnO540qH24lRPpuznZYO7JQ
BP8R/UcKHqwcCDQdBrfqxZXcqLu0fAttcm2rHT5c8noQGK5rTqM=
=FNWx
-----END PGP SIGNATURE-----

--- End Message ---
--- Begin Message ---
Source: scikit-optimize
Source-Version: 0.10.2-5
Done: Colin Watson <[email protected]>

We believe that the bug you reported is fixed in the latest version of
scikit-optimize, which is due to be installed in the Debian FTP archive.

A summary of the changes between this version and the previous one is
attached.

Thank you for reporting the bug, which will now be closed.  If you
have further comments please address them to [email protected],
and the maintainer will reopen the bug report if appropriate.

Debian distribution maintenance software
pp.
Colin Watson <[email protected]> (supplier of updated scikit-optimize package)

(This message was generated automatically at their request; if you
believe that there is a problem with it please contact the archive
administrators by mailing [email protected])


-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA512

Format: 1.8
Date: Sun, 19 Oct 2025 03:12:24 +0100
Source: scikit-optimize
Architecture: source
Version: 0.10.2-5
Distribution: unstable
Urgency: medium
Maintainer: Debian Python Team <[email protected]>
Changed-By: Colin Watson <[email protected]>
Closes: 1117992
Changes:
 scikit-optimize (0.10.2-5) unstable; urgency=medium
 .
   * Team upload.
   * Fix mixin order for scikit-learn >= 1.6 (closes: #1117992).
Checksums-Sha1:
 516f5c701251f416c3a5d95040bd52db3292fa8b 2954 scikit-optimize_0.10.2-5.dsc
 d7801f5b58b7d69b940f3d620a5a68060f04e66b 19496 
scikit-optimize_0.10.2-5.debian.tar.xz
 ab347275edaee0b0e12f592ba3e6941aa9af6aaa 4834716 
scikit-optimize_0.10.2-5.git.tar.xz
 19905d0616775e88dcc77d9a5d19c09ee07b8f83 18254 
scikit-optimize_0.10.2-5_source.buildinfo
Checksums-Sha256:
 025f6a258e97c7ba66930eee70f95e3e6ce9cc21577456b8411a94c58018bf0b 2954 
scikit-optimize_0.10.2-5.dsc
 c21abcf22b9e9acba6b5a73c38b29a54eb09727c958a368e1cf3a2ba05151937 19496 
scikit-optimize_0.10.2-5.debian.tar.xz
 010b4da6b066c6ad2f96712992bd405ab41c27e7924bcbfbf99c72a150861afa 4834716 
scikit-optimize_0.10.2-5.git.tar.xz
 9c132e0cb08ad74785a771aeeb544433ace072cb5ea50ebaa223f314818afca4 18254 
scikit-optimize_0.10.2-5_source.buildinfo
Files:
 f4d495b8de9d05e19f53e85593c12e78 2954 python optional 
scikit-optimize_0.10.2-5.dsc
 3c2ba76f6210029d1a79c1bb97ede226 19496 python optional 
scikit-optimize_0.10.2-5.debian.tar.xz
 b42cedb84be02a5cf8e8053d3dbdf9eb 4834716 python optional 
scikit-optimize_0.10.2-5.git.tar.xz
 d38db9bdaf31ee0e9eb4cf0900a528bc 18254 python optional 
scikit-optimize_0.10.2-5_source.buildinfo
Git-Tag-Info: tag=de35958886a5ae506bfeae7f898ec98ace52db49 
fp=ac0a4ff12611b6fccf01c111393587d97d86500b
Git-Tag-Tagger: Colin Watson <[email protected]>

-----BEGIN PGP SIGNATURE-----

iQIzBAEBCgAdFiEEN02M5NuW6cvUwJcqYG0ITkaDwHkFAmj0SXUACgkQYG0ITkaD
wHmHwxAAkiXm9yTbkzE90ivs3dC/MWgiEDkBxI3GJgDXYk7wvPaT9l55Ycx8FTxT
Ob7T11kogCJEn6bMK6xRdTO1MJozIr5KJ4Jpp2tdtfGOyKgRYdwdrvQ2skOJzpDN
URa/W3S6lzjH8OhudoId2snG7QDvS1f3n4OvlYVbz+5bueD+K9LRzc8a7AukDdIz
GOliVa9uf4hHl780N0c+smmIZNg9k017w4mDqS3GD4fcpzeWVeSTrQuvWC0nnDB2
PDAvZezvmWwk+qED2lTdKqgYKLbayDd1/kRleWd0GubumJXitc3E4XnF6u8DBDNI
Dkyf/G3JCkYLDR0IzxeFRTzoO4b61enuJFeU2MvlL4RwzWHZL7KTkPMtNSd7Kk2g
/pcmQgiGGQbnJ0mOEl4RK/wlzsqsS0p4oegiKZ7NTAaxZuGmI+SolRT6N1O6j6LO
tRWNo4tssoF+QQiHEpd4In4MuhhcilALor7xMBjjrOSInhqK1dVdHVdvhL1ZqrPZ
hx9qDyJVu/2rAn7SvO+LEN7KAdSck+rJYNn8v3ffNKDFgjnYG2Cc9CDuLjUO4YWH
43lBHHEs10z5wBoGRdNg6sRvMsk0DcAaLzuD4ojfodSkQlKewZelUmHQ6akPUU7V
KqGdkPNXUF8iy43AlRAFEUhm+vHyRXbPjW+M0fVeLj3YgIq4Res=
=S6J4
-----END PGP SIGNATURE-----

Attachment: pgpSKiD9U1qX2.pgp
Description: PGP signature


--- End Message ---

Reply via email to