Doug Hellmann wrote:
Excerpts from Clark Boylan's message of 2017-04-19 08:10:43 -0700:
On Wed, Apr 19, 2017, at 05:54 AM, Julien Danjou wrote:
Hoy,

So Gnocchi gate is all broken (agaiiiin) because it depends on "pbr" and
some new release of oslo.* depends on pbr!=2.1.0.

Neither Gnocchi nor Oslo cares about whatever bug there is in pbr 2.1.0
that got in banished by requirements Gods. It does not prevent it to be
used e.g. to install the software or get version information. But it
does break anything that is not in OpenStack because well, pip installs
the latest pbr (2.1.0) and then oslo.* is unhappy about it.
It actually breaks everything, including OpenStack. Shade and others are
affected by this as well. The specific problem here is that PBR is a
setup_requires which means it gets installed by easy_install before
anything else. This means that the requirements restrictions are not
applied to it (neither are the constraints). So you get latest PBR from
easy_install then later when something checks the requirements
(pkg_resources console script entrypoints?) they break because latest
PBR isn't allowed.

We need to stop pinning PBR and more generally stop pinning any
setup_requires (there are a few more now since setuptools itself is
starting to use that to list its deps rather than bundling them).

So I understand the culprit is probably pip installation scheme, and we
can blame him until we fix it. I'm also trying to push pbr 2.2.0 to
avoid the entire issue.
Yes, a new release of PBR undoing the "pin" is the current sane step
forward for fixing this particular issue. Monty also suggested that we
gate global requirements changes on requiring changes not pin any
setup_requires.

But for the future, could we stop updating the requirements in oslo libs
for no good reason? just because some random OpenStack project hit a bug
somewhere?

For example, I've removed requirements update on tooz¹ for more than a
year now, which did not break *anything* in the meantime, proving that
this process is giving more problem than solutions. Oslo libs doing that
automatic update introduce more pain for all consumers than anything (at
least not in OpenStack).
You are likely largely shielded by the constraints list here which is
derivative of the global requirements list. Basically by using
constraints you get distilled global requirements and even without being
part of the requirements updates you'd be shielded from breakages when
installed via something like devstack or other deployment method using
constraints.

So if we care about Oslo users outside OpenStack, I beg us to stop this
crazyness. If we don't, we'll just spend time getting rid of Oslo over
the long term…
I think we know from experience that just stopping (eg reverting to the
situation we had before requirements and constraints) would lead to
sadness. Installations would frequently be impossible due to some
unresolvable error in dependency resolution. Do you have some
alternative in mind? Perhaps we loosen the in project requirements and
explicitly state that constraints are known to work due to testing and
users should use constraints? That would give users control to manage
their own constraints list too if they wish. Maybe we do this in
libraries while continuing to be more specific in applications?

At the meeting in Austin, the requirements team accepted my proposal
to stop syncing requirements updates into projects, as described
in https://etherpad.openstack.org/p/ocata-requirements-notes

We haven't been able to find anyone to work on the implementation,
though. I is my understanding that Tony did contact the Telemetry
and Swift teams, who are most interested in this area of change,
about devoting some resources to the tasks outlined in the proposal.

Doug

My 2c,

Cheers,


Wasn't there also some decision made in austin (?) about how we as a group stated something along the line of co-installability isn't as important as it once was (and may not even be practical or what people care about anymore anyway)?

With kolla becoming more popular (tripleo I think is using it, and ...) and the containers it creates making isolated per-application environments it makes me wonder what of global-requirements is still valid (as a concept) and what isn't.

I do remember the days of free for all requirements (or requirements sometimes just put/stashed in devstack vs elsewhere), which I don't really want to go back to; but if we finally all agree that co-installability isn't what people actually do and/or care about (anymore?) then maybe we can re-think some things?

I personally still like having an ability to know some set of requirements works for certain project X for a given release Z (as tested by the gate); though I am not really concerned about if the same set of requirements works for certain project Y (also in release Z). If this is something others agree with then perhaps we just need to store those requirements and the last *constraints* tested inside each project (instead of storing it in a requirements repository)?

Sadly idk if that summit meeting (I think we had an etherpad?) ever went anywhere. Probably because IMHO it's overwhelming to take into account the amount of things that must be undone or changed to accomplish such a goal; due to <history> (but perhaps it's time we just tried).

My 4 cents,

-Josh

__________________________________________________________________________
OpenStack Development Mailing List (not for usage questions)
Unsubscribe: openstack-dev-requ...@lists.openstack.org?subject:unsubscribe
http://lists.openstack.org/cgi-bin/mailman/listinfo/openstack-dev

Reply via email to