Oh good to know! Scrap what I wrote then.
On Fri, Oct 19, 2018 at 9:08 AM Ash Berlin-Taylor wrote:
> echo 'pandas==2.1.3' > constraints.txt
>
> pip install -c constraints.txt apache-airflow[pandas]
>
> That will ignore what ever we specify in setup.py and use 2.1.3.
>
echo 'pandas==2.1.3' > constraints.txt
pip install -c constraints.txt apache-airflow[pandas]
That will ignore what ever we specify in setup.py and use 2.1.3.
https://pip.pypa.io/en/latest/user_guide/#constraints-files
(sorry for the brief message)
> On 19 Oct 2018, at 17:02, Maxime
> releases in pip should have stable (pinned deps)
I think that's an issue. When setup.py (the only reqs that setuptools/pip
knows about) is restrictive, there's no way to change that in your
environment, install will just fail if you deviate (are there any
hacks/solutions around that that I don't
On 10/17/18, 12:24 AM, "William Pursell" wrote:
I'm jumping in a bit late here, and perhaps have missed some of the
discussion, but I haven't seen any mention of the fact that pinning
versions in setup.py isn't going to solve the problem. Perhaps it's
my lack of experience with
n both in parallel. And the case
> >> with
> >> > constraints is a nice workaround for someone who actually need
> specific
> >> > (even newer) version of specific package in their environment.
> >> >
> >> > Mayb
;> fork
>> > apache-airflow or use constraints file of pip?
>> >
>> > J.
>> >
>> >
>> > On Tue, Oct 9, 2018 at 5:56 PM Matt Davis wrote:
>> >
>> > > Erik, the Airflow task execution code itself of course must run
>>
on dependencies at all
> > > except to get Airflow running. When running Python operators that's not
> > the
> > > case (as you already deal with).
> > >
> > > - Matt
> > >
> > > On Tue, Oct 9, 2018 at 2:45 AM EKC (Erik Cederstrand)
> > > wrote:
> &g
> > an environment where Airflow is not installed?
> > >
> > >
> > > Kind regards,
> > >
> > > Erik
> > >
> > >
> > > From: Matt Davis
> > > Sent: Monday, October 8, 2018 10:13:34 P
>
> > From: Matt Davis
> > Sent: Monday, October 8, 2018 10:13:34 PM
> > To: dev@airflow.incubator.apache.org
> > Subject: Re: Pinning dependencies for Apache Airflow
> >
> > It sounds like we can get the best of both worl
talled?
>
>
> Kind regards,
>
> Erik
>
>
> From: Matt Davis
> Sent: Monday, October 8, 2018 10:13:34 PM
> To: dev@airflow.incubator.apache.org
> Subject: Re: Pinning dependencies for Apache Airflow
>
> It sounds like we can
my experience so far.
> >>>>>
> >>>>> Additionally, Airflow is an open system - if you have very specific
> >>> needs
> >>>>> for requirements, you might actually - in the very same way with
> >>>>> pip-tools/poetry - upgr
bit counterintuitively I think tools like pip-tools/poetry help
> >>> you
> >>>> to
> >>>>> catch up faster in many cases. That is at least my experience so far.
> >>>>>
> >>>>> Additionally, Airflow is an open system - if
>>>> of
>>>>> democratise dependency management. It should be as easy as `pip-compile
>>>>> --upgrade` or `poetry update` and you will get all the
>>> "non-conflicting"
>>>>> latest dependencies in your local fork (and poetry especially seems to
>>> do
&
t. It should be as easy as
> `pip-compile
> > > > > --upgrade` or `poetry update` and you will get all the
> > > "non-conflicting"
> > > > > latest dependencies in your local fork (and poetry especially
> seems to
> > > do
> > > > > all the
grade eventually
> > > to
> > > > get it faster in master. You can even downgrade in case newer
> > dependency
> > > > causes problems for you in similar way. Guided by the tools, it's much
> > > > faster than figuring the versions out by yourself.
> >
t; > > upgrade/downgrade dependencies in your own fork, and mention how to
> > locally
> > > release Airflow as a package, I think your case could be covered even
> > > better than now. What do you think ?
> > >
> > > J.
> > >
> > &g
>
> > On Fri, Oct 5, 2018 at 2:34 PM EKC (Erik Cederstrand)
> > wrote:
> >
> > > For us, exact pinning of versions would be problematic. We have DAG
> code
> > > that shares direct and indirect dependencies with Airflow, e.g. lxml,
> > > request
lxml,
> > requests, pyhive, future, thrift, tzlocal, psycopg2 and ldap3. If our DAG
> > code for some reason needs a newer point release due to a bug that's
> fixed,
> > then we can't cleanly build a virtual environment containing the fixed
> > version. For us, it's alread
fixed
> version. For us, it's already a problem that Airflow has quite strict (and
> sometimes old) requirements in setup.py.
>
> Erik
>
> From: Jarek Potiuk
> Sent: Friday, October 5, 2018 2:01:15 PM
> To: dev@airflow.incubator.apach
@airflow.incubator.apache.org
Subject: Re: Pinning dependencies for Apache Airflow
I think one solution to release approach is to check as part of automated
Travis build if all requirements are pinned with == (even the deep ones)
and fail the build in case they are not for ALL versions (including
dev
I think one solution to release approach is to check as part of automated
Travis build if all requirements are pinned with == (even the deep ones)
and fail the build in case they are not for ALL versions (including
dev). And of course we should document the approach of releases/upgrades
etc. If we
One thing to point out here.
Right now if you `pip install apache-airflow=1.10.0` in a clean environment it
will fail.
This is because we pin flask-login to 0.2.1 but flask-appbuilder is >= 1.11.1,
so that pulls in 1.12.0 which requires flask-login >= 0.3.
So I do think there is maybe
Never tried poetry before, but it looks really good (it passes also my
aesthetic filter for slick design of the webpage). Quick look and it passes
a lot of criteria I have in my mind:
- works on all platforms
- easily installable with pip
- uses standard PyPI repositories by default (but
Hi all,
Have you considered looking into poetry[1]? I’ve had really good experiences
with it, we specifically introduced it into our project because we were getting
version conflicts, and it resolved them just fine. It properly supports
semantic versioning, so package versions have upper
I suggest not adopting pipenv. It has a nice "first five minutes" demo but
it's simply not baked enough to depend on as a swap in pip replacement. We
are in the process of removing it after finding several serious bugs in our
POC of it.
On Thu, Oct 4, 2018, 20:30 Alex Guziel
wrote:
> FWIW,
FWIW, there's some value in using virtualenv with Docker to isolate
yourself from your system's Python.
It's worth noting that requirements files can link other requirements
files, so that would make groups easier, but not that pip in one run has no
guarantee of transitive dependencies not
Hi Jarek,
Thanks for bringing this up. I missed the discussion on Slack since I'm on
holiday, but I saw the thread and it was way too interesting, and therefore
this email :)
This is actually something that we need to address asap. Like you mention,
we saw it earlier that specific transient
Thanks Jakob!
I think that this is a huge risk of Slack.
I am not against Slack as a support channel, but it is a slippery slope to
have more and more decisions/conversations happening there, contrary to
what we hope to achieve with the ASF.
When we are starting to discuss issues of development,
Thanks for pointing it out Jakob.
I am still very fresh in the ASF community and learning the ropes and
etiquette and code of conduct. Apologies for my ignorance.
I re-read the conduct and FAQ now again - with more understanding and will
pay more attention to wording in the future. As you
You should run `pip check` to ensure no conflicts. Pip does not do this on
its own.
On Thu, Oct 4, 2018 at 9:20 AM Jarek Potiuk
wrote:
> Great that this discussion already happened :). Lots of useful things in
> it. And yes - it means pinning in requirement.txt - this is how pip-tools
> work.
>
Great that this discussion already happened :). Lots of useful things in
it. And yes - it means pinning in requirement.txt - this is how pip-tools
work.
J.
Principal Software Engineer
Phone: +48660796129
On Thu, 4 Oct 2018, 18:14 Arthur Wiedmer, wrote:
> Hi Jarek,
>
> I will +1 the discussion
Hi Jarek,
I will +1 the discussion Dan is referring to and George's advice.
I just want to double check we are talking about pinning in
requirements.txt only.
This offers the ability to
pip install -r requirements.txt
pip install --no-deps airflow
For a guaranteed install which works.
Several
Relevant discussion about this:
https://github.com/apache/incubator-airflow/pull/1809#issuecomment-257502174
On Thu, Oct 4, 2018 at 11:25 AM Jarek Potiuk
wrote:
> TL;DR; A change is coming in the way how dependencies/requirements are
> specified for Apache Airflow - they will be fixed rather
TL;DR; A change is coming in the way how dependencies/requirements are
specified for Apache Airflow - they will be fixed rather than flexible (==
rather than >=).
This is follow up after Slack discussion we had with Ash and Kaxil -
summarising what we propose we'll do.
*Problem:*
During last few
34 matches
Mail list logo