On Tue, Oct 31, 2017 at 12:18 AM,  <a...@riptideio.com> wrote:
> The EntityProperty table was created by application itself from a time
> earlier to the introduction of alembic for the migrations. So should I
> create the EntityProperty table in the migration script itself? If so then
> how the models connect. The one sitting in the application and the one in
> the migration script.


when you're in a migration script, you have to be careful when
referring to code that is in the main part of the application.  The
code that the migration script refers to can never change, unless the
migration files are also changed.

Consider your project has been going for three years.  It has 50
migration scripts.   In year one, your model had three tables.  In
year two, the model was entirely refactored to have seven tables.
Then in year three, it was refactored yet again, now it has only four
tables.  For each of these major versions, the design of the schema is
totally different.

Now consider if you look at the latest version of the applciation,
with four tables.  In the alembic/versions/ directory, the first 12 of
those migration scripts were written back when the code had three
tables.  The next 20 of them are coded against the seven table
version.   Then the last 16 are against the four table version.

Someone who downloads and installs your app right now, then tries to
migrate the database from the very beginning, needs to run *all 50
scripts*, unless you have pruned them.  So if the first 32 scripts all
pull in the model from the main codebase, they will break.   They will
be referring to tables and columns that no longer exist.

This is why, in a migration script, you need to refer to tables and
optionally ORM models that are *local to that script*, because those
things by definition can change on the outside, but not within the
migration script as long as it exists.

A quick way to get a working model within your migration script is to
only name the Table objects you need, and locally define just the
columns you need.  Or, use reflection with a local MetaData object so
that you get a snapshot of the Table metadata that is in place while
that script runs.   If you want to use the ORM, you can combine this
with the automap feature
(http://docs.sqlalchemy.org/en/latest/orm/extensions/automap.html) to
get a quick reflected ORM model.

However, if it were me, I'd use straight Table/Column/MetaData objects
to do data migrations within a migration file and I'd keep the
definitions minimal to the specific tables and columns I need, and I
wouldn't use the application's models because those models are not a
fixed constant relative to a migration file.

I hope this explains the problem better.








>
> Thanks
>
> On Tuesday, October 31, 2017 at 9:27:59 AM UTC+5:30, Mike Bayer wrote:
>>
>> On Mon, Oct 30, 2017 at 11:23 PM,  <am...@riptideio.com> wrote:
>> > I have 2 migration scripts.
>> >
>> > The first 1 (base) is this
>> >
>> > from models import EntityProperty
>> > from contextlib import contextmanager
>> >
>> > # revision identifiers, used by Alembic.
>> > revision = 'ecbde8fa83e3'
>> > down_revision = None
>> > branch_labels = None
>> > depends_on = None
>> >
>> > from alembic import op       # noqa
>> > import sqlalchemy as sa      # noqa
>> >
>> >
>> > @contextmanager
>> > def session_scope():
>> >    session = sa.orm.session.Session(bind=op.get_bind())
>> >    try:
>> >        yield session
>> >        session.commit()
>> >    except:
>> >         session.rollback()
>> >         raise
>> >    finally:
>> >         session.close()
>> >
>> >
>> > def _upgrade(session):
>> >    properties = session.query(EntityProperty).filter(
>> >       ~EntityProperty._ptype.in_(["AI", "AO", "AV"])
>> >    ).all()
>> >
>> >    for _property in properties:
>> >       _property._cov_increment = None
>> >
>> >
>> > def upgrade():
>> >    with session_scope() as session:
>> >     _upgrade(session)
>> >
>> >
>> > def downgrade():
>> >     pass
>> >
>> >
>> > This script queries the EntityPropertyTable and does some op on it.
>> >
>> > The second migration script is
>> >
>> > revision = 'ab47480a7be7'
>> >  down_revision = u'ecbde8fa83e3'
>> >  branch_labels = None
>> >  depends_on = None
>> >
>> >  from alembic import op       # noqa
>> >  import sqlalchemy as sa      # noqa
>> >
>> >
>> > def upgrade():
>> >    # add bacnet_enable to EP
>> >    with op.batch_alter_table(u'entityproperties', schema=None) as
>> > batch_op:
>> >       batch_op.execute("PRAGMA foreign_keys=OFF;")
>> >       batch_op.add_column(
>> >           sa.Column(
>> >               'bacnet_object', sa.Boolean(), nullable=True,
>> >               server_default=expression.true()))
>> >
>> >
>> >  def downgrade():
>> >     with op.batch_alter_table(u'entityproperties', schema=None) as
>> > batch_op:
>> >       batch_op.drop_column('bacnet_object')
>> >
>> >
>> > This script does operate in batch mode and adds a column called
>> > 'bacnet_object' to the EntityProperty table.
>> >
>> > Now when I try to downgrade to base 'downgrade -r base' I get the
>> > following
>> > error
>> >
>> > no such column: entityproperties.bacnet_object
>> >
>> > This error is while executing script 1. The execution of script 2 does
>> > proceeds without any issues. When I look up the generated SQL for
>> > script1 I
>> > find this in the SQL statement
>> > Enter code here..
>> > entityproperties.bacnet_object AS entityproperties_bacnet_object
>> >
>> > The downgrade of script2 does indeed removes the bacnet_object column
>> > from
>> > the entityproperty table. Why the SQL generated for script1 is still
>> > looking
>> > for bacnet_object column in the EntityProperty table?
>>
>> This would mean your mappings used in script #1 still include the
>> bacnet_object column.
>>
>> You haven't given me any information on how EntityProperty gets
>> created but it would appear that it runs table reflection upon import.
>>   This import would occur before any migration script runs, so when
>> script 2 drops the column you now have a stale mapping.
>>
>> The general antipattern here is that you're referring to your
>> application's models within a migration script.   Since each migration
>> file represents a point in time of your schema design, one that might
>> be very different from what the current model is, to run any SQL
>> queries in a migration file you should create Table metadata directly
>> in the migration that explicitly includes the tables and columns you
>> need to query against, which would always exist as of that migration
>> version.
>>
>>
>> >
>> > --
>> > You received this message because you are subscribed to the Google
>> > Groups
>> > "sqlalchemy-alembic" group.
>> > To unsubscribe from this group and stop receiving emails from it, send
>> > an
>> > email to sqlalchemy-alembic+unsubscr...@googlegroups.com.
>> > For more options, visit https://groups.google.com/d/optout.
>
> --
> You received this message because you are subscribed to the Google Groups
> "sqlalchemy-alembic" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to sqlalchemy-alembic+unsubscr...@googlegroups.com.
> For more options, visit https://groups.google.com/d/optout.

-- 
You received this message because you are subscribed to the Google Groups 
"sqlalchemy-alembic" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to sqlalchemy-alembic+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Reply via email to