Hey Simon,
I am indeed getting back duplicates of aggregate ids. I reworked my query
to using exists like you suggested and now I get back correct result
count. Thanks so much for your suggestion and saving my day! I can now
move ahead on my work ticket.
Many, many, thanks again, appreciate
This is the same problem: you're writing a query that joins 3 tables
together, and then applying a "LIMIT 20" to that query. If you look
carefully at your 20 rows of psql output, I expect you'll see the same
aggregates_id appear more than once. There are less than 20 distinct
Aggregate objects. Whe
Hey Simon,
Thanks so much for replying to my question. I reworked my code to use
sqlalchemy ORM and took off flask and paginate so I can narrow down the
issue. My models now extend from declarative_base.
engine =
create_engine('postgresql://postgres:postgres@localhost:5400/postgres')
Session
"paginate" is not an SQLAlchemy function, so you'd be better off
asking the author of whatever is providing that feature.
However, I would guess that maybe paginate is naively applying
something like "LIMIT 20" to the query. This doesn't work properly
when you join along a one-to-many relationship