[sqlalchemy] Re: Streamlined dictionary numpy arrays storage in a one-to-many relationship

2017-07-31 Thread Jonathan Vanasco
Using `bulk_insert_mappings` is much more performant than using the ORM and custom collections. Right now you are bypassing ORM object creation and state management. Your operations might be 10x slower with a collection. IIRC, `bulk_insert_mappings` will iterate over the payload like this:

[sqlalchemy] Re: Streamlined dictionary numpy arrays storage in a one-to-many relationship

2017-07-30 Thread Ruben Di Battista
Thanks, this is in fact what I implemented now as a method in the Sensor class, exploiting also the bulk_insert_mappings since the number of readings are quite a lot (400k each time): def store_readings(self, session): if not(self.id): session.add(self) sessi

[sqlalchemy] Re: Streamlined dictionary numpy arrays storage in a one-to-many relationship

2017-07-28 Thread Jonathan Vanasco
Unless you need to use all the readings immediately, have you considered just making a custom def under the Sensor model, and then inserting all the readings via sqlalchemy core? That would allow you to insert them without creating ORM objects, which people using numpy and a lot of data often l

[sqlalchemy] Re: Streamlined dictionary numpy arrays storage in a one-to-many relationship

2017-07-27 Thread Ruben Di Battista
sensor.values['value'] = values This is a typo. Should be this: sensor.readings['value'] = values On Thursday, July 27, 2017 at 4:50:23 PM UTC+2, Ruben Di Battista wrote: > > Hello, I'm trying to figure out a streamlined way to store some children > values that are stored in numpy arrays in