Re: [sqlalchemy] Increase max query length!

2018-11-02 Thread Ruben Di Battista
Hi Jonhatan,  That query inserts around 1 million of rows nowadays (in more or less 8 minutes on remote DB — while the profiling data in this thread are on localhost — ) 2/3 times a day. This is expected to increase of a factor around 10x in next months/year.  I'm personally not targeting any

Re: [sqlalchemy] Increase max query length!

2018-11-02 Thread Ruben Di Battista
on that… Thank you very much for your support! On 2 novembre 2018 a 18:41:32, Mike Bayer (mike...@zzzcomputing.com) scritto: On Fri, Nov 2, 2018 at 1:08 PM Ruben Di Battista wrote: > > Thanks Mike as always, > > I'm diving a bit more in the problem. The solution they deci

Re: [sqlalchemy] Increase max query length!

2018-11-02 Thread Ruben Di Battista
ger number of function calls. To me this seems related to the args escaping from MySQLdb cursors.py. Is there any better way to optimize that INSERT query? On Friday, November 2, 2018 at 5:23:51 PM UTC+1, Mike Bayer wrote: > > On Fri, Nov 2, 2018 at 11:17 AM Ruben Di Battista > >

[sqlalchemy] Re: Increase max query length!

2018-11-02 Thread Ruben Di Battista
session.execute it, I obtain performance benefits. On Friday, November 2, 2018 at 4:17:43 PM UTC+1, Ruben Di Battista wrote: > > Hello, > > I have a huge insert of the type: > > ``` > session.execute( > insert_query, > [ > { >

[sqlalchemy] Increase max query length!

2018-11-02 Thread Ruben Di Battista
Hello, I have a huge insert of the type: ``` session.execute( insert_query, [ { 'time': times[i], 'elevation': elevation[i], 'azimuth': azimuth[i], 'doppler': doppler[i],

Re: [sqlalchemy] Custom JSON type that acts as a numpy array on Python side

2018-08-30 Thread Ruben Di Battista
,.,|∞∞ .' '. | -' `’ https://rdb.is On 30 agosto 2018 a 17:09:33, Mike Bayer (mike...@zzzcomputing.com) scritto: On Thu, Aug 30, 2018 at 6:32 AM, Ruben Di Battista wrote: > Ehi Mike, thank you :). > > I just went ahead in implementing a custom type for a Pyth

Re: [sqlalchemy] Custom JSON type that acts as a numpy array on Python side

2018-08-30 Thread Ruben Di Battista
fn.__self__): ValueError: The truth value of an array with more than one element is ambiguous. Use a.any() or a.all() That is because a `numpy` array cannot be checked for truth. Is there a way to monkey patch that `if no_self` in order to use the right method for the array to be true (so .any()

Re: [sqlalchemy] how to create timestamp without time zone column with postgresql

2018-07-23 Thread Ruben Di Battista
What about using DateTime type? _ -. .´ | ', ;|∞∞ ˜˜ |∞ RdB ,.,|∞∞ .' '. | -' `’ http://rdb.is On 23 luglio 2018 a 11:01:10, Yingchen Zhang (cevin.che...@gmail.com) scritto: data type TIMESTAMP just have one param is

[sqlalchemy] Custom JSON type that acts as a numpy array on Python side

2018-07-23 Thread Ruben Di Battista
Hello, I need to store a matrix into the database and I was evaluating the possibility to have a column of JSON type in MySQL to store it. What I would like to achieve is the possibility of operating on a numpy array when manipulating that column on Python while keeping a "meaningful" data

[sqlalchemy] Merge instances of objects without dropping attributes that are not persisted

2018-04-02 Thread Ruben Di Battista
Hello, I have some SQLAlchemy-persisted instances of objects that have some relationships coming from a parallel execution with multiprocessing. When I reduce the results coming from the several processes, I need to merge some relationships (`satellite` and `ground_station` objects in the

Re: [sqlalchemy] Emit INSERT query with float values reducing number of decimal digits

2018-03-05 Thread Ruben Di Battista
t__> > > пн, 5 мар. 2018 г. в 18:42, Ruben Di Battista <rubendibatti...@gmail.com>: > >> I have a table that is storing a huge amount of numerical details about >> my application. I have huge INSERT queries (also millions of rows for each >> of them) of float val

[sqlalchemy] Emit INSERT query with float values reducing number of decimal digits

2018-03-05 Thread Ruben Di Battista
I have a table that is storing a huge amount of numerical details about my application. I have huge INSERT queries (also millions of rows for each of them) of float values that are made with core API, while the rest of application logic is ORM. The precision I need on each float is not big,

Re: [sqlalchemy] Re: Temporarily disable DB persistence for optimization routine

2018-01-18 Thread Ruben Di Battista
I confirm what I said. The run in multiprocessing was regenerating instances because after deserialization they were getting new IDs. I tried to implement a custom __hash__ but it seems that SQLAlchemy does not get it. What I did was disabling the backref cascade for `Satellite` and

Re: [sqlalchemy] Re: Temporarily disable DB persistence for optimization routine

2018-01-17 Thread Ruben Di Battista
11:53 AM, Simon King <si...@simonking.org.uk> wrote: > Yes, if you can't find where you are creating new Satellite instances, > I'd probably stick an assert statement in Satellite.__init__ and see > where it gets triggered. > > Simon > > On Mon, Jan 15, 2018 at 10:34 AM,

Re: [sqlalchemy] Re: Temporarily disable DB persistence for optimization routine

2018-01-15 Thread Ruben Di Battista
aos=aos, los=los, tca=tca, deltaT=deltaT) `self` should be a reference to the instance of `Satellite` already loaded from DB. I will try to dive more into the code... Thanks a lot for the kind help of all of you, On Monday, January 15, 2018 at 10:06:24 AM UTC+1, Simon K

Re: [sqlalchemy] Re: Temporarily disable DB persistence for optimization routine

2018-01-13 Thread Ruben Di Battista
ow example of use, where you are > > doing things that make objects and you'd like them to not be > > persisted. If you need to create unique objects in memory without > > persisting, you just need to store them in some dictionary that sets > > up

[sqlalchemy] Re: Temporarily disable DB persistence for optimization routine

2018-01-11 Thread Ruben Di Battista
ationProxy): instance = instance.ensure_unicity(session) On Thursday, January 4, 2018 at 6:05:38 PM UTC+1, Ruben Di Battista wrote: > > Hello, > I'm writing a satellite passage scheduler that has a database persistence > layer to store the scheduled passages. > > The DB sc

[sqlalchemy] Re: Temporarily disable DB persistence for optimization routine

2018-01-11 Thread Ruben Di Battista
abstractmethod def ensure_unicity(self, session): return NotImplementedError() In this way I'm making SQLA to store everything at instance init time, if I well understand. Could you please help me understand how to improve the situation? Thanks in advance. On Thursday, Januar

[sqlalchemy] Temporarily disable DB persistence for optimization routine

2018-01-04 Thread Ruben Di Battista
Hello, I'm writing a satellite passage scheduler that has a database persistence layer to store the scheduled passages. The DB schema is organized as follows: - A table storing the satellites (using NORAD No as Primary Key) - A table storing the ground stations where to compute the passages

[sqlalchemy] Multiprocessing with SQLAlchemy and pickling errors

2017-08-13 Thread Ruben Di Battista
Hello, I have a heavy method of a class (not mapped with SQLA) that takes as argument some SQLA instances (with relationships), does some operation on them (without using any SQLA feature), and returns a subset of them. The logic was working before I added the SQLAlchemy persistence layer

[sqlalchemy] 'orphan-delete' when one ForeignKey is null on table with multiple ForeignKeys

2017-08-07 Thread Ruben Di Battista
Hello, I have a tables with two ForeignKeys. When I remove the relation on one side, SQLAlchemy sets to 'NULL' the related ForeignKey, but the related row is not considered orphaned since it hase still the other ForeignKey. Is there a way to make SQLAlchemy fulfill the `orphan-delete' cascade

[sqlalchemy] Re: Streamlined dictionary numpy arrays storage in a one-to-many relationship

2017-07-30 Thread Ruben Di Battista
Thanks, this is in fact what I implemented now as a method in the Sensor class, exploiting also the bulk_insert_mappings since the number of readings are quite a lot (400k each time): def store_readings(self, session): if not(self.id): session.add(self)

[sqlalchemy] Re: Streamlined dictionary numpy arrays storage in a one-to-many relationship

2017-07-27 Thread Ruben Di Battista
sensor.values['value'] = values This is a typo. Should be this: sensor.readings['value'] = values On Thursday, July 27, 2017 at 4:50:23 PM UTC+2, Ruben Di Battista wrote: > > Hello, I'm trying to figure out a streamlined way to store some children > values that are stored in num

[sqlalchemy] Streamlined dictionary numpy arrays storage in a one-to-many relationship

2017-07-27 Thread Ruben Di Battista
Hello, I'm trying to figure out a streamlined way to store some children values that are stored in numpy arrays in Python. As example let's assume I have a parent object that is a sensor that has some readings associated to it: class Sensor(object): __tablename__ = 'sensor' id =

Re: [sqlalchemy] Inverse the mapping in a composite association proxy

2017-07-24 Thread Ruben Di Battista
I'm playing a bit with composite Association Proxies and stumbled upon this discussion. Just wanted to point out that here: @event.listens_for(Session, "after_attach") def after_attach(session, instance): # when UserCourse objects are attached to a Session, # figure out what

Re: [sqlalchemy] Association proxy in classical mapping keeping the business logic and the db schema separated

2017-07-24 Thread Ruben Di Battista
not super familiar with the automagic of SQLAlchemy... On Mon, Jul 24, 2017 at 1:50 AM, Mike Bayer <mike...@zzzcomputing.com> wrote: > On Sun, Jul 23, 2017 at 11:26 AM, Ruben Di Battista > <rubendibatti...@gmail.com> wrote: > > Hello, > > > > I'm trying to introduce

[sqlalchemy] Association proxy in classical mapping keeping the business logic and the db schema separated

2017-07-23 Thread Ruben Di Battista
Hello, I'm trying to introduce database persistence into an already existent class hierarchy. That mean I have in a separate module all the class representing the models of my application (without SQL interaction methods), then I have in another module the SQL schema and the mapping (so I'm