Hi Jonhatan,
That query inserts around 1 million of rows nowadays (in more or less 8 minutes
on remote DB — while the profiling data in this thread are on localhost — ) 2/3
times a day. This is expected to increase of a factor around 10x in next
months/year.
I'm personally not targeting any
on that… Thank you very much for your
support!
On 2 novembre 2018 a 18:41:32, Mike Bayer (mike...@zzzcomputing.com) scritto:
On Fri, Nov 2, 2018 at 1:08 PM Ruben Di Battista
wrote:
>
> Thanks Mike as always,
>
> I'm diving a bit more in the problem. The solution they deci
ger number of function calls. To me this seems related to
the args escaping from MySQLdb cursors.py. Is there any better way to
optimize that INSERT query?
On Friday, November 2, 2018 at 5:23:51 PM UTC+1, Mike Bayer wrote:
>
> On Fri, Nov 2, 2018 at 11:17 AM Ruben Di Battista
> >
session.execute it, I obtain performance benefits.
On Friday, November 2, 2018 at 4:17:43 PM UTC+1, Ruben Di Battista wrote:
>
> Hello,
>
> I have a huge insert of the type:
>
> ```
> session.execute(
> insert_query,
> [
> {
>
Hello,
I have a huge insert of the type:
```
session.execute(
insert_query,
[
{
'time': times[i],
'elevation': elevation[i],
'azimuth': azimuth[i],
'doppler': doppler[i],
,.,|∞∞
.' '. |
-' `’
https://rdb.is
On 30 agosto 2018 a 17:09:33, Mike Bayer (mike...@zzzcomputing.com) scritto:
On Thu, Aug 30, 2018 at 6:32 AM, Ruben Di Battista
wrote:
> Ehi Mike, thank you :).
>
> I just went ahead in implementing a custom type for a Pyth
fn.__self__):
ValueError: The truth value of an array with more than one element is
ambiguous. Use a.any() or a.all()
That is because a `numpy` array cannot be checked for truth. Is there a way
to monkey patch that `if no_self` in order to use the right method for the
array to be true (so .any()
What about using DateTime type?
_
-. .´ |
', ;|∞∞
˜˜ |∞ RdB
,.,|∞∞
.' '. |
-' `’
http://rdb.is
On 23 luglio 2018 a 11:01:10, Yingchen Zhang (cevin.che...@gmail.com)
scritto:
data type TIMESTAMP just have one param is
Hello,
I need to store a matrix into the database and I was evaluating the
possibility to have a column of JSON type in MySQL to store it. What I
would like to achieve is the possibility of operating on a numpy array when
manipulating that column on Python while keeping a "meaningful" data
Hello,
I have some SQLAlchemy-persisted instances of objects that have some
relationships coming from a parallel execution with multiprocessing. When I
reduce the results coming from the several processes, I need to merge some
relationships (`satellite` and `ground_station` objects in the
t__>
>
> пн, 5 мар. 2018 г. в 18:42, Ruben Di Battista <rubendibatti...@gmail.com>:
>
>> I have a table that is storing a huge amount of numerical details about
>> my application. I have huge INSERT queries (also millions of rows for each
>> of them) of float val
I have a table that is storing a huge amount of numerical details about my
application. I have huge INSERT queries (also millions of rows for each of
them) of float values that are made with core API, while the rest of
application logic is ORM. The precision I need on each float is not big,
I confirm what I said.
The run in multiprocessing was regenerating instances because after
deserialization they were getting new IDs. I tried to implement a custom
__hash__ but it seems that SQLAlchemy does not get it.
What I did was disabling the backref cascade for `Satellite` and
11:53 AM, Simon King <si...@simonking.org.uk> wrote:
> Yes, if you can't find where you are creating new Satellite instances,
> I'd probably stick an assert statement in Satellite.__init__ and see
> where it gets triggered.
>
> Simon
>
> On Mon, Jan 15, 2018 at 10:34 AM,
aos=aos, los=los, tca=tca,
deltaT=deltaT)
`self` should be a reference to the instance of `Satellite` already loaded
from DB. I will try to dive more into the code...
Thanks a lot for the kind help of all of you,
On Monday, January 15, 2018 at 10:06:24 AM UTC+1, Simon K
ow example of use, where you are
> > doing things that make objects and you'd like them to not be
> > persisted. If you need to create unique objects in memory without
> > persisting, you just need to store them in some dictionary that sets
> > up
ationProxy):
instance = instance.ensure_unicity(session)
On Thursday, January 4, 2018 at 6:05:38 PM UTC+1, Ruben Di Battista wrote:
>
> Hello,
> I'm writing a satellite passage scheduler that has a database persistence
> layer to store the scheduled passages.
>
> The DB sc
abstractmethod
def ensure_unicity(self, session):
return NotImplementedError()
In this way I'm making SQLA to store everything at instance init time, if I
well understand.
Could you please help me understand how to improve the situation?
Thanks in advance.
On Thursday, Januar
Hello,
I'm writing a satellite passage scheduler that has a database persistence
layer to store the scheduled passages.
The DB schema is organized as follows:
- A table storing the satellites (using NORAD No as Primary Key)
- A table storing the ground stations where to compute the passages
Hello,
I have a heavy method of a class (not mapped with SQLA) that takes as
argument some SQLA instances (with relationships), does some operation on
them (without using any SQLA feature), and returns a subset of them. The
logic was working before I added the SQLAlchemy persistence layer
Hello,
I have a tables with two ForeignKeys. When I remove the relation on one
side, SQLAlchemy sets to 'NULL' the related ForeignKey, but the related row
is not considered orphaned since it hase still the other ForeignKey. Is
there a way to make SQLAlchemy fulfill the `orphan-delete' cascade
Thanks, this is in fact what I implemented now as a method in the Sensor
class, exploiting also the bulk_insert_mappings since the number of
readings are quite a lot (400k each time):
def store_readings(self, session):
if not(self.id):
session.add(self)
sensor.values['value'] = values
This is a typo. Should be this:
sensor.readings['value'] = values
On Thursday, July 27, 2017 at 4:50:23 PM UTC+2, Ruben Di Battista wrote:
>
> Hello, I'm trying to figure out a streamlined way to store some children
> values that are stored in num
Hello, I'm trying to figure out a streamlined way to store some children
values that are stored in numpy arrays in Python. As example let's assume I
have a parent object that is a sensor that has some readings associated to
it:
class Sensor(object):
__tablename__ = 'sensor'
id =
I'm playing a bit with composite Association Proxies and stumbled upon this
discussion. Just wanted to point out that here:
@event.listens_for(Session, "after_attach")
def after_attach(session, instance):
# when UserCourse objects are attached to a Session,
# figure out what
not super familiar with the automagic of
SQLAlchemy...
On Mon, Jul 24, 2017 at 1:50 AM, Mike Bayer <mike...@zzzcomputing.com>
wrote:
> On Sun, Jul 23, 2017 at 11:26 AM, Ruben Di Battista
> <rubendibatti...@gmail.com> wrote:
> > Hello,
> >
> > I'm trying to introduce
Hello,
I'm trying to introduce database persistence into an already existent class
hierarchy. That mean I have in a separate module all the class representing
the models of my application (without SQL interaction methods), then I have
in another module the SQL schema and the mapping (so I'm
27 matches
Mail list logo