Re: [pygame] man oh man Java is painful

2011-11-07 Thread Mac Ryan
On Fri, 4 Nov 2011 09:45:58 -0600
Ian Mallett geometr...@gmail.com wrote:

 Good C++ is always faster than Java code.
  However, crappy C++ is slower than crappy Java code.

The flaw here is that the underlying assumption is that execution time
is THE criteria for evaluating code quality. 

Contrarily, there are often instances where one might prefer slower
performing code if more readable / better structured / easier to
maintain.

It might be true that the C++ implementation of any algorithm is
faster than its Java counterpart, though. As well as it is true that
one could write an optimised-for-speed java implementation of something
that outperform a non optimised C++ version.

Both scenarios have nothing to do with crappiness of a given
implementation though. :o

/mac


Re: [pygame] man oh man Java is painful

2011-11-03 Thread Mac Ryan
On Thu, 03 Nov 2011 14:19:28 +1300
Greg Ewing greg.ew...@canterbury.ac.nz wrote:

 I've yet to decide whether there's a place for a language
 that combines Python's flexibility and get-out-of-your-way
 nature with static type checking and efficient code generation.
 If there is, I don't think that language exists yet.

Absolutely not sure it fits the bill... but have you had a look at go?

http://golang.org/

It's used by folks at Google, and since they are probably the biggest
sponsor of python there are good chances they used some python-wisdom
in crafting it...

/mac


Re: [pygame] Native PyGame method for automatically scaling inputs to a surface resolution?

2011-10-03 Thread Mac Ryan
On Mon, 03 Oct 2011 17:52:13 +1300
Greg Ewing greg.ew...@canterbury.ac.nz wrote:

  Could be another parameter for the
  surface (coord_system=PIXEL_CENTRE | PIXEL_BOUNDARIES)!! :P  
 
 I don't think that would be strictly necessary, because you
 would be able to get the same effect by adjusting the translation
 by 0.5 pixels.
 
 Also, providing too many options could just lead to confusion.
 Surfaces already have a bewildering collection of options --
 let's not add any more if we don't have to!

Ah, the ambiguities of written langauge! ;) The :P was meant to
indicate it was a joke! :)

Anyhow I agree with you that the theory should be well thought out.

So far this is a summary of how I would try to implement the feature:

1. Scaling option... optional. Defaults to 1 and in that case the
   scaling procedure is bypassed at once instantiating a class that
   doesn't have it.
2. Scaling only happens one-way, returned rectangles are in pixels (so
   no floating point coords).
3. Pixels and hairlines are supported without having to specify their
   size by defaulting the unspecified size to 1/scale.

I'm still undecided on whether the positioning should use pixel
boundaries or pixel centres: I think there are good reasons to go for
both. Namely I think centres is a more appropriate way to describe
position of objects in a modelled space (nobody says the car is in the
far end left corner of the garage, as intuitively we consider the
geometrical centre of objects as their most significative one for
positional reference). OTOH, boundaries are way more consistent with
how PyGame operates on standard surfaces.

Again: it might well be that when I will have a crack at it it will
prove beyond my capacities, but before even try to start working on it
(which I shall repeat won't be immediately), I would like to know I'm
moving towards a sensible direction...

/mac


Re: [pygame] Native PyGame method for automatically scaling inputs to a surface resolution?

2011-10-02 Thread Mac Ryan
On Sun, 02 Oct 2011 14:12:21 +1300
Greg Ewing greg.ew...@canterbury.ac.nz wrote:

  If you're going to say that lines are 1-dimensional and thus
  infinitely smaller than pixels, and thus we're obliged to draw a
  thin rectangle whenever we want a line, then (a) I probably would
  not use the tool if it doesn't even support drawing lines,  
 
 There's no reason line drawing shouldn't be supported.
 A line of some specified width can easily be converted
 automatically into a filled polygon, and there can be
 a default for the width if it's not specified.

Which is indeed another way to say their thickness would be
automatically set to 1/scale... or did I get you wrong?

/mac


Re: [pygame] Native PyGame method for automatically scaling inputs to a surface resolution?

2011-10-02 Thread Mac Ryan
On Sun, 02 Oct 2011 13:53:02 +1300
Greg Ewing greg.ew...@canterbury.ac.nz wrote:

 Now, suppose our arithmetic is a little inaccurate, and we
 actually get x = 0.499. The boundaries then come out as -0.001
 and 0.999. If we round these, we get 0.0 and 1.0 as before.
 But if we floor, we get -1.0 and -0.0, and the pixel goes
 off the screen.
 ...
 This may be the reason we're misunderstanding each other --
 you're thinking of (0, 0) as being the *centre* of the top
 left pixel in the window, whereas I'm thinking of it as the
 *top left corner* of that pixel.

Although I was not the recipient of the original answer: Very
interesting! that's actually the first time I understand why using
the upper-left coordinate system may make sense under certain
conditions. :)

Yet, while I followed the math through, I am unsure about how bad small
inaccuracies might turn out. Those inaccuracies would essentially be
the fruit of scaling down a very large physical model to screen
resolution, so I truly wouldn't care if I would expected my
sprite to be 1px to the left or what it appears, as far as the
model performs accurately. For game interface elements (where visual
alignment might be more relevant) I probably wouldn't use scaled
surfaces anyway.

But again... interesting to debate. Could be another parameter for the
surface (coord_system=PIXEL_CENTRE | PIXEL_BOUNDARIES)!! :P

/mac


Re: [pygame] Native PyGame method for automatically scaling inputs to a surface resolution?

2011-09-30 Thread Mac Ryan
On Thu, 29 Sep 2011 18:18:48 -0400
Christopher Night cosmologi...@gmail.com wrote:

 If I have a 100x100 pixel window, and I want to put a dot at a
 position (x,x), it seems to me like the dot should appear in the
 window if 0 = x  100. You're saying it should appear in the window
 if -0.5 = x  99.5. So if I request to put a point at (-0.4, -0.4),
 it will appear in the window, but if I put one at (99.6, 99.6), it
 won't. I disagree that this is the correct behavior. Intuitively, the
 point (99.6, 99.6) should be within a 100x100 canvas.

I have a different take on this: if you have float rectangles, you are
effectively treating **your rectangle as part of your model, not part of
your representation** (see my previous mail). This means that a point,
which is a dimensionless entity, shouldn't be displayed regardless of
it's coordinates, given that your screen' pixels have a dimension (and
therefore are infinitely larger than a point.

I realise that this is absolutely counter-intuitive (you would be
obliged to draw points as circles or rectangles that scales to 1 px or
to internally convert the calls to draw pixels to calls to draw
rectangles), but I think that is the only mathematically correct
solution to the ambivalence. 

in fact:

pixel(-0.4, -0.4) = 1-px-rect((-0.9, -0.9), (+0.1, +0.1)) =
rect-through-scaling-routine((-1, -1), (0, 0)) = no lit.

and

pixel(99.6, 99.6) = 1-px-rect((99.1, 99.1), (100.1, 100.1)) =
rect-through-scaling-routine((99, 99), (100, 100)) = lit.

[this assumes that the scaling routine - as proposed in a previous mail
- uses rounding, not truncate].

/mac

BTW: This is part of the reason why I think that rectangles should keep
to be int based / part of the representation logic.


Re: [pygame] Native PyGame method for automatically scaling inputs to a surface resolution?

2011-09-30 Thread Mac Ryan
On Fri, 30 Sep 2011 09:19:12 -0400
Christopher Night cosmologi...@gmail.com wrote:

 Well pixels are just the simplest example. The ambiguity exists for
 all drawing functions, not just set_at. For instance, should a
 horizontal line extending from (10, 99.6) to (90, 99.6) appear on the
 100x100 screen or not? So I don't think that forbidding set_at solves
 it.

If you are going to use scaled surface you *must* specify the dimension
of what you are drawing. Precisely as you can see the great wall of
Chine from a low orbit but you can't from the moon, so you should be
able to see a line only if its thickness is larger than half of the
mapped size of a pixel.

But this is nothing new: try to draw a line of thickness 1 and to
smoothscale it down 10 times: you won't see it [or you will see a very
faint colour simulating the 10% of a pixel].

You could automatize some of the functions by stating for example that
set_at assumes a square of side 1/scale, or that a draw_line without
a thickness parameter assumes tickness is 1/scale...

However, if you are going to draw something that does need not to be
scaled, it would be a better solution (at least IMO) to simply bit a
non-scaled image on the scaled one.

 If you're going to say that lines are 1-dimensional and thus
 infinitely smaller than pixels, and thus we're obliged to draw a thin
 rectangle whenever we want a line, then (a) I probably would not use
 the tool if it doesn't even support drawing lines, and (b) consider
 the filled, closed polygon from (10, 99.6) to (90, 99.6) to (50,
 200). Would drawing that cause any pixels on the 100x100 screen to be
 lit up or not?

About (b): maybe I am missing the point as it seems obvious to me that
it shouldn't. Unless you want to do something like smoothscale does (and
thus using intensity as a mean to simulate fractions of a pixel) all the
points, once scaled, are outside the surface... did I misunderstand you?

/mac


Re: [pygame] Native PyGame method for automatically scaling inputs to a surface resolution?

2011-09-29 Thread Mac Ryan
On Wed, 28 Sep 2011 10:28:12 +1300
Greg Ewing greg.ew...@canterbury.ac.nz wrote:

  ...because the *INPUT* destination is mapped on s1, and the blit
  method should return its *OUTPUT* in unmapped measures [pixels].  
 
 Actually that's something to be debated -- *should* it
 return post-scaling or pre-scaling coordinates? A case
 could be made for the latter, since the code using the
 scaled surface is doing all its other calculations in
 those coordinates, and having to manually transform
 back would be a nuisance.

I suppose there are various ways of using pygame, but I would still
hold that having the scaling function should be one way only. Some
random thoughts about it:

1. Domain separation: pygame is about managing graphics on the screen,
   not about managing a mathematical model. A scale factor for input
   coordinates is a transparent, non-obstructive addition. Having
   returned rectangles that do not match what is on screen is not, it
   requires you to alter the way you handle graphic.
2. Code incompatibility: code that has been written assuming images are
   not scaled may not run correctly if scaling is introduced at a later
   stage: with your proposal the very **meaning** of the returned
   rectangle would be different that what is now.
3. Scaling mess: a concrete example... In the application I am
   developing now, I scale the modelled space to match screen
   resolution, but since the objects moving in the space might have
   sub-pixel dimensions if scaled to the same resolution, I am
   often obliged to scale the sprites representing them to a different
   resolution (depending on their size) in order to be sure to have
   something distinguishable on screen. It would be all but a nuisance
   to keep track of all the different resolution when doing math on
   sprites rects.

The idea of float rectangles per se doesn't seem to have any of these
problems, yet the underlying thought seems (correct me if I am wrong!)
that you would use rectangles for keeping track of your models, rather
than of their on-screen representation [which couldn't be anything else
than integer coordinates]... Again: there are many ways to use pygame
and I can't (nor wish to) say this is wrong... It just seems to me that
working in non-integer coordinates would subtly change the semantics of
pygame, and wanted to point that out.

/mac


Re: [pygame] Native PyGame method for automatically scaling inputs to a surface resolution?

2011-09-27 Thread Mac Ryan
On Sun, 25 Sep 2011 13:37:41 +0200
René Dudfield ren...@gmail.com wrote:

 Cool, if you like that might be interesting.  Just make sure you show
 your work as you go, so we can see if there's any problems before you
 do heaps of work.

Will do. As I said, no guarantees though! :)
 
 What do you think about generating a wrapper that does it in python
 similar to the code I posted?

If I were to do it in python I'm sure I could do it. I have very
limited experience in C, but I would guess C would be a more sensible
choice though: the scaling procedure would potentially called rather
often (every time something is draw on that surface) so it seems
sensible to me to try to make as fast performing as possible... or
not? :-/

On Sun, 25 Sep 2011 11:41:19 -0400
Christopher Night cosmologi...@gmail.com wrote:

 Suppose the current transformation is (x, y) - (0.1x, 0.1y). What
 pygame Rect will the following correspond to (after all the
 coordinates are coerced to ints)?

Could you please articulate a bit more? I'm not sure I followed. From
your post it seems to me that you are imagining a two-way
transformation in which you use model-scaled data to the draw
function (draw a 150 metres radius circle) and you also get
model-scaled data when you query the object (what is the size of the
bounding rect for the circle? -- 300 metres).

What I am talking about is just a one-way transformation that convert
the input of the drawing functions. In pseudocode:

 init_surface(scale_factor=0.1)
 draw_circle(radius=150)
 get_size_bounding_rect()
... 30x30

For me pygame is only a presentation layer, all my game logic
math is done elsewhere. As for the graphic math (sprite collision,
etc...) is just fine to do it at screen resolution, so I am happy for
pygame to keep on thinking that my circle is 30x30 pixels.

But maybe I misunderstood what you meant?

On Mon, 26 Sep 2011 09:55:01 +1300
Greg Ewing greg.ew...@canterbury.ac.nz wrote:

 The state wouldn't be global, it would be an attribute of the
 surface, like the clipping rect is now. If you don't want to
 pollute the state of your main surface, you create a subsurface
 and pass it to your drawing routines.

Indeed. I did not yet look at the code, but my preliminary idea is that
if scale == 1.0, then a standard surface should be initialised, if it
is not, then a surface which scales inputs should. This would spare
the effort of multiplying by 1 the parameters for non-scaled surfaces.

However, I think I'm thinking object orientation here, and if I am
correct, pygame is coded in regular C. ??? [Again: will probably need
some direction here].

/mac


Re: [pygame] Native PyGame method for automatically scaling inputs to a surface resolution?

2011-09-27 Thread Mac Ryan
On Tue, 27 Sep 2011 10:42:10 -0400
Christopher Night cosmologi...@gmail.com wrote:

 That's how I understood it... I'm not sure what I said that made it
 sound like I was thinking of a two-way transformation Let me try
 rephrasing my question using your notation. Please tell me what the
 following would output (where I have question marks), and tell me if
 you don't understand what I mean by the pseudocode:
 
  init_surface(scale_factor = 0.1)
  r1 = myrect(0, 0, 108, 10)
  r1.left = 4
  fill_rect(r1)
  get_size_bounding_rect()  
 ??? x ???

Hi Chris,

sorry, reading your reply I understood that my notation was
probably not very clear. I'm still not sure I got you, but here's a more
articulated example of what I mean, using standard pygame syntax (bar
the ``scale`` parameter):

 s1 = pygame.surface.Surface((100,100), scale=0.1)

This will initialise a surface of 10x10 px, mapping a 100x100 square.

 s1.get_bounding_rect()
rect(0, 0, 10, 10)

...because the image is 10x10 pixels.

 s2 = pygame.surface.Surface((100,100), scale=0.05)
 s2.get_bounding_rect()
rect(0, 0, 5, 5)
 s1.blit(s2, (20,20))
rect(2, 2, 5, 5)

...because the *INPUT* destination is mapped on s1, and the blit
method should return its *OUTPUT* in unmapped measures [pixels].

One thing that might have further confused the discussion is that
somewhere in an earlier mail I mentioned rects. But all I meant was
really just the rects returned by this kind of operations. I was not
thinking to a new class of rects with scaling included.

I'm far from asserting that what above is the best way ever to
implement scaling in PyGame, but this is how I imagined it working. I'm
very open to hear alternative suggestions and feedback, of course! Also:
does this address your doubts?

/mac



Re: [pygame] Native PyGame method for automatically scaling inputs to a surface resolution?

2011-09-27 Thread Mac Ryan
On Tue, 27 Sep 2011 11:28:34 -0400
Christopher Night cosmologi...@gmail.com wrote:

 How can I use your system to draw a rectangle of a solid color onto a
 surface? With the regular pygame system, I would use surf.fill and
 pass it a Rect. If your system doesn't recognize rectangles of any
 sort, how can I do this? Feel free to show me in pseudocode how I
 could do it.

I suppose you mean something like:

 Surface.fill(WHITE, myrect)

is it?

If this is the case, the rectangle would be scaled (so a rect of
100x200 would be scaled to a rect of 10x20 assuming scale=0.1). The
general idea would be: any argument to a surface method whose purpose
is to indicate a measure ((x,y) tuples, explicit width/height or tuples)
would be scaled. Any other parameter whose purpose is not defining
coordinates (colours, flags, surfaces...) wouldn't.

Again: not claiming this is the best way of doing it... just that this
is how I would try to implement it

/mac


Re: [pygame] Native PyGame method for automatically scaling inputs to a surface resolution?

2011-09-27 Thread Mac Ryan
On Tue, 27 Sep 2011 11:28:34 -0400
Christopher Night cosmologi...@gmail.com wrote:

 How can I use your system to draw a rectangle of a solid color onto a
 surface? With the regular pygame system, I would use surf.fill and
 pass it a Rect. If your system doesn't recognize rectangles of any
 sort, how can I do this? Feel free to show me in pseudocode how I
 could do it.

Addendum: it occurs to me that maybe I wasn't clear about the fact that
while only surfaces would have a scale parameter, rectangles applied
to a surface would be scaled, but the scaling is a property of the
surface, not of the rectangle. The same rectangle used to fill another
surface scaled differently would fill a different number of pixels.

/mac


Re: [pygame] Native PyGame method for automatically scaling inputs to a surface resolution?

2011-09-27 Thread Mac Ryan
On Tue, 27 Sep 2011 15:53:03 -0400
Christopher Night cosmologi...@gmail.com wrote:

 Either way, that's fine, and not very suprising. That's exactly what
 I had in mind. So I really don't see why you weren't able to answer
 my original question. Let me try asking it one more time.

Sorry. The good news is: this time I understood what you mean. (I
think!) :)

 You say that a 100x200 rect would be scaled to 10x20 pixels. Yes,
 obviously, this makes sense. What would a 104x200 rect be scaled to?
 10x20 pixels? Or 11x20 pixels? Or would it depend on the position of
 the rect? (Remember that in addition to height and width, rectangles
 also have positions, ie left and top. You can't just say draw a
 10x20 rectangle, you also have to say where to draw it. This is
 relevant to the question, so try to keep it in mind.)

Since we are working in raster rather than in vectorial, I
would first convert the rectangle in coordinates, and then scale,
possibly rounding to the nearest integer.

The idea is that position (x=4) + (width=14) = right(18) --  scaled: 2
and not x=4 -- scaled: 0 + width=14 -- scaled 1 -- scaled_sum: 1.
Which I suppose was the point you were trying to make in the first
post of yours (maybe part of the reason why I did not get it is that I
never used this before: I only needed to blit sprites on images in my
code).

Does this answer your question? Usual disclaimer: I did not think this
deeply, I'm very open to alternative ways of implementing it. :)

/mac


Re: [SPAM: 5.000] Re: [pygame] Native PyGame method for automatically scaling inputs to a surface resolution?

2011-09-25 Thread Mac Ryan
On Sun, 25 Sep 2011 14:18:25 +1300
Greg Ewing greg.ew...@canterbury.ac.nz wrote:

 Mac Ryan wrote:
 
  The behaviour that I envisage would be an
  optional keyword argument ``scale=1.0`` for rectangles (and
  surfaces).  
 
 I would say the transformation should be an attribute of the
 surface, not something that you pass into drawing calls.

But isn't this what I just say? Or did I misunderstand you? Or did you
misunderstand me? :o

On Sun, 25 Sep 2011 09:55:20 +0200
René Dudfield ren...@gmail.com wrote:

 Could you create a transform rect like function that returns the
 transformed state?
 
  t(20, 20, 20, 20)
 (1,1,1,1)

That's exactly what I have at the moment (mine is called ``sc()``, but
that's irrelevant... :)).

I still think that's a missing battery thing though. I haven't
browsed much into code of games on pygame.org, but I wouldn't be
astonished if a large portion (probably even the majority) of the code
would internally use a non-scaled model, to be transformed at
representation time...

Anyhow: not a big issue, it's just that is not an uncommon feature in
other libraries and I feel that inserting it in pygame would make
pygame better, but I *can* live without. :)

/mac


Re: [pygame] Native PyGame method for automatically scaling inputs to a surface resolution?

2011-09-25 Thread Mac Ryan
On Sun, 25 Sep 2011 12:36:50 +0200
René Dudfield ren...@gmail.com wrote:

 Maybe it would be possible to implement easily in C, without too much
 code, but maybe not.  I'm not interested in it myself, since I think
 it's not too hard to layer on top - so I won't spend any effort
 implementing it. However, if you want to submit a patch that doesn't
 make the code super ugly, and doesn't cost too much performance we
 could add it.

I can have a go at it - without guarantees as I only occasionally code
in C - but that won't be immediately anyhow, most likely not before
15/10. If you have any further suggestion before I try to make my way
through the code, I'll be happy to listen.

Best,
/mac

PS: If any other PyGamer who is more fluent in C than I is interested
to work on this, I won't in any way feel dispossessed! ;)


Re: [pygame] Native PyGame method for automatically scaling inputs to a surface resolution?

2011-09-24 Thread Mac Ryan
Thank you for all your answers. I condensed my reaction in a single
mail:

Julian Marchant onp...@yahoo.com wrote:

 What is the purpose to having calculations done with a size that's 10
 times larger? If it's just precision, the solution could be simply to
 use floats for the calculations and convert to ints as necessary.

That was just an example. The general idea is that when you write the
physics of a game (and even more a simulation) you normally model the
reality as-is, but represent it scaled (an projected) onto your
computer screen. So: your speed would be 10.2m/s in all the
internal calculation (and not 2.5789px/(1/FPS)).

 One last possibility that I can think of is to scale up your graphics
 for 1000x1000 and then scale the window surface...

In my specific application I am representing objects on 6.400.000.000
square metres, with a resolution to the centimetre. Unless I'm going to
run it on the K computer it's not a viable solution.

Christopher Night cosmologi...@gmail.com wrote:

 Yeah short answer no. However, I think the answers you've gotten from
 StackOverflow have not been terribly helpful. They seem to suggest
 don't do scaling in pygame. This is silly, I do scaling in pygame
 all the time. There's no reason you'd need to work in screen
 coordinates.

Yep, still I appreciated the fact they took the time to answer!

 I use wrappers. Let me point out there's a total of 9 functions in
 pygame.draw. You seem to be going to a lot of effort to avoid writing
 9 one-line functions. (And I usually only ever use about 3 or 4 in
 any one application.) Writing the wrappers is the best way, and I
 don't think you should have dismissed it so quickly.

I never counted the number of functions actually. Good to know. :o

 Since this is a very common problem, I wonder if there is an
 established pattern that elegantly solves this problem that I failed
 to find.

We are in two then! :)

 I could simply decorate the existing middleware functions, but the
 problem is that those functions also work with the same parameters
 being list or Vector2D too, and ...

Totally on the same line, although writing a decorator that need to
check special cases all the time is going to have a performance hit,
and writing more than one decorator does seem like silly. 

 I hope you find a solution that satisfies you. It's not that there
 aren't plenty of solutions!

So far I settled with a util function ``sc()`` that accepts scalars or
iterables and returned them scaled. Using a deep copy and
initialisation I could also create a sort of closure, de-facto using
the same syntax for all surfaces but getting scaled to the right amount
according to the target surface [so far I don't need it, though].

Greg Ewing greg.ew...@canterbury.ac.nz wrote:

 Don't forget that function calls are expensive in Python, as
 is doing piecemeal arithmetic.
 
 Most other graphics systems these days provide a facility for
 applying a transformation to coordinates before drawing, and
 I don't think it's unreasonable to suggest that PyGame should
 do the same. Having it done for you in C would be more efficient
 than doing it in Python.

I'm totally with you on this. The behaviour that I envisage would be an
optional keyword argument ``scale=1.0`` for rectangles (and surfaces).
In order to further optimise execution time, PyGame could initialise
objects with the scaling procedure only when ``scale != 1 ``.

Christopher Night cosmologi...@gmail.com wrote:

I think I missed some mail, as there is quoted text in here that I
didn't see in original, however...

 While that is roughly true, it's a very, very general statement to
 the point where I would say that avoiding function calls on principle
 is premature optimization. Keep in mind that the operation you're
 wrapping - a draw call - is expensive in the first place. Anyway, a
 quick profile suggests that for a small circle you can potentially
 gain a 7% speedup by avoiding this function call, and a 14% speedup
 avoiding both the function and the arithmetic:

My point wasn't really the fact that I'm lacking speed. For me it is
more a question of good code design. I think that having to scale
manually at each operation on a given surface (or having to wrap
decorate functions all the time) is suboptimal (DNRY). Besides, while
speed is not my immediate concern, I do think that at library-level
speed should (is) one of the designing principles.

 You can decide whether that's worth it for you to avoid this function
 call. For me, if my game is slow enough that I have to avoid function
 calls to get reasonable performance, it means I'm doing something
 else wrong. :) If performance is the goal here, I still think it's a
 large amount of effort for a relatively modest gain.

Everything is relative: 14% is the difference between dying 80 and 69
years old. I'm not so sure if I would call that modest! ;)
 
 For what it's worth, I would also welcome native-pygame wrappers that
 apply a linear transformation. 

Re: [pygame] Native PyGame method for automatically scaling inputs to a surface resolution?

2011-09-24 Thread Mac Ryan
On Sat, 24 Sep 2011 14:39:31 -0500
Jake b ninmonk...@gmail.com wrote:

 Are you using numpy?

No, since I don't have to do very complex or loads of operations I went
with euclid... but I'd be interested in knowing if you have suggestions
involving numpy, nevertheless.

/mac


[pygame] Native PyGame method for automatically scaling inputs to a surface resolution?

2011-09-23 Thread Mac Ryan
Hello,

back July, I posted a question on StackOverflow titled that so
far still did not get any answer. So - although I believe the answer is
quite simply NO - I thought to repost it here:



In my program (which uses pygame to draw objects on the video) I have
two representation of my world:

- A physical one that I use to make all the calculations involved in the
  simulation and in which objects are located on a 1000x1000 metres
  surface.
- A visual one which I use to draw on the screen, in which my objects
  are located in a window measuring 100x100 pixels.

What I want to achieve is to be able to pass to my pygame drawing
functions (which normally accept inputs in pixels) my
physical/real-word coordinates. In other words, I would like to be able
to say:

Draw a 20m radius circle at coordinates (200m, 500m)
using the precise pygame syntax:

pygame.draw.circle(surface, (255,255,255), (200,500), 20)
and get my circle of 2px radius at centred on pixels (20,50).



If you are on SO and would like to answer there [too], I'll be happy to
dispense upvotes! ;) http://stackoverflow.com/q/6807459/146792


Re: [pygame] growing out of idle ide

2011-08-16 Thread Mac Ryan
  I don't particularly like Eclipse... seems pretty bloated especially
  for Python. For a Java developer I'm sure it's great but I don't
  get a happy feeling when I start it up, and for us game programmers
  it's all about the happy.

It is bloated, yet - after having tried a lot of various IDE's - I
still think it's the best free (as in freedom) choice if you care for
debugging, git integration, plugins for other syntax (yaml, json...),
etc... True: it takes 20 seconds to fire up, but then [at least for me]
it's open the best part of the day and it's snappy enough even on my 5
years old laptop.

  Is anybody particularly enthusiastic about their IDE? Anybody use a
  black-on-white or similar color scheme?

I recently tried Ninja (http://ninja-ide.org/) it's specifically
designed for python, light, and supports color schemes. Current stable
version is 1.1 with version 2.0beta coming out 23/9. Personally I found
it still too rudimental to win over gedit with the appropriate plugins,
but nevertheless I liked the approach the devs have chosen, and look
forward to see what version 2.0 will bring up.

/mac


Re: [pygame] Could sprite drawing be done but a Sprite.draw function?

2011-07-25 Thread Mac Ryan
On Mon, 25 Jul 2011 06:45:44 +0100
René Dudfield ren...@gmail.com wrote:

 There are things like z-order as well, which make drawing in the
 group the way to do it.  If you did Sprite drawing, then each Sprite
 would first need to talk to the other sprites in order to know when
 to draw.

Hi and thank you for your answer.

I think I failed to explain myself properly. What I meant with
my question is that I have been surprised that the drawing of sprites
happens **calling surface.blit(sprite.image, sprite.rect)** instead than
**calling sprite.draw()**. It makes of course sense that drawing is
co-ordinated by the group, but the way it's done now make very hard to
customise the drawing: I might need for example to selectively choose
a flag for the blit operation on _some_ of the sprites, for example...

Hope this clarify what's my doubt better! :)

Mac

 On Sun, Jul 24, 2011 at 11:48 PM, Mac Ryan quasipe...@gmail.com
 wrote:
 
  Hi all,
 
  I'm pretty new to pygame (although I have good experience with
  python) and this is my first message on the list. Bear with me if
  what I am about to ask has been already discussed.
 
  While working around my first program, I realised that while each
  sprite has an update() function, it's drawing happens at group
  level, by blitting the s.image and s.rect directly, which somehow
  limits this functionality (for example making impossible to set
  blit flags).
 
  I was wondering if this was a design choice, and in that case,
  what's the rationale behind it.
 
  I realise both that:
 
   1. Calling a sprite function is expensive
   2. I can override the group.draw method to call a sprite function
 
  Yet it seems to me that having a sprite.update without a sprite.draw
  is inconsistent from an API standpoint.
 
  I did not profile the code to see how big is the performance hit
  because of #1, but #2 has serious limitations as a workaround: for
  one, the draw() method changes in each an all of the group classes,
  so if the sprite must use a draw method, one will have to override
  _all_ of the draw methods of the types of groups used in the
  program.
 
  Your thoughts?
  /mac
 



[pygame] Could sprite drawing be done but a Sprite.draw function?

2011-07-24 Thread Mac Ryan
Hi all,

I'm pretty new to pygame (although I have good experience with python)
and this is my first message on the list. Bear with me if what I am
about to ask has been already discussed.

While working around my first program, I realised that while each
sprite has an update() function, it's drawing happens at group level,
by blitting the s.image and s.rect directly, which somehow limits this
functionality (for example making impossible to set blit flags).

I was wondering if this was a design choice, and in that case, what's
the rationale behind it.

I realise both that:

 1. Calling a sprite function is expensive
 2. I can override the group.draw method to call a sprite function

Yet it seems to me that having a sprite.update without a sprite.draw
is inconsistent from an API standpoint.

I did not profile the code to see how big is the performance hit
because of #1, but #2 has serious limitations as a workaround: for one,
the draw() method changes in each an all of the group classes, so if
the sprite must use a draw method, one will have to override _all_ of
the draw methods of the types of groups used in the program.

Your thoughts?
/mac