Re: [pygame] Native PyGame method for automatically scaling inputs to a surface resolution?

2011-10-03 Thread Mac Ryan
On Mon, 03 Oct 2011 17:52:13 +1300
Greg Ewing greg.ew...@canterbury.ac.nz wrote:

  Could be another parameter for the
  surface (coord_system=PIXEL_CENTRE | PIXEL_BOUNDARIES)!! :P  
 
 I don't think that would be strictly necessary, because you
 would be able to get the same effect by adjusting the translation
 by 0.5 pixels.
 
 Also, providing too many options could just lead to confusion.
 Surfaces already have a bewildering collection of options --
 let's not add any more if we don't have to!

Ah, the ambiguities of written langauge! ;) The :P was meant to
indicate it was a joke! :)

Anyhow I agree with you that the theory should be well thought out.

So far this is a summary of how I would try to implement the feature:

1. Scaling option... optional. Defaults to 1 and in that case the
   scaling procedure is bypassed at once instantiating a class that
   doesn't have it.
2. Scaling only happens one-way, returned rectangles are in pixels (so
   no floating point coords).
3. Pixels and hairlines are supported without having to specify their
   size by defaulting the unspecified size to 1/scale.

I'm still undecided on whether the positioning should use pixel
boundaries or pixel centres: I think there are good reasons to go for
both. Namely I think centres is a more appropriate way to describe
position of objects in a modelled space (nobody says the car is in the
far end left corner of the garage, as intuitively we consider the
geometrical centre of objects as their most significative one for
positional reference). OTOH, boundaries are way more consistent with
how PyGame operates on standard surfaces.

Again: it might well be that when I will have a crack at it it will
prove beyond my capacities, but before even try to start working on it
(which I shall repeat won't be immediately), I would like to know I'm
moving towards a sensible direction...

/mac


Re: [pygame] Native PyGame method for automatically scaling inputs to a surface resolution?

2011-10-03 Thread Greg Ewing

Mac Ryan wrote:


Ah, the ambiguities of written langauge! ;) The :P was meant to
indicate it was a joke! :)


Actually that's the danger of using non-standard emoticons. :-)

 I think centres is a more appropriate way to describe

position of objects in a modelled space (nobody says the car is in the
far end left corner of the garage, as intuitively we consider the
geometrical centre of objects as their most significative one for
positional reference).


The garage is not a point, it's a rectangle, and so is the car.
Saying the car is in the garage means that the boundaries of
the car lie within the boundaries of the garage. It's not
sufficient to say that the *centre* of the car is in the garage;
the door might be open and the boot[1] sticking out into the
driveway.

Conversely, it's too restrictive to say that the centre of the car
is at the centre of the garage, since (unless your garage is *really*
tight for space) there are other locations for the car's centre such
that the car is still inside the garage.


OTOH, boundaries are way more consistent with
how PyGame operates on standard surfaces.


That's the clincher, I think. In PyGame you're very often dealing
with rects in one way or another, and in my experience, rects just
work better with pixel-boundary coordinates.

For example, if you have a rect with left=10 and right=20, and
another with left=20 and right=30, it's obvious that they will
touch without overlapping. With pixel-centre coordinates, you
have to think about it a bit harder, plus you need some rule
such as half-open rect coordinates to make it work.

This is similar to the way it helps to think of slice indices
in Python as labelling points between items rather than the
items themselves. It's the same thing in two dimensions.

[1] Or trunk, if it's an American car and/or garage.

--
Greg


Re: [pygame] Native PyGame method for automatically scaling inputs to a surface resolution?

2011-10-02 Thread Greg Ewing

Christopher Night wrote:

Not that it's the most important thing in the world, but I think that 
OpenGL agrees with me. Here's a program that creates a scaled surface 
extending from (0,0) to (1000, 1000), with a canvas size of 100x100 
pixels (so the scale factor is 0.1). It then draws the following 
horizontal lines:


Red line at y = -4 (will appear if we're rounding but not if we're 
truncating)

Blue line at y = 6 (will appear)
White line a y = 500 (will appear)
Red line at y = 996 (will appear if we're truncating but not if we're 
rounding)

Blue line at y = 1004 (will not appear)

When I run the script, I see a blue line at top and a red line at 
bottom, which is the correct behavior if we're truncating. But feel free 
to tell me if there's something wrong with this experiment.


I think what's happening here is that OpenGL is treating the
coordinates as pixel boundaries, and treating the line as extending
half a pixel either side of the zero-width line between its
endpoints.

In post-transform coordinates, y = 0.0 is a borderline case --
exactly half of each pixel is in the window. For y = -0.4, less than
half is in the window, so no pixels get drawn. Similarly for
y = 100.4. For y = 99.6, most of each pixel is in the window, so
the line appears.

So the same principle applies here -- work out where the boundaries
of the filled area are, and round them to the nearest pixel
boundaries.

And again, the reason it seems like it's truncating is that it's
applying a half-pixel offset from the coordinates you provide to
find the edges of the line, and then rounding those. (At least
that's what's happening conceptually -- the actual arithmetic
it's performing might be different.)

--
Greg


Re: [pygame] Native PyGame method for automatically scaling inputs to a surface resolution?

2011-10-02 Thread Mac Ryan
On Sun, 02 Oct 2011 14:12:21 +1300
Greg Ewing greg.ew...@canterbury.ac.nz wrote:

  If you're going to say that lines are 1-dimensional and thus
  infinitely smaller than pixels, and thus we're obliged to draw a
  thin rectangle whenever we want a line, then (a) I probably would
  not use the tool if it doesn't even support drawing lines,  
 
 There's no reason line drawing shouldn't be supported.
 A line of some specified width can easily be converted
 automatically into a filled polygon, and there can be
 a default for the width if it's not specified.

Which is indeed another way to say their thickness would be
automatically set to 1/scale... or did I get you wrong?

/mac


Re: [pygame] Native PyGame method for automatically scaling inputs to a surface resolution?

2011-10-02 Thread Mac Ryan
On Sun, 02 Oct 2011 13:53:02 +1300
Greg Ewing greg.ew...@canterbury.ac.nz wrote:

 Now, suppose our arithmetic is a little inaccurate, and we
 actually get x = 0.499. The boundaries then come out as -0.001
 and 0.999. If we round these, we get 0.0 and 1.0 as before.
 But if we floor, we get -1.0 and -0.0, and the pixel goes
 off the screen.
 ...
 This may be the reason we're misunderstanding each other --
 you're thinking of (0, 0) as being the *centre* of the top
 left pixel in the window, whereas I'm thinking of it as the
 *top left corner* of that pixel.

Although I was not the recipient of the original answer: Very
interesting! that's actually the first time I understand why using
the upper-left coordinate system may make sense under certain
conditions. :)

Yet, while I followed the math through, I am unsure about how bad small
inaccuracies might turn out. Those inaccuracies would essentially be
the fruit of scaling down a very large physical model to screen
resolution, so I truly wouldn't care if I would expected my
sprite to be 1px to the left or what it appears, as far as the
model performs accurately. For game interface elements (where visual
alignment might be more relevant) I probably wouldn't use scaled
surfaces anyway.

But again... interesting to debate. Could be another parameter for the
surface (coord_system=PIXEL_CENTRE | PIXEL_BOUNDARIES)!! :P

/mac


Re: [pygame] Native PyGame method for automatically scaling inputs to a surface resolution?

2011-10-02 Thread Greg Ewing

Mac Ryan wrote:


Those inaccuracies would essentially be
the fruit of scaling down a very large physical model to screen
resolution, so I truly wouldn't care if I would expected my
sprite to be 1px to the left or what it appears, as far as the
model performs accurately.


Possibly you wouldn't care. However, if any transformation
feature were to be added, the theory behind it ought to be
properly thought out, I believe.

 Could be another parameter for the

surface (coord_system=PIXEL_CENTRE | PIXEL_BOUNDARIES)!! :P


I don't think that would be strictly necessary, because you
would be able to get the same effect by adjusting the translation
by 0.5 pixels.

Also, providing too many options could just lead to confusion.
Surfaces already have a bewildering collection of options --
let's not add any more if we don't have to!

--
Greg



Re: [pygame] Native PyGame method for automatically scaling inputs to a surface resolution?

2011-10-01 Thread Greg Ewing

Christopher Night wrote:
If I have a 100x100 pixel window, 
and I want to put a dot at a position (x,x), it seems to me like the dot 
should appear in the window if 0 = x  100. You're saying it should 
appear in the window if -0.5 = x  99.5.


You need to be more precise about what you mean by a dot. On
a display surface made of pixels, to make anything appear at all,
you need to paint at least one pixel. So I'll take it that you
want to paint a 1x1 rectangle.

There are also a couple of other things we need to be clear
about. One is the precise relationship between coordinates and
pixels. Two obvious choices come to mind: we could take the
coordinates as labelling the centres of pixels, or the boundaries
between pixels.

I prefer to take the boundary approach, because it avoids a lot
of potential confusion. To draw a 1x1 rect centred at (x, y), we
need to paint the area between (x-0.5, y-0.5) and (x+0.5, y+0.5).
In order to cover exactly one pixel, the centre coordinates need
to be an integer plus 0.5, so that the boundaries are integers.
So if x = 0.5, the rect covers the range 0.0 to 1.0.

Now, suppose our arithmetic is a little inaccurate, and we
actually get x = 0.499. The boundaries then come out as -0.001
and 0.999. If we round these, we get 0.0 and 1.0 as before.
But if we floor, we get -1.0 and -0.0, and the pixel goes
off the screen.

Note that I'm rounding the *boundaries* of the rect, not its
centre. That's what I meant when I said that you need to
work out where all the sides of the rect are first, and then
round them.

Also, in general, what I mean by rounding is finding the
*nearest pixel boundary*. If the coordinate system is chosen so
that the pixel boundaries are on integer coordinates, then
this will correspond to what the round() function does.

If you choose some other convention, it won't. In particular,
if you decide that the *centres* of pixels lie on integer
coordinates, it will correspond to something involving floor(),
because everything is offset by 0.5 and round(x) == floor(x+0.5).

This may be the reason we're misunderstanding each other --
you're thinking of (0, 0) as being the *centre* of the top
left pixel in the window, whereas I'm thinking of it as the
*top left corner* of that pixel.

I realize that it's a matter of preference, and either way would be 
logically consistent, so it's just a matter of which is more intuitive 
and comfortable.


The PyGame docs don't specify one way or the other. They
don't really need to, because for integer coordinates you
get the same result either way. But when floats are involved,
things need to be pinned down more precisely.

--
Greg


Re: [pygame] Native PyGame method for automatically scaling inputs to a surface resolution?

2011-10-01 Thread Greg Ewing

Christopher Night wrote:

If you're going to say that lines are 1-dimensional and thus infinitely 
smaller than pixels, and thus we're obliged to draw a thin rectangle 
whenever we want a line, then (a) I probably would not use the tool if 
it doesn't even support drawing lines,


There's no reason line drawing shouldn't be supported.
A line of some specified width can easily be converted
automatically into a filled polygon, and there can be
a default for the width if it's not specified.

Postscript has a notion of a hairline: if you set
the width to 0, it draws the narrowest line that the
device can produce, whatever scaling is in effect. For
a PyGame surface that would be 1 pixel.

--
Greg


Re: [pygame] Native PyGame method for automatically scaling inputs to a surface resolution?

2011-09-30 Thread Mac Ryan
On Thu, 29 Sep 2011 18:18:48 -0400
Christopher Night cosmologi...@gmail.com wrote:

 If I have a 100x100 pixel window, and I want to put a dot at a
 position (x,x), it seems to me like the dot should appear in the
 window if 0 = x  100. You're saying it should appear in the window
 if -0.5 = x  99.5. So if I request to put a point at (-0.4, -0.4),
 it will appear in the window, but if I put one at (99.6, 99.6), it
 won't. I disagree that this is the correct behavior. Intuitively, the
 point (99.6, 99.6) should be within a 100x100 canvas.

I have a different take on this: if you have float rectangles, you are
effectively treating **your rectangle as part of your model, not part of
your representation** (see my previous mail). This means that a point,
which is a dimensionless entity, shouldn't be displayed regardless of
it's coordinates, given that your screen' pixels have a dimension (and
therefore are infinitely larger than a point.

I realise that this is absolutely counter-intuitive (you would be
obliged to draw points as circles or rectangles that scales to 1 px or
to internally convert the calls to draw pixels to calls to draw
rectangles), but I think that is the only mathematically correct
solution to the ambivalence. 

in fact:

pixel(-0.4, -0.4) = 1-px-rect((-0.9, -0.9), (+0.1, +0.1)) =
rect-through-scaling-routine((-1, -1), (0, 0)) = no lit.

and

pixel(99.6, 99.6) = 1-px-rect((99.1, 99.1), (100.1, 100.1)) =
rect-through-scaling-routine((99, 99), (100, 100)) = lit.

[this assumes that the scaling routine - as proposed in a previous mail
- uses rounding, not truncate].

/mac

BTW: This is part of the reason why I think that rectangles should keep
to be int based / part of the representation logic.


Re: [pygame] Native PyGame method for automatically scaling inputs to a surface resolution?

2011-09-30 Thread Christopher Night
On Fri, Sep 30, 2011 at 5:43 AM, Mac Ryan quasipe...@gmail.com wrote:

 On Thu, 29 Sep 2011 18:18:48 -0400
 Christopher Night cosmologi...@gmail.com wrote:

  If I have a 100x100 pixel window, and I want to put a dot at a
  position (x,x), it seems to me like the dot should appear in the
  window if 0 = x  100. You're saying it should appear in the window
  if -0.5 = x  99.5. So if I request to put a point at (-0.4, -0.4),
  it will appear in the window, but if I put one at (99.6, 99.6), it
  won't. I disagree that this is the correct behavior. Intuitively, the
  point (99.6, 99.6) should be within a 100x100 canvas.

 I have a different take on this: if you have float rectangles, you are
 effectively treating **your rectangle as part of your model, not part of
 your representation** (see my previous mail). This means that a point,
 which is a dimensionless entity, shouldn't be displayed regardless of
 it's coordinates, given that your screen' pixels have a dimension (and
 therefore are infinitely larger than a point.

 I realise that this is absolutely counter-intuitive (you would be
 obliged to draw points as circles or rectangles that scales to 1 px or
 to internally convert the calls to draw pixels to calls to draw
 rectangles), but I think that is the only mathematically correct
 solution to the ambivalence.


Well pixels are just the simplest example. The ambiguity exists for all
drawing functions, not just set_at. For instance, should a horizontal line
extending from (10, 99.6) to (90, 99.6) appear on the 100x100 screen or not?
So I don't think that forbidding set_at solves it.

If you're going to say that lines are 1-dimensional and thus infinitely
smaller than pixels, and thus we're obliged to draw a thin rectangle
whenever we want a line, then (a) I probably would not use the tool if it
doesn't even support drawing lines, and (b) consider the filled, closed
polygon from (10, 99.6) to (90, 99.6) to (50, 200). Would drawing that cause
any pixels on the 100x100 screen to be lit up or not?

-Christopher


Re: [pygame] Native PyGame method for automatically scaling inputs to a surface resolution?

2011-09-30 Thread Mac Ryan
On Fri, 30 Sep 2011 09:19:12 -0400
Christopher Night cosmologi...@gmail.com wrote:

 Well pixels are just the simplest example. The ambiguity exists for
 all drawing functions, not just set_at. For instance, should a
 horizontal line extending from (10, 99.6) to (90, 99.6) appear on the
 100x100 screen or not? So I don't think that forbidding set_at solves
 it.

If you are going to use scaled surface you *must* specify the dimension
of what you are drawing. Precisely as you can see the great wall of
Chine from a low orbit but you can't from the moon, so you should be
able to see a line only if its thickness is larger than half of the
mapped size of a pixel.

But this is nothing new: try to draw a line of thickness 1 and to
smoothscale it down 10 times: you won't see it [or you will see a very
faint colour simulating the 10% of a pixel].

You could automatize some of the functions by stating for example that
set_at assumes a square of side 1/scale, or that a draw_line without
a thickness parameter assumes tickness is 1/scale...

However, if you are going to draw something that does need not to be
scaled, it would be a better solution (at least IMO) to simply bit a
non-scaled image on the scaled one.

 If you're going to say that lines are 1-dimensional and thus
 infinitely smaller than pixels, and thus we're obliged to draw a thin
 rectangle whenever we want a line, then (a) I probably would not use
 the tool if it doesn't even support drawing lines, and (b) consider
 the filled, closed polygon from (10, 99.6) to (90, 99.6) to (50,
 200). Would drawing that cause any pixels on the 100x100 screen to be
 lit up or not?

About (b): maybe I am missing the point as it seems obvious to me that
it shouldn't. Unless you want to do something like smoothscale does (and
thus using intensity as a mean to simulate fractions of a pixel) all the
points, once scaled, are outside the surface... did I misunderstand you?

/mac


Re: [pygame] Native PyGame method for automatically scaling inputs to a surface resolution?

2011-09-30 Thread Christopher Night
On Fri, Sep 30, 2011 at 10:38 AM, Mac Ryan quasipe...@gmail.com wrote:

  If you are going to use scaled surface you *must* specify the dimension
 of what you are drawing.


There are plenty of good reasons you might need a line that's visible
regardless of the dimensions. Here's a screenshot from a game I made where
you're able to zoom in and out:

http://media.pyweek.org/dl/13/unifac13/screenshot-20110917193243.png

You can see red lines that represent lasers, and blue circles that represent
shields. Most of the graphical elements resize when you zoom in and out, but
the thickness of the lines and the circles do not, even though the endpoints
of the lines and the center and radius of the circles scale appropriately.
This is what I wanted when I made the game. I didn't want lasers that become
invisible when you zoom out. I don't care if it's unrealistic in whatever
sense. It's more important that they be visible to the player than that they
be properly scaled in thickness.

Another example off the top of my head is pathfinding. If you give orders to
a unit to move from one point to another, and you want to highlight the path
that unit is going to take, that's not an in-game object, that's a marker
for the player, and it needs to be visible to the player even if you're
zoomed out. If you draw it with lines, you want those lines to be visible.

If you're familiar with OpenGL, which fully supports the scaling mechanism
you desire, this is exactly what you get when you call glBegin(GL_LINES):
lines of 1 pixel in thickness, regardless of your scale. When you specify
glLineWidth, you specify it in *pixels*, not in unscaled units. Similarly
with GL_POINTS: you get points of 1 pixel, regardless of scale.

 if you are going to draw something that does need not to be
 scaled, it would be a better solution (at least IMO) to simply bit a
 non-scaled image on the scaled one.


That seems like a huge pain to me for something as simple as drawing a line.



  consider the filled, closed polygon from (10, 99.6) to (90, 99.6) to (50,
  200). Would drawing that cause any pixels on the 100x100 screen to be
  lit up or not?

 About (b): maybe I am missing the point as it seems obvious to me that
 it shouldn't. Unless you want to do something like smoothscale does (and
 thus using intensity as a mean to simulate fractions of a pixel) all the
 points, once scaled, are outside the surface... did I misunderstand


You're saying that all the points, once scaled, are obviously outside the
surface. That's what I'm disagreeing with: I'm saying it's not obvious, and
asking you to justify that statement. I'm making the argument that anything
within the rectangle (0,0) to (100, 100) should scale to a point inside the
surface, including (10, 99.6). You're saying that points within the
rectangle (-0.5, -0.5) to (99.5, 99.5) should scale to points inside the
surface. Thus, for instance, you think that filling the polygon going from
(10, -0.4) to (90, -0.4) to (50, -100) *should* light up some pixels,
whereas I think it obviously shouldn't, because all those points are in the
fourth quadrant.

Not that it's the most important thing in the world, but I think that OpenGL
agrees with me. Here's a program that creates a scaled surface extending
from (0,0) to (1000, 1000), with a canvas size of 100x100 pixels (so the
scale factor is 0.1). It then draws the following horizontal lines:

Red line at y = -4 (will appear if we're rounding but not if we're
truncating)
Blue line at y = 6 (will appear)
White line a y = 500 (will appear)
Red line at y = 996 (will appear if we're truncating but not if we're
rounding)
Blue line at y = 1004 (will not appear)

When I run the script, I see a blue line at top and a red line at bottom,
which is the correct behavior if we're truncating. But feel free to tell me
if there's something wrong with this experiment.

import pygame
from pygame.locals import *
from OpenGL.GL import *

pygame.init()
screen = pygame.display.set_mode((100, 100), OPENGL | DOUBLEBUF)
glOrtho(0, 1000, 1000, 0, 0, 1)

ys = [-4, 6, 500, 996, 1004]
colors = [(1,0,0), (0,0,1), (1,1,1), (1,0,0), (0,0,1)]
for y, color in zip(ys, colors):
glColor3fv(color)
glBegin(GL_LINES)
glVertex2f(100, y)
glVertex2f(900, y)
glEnd()

pygame.display.flip()
while not any(e.type in (KEYDOWN, QUIT) for e in pygame.event.get()):
pass

-Christopher


Re: [pygame] Native PyGame method for automatically scaling inputs to a surface resolution?

2011-09-30 Thread Greg Ewing

Mac Ryan wrote:


I suppose there are various ways of using pygame, but I would still
hold that having the scaling function should be one way only.


As far as drawing is concerned, I tend to agree. The only
difficulty is that some drawing operations in pygame are
defined to return a rect indicating the affected area.
I've never used that feature myself, so I don't have any
use cases to go on, but if a surface with a scaled coordinate
system was being used, I find it hard to imagine that the
calling code would have any use for post-scaling coordinates,
because it's doing all its calculations pre-scaling.


3. Scaling mess: a concrete example... In the application I am
   developing now, I scale the modelled space to match screen
   resolution, but since the objects moving in the space might have
   sub-pixel dimensions if scaled to the same resolution, I am
   often obliged to scale the sprites representing them to a different
   resolution


Yes, sometimes you don't want to just scale everything uniformly,
but do something more complicated. However, I don't see how the
rect returned by the drawing operation is of much help in this
example. You're deciding beforehand how big you want the sprite
to be in pixels, not asking pygame to tell you after the fact.

--
Greg


Re: [pygame] Native PyGame method for automatically scaling inputs to a surface resolution?

2011-09-29 Thread Mac Ryan
On Wed, 28 Sep 2011 10:28:12 +1300
Greg Ewing greg.ew...@canterbury.ac.nz wrote:

  ...because the *INPUT* destination is mapped on s1, and the blit
  method should return its *OUTPUT* in unmapped measures [pixels].  
 
 Actually that's something to be debated -- *should* it
 return post-scaling or pre-scaling coordinates? A case
 could be made for the latter, since the code using the
 scaled surface is doing all its other calculations in
 those coordinates, and having to manually transform
 back would be a nuisance.

I suppose there are various ways of using pygame, but I would still
hold that having the scaling function should be one way only. Some
random thoughts about it:

1. Domain separation: pygame is about managing graphics on the screen,
   not about managing a mathematical model. A scale factor for input
   coordinates is a transparent, non-obstructive addition. Having
   returned rectangles that do not match what is on screen is not, it
   requires you to alter the way you handle graphic.
2. Code incompatibility: code that has been written assuming images are
   not scaled may not run correctly if scaling is introduced at a later
   stage: with your proposal the very **meaning** of the returned
   rectangle would be different that what is now.
3. Scaling mess: a concrete example... In the application I am
   developing now, I scale the modelled space to match screen
   resolution, but since the objects moving in the space might have
   sub-pixel dimensions if scaled to the same resolution, I am
   often obliged to scale the sprites representing them to a different
   resolution (depending on their size) in order to be sure to have
   something distinguishable on screen. It would be all but a nuisance
   to keep track of all the different resolution when doing math on
   sprites rects.

The idea of float rectangles per se doesn't seem to have any of these
problems, yet the underlying thought seems (correct me if I am wrong!)
that you would use rectangles for keeping track of your models, rather
than of their on-screen representation [which couldn't be anything else
than integer coordinates]... Again: there are many ways to use pygame
and I can't (nor wish to) say this is wrong... It just seems to me that
working in non-integer coordinates would subtly change the semantics of
pygame, and wanted to point that out.

/mac


Re: [pygame] Native PyGame method for automatically scaling inputs to a surface resolution?

2011-09-29 Thread Christopher Night
On Wed, Sep 28, 2011 at 6:26 PM, Greg Ewing greg.ew...@canterbury.ac.nzwrote:

 Lenard Lindstrom wrote:

  Would you be interested in adding a feature request for a float based rect
 class in the issue tracker?


 What *would* be useful right now is if the drawing functions would
 accept tuples of floats and round (not truncate!) them to ints.


I've been thinking about this, and even though in the past I've rounded to
int when passing coordinates to drawing functions, I actually think the
correct behavior is to truncate. This makes sense if you consider the pixel
at (0,0) to actually occupy a 1x1 rectangle, extending from (0,0) to
(1,1). So the point (0.7, 0.7) should actually be considered to be within
the pixel at (0,0).

Distances, on the other hand, such as the radius of circles, should be
rounded.

Just my opinion.

-Christopher


Re: [pygame] Native PyGame method for automatically scaling inputs to a surface resolution?

2011-09-29 Thread Greg Ewing

Christopher Night wrote:

I've been thinking about this, and even though in the past I've rounded 
to int when passing coordinates to drawing functions, I actually think 
the correct behavior is to truncate. This makes sense if you consider 
the pixel at (0,0) to actually occupy a 1x1 rectangle, extending from 
(0,0) to (1,1). So the point (0.7, 0.7) should actually be considered to 
be within the pixel at (0,0).


If your intention is to draw a 1x1 rectangle at some location
on the screen, the correct approach would be to calculate the
transformed coordinates of all four sides of the rectangle,
round them to ints, and then fill the resulting rect.

One of the problems with truncating is that it behaves badly
when you've calculated what should be a float approximation of
an integer coordinate -- if it's slightly off in the negative
direction, you end up a whole pixel out.

In my experience, rounding is almost always the right thing
to do, and if it seems not to be, then you're not thinking
about the problem the right way.

--
Greg


Re: [pygame] Native PyGame method for automatically scaling inputs to a surface resolution?

2011-09-29 Thread Christopher Night
On Thu, Sep 29, 2011 at 5:50 PM, Greg Ewing greg.ew...@canterbury.ac.nzwrote:

 Christopher Night wrote:

  I actually think the correct behavior is to truncate. This makes sense if
 you consider the pixel at (0,0) to actually occupy a 1x1 rectangle,
 extending from (0,0) to (1,1). So the point (0.7, 0.7) should actually be
 considered to be within the pixel at (0,0).


 If your intention is to draw a 1x1 rectangle at some location
 on the screen, the correct approach would be to calculate the
 transformed coordinates of all four sides of the rectangle,
 round them to ints, and then fill the resulting rect.


Okay, thanks for the response. I understand that you're saying that's
correct, but I don't understand why. If I have a 100x100 pixel window, and I
want to put a dot at a position (x,x), it seems to me like the dot should
appear in the window if 0 = x  100. You're saying it should appear in the
window if -0.5 = x  99.5. So if I request to put a point at (-0.4, -0.4),
it will appear in the window, but if I put one at (99.6, 99.6), it won't. I
disagree that this is the correct behavior. Intuitively, the point (99.6,
99.6) should be within a 100x100 canvas.

I realize that it's a matter of preference, and either way would be
logically consistent, so it's just a matter of which is more intuitive and
comfortable.


 In my experience, rounding is almost always the right thing
 to do, and if it seems not to be, then you're not thinking
 about the problem the right way.


Well, that's certainly not true. Rounding is often the correct way to get
from a float to an int, but truncation is correct at times too. I can
provide examples if you want. But even so, I think they should make this
decision based on what's the right answer for this problem, not what's the
right answer in a general sense.

-Christopher


Re: Re: [pygame] Native PyGame method for automatically scaling inputs to a surface resolution?

2011-09-28 Thread Lenard Lindstrom
Hi Greg,

Would you be interested in adding a feature request for a float based rect class in the issue tracker? Besides the topic of this thread, what are some other use cases?

Lenard Lindstrom

On Sep 27, 2011, Greg Ewing greg.ew...@canterbury.ac.nz wrote:

Mac Ryan wrote: ...because the *INPUT* destination is "mapped" on s1, and the blit method should return its *OUTPUT* in unmapped measures [pixels].Actually that's something to be debated -- *should* itreturn post-scaling or pre-scaling coordinates? A casecould be made for the latter, since the code using thescaled surface is doing all its other calculations inthose coordinates, and having to manually transformback would be a nuisance.However, this would mean that the return value couldn'tbe a standard Rect in this case, as it would have tocontain floats. This suggests adding a new float-basedrect class, which would be a useful thing to haveanyway for passing into the drawing operations.



Re: Re: [pygame] Native PyGame method for automatically scaling inputs to a surface resolution?

2011-09-28 Thread Lenard Lindstrom
Hi Toni,

The new float Rect would not subclass pygame.Rect, as it would have float rather that integer fields. Also, a scaled Surface would automatically return the float Rect.

Having Rect subclass methods return subclass instances is still an issue though. Currently new Rect creation at the C level bypasses the normal Python level methods. I suppose for subclasses, the class call approach can be taken. But I am somewhat uncomfortable about the signature compatibility requirement. It cannot easily be enforced at the declaration level; an incompatible signature would only get caught during a method call. It would also break existing Rect subclasses that depend on a Rect instance being returned, if there are any. It should be noted that builtin Python type methods do not return subclasses either. Anyway, this is getting off track for this thread. But improved handling of subclasses is being considered.

Lenard Lindstrom

On Sep 27, 2011, Toni Alatalo ant...@kyperjokki.fi wrote:

On Sep 28, 2011, at 12:28 AM, Greg Ewing wrote: However, this would mean that the return value couldn't be a standard Rect in this case, as it would have toI found it nice how anything that has rect attribute, is a Rect :) contain floats. This suggests adding a new float-based rect class, which would be a useful thing to have anyway for passing into the drawing operations.Hm,"Though Rect can be subclassed, methods that return new rectangles are not subclass aware. That is, move or copy return a new pygame.Rect instance, not an instance of the subclass. This may change. To make subclass awareness work though, subclasses may have to maintain the same constructor signature as Rect."does that imply something here? is from http://www.pygame.org/docs/ref/rect.html Greg~Toni



Re: [pygame] Native PyGame method for automatically scaling inputs to a surface resolution?

2011-09-28 Thread Greg Ewing

Lenard Lindstrom wrote:

Would you be interested in adding a feature request for a float based 
rect class in the issue tracker?


I think it would make sense more as an adjunct to a coordinate
transformation feature. Once you have surfaces doing transformations,
you're going to want to pass floats into drawing functions, and a
floating rect type would be convenient for that. But I'm not sure it
would be worth adding float rects on their own.

What *would* be useful right now is if the drawing functions would
accept tuples of floats and round (not truncate!) them to ints.
Recent versions of pygame seem to have taken to raising exceptions
if you try to use floats as coordinates, which is very annoying.

--
Greg


Re: [pygame] Native PyGame method for automatically scaling inputs to a surface resolution?

2011-09-27 Thread Mac Ryan
On Sun, 25 Sep 2011 13:37:41 +0200
René Dudfield ren...@gmail.com wrote:

 Cool, if you like that might be interesting.  Just make sure you show
 your work as you go, so we can see if there's any problems before you
 do heaps of work.

Will do. As I said, no guarantees though! :)
 
 What do you think about generating a wrapper that does it in python
 similar to the code I posted?

If I were to do it in python I'm sure I could do it. I have very
limited experience in C, but I would guess C would be a more sensible
choice though: the scaling procedure would potentially called rather
often (every time something is draw on that surface) so it seems
sensible to me to try to make as fast performing as possible... or
not? :-/

On Sun, 25 Sep 2011 11:41:19 -0400
Christopher Night cosmologi...@gmail.com wrote:

 Suppose the current transformation is (x, y) - (0.1x, 0.1y). What
 pygame Rect will the following correspond to (after all the
 coordinates are coerced to ints)?

Could you please articulate a bit more? I'm not sure I followed. From
your post it seems to me that you are imagining a two-way
transformation in which you use model-scaled data to the draw
function (draw a 150 metres radius circle) and you also get
model-scaled data when you query the object (what is the size of the
bounding rect for the circle? -- 300 metres).

What I am talking about is just a one-way transformation that convert
the input of the drawing functions. In pseudocode:

 init_surface(scale_factor=0.1)
 draw_circle(radius=150)
 get_size_bounding_rect()
... 30x30

For me pygame is only a presentation layer, all my game logic
math is done elsewhere. As for the graphic math (sprite collision,
etc...) is just fine to do it at screen resolution, so I am happy for
pygame to keep on thinking that my circle is 30x30 pixels.

But maybe I misunderstood what you meant?

On Mon, 26 Sep 2011 09:55:01 +1300
Greg Ewing greg.ew...@canterbury.ac.nz wrote:

 The state wouldn't be global, it would be an attribute of the
 surface, like the clipping rect is now. If you don't want to
 pollute the state of your main surface, you create a subsurface
 and pass it to your drawing routines.

Indeed. I did not yet look at the code, but my preliminary idea is that
if scale == 1.0, then a standard surface should be initialised, if it
is not, then a surface which scales inputs should. This would spare
the effort of multiplying by 1 the parameters for non-scaled surfaces.

However, I think I'm thinking object orientation here, and if I am
correct, pygame is coded in regular C. ??? [Again: will probably need
some direction here].

/mac


Re: [pygame] Native PyGame method for automatically scaling inputs to a surface resolution?

2011-09-27 Thread Christopher Night
On Tue, Sep 27, 2011 at 2:13 AM, Mac Ryan quasipe...@gmail.com wrote:

 On Sun, 25 Sep 2011 11:41:19 -0400
 Christopher Night cosmologi...@gmail.com wrote:

  Suppose the current transformation is (x, y) - (0.1x, 0.1y). What
  pygame Rect will the following correspond to (after all the
  coordinates are coerced to ints)?

 Could you please articulate a bit more? I'm not sure I followed. From
 your post it seems to me that you are imagining a two-way
 transformation in which you use model-scaled data to the draw
 function (draw a 150 metres radius circle) and you also get
 model-scaled data when you query the object (what is the size of the
 bounding rect for the circle? -- 300 metres).

 What I am talking about is just a one-way transformation that convert
 the input of the drawing functions. In pseudocode:

 init_surface(scale_factor=0.1)
 draw_circle(radius=150)
 get_size_bounding_rect()
... 30x30


That's how I understood it... I'm not sure what I said that made it sound
like I was thinking of a two-way transformation Let me try rephrasing my
question using your notation. Please tell me what the following would output
(where I have question marks), and tell me if you don't understand what I
mean by the pseudocode:

 init_surface(scale_factor = 0.1)
 r1 = myrect(0, 0, 108, 10)
 r1.left = 4
 fill_rect(r1)
 get_size_bounding_rect()
??? x ???

I assume it would either be 10x1 or 11x1. Do you understand my question now?

-Christopher


Re: [pygame] Native PyGame method for automatically scaling inputs to a surface resolution?

2011-09-27 Thread Mac Ryan
On Tue, 27 Sep 2011 10:42:10 -0400
Christopher Night cosmologi...@gmail.com wrote:

 That's how I understood it... I'm not sure what I said that made it
 sound like I was thinking of a two-way transformation Let me try
 rephrasing my question using your notation. Please tell me what the
 following would output (where I have question marks), and tell me if
 you don't understand what I mean by the pseudocode:
 
  init_surface(scale_factor = 0.1)
  r1 = myrect(0, 0, 108, 10)
  r1.left = 4
  fill_rect(r1)
  get_size_bounding_rect()  
 ??? x ???

Hi Chris,

sorry, reading your reply I understood that my notation was
probably not very clear. I'm still not sure I got you, but here's a more
articulated example of what I mean, using standard pygame syntax (bar
the ``scale`` parameter):

 s1 = pygame.surface.Surface((100,100), scale=0.1)

This will initialise a surface of 10x10 px, mapping a 100x100 square.

 s1.get_bounding_rect()
rect(0, 0, 10, 10)

...because the image is 10x10 pixels.

 s2 = pygame.surface.Surface((100,100), scale=0.05)
 s2.get_bounding_rect()
rect(0, 0, 5, 5)
 s1.blit(s2, (20,20))
rect(2, 2, 5, 5)

...because the *INPUT* destination is mapped on s1, and the blit
method should return its *OUTPUT* in unmapped measures [pixels].

One thing that might have further confused the discussion is that
somewhere in an earlier mail I mentioned rects. But all I meant was
really just the rects returned by this kind of operations. I was not
thinking to a new class of rects with scaling included.

I'm far from asserting that what above is the best way ever to
implement scaling in PyGame, but this is how I imagined it working. I'm
very open to hear alternative suggestions and feedback, of course! Also:
does this address your doubts?

/mac



Re: [pygame] Native PyGame method for automatically scaling inputs to a surface resolution?

2011-09-27 Thread Christopher Night
On Tue, Sep 27, 2011 at 11:21 AM, Mac Ryan quasipe...@gmail.com wrote:

 One thing that might have further confused the discussion is that
 somewhere in an earlier mail I mentioned rects. But all I meant was
 really just the rects returned by this kind of operations. I was not
 thinking to a new class of rects with scaling included.


How can I use your system to draw a rectangle of a solid color onto a
surface? With the regular pygame system, I would use surf.fill and pass it a
Rect. If your system doesn't recognize rectangles of any sort, how can I do
this? Feel free to show me in pseudocode how I could do it.

-Christopher


Re: [pygame] Native PyGame method for automatically scaling inputs to a surface resolution?

2011-09-27 Thread Jake b
I wrote a quick demo, using numpy for vectors.
velocity = numpy.array([10., 30.])

http://code.google.com/p/ninmonkey/source/browse/boilerplate/pygame/4.%20with%20numpy%20for%20array/boilerplate%20-%20pygame_with_numpy-2011_jake.py

Euclid is python only, which means you don't need to install anything,
however, its slower. But its fast enough.

see also: http://www.scipy.org/Numpy_Example_List_With_Doc

On Sat, Sep 24, 2011 at 3:04 PM, Mac Ryan quasipe...@gmail.com wrote:

 No, since I don't have to do very complex or loads of operations I went
 with euclid... but I'd be interested in knowing if you have suggestions
 involving numpy, nevertheless.

 /mac



--
Jake


Re: [pygame] Native PyGame method for automatically scaling inputs to a surface resolution?

2011-09-27 Thread Mac Ryan
On Tue, 27 Sep 2011 11:28:34 -0400
Christopher Night cosmologi...@gmail.com wrote:

 How can I use your system to draw a rectangle of a solid color onto a
 surface? With the regular pygame system, I would use surf.fill and
 pass it a Rect. If your system doesn't recognize rectangles of any
 sort, how can I do this? Feel free to show me in pseudocode how I
 could do it.

I suppose you mean something like:

 Surface.fill(WHITE, myrect)

is it?

If this is the case, the rectangle would be scaled (so a rect of
100x200 would be scaled to a rect of 10x20 assuming scale=0.1). The
general idea would be: any argument to a surface method whose purpose
is to indicate a measure ((x,y) tuples, explicit width/height or tuples)
would be scaled. Any other parameter whose purpose is not defining
coordinates (colours, flags, surfaces...) wouldn't.

Again: not claiming this is the best way of doing it... just that this
is how I would try to implement it

/mac


Re: [pygame] Native PyGame method for automatically scaling inputs to a surface resolution?

2011-09-27 Thread Mac Ryan
On Tue, 27 Sep 2011 11:28:34 -0400
Christopher Night cosmologi...@gmail.com wrote:

 How can I use your system to draw a rectangle of a solid color onto a
 surface? With the regular pygame system, I would use surf.fill and
 pass it a Rect. If your system doesn't recognize rectangles of any
 sort, how can I do this? Feel free to show me in pseudocode how I
 could do it.

Addendum: it occurs to me that maybe I wasn't clear about the fact that
while only surfaces would have a scale parameter, rectangles applied
to a surface would be scaled, but the scaling is a property of the
surface, not of the rectangle. The same rectangle used to fill another
surface scaled differently would fill a different number of pixels.

/mac


Re: [pygame] Native PyGame method for automatically scaling inputs to a surface resolution?

2011-09-27 Thread Christopher Night
On Tue, Sep 27, 2011 at 3:39 PM, Mac Ryan quasipe...@gmail.com wrote:

 On Tue, 27 Sep 2011 11:28:34 -0400
 Christopher Night cosmologi...@gmail.com wrote:

  How can I use your system to draw a rectangle of a solid color onto a
  surface? With the regular pygame system, I would use surf.fill and
  pass it a Rect. If your system doesn't recognize rectangles of any
  sort, how can I do this? Feel free to show me in pseudocode how I
  could do it.

 I suppose you mean something like:

  Surface.fill(WHITE, myrect)

 is it?

 If this is the case, the rectangle would be scaled (so a rect of
 100x200 would be scaled to a rect of 10x20 assuming scale=0.1). The
 general idea would be: any argument to a surface method whose purpose
 is to indicate a measure ((x,y) tuples, explicit width/height or tuples)
 would be scaled. Any other parameter whose purpose is not defining
 coordinates (colours, flags, surfaces...) wouldn't.


Is myrect supposed to be a regular pygame.Rect? One big problem with that is
that pygame.Rect properties are coerced to integers. This makes sense for a
rectangle that describes a set of pixels, not so much for a rectangle that's
supposed to describe a region in world coordinates.

Either way, that's fine, and not very suprising. That's exactly what I had
in mind. So I really don't see why you weren't able to answer my original
question. Let me try asking it one more time.

You say that a 100x200 rect would be scaled to 10x20 pixels. Yes, obviously,
this makes sense. What would a 104x200 rect be scaled to? 10x20 pixels? Or
11x20 pixels? Or would it depend on the position of the rect? (Remember that
in addition to height and width, rectangles also have positions, ie left and
top. You can't just say draw a 10x20 rectangle, you also have to say where
to draw it. This is relevant to the question, so try to keep it in mind.)

-Christopher


Re: [pygame] Native PyGame method for automatically scaling inputs to a surface resolution?

2011-09-27 Thread Mac Ryan
On Tue, 27 Sep 2011 15:53:03 -0400
Christopher Night cosmologi...@gmail.com wrote:

 Either way, that's fine, and not very suprising. That's exactly what
 I had in mind. So I really don't see why you weren't able to answer
 my original question. Let me try asking it one more time.

Sorry. The good news is: this time I understood what you mean. (I
think!) :)

 You say that a 100x200 rect would be scaled to 10x20 pixels. Yes,
 obviously, this makes sense. What would a 104x200 rect be scaled to?
 10x20 pixels? Or 11x20 pixels? Or would it depend on the position of
 the rect? (Remember that in addition to height and width, rectangles
 also have positions, ie left and top. You can't just say draw a
 10x20 rectangle, you also have to say where to draw it. This is
 relevant to the question, so try to keep it in mind.)

Since we are working in raster rather than in vectorial, I
would first convert the rectangle in coordinates, and then scale,
possibly rounding to the nearest integer.

The idea is that position (x=4) + (width=14) = right(18) --  scaled: 2
and not x=4 -- scaled: 0 + width=14 -- scaled 1 -- scaled_sum: 1.
Which I suppose was the point you were trying to make in the first
post of yours (maybe part of the reason why I did not get it is that I
never used this before: I only needed to blit sprites on images in my
code).

Does this answer your question? Usual disclaimer: I did not think this
deeply, I'm very open to alternative ways of implementing it. :)

/mac


Re: [pygame] Native PyGame method for automatically scaling inputs to a surface resolution?

2011-09-27 Thread Greg Ewing

Mac Ryan wrote:


...because the *INPUT* destination is mapped on s1, and the blit
method should return its *OUTPUT* in unmapped measures [pixels].


Actually that's something to be debated -- *should* it
return post-scaling or pre-scaling coordinates? A case
could be made for the latter, since the code using the
scaled surface is doing all its other calculations in
those coordinates, and having to manually transform
back would be a nuisance.

However, this would mean that the return value couldn't
be a standard Rect in this case, as it would have to
contain floats. This suggests adding a new float-based
rect class, which would be a useful thing to have
anyway for passing into the drawing operations.

--
Greg


Re: [pygame] Native PyGame method for automatically scaling inputs to a surface resolution?

2011-09-27 Thread Toni Alatalo
On Sep 28, 2011, at 12:28 AM, Greg Ewing wrote:
 However, this would mean that the return value couldn't
 be a standard Rect in this case, as it would have to

I found it nice how anything that has rect attribute, is a Rect :)

 contain floats. This suggests adding a new float-based
 rect class, which would be a useful thing to have
 anyway for passing into the drawing operations.

Hm,

Though Rect can be subclassed, methods that return new rectangles are not 
subclass aware. That is, move or copy return a new pygame.Rect instance, not an 
instance of the subclass. This may change. To make subclass awareness work 
though, subclasses may have to maintain the same constructor signature as Rect.

does that imply something here? is from http://www.pygame.org/docs/ref/rect.html

 Greg

~Toni

Re: [SPAM: 5.000] Re: [pygame] Native PyGame method for automatically scaling inputs to a surface resolution?

2011-09-25 Thread René Dudfield
Could you create a transform rect like function that returns the transformed
state?

 t(20, 20, 20, 20)
(1,1,1,1)

 t(20, 20)
(1,1)


pygame.draw.line(s, t(20,20))

I don't know about adding that into every pygame function... that sounds
like too much work.


Re: [SPAM: 5.000] Re: [pygame] Native PyGame method for automatically scaling inputs to a surface resolution?

2011-09-25 Thread Mac Ryan
On Sun, 25 Sep 2011 14:18:25 +1300
Greg Ewing greg.ew...@canterbury.ac.nz wrote:

 Mac Ryan wrote:
 
  The behaviour that I envisage would be an
  optional keyword argument ``scale=1.0`` for rectangles (and
  surfaces).  
 
 I would say the transformation should be an attribute of the
 surface, not something that you pass into drawing calls.

But isn't this what I just say? Or did I misunderstand you? Or did you
misunderstand me? :o

On Sun, 25 Sep 2011 09:55:20 +0200
René Dudfield ren...@gmail.com wrote:

 Could you create a transform rect like function that returns the
 transformed state?
 
  t(20, 20, 20, 20)
 (1,1,1,1)

That's exactly what I have at the moment (mine is called ``sc()``, but
that's irrelevant... :)).

I still think that's a missing battery thing though. I haven't
browsed much into code of games on pygame.org, but I wouldn't be
astonished if a large portion (probably even the majority) of the code
would internally use a non-scaled model, to be transformed at
representation time...

Anyhow: not a big issue, it's just that is not an uncommon feature in
other libraries and I feel that inserting it in pygame would make
pygame better, but I *can* live without. :)

/mac


Re: [SPAM: 5.000] Re: [pygame] Native PyGame method for automatically scaling inputs to a surface resolution?

2011-09-25 Thread René Dudfield
On Sun, Sep 25, 2011 at 11:57 AM, Mac Ryan quasipe...@gmail.com wrote:

 On Sun, 25 Sep 2011 14:18:25 +1300
 Greg Ewing greg.ew...@canterbury.ac.nz wrote:

  Mac Ryan wrote:
 
   The behaviour that I envisage would be an
   optional keyword argument ``scale=1.0`` for rectangles (and
   surfaces).
 
  I would say the transformation should be an attribute of the
  surface, not something that you pass into drawing calls.

 But isn't this what I just say? Or did I misunderstand you? Or did you
 misunderstand me? :o

 On Sun, 25 Sep 2011 09:55:20 +0200
 René Dudfield ren...@gmail.com wrote:

  Could you create a transform rect like function that returns the
  transformed state?
 
   t(20, 20, 20, 20)
  (1,1,1,1)

 That's exactly what I have at the moment (mine is called ``sc()``, but
 that's irrelevant... :)).

 I still think that's a missing battery thing though. I haven't
 browsed much into code of games on pygame.org, but I wouldn't be
 astonished if a large portion (probably even the majority) of the code
 would internally use a non-scaled model, to be transformed at
 representation time...

 Anyhow: not a big issue, it's just that is not an uncommon feature in
 other libraries and I feel that inserting it in pygame would make
 pygame better, but I *can* live without. :)

 /mac


It's a feature I've used on apps myself, but differently for a number of
them.  There's a million different transforms that you could do.  From
scaling, to shearing to rotation, to random quaternions or matrices.  I
guess you could add an optional transform matrix for every coordinate, or
rect and that would work for most cases.

I still think a transforming Rect subclass would work best, like your 'sc'
function.  Otherwise you'd need global state, which is always something
pygame tries to avoid.

An automatically generated wrapper which transforms any rect, or coord like
args could be possible to make...

# untested wrapper maker.
def transform_rect_like(r):
if type(s) in []:
#TODO... do rect like detection and return a transformed thing.
pass
def transform(*args):
return tuple([transform_rect_like(a) for a in args])
def transform_kw(*kwargs):
return dict([(k, transform_rect_like(v)) for k,v in kwargs.items()])
def make_wrapper(mod, into_obj):
for k, f in mod.__dict__.items():
def wrapped_f(*args, **kwargs):
return f(*transform(args), **transformkw(kwargs))
setattr(into_mod, k, wrapped_f)

make_wrapper(pygame.draw, mydraw)


Maybe it would be possible to implement easily in C, without too much code,
but maybe not.  I'm not interested in it myself, since I think it's not too
hard to layer on top - so I won't spend any effort implementing it.
However, if you want to submit a patch that doesn't make the code super
ugly, and doesn't cost too much performance we could add it.


cheers,


Re: [pygame] Native PyGame method for automatically scaling inputs to a surface resolution?

2011-09-25 Thread Mac Ryan
On Sun, 25 Sep 2011 12:36:50 +0200
René Dudfield ren...@gmail.com wrote:

 Maybe it would be possible to implement easily in C, without too much
 code, but maybe not.  I'm not interested in it myself, since I think
 it's not too hard to layer on top - so I won't spend any effort
 implementing it. However, if you want to submit a patch that doesn't
 make the code super ugly, and doesn't cost too much performance we
 could add it.

I can have a go at it - without guarantees as I only occasionally code
in C - but that won't be immediately anyhow, most likely not before
15/10. If you have any further suggestion before I try to make my way
through the code, I'll be happy to listen.

Best,
/mac

PS: If any other PyGamer who is more fluent in C than I is interested
to work on this, I won't in any way feel dispossessed! ;)


Re: [pygame] Native PyGame method for automatically scaling inputs to a surface resolution?

2011-09-25 Thread René Dudfield
On Sun, Sep 25, 2011 at 1:31 PM, Mac Ryan quasipe...@gmail.com wrote:

 On Sun, 25 Sep 2011 12:36:50 +0200
 René Dudfield ren...@gmail.com wrote:

  Maybe it would be possible to implement easily in C, without too much
  code, but maybe not.  I'm not interested in it myself, since I think
  it's not too hard to layer on top - so I won't spend any effort
  implementing it. However, if you want to submit a patch that doesn't
  make the code super ugly, and doesn't cost too much performance we
  could add it.

 I can have a go at it - without guarantees as I only occasionally code
 in C - but that won't be immediately anyhow, most likely not before
 15/10. If you have any further suggestion before I try to make my way
 through the code, I'll be happy to listen.

 Best,
 /mac

 PS: If any other PyGamer who is more fluent in C than I is interested
 to work on this, I won't in any way feel dispossessed! ;)



Cool, if you like that might be interesting.  Just make sure you show your
work as you go, so we can see if there's any problems before you do heaps of
work.

What do you think about generating a wrapper that does it in python similar
to the code I posted?

cya.


Re: [pygame] Native PyGame method for automatically scaling inputs to a surface resolution?

2011-09-25 Thread Christopher Night
On Sun, Sep 25, 2011 at 7:31 AM, Mac Ryan quasipe...@gmail.com wrote:

 On Sun, 25 Sep 2011 12:36:50 +0200
 René Dudfield ren...@gmail.com wrote:

  Maybe it would be possible to implement easily in C, without too much
  code, but maybe not.  I'm not interested in it myself, since I think
  it's not too hard to layer on top - so I won't spend any effort
  implementing it. However, if you want to submit a patch that doesn't
  make the code super ugly, and doesn't cost too much performance we
  could add it.

 I can have a go at it - without guarantees as I only occasionally code
 in C - but that won't be immediately anyhow, most likely not before
 15/10. If you have any further suggestion before I try to make my way
 through the code, I'll be happy to listen.

 I have several questions that I would want addressed before I used your
tool extensively. Should I post them to this thread or what? There's a
number having to do with tricky edge cases of the transformation. When I
write my own wrappers I know what to expect, so I would need it well
documented for yours. For example

Suppose the current transformation is (x, y) - (0.1x, 0.1y). What pygame
Rect will the following correspond to (after all the coordinates are coerced
to ints)?

rect1 = rect2 = myRect(0, 0, 108, 10)
rect1.left = 4
rect2.right = 112

Should the pygame rects produced by rect1 and rect2 be the same? Either
answer is surprising, I think. (That question assumes you're rounding down
to coerce to ints. If instead you're rounding to the nearest, consider this
example)

rect1 = rect2 = myRect(0, 0, 104, 10)
rect1.left = 4
rect2.right = 108

Anyway, that's just one question, I'd just want to make sure you'd thought
it through.

-Christopher


Re: [SPAM: 5.000] Re: [pygame] Native PyGame method for automatically scaling inputs to a surface resolution?

2011-09-25 Thread Greg Ewing

René Dudfield wrote:

I still think a transforming Rect subclass would work best, like your 
'sc' function.  Otherwise you'd need global state,


The state wouldn't be global, it would be an attribute of the
surface, like the clipping rect is now. If you don't want to
pollute the state of your main surface, you create a subsurface
and pass it to your drawing routines.

--
Greg


Re: [pygame] Native PyGame method for automatically scaling inputs to a surface resolution?

2011-09-24 Thread Mac Ryan
Thank you for all your answers. I condensed my reaction in a single
mail:

Julian Marchant onp...@yahoo.com wrote:

 What is the purpose to having calculations done with a size that's 10
 times larger? If it's just precision, the solution could be simply to
 use floats for the calculations and convert to ints as necessary.

That was just an example. The general idea is that when you write the
physics of a game (and even more a simulation) you normally model the
reality as-is, but represent it scaled (an projected) onto your
computer screen. So: your speed would be 10.2m/s in all the
internal calculation (and not 2.5789px/(1/FPS)).

 One last possibility that I can think of is to scale up your graphics
 for 1000x1000 and then scale the window surface...

In my specific application I am representing objects on 6.400.000.000
square metres, with a resolution to the centimetre. Unless I'm going to
run it on the K computer it's not a viable solution.

Christopher Night cosmologi...@gmail.com wrote:

 Yeah short answer no. However, I think the answers you've gotten from
 StackOverflow have not been terribly helpful. They seem to suggest
 don't do scaling in pygame. This is silly, I do scaling in pygame
 all the time. There's no reason you'd need to work in screen
 coordinates.

Yep, still I appreciated the fact they took the time to answer!

 I use wrappers. Let me point out there's a total of 9 functions in
 pygame.draw. You seem to be going to a lot of effort to avoid writing
 9 one-line functions. (And I usually only ever use about 3 or 4 in
 any one application.) Writing the wrappers is the best way, and I
 don't think you should have dismissed it so quickly.

I never counted the number of functions actually. Good to know. :o

 Since this is a very common problem, I wonder if there is an
 established pattern that elegantly solves this problem that I failed
 to find.

We are in two then! :)

 I could simply decorate the existing middleware functions, but the
 problem is that those functions also work with the same parameters
 being list or Vector2D too, and ...

Totally on the same line, although writing a decorator that need to
check special cases all the time is going to have a performance hit,
and writing more than one decorator does seem like silly. 

 I hope you find a solution that satisfies you. It's not that there
 aren't plenty of solutions!

So far I settled with a util function ``sc()`` that accepts scalars or
iterables and returned them scaled. Using a deep copy and
initialisation I could also create a sort of closure, de-facto using
the same syntax for all surfaces but getting scaled to the right amount
according to the target surface [so far I don't need it, though].

Greg Ewing greg.ew...@canterbury.ac.nz wrote:

 Don't forget that function calls are expensive in Python, as
 is doing piecemeal arithmetic.
 
 Most other graphics systems these days provide a facility for
 applying a transformation to coordinates before drawing, and
 I don't think it's unreasonable to suggest that PyGame should
 do the same. Having it done for you in C would be more efficient
 than doing it in Python.

I'm totally with you on this. The behaviour that I envisage would be an
optional keyword argument ``scale=1.0`` for rectangles (and surfaces).
In order to further optimise execution time, PyGame could initialise
objects with the scaling procedure only when ``scale != 1 ``.

Christopher Night cosmologi...@gmail.com wrote:

I think I missed some mail, as there is quoted text in here that I
didn't see in original, however...

 While that is roughly true, it's a very, very general statement to
 the point where I would say that avoiding function calls on principle
 is premature optimization. Keep in mind that the operation you're
 wrapping - a draw call - is expensive in the first place. Anyway, a
 quick profile suggests that for a small circle you can potentially
 gain a 7% speedup by avoiding this function call, and a 14% speedup
 avoiding both the function and the arithmetic:

My point wasn't really the fact that I'm lacking speed. For me it is
more a question of good code design. I think that having to scale
manually at each operation on a given surface (or having to wrap
decorate functions all the time) is suboptimal (DNRY). Besides, while
speed is not my immediate concern, I do think that at library-level
speed should (is) one of the designing principles.

 You can decide whether that's worth it for you to avoid this function
 call. For me, if my game is slow enough that I have to avoid function
 calls to get reasonable performance, it means I'm doing something
 else wrong. :) If performance is the goal here, I still think it's a
 large amount of effort for a relatively modest gain.

Everything is relative: 14% is the difference between dying 80 and 69
years old. I'm not so sure if I would call that modest! ;)
 
 For what it's worth, I would also welcome native-pygame wrappers that
 apply a linear transformation. 

Re: [pygame] Native PyGame method for automatically scaling inputs to a surface resolution?

2011-09-24 Thread Jake b
Are you using numpy?
-- 
Jake


Re: [pygame] Native PyGame method for automatically scaling inputs to a surface resolution?

2011-09-24 Thread Mac Ryan
On Sat, 24 Sep 2011 14:39:31 -0500
Jake b ninmonk...@gmail.com wrote:

 Are you using numpy?

No, since I don't have to do very complex or loads of operations I went
with euclid... but I'd be interested in knowing if you have suggestions
involving numpy, nevertheless.

/mac


Re: [SPAM: 5.000] Re: [pygame] Native PyGame method for automatically scaling inputs to a surface resolution?

2011-09-24 Thread Greg Ewing

Mac Ryan wrote:


The behaviour that I envisage would be an
optional keyword argument ``scale=1.0`` for rectangles (and surfaces).


I would say the transformation should be an attribute of the
surface, not something that you pass into drawing calls. Also
it should allow for both scaling and translation, independently
in each direction.

Ideally it should be a general linear transformation, but that
would probably require totally re-thinking the way drawing
operations are implemented (e.g. rotated ellipses would blow
pygame.draw's tiny mind at the moment).


My point wasn't really the fact that I'm lacking speed. For me it is
more a question of good code design.


If you're not concerned about speed, then rather than wrapping
the drawing functions, you could consider wrapping the surface
with an object that holds the transformation and provides
drawing methods. Such a wrapper would behaves something like
the enhanced surface object I proposed above.

--
Greg


Re: [pygame] Native PyGame method for automatically scaling inputs to a surface resolution?

2011-09-23 Thread Julian Marchant
What is the purpose to having calculations done with a size that's 10 times 
larger? If it's just precision, the solution could be simply to use floats for 
the calculations and convert to ints as necessary.

Or, you could write some simple functions or methods that divide the x and y by 
10 and then pass the new values to the appropriate Pygame methods. Or, if you 
really want to use the normal Pygame methods, just integer-divide the values as 
you pass them. Even better, you could make a derived class and override the 
Pygame methods with more appropriate ones (a technique commonly used with 
wxPython) like so:

def blit(self, source, dest, area=None, special_flags=0):
# Change dest so that its x and y are divided by 10
pretend_this_function_does_what_you_want()

pygame.Surface.blit(self, source, dest, area, special_flags)

One last possibility that I can think of is to scale up your graphics for 
1000x1000 and then scale the window surface every time you want to draw it. 
That is, draw to a 1000x1000 surface, but shrink it to 100x100 when it's 
displayed. I would only recommend this if you find it easy to implement and any 
performance hit is negligible or unimportant.

Anyway, I hope that helps!

--- On Fri, 9/23/11, Mac Ryan quasipe...@gmail.com wrote:

From: Mac Ryan quasipe...@gmail.com
Subject: [pygame] Native PyGame method for automatically scaling inputs to a 
surface resolution?
To: pygame-users@seul.org
Date: Friday, September 23, 2011, 8:29 AM

Hello,

    back July, I posted a question on StackOverflow titled that so
far still did not get any answer. So - although I believe the answer is
quite simply NO - I thought to repost it here:



In my program (which uses pygame to draw objects on the video) I have
two representation of my world:

- A physical one that I use to make all the calculations involved in the
  simulation and in which objects are located on a 1000x1000 metres
  surface.
- A visual one which I use to draw on the screen, in which my objects
  are located in a window measuring 100x100 pixels.

What I want to achieve is to be able to pass to my pygame drawing
functions (which normally accept inputs in pixels) my
physical/real-word coordinates. In other words, I would like to be able
to say:

Draw a 20m radius circle at coordinates (200m, 500m)
using the precise pygame syntax:

pygame.draw.circle(surface, (255,255,255), (200,500), 20)
and get my circle of 2px radius at centred on pixels (20,50).



If you are on SO and would like to answer there [too], I'll be happy to
dispense upvotes! ;) http://stackoverflow.com/q/6807459/146792


Re: [pygame] Native PyGame method for automatically scaling inputs to a surface resolution?

2011-09-23 Thread Christopher Night
Yeah short answer no. However, I think the answers you've gotten from
StackOverflow have not been terribly helpful. They seem to suggest don't do
scaling in pygame. This is silly, I do scaling in pygame all the time.
There's no reason you'd need to work in screen coordinates.

I use wrappers. Let me point out there's a total of 9 functions in
pygame.draw. You seem to be going to a lot of effort to avoid writing 9
one-line functions. (And I usually only ever use about 3 or 4 in any one
application.) Writing the wrappers is the best way, and I don't think you
should have dismissed it so quickly.

Since this is a very common problem, I wonder if there is an established
pattern that elegantly solves this problem that I failed to find.


I don't consider the wrappers that inelegant. However, you could also do it
pretty easily with a function decorator if that's preferable for you

I could simply decorate the existing middleware functions, but the problem
is that those functions also work with the same parameters being list or
Vector2D too, and the decorator would have no way to know which lists need
to be scaled (the radius for example) and which not (the RGB values).


With one example, everything numerical should be scaled except for RGB
values. Since these are always the second argument to the function, it
should be trivial to accept the first two arguments (surf and color) and
scale everything else as needed. The only other exception is the angles in
pygame.draw.arc. So you would have to write one special case, assuming you
actually need pygame.draw.arc.

I hope you find a solution that satisfies you. It's not that there aren't
plenty of solutions! :)

-Christopher

On Fri, Sep 23, 2011 at 4:29 AM, Mac Ryan quasipe...@gmail.com wrote:

 Hello,

back July, I posted a question on StackOverflow titled that so
 far still did not get any answer. So - although I believe the answer is
 quite simply NO - I thought to repost it here:

 

 In my program (which uses pygame to draw objects on the video) I have
 two representation of my world:

 - A physical one that I use to make all the calculations involved in the
  simulation and in which objects are located on a 1000x1000 metres
  surface.
 - A visual one which I use to draw on the screen, in which my objects
  are located in a window measuring 100x100 pixels.

 What I want to achieve is to be able to pass to my pygame drawing
 functions (which normally accept inputs in pixels) my
 physical/real-word coordinates. In other words, I would like to be able
 to say:

 Draw a 20m radius circle at coordinates (200m, 500m)
 using the precise pygame syntax:

 pygame.draw.circle(surface, (255,255,255), (200,500), 20)
 and get my circle of 2px radius at centred on pixels (20,50).

 

 If you are on SO and would like to answer there [too], I'll be happy to
 dispense upvotes! ;) http://stackoverflow.com/q/6807459/146792



Re: [pygame] Native PyGame method for automatically scaling inputs to a surface resolution?

2011-09-23 Thread Greg Ewing

Christopher Night wrote:

I use wrappers. Let me point out there's a total of 9 functions in 
pygame.draw. You seem to be going to a lot of effort to avoid writing 9 
one-line functions.


Don't forget that function calls are expensive in Python, as
is doing piecemeal arithmetic.

Most other graphics systems these days provide a facility for
applying a transformation to coordinates before drawing, and
I don't think it's unreasonable to suggest that PyGame should
do the same. Having it done for you in C would be more efficient
than doing it in Python.

--
Greg


Re: [pygame] Native PyGame method for automatically scaling inputs to a surface resolution?

2011-09-23 Thread Christopher Night
On Fri, Sep 23, 2011 at 7:04 PM, Greg Ewing greg.ew...@canterbury.ac.nzwrote:

 Christopher Night wrote:

  I use wrappers. Let me point out there's a total of 9 functions in
 pygame.draw. You seem to be going to a lot of effort to avoid writing 9
 one-line functions.


 Don't forget that function calls are expensive in Python, as
 is doing piecemeal arithmetic.

 Most other graphics systems these days provide a facility for
 applying a transformation to coordinates before drawing, and
 I don't think it's unreasonable to suggest that PyGame should
 do the same. Having it done for you in C would be more efficient
 than doing it in Python.


While that is roughly true, it's a very, very general statement to the point
where I would say that avoiding function calls on principle is premature
optimization. Keep in mind that the operation you're wrapping - a draw call
- is expensive in the first place. Anyway, a quick profile suggests that for
a small circle you can potentially gain a 7% speedup by avoiding this
function call, and a 14% speedup avoiding both the function and the
arithmetic:

 import timeit
 setup = import pygame
... pygame.init()
... s = pygame.display.set_mode((100, 100))
... def myCirc(surf, color, (x, y), r, width=0):
... pygame.draw.circle(surf, color, (x/10, y/10), r/10, width/10)
 timeit.timeit(myCirc(s, (255,255,255), (500, 500), 400), setup,
number=10)
4.9027900695800781
 pygame.quit()
 timeit.timeit(pygame.draw.circle(s, (255,255,255), (500/10, 500/10),
400/10), setup, number=10)
4.546515941619873
 pygame.quit()
 timeit.timeit(pygame.draw.circle(s, (255,255,255), (50, 50), 40),
setup, number=10)
4.1960330009460449
 pygame.quit()

You can decide whether that's worth it for you to avoid this function call.
For me, if my game is slow enough that I have to avoid function calls to get
reasonable performance, it means I'm doing something else wrong. :) If
performance is the goal here, I still think it's a large amount of effort
for a relatively modest gain.

For what it's worth, I would also welcome native-pygame wrappers that apply
a linear transformation. But whether pygame *should* have them wasn't the
question, as I understood it. And I can scrape by without them.

-Christopher