[Haskell-cafe] OpenGL extensions survey results

2013-07-15 Thread Brian Lewis
Hi, here are the results of the recent OpenGL extensions survey:
https://github.com/bsl/opengl-extensions-survey

If I did anything dumb, please send me mail or a pull request.

Thanks to everyone who contributed!

___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] opengl type confusion

2013-06-17 Thread Tom Ellis
On Sun, Jun 16, 2013 at 05:22:59PM -0700, bri...@aracnet.com wrote:
  Vertex3 takes three arguments, all of which must be of the same instance of
  VertexComponent.  Specifying GLdoubles in the signature of wireframe
  specifies the types in the last three calls to Vertex3, but (0.0 ::
  GLdouble) is still requried on the first to fix the type there.  How else
  could the compiler know that you mean 0.0 to be a GLdouble and not a
  GLfloat?
 
 it's curious that 
 
 (0.0::GLdouble) 0.0 0.0 
 
 is good enough and that 
 
 (0.0::GLdouble) (0.0::GLdouble) (0.0::GLdouble)
 
 is not required.  I suspect that's because, as you point out, they all
 have to be the same argument and ghc is being smart and saying if the
 first arg _must_ be GLdouble (because I'm explicitly forcing the type),
 then the rest must be too.

That is exactly the reason.

Tom

___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


[Haskell-cafe] opengl type confusion

2013-06-16 Thread briand
This,

wireframe :: Double - Double - Double - IO ()
wireframe wx wy wz = do 
  -- yz plane
  renderPrimitive LineLoop $ do
   vertex $ Vertex3 0.0 0.0 0.0
   vertex $ Vertex3 0.0 wy 0.0
   vertex $ Vertex3 0.0 wy wz
   vertex $ Vertex3 0.0 0.0 wz

produces this:

No instance for (VertexComponent Double)
  arising from a use of `vertex'
Possible fix:
  add an instance declaration for (VertexComponent Double)
In the expression: vertex
In a stmt of a 'do' block: vertex $ Vertex3 0.0 wy 0.0
In the second argument of `($)', namely
  `do { vertex $ Vertex3 0.0 0.0 0.0;
vertex $ Vertex3 0.0 wy 0.0;
vertex $ Vertex3 0.0 wy wz;
vertex $ Vertex3 0.0 0.0 wz }'

and thusly this :-(

Changing the declaration to GLdouble - GLdouble - GLdouble - IO() and using
(0.0::GLdouble) fixes it, and I'm not clear on why it's not automagic.  There 
are many times I see the compiler doing type conversion an numerican arguments 
although sometimes the occasional fracSomethingIntegralorOther is required.

I was hoping for some enlightenment.

Thank you.

Brian


___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] opengl type confusion

2013-06-16 Thread Brandon Allbery
On Sun, Jun 16, 2013 at 4:03 PM, bri...@aracnet.com wrote:

 Changing the declaration to GLdouble - GLdouble - GLdouble - IO() and
 using
 (0.0::GLdouble) fixes it, and I'm not clear on why it's not automagic.
  There are many times I see the


Haskell never automagics types in that context; if it expects GLdouble,
it expects GLdouble. Pretending it's Double will not work. It would in
the specific case that GLdouble were actually a type synonym for Double;
however, for performance reasons it is not. Haskell Double is not directly
usable from the C-based API used by OpenGL, so GLdouble is a type synonym
for CDouble which is.

compiler doing type conversion an numerican arguments although sometimes
 the occasional fracSomethingIntegralorOther is required.


I presume the reason the type specification for numeric literals is because
there is no defaulting (and probably can't be without introducing other
strange type issues) for GLdouble.

In any case, the very fact that you refer to automagic and type
conversion indicates that you don't really have an understanding of how
Haskell's numeric types work; this will lead you into not only this kind of
confusion, but worse problems later. In particular, you're going to get
into dreadful messes where you expect Haskell to transparently deal with
strange combinations of numeric types as if Haskell were (almost-typeless)
Perl or something, and you'll have real trouble getting that code to work
until you sit down and figure out how strong typing and Haskell's numeric
typeclasses interact.

-- 
brandon s allbery kf8nh   sine nomine associates
allber...@gmail.com  ballb...@sinenomine.net
unix, openafs, kerberos, infrastructure, xmonadhttp://sinenomine.net
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] opengl type confusion

2013-06-16 Thread briand
On Sun, 16 Jun 2013 16:15:25 -0400
Brandon Allbery allber...@gmail.com wrote:

 On Sun, Jun 16, 2013 at 4:03 PM, bri...@aracnet.com wrote:
 
  Changing the declaration to GLdouble - GLdouble - GLdouble - IO() and
  using
  (0.0::GLdouble) fixes it, and I'm not clear on why it's not automagic.
   There are many times I see the
 
 
 Haskell never automagics types in that context; if it expects GLdouble,
 it expects GLdouble. Pretending it's Double will not work. It would in
 the specific case that GLdouble were actually a type synonym for Double;
 however, for performance reasons it is not. Haskell Double is not directly
 usable from the C-based API used by OpenGL, so GLdouble is a type synonym
 for CDouble which is.
 
 compiler doing type conversion an numerican arguments although sometimes
  the occasional fracSomethingIntegralorOther is required.
 
 
 I presume the reason the type specification for numeric literals is because
 there is no defaulting (and probably can't be without introducing other
 strange type issues) for GLdouble.
 

What I was thinking about, using a very poor choice of words, was this :


*Main let a = 1
*Main :t a
a :: Integer
*Main let a = 1::Double
*Main a
1.0
*Main :t a
a :: Double
*Main 

so normally 1 would be interpreted as an int, but if I declare 'a' a Double 
then it gets promoted to a Double without me having to call a conversion 
routine explicitly.

That seems automagic to me.

(0.0::GLdouble) works to make the compiler happy.  So it appears to be taking 
care of the conversion automagically.

So maybe a better question, I hope, is:

How can I simply declare 0.0 to be (0.0::GLdouble) and have the functional call 
work.  Doesn't a conversion have to be happening, i.e. shouldn't I really have 
to do (realToFrac 0.0) ?

Brian


___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] opengl type confusion

2013-06-16 Thread L Corbijn
On Sun, Jun 16, 2013 at 10:42 PM, bri...@aracnet.com wrote:

 On Sun, 16 Jun 2013 16:15:25 -0400
 Brandon Allbery allber...@gmail.com wrote:

  On Sun, Jun 16, 2013 at 4:03 PM, bri...@aracnet.com wrote:
 
   Changing the declaration to GLdouble - GLdouble - GLdouble - IO()
 and
   using
   (0.0::GLdouble) fixes it, and I'm not clear on why it's not automagic.
There are many times I see the
 
 
  Haskell never automagics types in that context; if it expects GLdouble,
  it expects GLdouble. Pretending it's Double will not work. It would in
  the specific case that GLdouble were actually a type synonym for Double;
  however, for performance reasons it is not. Haskell Double is not
 directly
  usable from the C-based API used by OpenGL, so GLdouble is a type synonym
  for CDouble which is.
 
  compiler doing type conversion an numerican arguments although sometimes
   the occasional fracSomethingIntegralorOther is required.
  
 
  I presume the reason the type specification for numeric literals is
 because
  there is no defaulting (and probably can't be without introducing other
  strange type issues) for GLdouble.
 

 What I was thinking about, using a very poor choice of words, was this :


 *Main let a = 1
 *Main :t a
 a :: Integer
 *Main let a = 1::Double
 *Main a
 1.0
 *Main :t a
 a :: Double
 *Main

 so normally 1 would be interpreted as an int, but if I declare 'a' a
 Double then it gets promoted to a Double without me having to call a
 conversion routine explicitly.

 That seems automagic to me.

 (0.0::GLdouble) works to make the compiler happy.  So it appears to be
 taking care of the conversion automagically.

 So maybe a better question, I hope, is:

 How can I simply declare 0.0 to be (0.0::GLdouble) and have the functional
 call work.  Doesn't a conversion have to be happening, i.e. shouldn't I
 really have to do (realToFrac 0.0) ?

 Brian


 ___
 Haskell-Cafe mailing list
 Haskell-Cafe@haskell.org
 http://www.haskell.org/mailman/listinfo/haskell-cafe

___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] opengl type confusion

2013-06-16 Thread Tom Ellis
On Sun, Jun 16, 2013 at 01:03:48PM -0700, bri...@aracnet.com wrote:
 wireframe :: Double - Double - Double - IO ()
 wireframe wx wy wz = do 
   -- yz plane
   renderPrimitive LineLoop $ do
vertex $ Vertex3 0.0 0.0 0.0
vertex $ Vertex3 0.0 wy 0.0
vertex $ Vertex3 0.0 wy wz
vertex $ Vertex3 0.0 0.0 wz
[...]
 
 No instance for (VertexComponent Double)
   arising from a use of `vertex'
[...]
 
 Changing the declaration to GLdouble - GLdouble - GLdouble - IO() and using
 (0.0::GLdouble) fixes it

Vertex3 takes three arguments, all of which must be of the same instance of
VertexComponent.  Specifying GLdoubles in the signature of wireframe
specifies the types in the last three calls to Vertex3, but (0.0 ::
GLdouble) is still requried on the first to fix the type there.  How else
could the compiler know that you mean 0.0 to be a GLdouble and not a
GLfloat?

Tom

___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] opengl type confusion

2013-06-16 Thread Brandon Allbery
On Sun, Jun 16, 2013 at 4:42 PM, bri...@aracnet.com wrote:

 On Sun, 16 Jun 2013 16:15:25 -0400
 Brandon Allbery allber...@gmail.com wrote:
  On Sun, Jun 16, 2013 at 4:03 PM, bri...@aracnet.com wrote:
   Changing the declaration to GLdouble - GLdouble - GLdouble - IO()
 and
   using
   (0.0::GLdouble) fixes it, and I'm not clear on why it's not automagic.
There are many times I see the
 
  I presume the reason the type specification for numeric literals is
 because
  there is no defaulting (and probably can't be without introducing other
  strange type issues) for GLdouble.

 What I was thinking about, using a very poor choice of words, was this :

 *Main let a = 1
 *Main :t a
 a :: Integer
 *Main let a = 1::Double
 *Main a
 1.0
 *Main :t a
 a :: Double
 *Main

 so normally 1 would be interpreted as an int, but if I declare 'a' a
 Double then it gets promoted to a Double without me having to call a
 conversion routine explicitly.

 That seems automagic to me.


No magic involved, although some automation is. Take a look at the
`default` keyword in the Haskell Report (this is the defaulting I
mentioned earlier).

http://www.haskell.org/onlinereport/haskell2010/haskellch4.html#x10-790004.3.4

The default `default` is `default (Integer, Double)` which means that it
will try to resolve a numeric literal as type Integer, and if it gets a
type error it will try again with type Double.

You should use this same mechanism to make numeric literals work with
OpenGL code: neither Integer nor Double will produce a valid type for the
expression, but at the same time the compiler cannot infer a type because
there are two possibilities (GLfloat and GLdouble). You could therefore add
a declaration `default (Integer, Double, GLdouble)` so that it will try
GLdouble to resolve numeric literals when neither Integer nor Double will
work.

 How can I simply declare 0.0 to be (0.0::GLdouble) and have the
functional call work.  Doesn't a conversion have to be happening, i.e.
shouldn't I really have to do (realToFrac 0.0) ?

The first part I just answered. As to the second, a conversion *is*
happening, implicitly as defined by the language; the question being, to
what type. A numeric literal has type (Num a = a), implemented by
inserting a call to `fromIntegral` for literals without decimal points and
`fromRational` for others. But the compiler can't always work out what `a`
is in (Num a = a) without some help (the aforementioned `default`
declaration).

-- 
brandon s allbery kf8nh   sine nomine associates
allber...@gmail.com  ballb...@sinenomine.net
unix, openafs, kerberos, infrastructure, xmonadhttp://sinenomine.net
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] opengl type confusion

2013-06-16 Thread L Corbijn
I seem to making a mess of it, first accidentally posting an empty message
and then forgetting to reply to the list. Thirdly I forgot to mention that
my message only describes the 'GHCi magic'.

Lars

P.S. Conclusion, I shouldn't write complicated email this late on the
evening.

-- Forwarded message --
From: L Corbijn aspergesoe...@gmail.com
Date: Mon, Jun 17, 2013 at 12:07 AM
Subject: Re: [Haskell-cafe] opengl type confusion
To: bri...@aracnet.com


On Sun, Jun 16, 2013 at 11:10 PM, L Corbijn aspergesoe...@gmail.com wrote:




 On Sun, Jun 16, 2013 at 10:42 PM, bri...@aracnet.com wrote:

 On Sun, 16 Jun 2013 16:15:25 -0400
 Brandon Allbery allber...@gmail.com wrote:

  On Sun, Jun 16, 2013 at 4:03 PM, bri...@aracnet.com wrote:
 
   Changing the declaration to GLdouble - GLdouble - GLdouble - IO()
 and
   using
   (0.0::GLdouble) fixes it, and I'm not clear on why it's not automagic.
There are many times I see the
 
 
  Haskell never automagics types in that context; if it expects
 GLdouble,
  it expects GLdouble. Pretending it's Double will not work. It would in
  the specific case that GLdouble were actually a type synonym for Double;
  however, for performance reasons it is not. Haskell Double is not
 directly
  usable from the C-based API used by OpenGL, so GLdouble is a type
 synonym
  for CDouble which is.
 
  compiler doing type conversion an numerican arguments although sometimes
   the occasional fracSomethingIntegralorOther is required.
  
 
  I presume the reason the type specification for numeric literals is
 because
  there is no defaulting (and probably can't be without introducing other
  strange type issues) for GLdouble.
 

 What I was thinking about, using a very poor choice of words, was this :


 *Main let a = 1
 *Main :t a
 a :: Integer
 *Main let a = 1::Double
 *Main a
 1.0
 *Main :t a
 a :: Double
 *Main

 so normally 1 would be interpreted as an int, but if I declare 'a' a
 Double then it gets promoted to a Double without me having to call a
 conversion routine explicitly.

 That seems automagic to me.

 (0.0::GLdouble) works to make the compiler happy.  So it appears to be
 taking care of the conversion automagically.

 So maybe a better question, I hope, is:

 How can I simply declare 0.0 to be (0.0::GLdouble) and have the
 functional call work.  Doesn't a conversion have to be happening, i.e.
 shouldn't I really have to do (realToFrac 0.0) ?

 Brian


 ___
 Haskell-Cafe mailing list
 Haskell-Cafe@haskell.org
 http://www.haskell.org/mailman/listinfo/haskell-cafe



Oops sorry for the empty reply, I accidentally hit the sent button.

What you are seeing is the defaulting (see
http://www.haskell.org/onlinereport/haskell2010/haskellch4.html#x10-790004.3.4).
Which roughly speaking means that if you need a specific instance of a
number first try Integer then Double and as a last resort fail.

Prelude :t 1
1 :: Num a = a
Prelude :t 1.0
1.0 :: Fractional a = a

So normally a number can be just any instance of the Num class, and any
number with a decimal can be any Fractional instance. And now with let
bindings


The need for defaulting is caused by the monomorphism restriction (
http://www.haskell.org/haskellwiki/Monomorphism_restriction), which states
that let binds should be monomorphic, or roughly speaking it should contain
no type variables (unless of course you provide a type signature).

Prelude let b = 1
Prelude :t b
b :: Integer

Prelude let c = 1.0
Prelude :t c
c :: Double

So here you see the result of the combination. The monomorphism restriction
doesn't allow 'Num a = a' as type for 'b'. So the defaulting kicks in and
finds that its first guess 'Integer' fits. Therefore 'b'  gets type
Integer. Though for 'c' the guess 'Integer' fails as it isn't a Fractional.
Its second guess, Double, is a fractional so 'c' gets type Double.

You can see that the monomorphism restriction is to blame by disabling it

Prelude :set -XNoMonomorphismRestriction
Prelude let b = 1
Prelude :t b
b :: Num a = a

But you shouldn't normally need to do this, as you can provide a specific
type signature.

(in a fresh GHCi session)
Prelude let {b :: Num a = a; b = 1}
Prelude :t b
b :: Num a = a



On Sun, Jun 16, 2013 at 10:42 PM, bri...@aracnet.com wrote:

 On Sun, 16 Jun 2013 16:15:25 -0400
 Brandon Allbery allber...@gmail.com wrote:

  On Sun, Jun 16, 2013 at 4:03 PM, bri...@aracnet.com wrote:
 
   Changing the declaration to GLdouble - GLdouble - GLdouble - IO()
 and
   using
   (0.0::GLdouble) fixes it, and I'm not clear on why it's not automagic.
There are many times I see the
 
 
  Haskell never automagics types in that context; if it expects GLdouble,
  it expects GLdouble. Pretending it's Double will not work. It would in
  the specific case that GLdouble were actually a type synonym for Double;
  however, for performance reasons it is not. Haskell Double is not
 directly
  usable from the C-based API used by OpenGL, so GLdouble

Re: [Haskell-cafe] opengl type confusion

2013-06-16 Thread briand
On Sun, 16 Jun 2013 22:19:22 +0100
Tom Ellis tom-lists-haskell-cafe-2...@jaguarpaw.co.uk wrote:

 On Sun, Jun 16, 2013 at 01:03:48PM -0700, bri...@aracnet.com wrote:
  wireframe :: Double - Double - Double - IO ()
  wireframe wx wy wz = do 
-- yz plane
renderPrimitive LineLoop $ do
 vertex $ Vertex3 0.0 0.0 0.0
 vertex $ Vertex3 0.0 wy 0.0
 vertex $ Vertex3 0.0 wy wz
 vertex $ Vertex3 0.0 0.0 wz
 [...]
  
  No instance for (VertexComponent Double)
arising from a use of `vertex'
 [...]
  
  Changing the declaration to GLdouble - GLdouble - GLdouble - IO() and 
  using
  (0.0::GLdouble) fixes it
 
 Vertex3 takes three arguments, all of which must be of the same instance of
 VertexComponent.  Specifying GLdoubles in the signature of wireframe
 specifies the types in the last three calls to Vertex3, but (0.0 ::
 GLdouble) is still requried on the first to fix the type there.  How else
 could the compiler know that you mean 0.0 to be a GLdouble and not a
 GLfloat?
 
 Tom
 


it's curious that 

(0.0::GLdouble) 0.0 0.0 

is good enough and that 

(0.0::GLdouble) (0.0::GLdouble) (0.0::GLdouble)

is not required.  I suspect that's because, as you point out, they all have to 
be the same argument and ghc is being smart and saying if the first arg _must_ 
be GLdouble (because I'm explicitly forcing the type), then the rest must be 
too.

Meanwhile 4.3.4 about the default is quite interesting. Didn't know about that 
:-)

Thanks very much for the responses !

Brian



___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


[Haskell-cafe] OpenGL library will not load into GHCi 7.6.1 on Win7 x86-64

2012-10-09 Thread Dan Haraj
This is a regression of some sort since the library operates fine within
GHCi for previous versions of GHC. I don't know whether it is a problem
with GHCi, the OpenGL library, or some third party. This is the error
encountered when a module that uses OpenGL tries to load:

Prelude Main main
Loading package OpenGLRaw-1.2.0.0 ... linking ... ghc.exe: unable to load
package `OpenGLRaw-1.2.0.0'
Prelude Main :q
Leaving GHCi.
interactive: C:\...\OpenGLRaw-1.2.0.0\ghc-7.6.1\HSOpenGLRaw-1.2.0.0.o:
unknown symbol `__imp_wglGetProcAddress'

I have very little of an idea of what's going on here. It would be rather
painful to debug this myself so any insight is appreciated.
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] OpenGL library will not load into GHCi 7.6.1 on Win7 x86-64

2012-10-09 Thread Jason Dagit
On Tue, Oct 9, 2012 at 1:55 PM, Dan Haraj devha...@gmail.com wrote:

 This is a regression of some sort since the library operates fine within
 GHCi for previous versions of GHC. I don't know whether it is a problem
 with GHCi, the OpenGL library, or some third party. This is the error
 encountered when a module that uses OpenGL tries to load:

 Prelude Main main
 Loading package OpenGLRaw-1.2.0.0 ... linking ... ghc.exe: unable to load
 package `OpenGLRaw-1.2.0.0'
 Prelude Main :q
 Leaving GHCi.
 interactive: C:\...\OpenGLRaw-1.2.0.0\ghc-7.6.1\HSOpenGLRaw-1.2.0.0.o:
 unknown symbol `__imp_wglGetProcAddress'

 I have very little of an idea of what's going on here. It would be rather
 painful to debug this myself so any insight is appreciated.


I don't know the solution or the culprit, but normally that symbol is
defined in opengl32.lib/dll and the OpenGLRaw cabal file mentions it as an
extra library.

Jason
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] OpenGL: No instance for Random GLfloat

2012-05-03 Thread L Corbijn
Hi,

GLfloat haskell is instance of several number related typeclasses. A
function like 'fromRational' could be used to create a GLfloat from another
number that has a random instance.

L
On May 3, 2012 4:39 AM, Mark Spezzano mark.spezz...@chariot.net.au
wrote:

 Hi,

 I tried this but now I get another error:

 The data constructors of `GLfloat' are not all in scope
   so you cannot derive an instance for it
 In the stand-alone deriving instance for `Random GLfloat'

 Mark

 On 03/05/2012, at 10:39 AM, Patrick Palka wrote:

 Because GLfloat is simply a newtype wrapper for CFloat, which has a Random
 instance, I would do:

 {-# LANGUAGE StandaloneDeriving #-}
 {-# LANGUAGE GeneralizedNewtypeDeriving #-}
 deriving instance Random GLFloat

 On Wed, May 2, 2012 at 6:29 PM, Mark Spezzano 
 mark.spezz...@chariot.net.au wrote:

 Hi Haskellers,

 I'm trying to generate a random vertex in OpenGL as follows.

 genPosition :: IO (Vertex3 GLfloat)
 genPosition = do x - getStdRandom $ randomR (-1.6,1.6)
   y - getStdRandom $ randomR (-1.0,1.0)
  return (Vertex3 x y (-1))

 Unfortunately the compiler complains about me having to implement an
 instance of Random for  GLfloat.

 How do I do this (or avoid having to do this)?

 Cheers,

 Mark


 ___
 Haskell-Cafe mailing list
 Haskell-Cafe@haskell.org
 http://www.haskell.org/mailman/listinfo/haskell-cafe


 ___
 Haskell-Cafe mailing list
 Haskell-Cafe@haskell.org
 http://www.haskell.org/mailman/listinfo/haskell-cafe



 ___
 Haskell-Cafe mailing list
 Haskell-Cafe@haskell.org
 http://www.haskell.org/mailman/listinfo/haskell-cafe


___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


[Haskell-cafe] OpenGL: No instance for Random GLfloat

2012-05-02 Thread Mark Spezzano
Hi Haskellers,

I'm trying to generate a random vertex in OpenGL as follows.

genPosition :: IO (Vertex3 GLfloat)
genPosition = do x - getStdRandom $ randomR (-1.6,1.6)
   y - getStdRandom $ randomR (-1.0,1.0)
  return (Vertex3 x y (-1))

Unfortunately the compiler complains about me having to implement an instance 
of Random for  GLfloat.

How do I do this (or avoid having to do this)?

Cheers,

Mark


___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] OpenGL: No instance for Random GLfloat

2012-05-02 Thread Patrick Palka
Because GLfloat is simply a newtype wrapper for CFloat, which has a Random
instance, I would do:

{-# LANGUAGE StandaloneDeriving #-}
{-# LANGUAGE GeneralizedNewtypeDeriving #-}
deriving instance Random GLFloat

On Wed, May 2, 2012 at 6:29 PM, Mark Spezzano
mark.spezz...@chariot.net.auwrote:

 Hi Haskellers,

 I'm trying to generate a random vertex in OpenGL as follows.

 genPosition :: IO (Vertex3 GLfloat)
 genPosition = do x - getStdRandom $ randomR (-1.6,1.6)
   y - getStdRandom $ randomR (-1.0,1.0)
  return (Vertex3 x y (-1))

 Unfortunately the compiler complains about me having to implement an
 instance of Random for  GLfloat.

 How do I do this (or avoid having to do this)?

 Cheers,

 Mark


 ___
 Haskell-Cafe mailing list
 Haskell-Cafe@haskell.org
 http://www.haskell.org/mailman/listinfo/haskell-cafe

___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] OpenGL: No instance for Random GLfloat

2012-05-02 Thread Mark Spezzano
Hi,

I tried this but now I get another error:

The data constructors of `GLfloat' are not all in scope
  so you cannot derive an instance for it
In the stand-alone deriving instance for `Random GLfloat'

Mark

On 03/05/2012, at 10:39 AM, Patrick Palka wrote:

 Because GLfloat is simply a newtype wrapper for CFloat, which has a Random 
 instance, I would do:
 
 {-# LANGUAGE StandaloneDeriving #-}
 {-# LANGUAGE GeneralizedNewtypeDeriving #-}
 deriving instance Random GLFloat
 
 On Wed, May 2, 2012 at 6:29 PM, Mark Spezzano mark.spezz...@chariot.net.au 
 wrote:
 Hi Haskellers,
 
 I'm trying to generate a random vertex in OpenGL as follows.
 
 genPosition :: IO (Vertex3 GLfloat)
 genPosition = do x - getStdRandom $ randomR (-1.6,1.6)
   y - getStdRandom $ randomR (-1.0,1.0)
  return (Vertex3 x y (-1))
 
 Unfortunately the compiler complains about me having to implement an instance 
 of Random for  GLfloat.
 
 How do I do this (or avoid having to do this)?
 
 Cheers,
 
 Mark
 
 
 ___
 Haskell-Cafe mailing list
 Haskell-Cafe@haskell.org
 http://www.haskell.org/mailman/listinfo/haskell-cafe
 
 ___
 Haskell-Cafe mailing list
 Haskell-Cafe@haskell.org
 http://www.haskell.org/mailman/listinfo/haskell-cafe

___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


[Haskell-cafe] OpenGL vs OpenGLRaw

2011-07-23 Thread Yves Parès
Hello Café,

Where do you people stand on using OpenGLRaw instead of the higher-level
layer?
I saw that the ports of the nehe tutorial use directly OpenGLRaw, and I
wondered why that choice had been made.
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] OpenGL vs OpenGLRaw

2011-07-23 Thread Jake McArthur
Translation from c is much more straightforward with openglraw compared with
OpenGL. Also, many of the design decisions behind OpenGL are arbitrary or
limiting, and some features aren't even exposed in its interface.
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] OpenGL vs OpenGLRaw

2011-07-23 Thread David Barbour
I'm a bit curious about who might be using GPipe.

I've been trying to find a good variation on OpenGL that integrates nicely
with reactive programming. Issues such as texture management seem to be
rather difficult targets.

On Sat, Jul 23, 2011 at 12:06 PM, Jake McArthur jake.mcart...@gmail.comwrote:

 Translation from c is much more straightforward with openglraw compared
 with OpenGL. Also, many of the design decisions behind OpenGL are arbitrary
 or limiting, and some features aren't even exposed in its interface.

 ___
 Haskell-Cafe mailing list
 Haskell-Cafe@haskell.org
 http://www.haskell.org/mailman/listinfo/haskell-cafe


___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] OpenGL vs OpenGLRaw

2011-07-23 Thread L Corbijn
On Sat, Jul 23, 2011 at 8:51 PM, Yves Parès limestr...@gmail.com wrote:

 Hello Café,

 Where do you people stand on using OpenGLRaw instead of the higher-level
 layer?
 I saw that the ports of the nehe tutorial use directly OpenGLRaw, and I
 wondered why that choice had been made.

 ___
 Haskell-Cafe mailing list
 Haskell-Cafe@haskell.org
 http://www.haskell.org/mailman/listinfo/haskell-cafe


Hi,

I was/am working on improving the OpenGL package. My view on things is that
the Raw package is a pure ffi package that imports the C functions for use
in haskell. The OpenGL package tries to make a more type safe interface on
top of the Raw package. I think that the nehe tutorials are more easily and
readable ported to the raw package (as it's using the raw C). The following
is my viewpoint on the current packages,

Some reasons for using the Raw package might be
- more constant API, it's the raw C api, while the OpenGL package sometimes
changes due to adding functionality.
- functionality is supported earlier in the Raw package.

Some reasons for using the OpenGL package
- Better type safety (can't mismatch a Shader and program identifier)
- Utility and typeclass (overloaded) functions

The biggest disadvantage of the OpenGL package is that it is a bit outdated
from the viewpoint of OpenGL functions, I was/am trying to add support for
3.0 and later. Hopefully the package OpenGL will improve over time to
include functions for the newer OpenGL API versions.

One other option, though not the nicest from somepoints, is using both. The
idea is to use where ever possible the functions of the OpenGL package, and
where necessary unsafeCoerce the object to the Identifier used by OpenGL and
use that for the Raw function. This works because all (as far as I know) the
OpenGL objects are represented by newtypes. Hopefully the number of
functions where this is needed will be reduced in the near future, as there
are quite some points where they are needed (Mostly for OpenGL = 3.0).

Lars Corbijn
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] OpenGL Speed!

2010-07-30 Thread Henning Thielemann
Vo Minh Thu schrieb:

 There other possibilities to deal with raster graphics:
 - Use gtk; i.e. something like
 http://hackage.haskell.org/package/AC-EasyRaster-GTK
 - Output the data in some image format (if you want to do it yourself,
 the most simple is PPM)
 - Use X11 directly (if you're on unix)
 - ...

or good old HGL:
http://hackage.haskell.org/package/HGL
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] OpenGL Speed!

2010-07-30 Thread Henning Thielemann


On Fri, 30 Jul 2010, Eitan Goldshtrom wrote:


HGL actually looks like EXACTLY what I need. I only need to set pixels, which 
looks like
just what HGL would be good at. Only problem is that I can't find a single 
tutorial for
HGL. Does anyone know or any, or where I could find one?


I found the Haddock documentation enough for what I tried. Maybe my 
example can help you:

  http://hackage.haskell.org/package/hgl-example
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] OpenGL Speed!

2010-07-30 Thread Eitan Goldshtrom
HGL actually looks like EXACTLY what I need. I only need to set pixels, 
which looks like just what HGL would be good at. Only problem is that I 
can't find a single tutorial for HGL. Does anyone know or any, or where 
I could find one?


-Eitan

On 7/30/2010 12:22 PM, Henning Thielemann wrote:

Vo Minh Thu schrieb:

   

There other possibilities to deal with raster graphics:
- Use gtk; i.e. something like
http://hackage.haskell.org/package/AC-EasyRaster-GTK
- Output the data in some image format (if you want to do it yourself,
the most simple is PPM)
- Use X11 directly (if you're on unix)
- ...
 

or good old HGL:
http://hackage.haskell.org/package/HGL
   
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


[Haskell-cafe] OpenGL Speed!

2010-07-29 Thread Eitan Goldshtrom
I'm having an unusual problem with OpenGL. To be honest I probably 
shouldn't be using OpenGL for this, as I'm just doing 2D and only 
drawing Points, but I don't know about any other display packages, so 
I'm making due. If this is a problem because of OpenGL however, then 
I'll have to learn another package. The problem is speed. I have a list 
of points representing the color of 800x600 pixels. All I'm trying to do 
is display the pixels on the screen. I use the following:


renderPrimitive Points $ mapM_ display list
flush
where
  display [] = return ()
  display ((x,y,i):n) = do
color $ Color3 i i i
vertex $ Vertex2 x y
display n

But, for some reason this takes FOREVER. I don't know how to use 
debugging hooks yet without an IDE -- and I don't use an IDE -- but I 
used a cleverly placed putStrLn to see that it was actually working, 
just really really slowly. Is there a solution to this speed problem or 
should I use a package that's more suited to 2D applications like this? 
Also, if I should use another package, are there any suggestions for 
which to use? Thanks for any help.


-Eitan
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] OpenGL Speed!

2010-07-29 Thread Vo Minh Thu
2010/7/29 Eitan Goldshtrom thesource...@gmail.com:
 I'm having an unusual problem with OpenGL. To be honest I probably shouldn't
 be using OpenGL for this, as I'm just doing 2D and only drawing Points, but
 I don't know about any other display packages, so I'm making due. If this is
 a problem because of OpenGL however, then I'll have to learn another
 package. The problem is speed. I have a list of points representing the
 color of 800x600 pixels. All I'm trying to do is display the pixels on the
 screen. I use the following:

 renderPrimitive Points $ mapM_ display list
 flush
 where
   display [] = return ()
   display ((x,y,i):n) = do
     color $ Color3 i i i
     vertex $ Vertex2 x y
     display n

 But, for some reason this takes FOREVER. I don't know how to use debugging
 hooks yet without an IDE -- and I don't use an IDE -- but I used a cleverly
 placed putStrLn to see that it was actually working, just really really
 slowly. Is there a solution to this speed problem or should I use a package
 that's more suited to 2D applications like this? Also, if I should use
 another package, are there any suggestions for which to use? Thanks for any
 help.

Hi,

Although you can use Vertex* to put a single Point on the screen, it
is not meant to be used as some kind of setPixel function.

If your goal is simply to set pixels' value of a raster, you can still
use OpenGL but should use a single textured quad (and thus manipulate
the texture's pixels).

There other possibilities to deal with raster graphics:
- Use gtk; i.e. something like
http://hackage.haskell.org/package/AC-EasyRaster-GTK
- Output the data in some image format (if you want to do it yourself,
the most simple is PPM)
- Use X11 directly (if you're on unix)
- ...

HTH,
Thu
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] OpenGL Speed!

2010-07-29 Thread Job Vranish
Yeah, using openGL Points to draw 2D images will probably be pretty slow.
However, if you don't need to change your points every frame, a display list
might improve the speed quite a bit (you could still transform the points as
a whole).

Also, you could try the SDL bindings for haskell:
http://hackage.haskell.org/package/SDL
SDL is better suited for 2D drawing (IMHO).
http://www.libsdl.org/


- Job


On Thu, Jul 29, 2010 at 6:51 AM, Vo Minh Thu not...@gmail.com wrote:

 2010/7/29 Eitan Goldshtrom thesource...@gmail.com:
  I'm having an unusual problem with OpenGL. To be honest I probably
 shouldn't
  be using OpenGL for this, as I'm just doing 2D and only drawing Points,
 but
  I don't know about any other display packages, so I'm making due. If this
 is
  a problem because of OpenGL however, then I'll have to learn another
  package. The problem is speed. I have a list of points representing the
  color of 800x600 pixels. All I'm trying to do is display the pixels on
 the
  screen. I use the following:
 
  renderPrimitive Points $ mapM_ display list
  flush
  where
display [] = return ()
display ((x,y,i):n) = do
  color $ Color3 i i i
  vertex $ Vertex2 x y
  display n
 
  But, for some reason this takes FOREVER. I don't know how to use
 debugging
  hooks yet without an IDE -- and I don't use an IDE -- but I used a
 cleverly
  placed putStrLn to see that it was actually working, just really really
  slowly. Is there a solution to this speed problem or should I use a
 package
  that's more suited to 2D applications like this? Also, if I should use
  another package, are there any suggestions for which to use? Thanks for
 any
  help.

 Hi,

 Although you can use Vertex* to put a single Point on the screen, it
 is not meant to be used as some kind of setPixel function.

 If your goal is simply to set pixels' value of a raster, you can still
 use OpenGL but should use a single textured quad (and thus manipulate
 the texture's pixels).

 There other possibilities to deal with raster graphics:
 - Use gtk; i.e. something like
 http://hackage.haskell.org/package/AC-EasyRaster-GTK
 - Output the data in some image format (if you want to do it yourself,
 the most simple is PPM)
 - Use X11 directly (if you're on unix)
 - ...

 HTH,
 Thu
 ___
 Haskell-Cafe mailing list
 Haskell-Cafe@haskell.org
 http://www.haskell.org/mailman/listinfo/haskell-cafe

___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] OpenGL Speed!

2010-07-29 Thread Vo Minh Thu
If you still want to use glVertex with GL_POINTS, instead of a display
list, you'd better go with vertex array or VBO.

But still, if the implicit coordinates of a raster is assumed, pairing
the coordinates with their value is overkill.

Cheers,
Thu

2010/7/29 Job Vranish job.vran...@gmail.com:
 Yeah, using openGL Points to draw 2D images will probably be pretty slow.
 However, if you don't need to change your points every frame, a display list
 might improve the speed quite a bit (you could still transform the points as
 a whole).

 Also, you could try the SDL bindings for haskell:
 http://hackage.haskell.org/package/SDL
 SDL is better suited for 2D drawing (IMHO).
 http://www.libsdl.org/


 - Job


 On Thu, Jul 29, 2010 at 6:51 AM, Vo Minh Thu not...@gmail.com wrote:

 2010/7/29 Eitan Goldshtrom thesource...@gmail.com:
  I'm having an unusual problem with OpenGL. To be honest I probably
  shouldn't
  be using OpenGL for this, as I'm just doing 2D and only drawing Points,
  but
  I don't know about any other display packages, so I'm making due. If
  this is
  a problem because of OpenGL however, then I'll have to learn another
  package. The problem is speed. I have a list of points representing the
  color of 800x600 pixels. All I'm trying to do is display the pixels on
  the
  screen. I use the following:
 
  renderPrimitive Points $ mapM_ display list
  flush
  where
    display [] = return ()
    display ((x,y,i):n) = do
      color $ Color3 i i i
      vertex $ Vertex2 x y
      display n
 
  But, for some reason this takes FOREVER. I don't know how to use
  debugging
  hooks yet without an IDE -- and I don't use an IDE -- but I used a
  cleverly
  placed putStrLn to see that it was actually working, just really really
  slowly. Is there a solution to this speed problem or should I use a
  package
  that's more suited to 2D applications like this? Also, if I should use
  another package, are there any suggestions for which to use? Thanks for
  any
  help.

 Hi,

 Although you can use Vertex* to put a single Point on the screen, it
 is not meant to be used as some kind of setPixel function.

 If your goal is simply to set pixels' value of a raster, you can still
 use OpenGL but should use a single textured quad (and thus manipulate
 the texture's pixels).

 There other possibilities to deal with raster graphics:
 - Use gtk; i.e. something like
 http://hackage.haskell.org/package/AC-EasyRaster-GTK
 - Output the data in some image format (if you want to do it yourself,
 the most simple is PPM)
 - Use X11 directly (if you're on unix)
 - ...

 HTH,
 Thu
 ___
 Haskell-Cafe mailing list
 Haskell-Cafe@haskell.org
 http://www.haskell.org/mailman/listinfo/haskell-cafe


___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] OpenGL Speed!

2010-07-29 Thread Luke Palmer
Yep, no surprise there.  I would suggest using bitmap[1] to construct
your bitmap, and bitmap-opengl to put it into an OpenGL texture and
draw it on a textured quad.  I think OpenGL is actually an OK choice
for this application, because it is the most portable graphics method
we have available.

If you are trying to redraw in realtime, eg. 30 FPS or so, I don't
think you're going to be able to.  There is just not enough GPU
bandwidth (and probably not enough CPU) for that (unless you write it
in a pixel shader, which IIRC haskell has some neat tools for, but I
don't remember).  If this is the case, see if you can boil down what
you have into something that doesn't require so much data, e.g.
polygons.

[1] http://hackage.haskell.org/package/bitmap
[2] http://hackage.haskell.org/package/bitmap-opengl

On Thu, Jul 29, 2010 at 3:57 AM, Eitan Goldshtrom
thesource...@gmail.com wrote:
 I'm having an unusual problem with OpenGL. To be honest I probably shouldn't
 be using OpenGL for this, as I'm just doing 2D and only drawing Points, but
 I don't know about any other display packages, so I'm making due. If this is
 a problem because of OpenGL however, then I'll have to learn another
 package. The problem is speed. I have a list of points representing the
 color of 800x600 pixels. All I'm trying to do is display the pixels on the
 screen. I use the following:

 renderPrimitive Points $ mapM_ display list
 flush
 where
   display [] = return ()
   display ((x,y,i):n) = do
     color $ Color3 i i i
     vertex $ Vertex2 x y
     display n

 But, for some reason this takes FOREVER. I don't know how to use debugging
 hooks yet without an IDE -- and I don't use an IDE -- but I used a cleverly
 placed putStrLn to see that it was actually working, just really really
 slowly. Is there a solution to this speed problem or should I use a package
 that's more suited to 2D applications like this? Also, if I should use
 another package, are there any suggestions for which to use? Thanks for any
 help.

 -Eitan

 ___
 Haskell-Cafe mailing list
 Haskell-Cafe@haskell.org
 http://www.haskell.org/mailman/listinfo/haskell-cafe


___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] OpenGL Speed!

2010-07-29 Thread Nick Bowler
On 2010-07-29 11:30 -0600, Luke Palmer wrote:
 If you are trying to redraw in realtime, eg. 30 FPS or so, I don't
 think you're going to be able to.  There is just not enough GPU
 bandwidth (and probably not enough CPU).

Updating an 800x600 texture at 30fps on a somewhat modern system is
absolutely *not* a problem.

-- 
Nick Bowler, Elliptic Technologies (http://www.elliptictech.com/)
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


[Haskell-cafe] OpenGL Linking Issue

2009-03-18 Thread Mark Spezzano
Hi,

 

I hope I’m posting to the right forum. 

 

I’ve got OpenGL up and running on my Windows Vista machine (finally!) and it
runs perfectly well under Eclipse, bringing up a window with a rendered
image as expected.

 

The only issue starts when I try to compile the program without Eclipse
(using GHC rather than GHCi like Eclipse does).

 

I type:

 

C:\Users\Mark\workspace2\OpenGLPractice\srcghc --make Test1.hs

 

And get the following output

 

Linking Test1.exe ...

C:\Program
Files\Haskell\GLUT-2.1.1.2\ghc-6.10.1/libHSGLUT-2.1.1.2.a(Extensions.

o):fake:(.text+0xcc): undefined reference to `glutgetprocaddr...@4'

collect2: ld returned 1 exit status

 

What’s happening here? Obviously it’s a linking issue, but I was wondering
whether I need to include some library or file or option to ghc so that it
links correctly.

 

Cheers,

 

Mark Spezzano

 


No virus found in this outgoing message.
Checked by AVG. 
Version: 7.5.557 / Virus Database: 270.11.18/2008 - Release Date: 17/03/2009
4:25 PM
 
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] OpenGL Linking Issue

2009-03-18 Thread Sean Lee
Hi Mark,

From what I remember -- I haven't used OpenGL for about a year -- you
need something like

ghc --make -package GLUT -lglut Test1.hs

assuming that you have both HOpenGL haskell package and OpenGL c library.


Cheers,
Sean

2009/3/18 Mark Spezzano mark.spezz...@chariot.net.au:
 Hi,



 I hope I’m posting to the right forum.



 I’ve got OpenGL up and running on my Windows Vista machine (finally!) and it
 runs perfectly well under Eclipse, bringing up a window with a rendered
 image as expected.



 The only issue starts when I try to compile the program without Eclipse
 (using GHC rather than GHCi like Eclipse does).



 I type:



 C:\Users\Mark\workspace2\OpenGLPractice\srcghc --make Test1.hs



 And get the following output



 Linking Test1.exe ...

 C:\Program
 Files\Haskell\GLUT-2.1.1.2\ghc-6.10.1/libHSGLUT-2.1.1.2.a(Extensions.

 o):fake:(.text+0xcc): undefined reference to `glutgetprocaddr...@4'

 collect2: ld returned 1 exit status



 What’s happening here? Obviously it’s a linking issue, but I was wondering
 whether I need to include some library or file or option to ghc so that it
 links correctly.



 Cheers,



 Mark Spezzano



 No virus found in this outgoing message.
 Checked by AVG.
 Version: 7.5.557 / Virus Database: 270.11.18/2008 - Release Date: 17/03/2009
 4:25 PM

 ___
 Haskell-Cafe mailing list
 Haskell-Cafe@haskell.org
 http://www.haskell.org/mailman/listinfo/haskell-cafe





-- 
Sean Lee
PhD Student
Programming Language and Systems Research Group
School of Computer Science and Engineering
University of New South Wales
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] OpenGL and Cabal installation

2009-03-18 Thread Henk-Jan van Tuyl
On Wed, 18 Mar 2009 04:34:29 +0100, Mark Spezzano  
mark.spezz...@chariot.net.au wrote:




configure: error: no GLUT header found, so this package cannot be built

See `config.log' for more details.


Where is it looking for these glut.h files? I’ve tried putting them
everywhere in my PATH but msys won’t find them.


Add an environment variable C_INCLUDE_PATH and assign it the value of the  
directory where you keep the header files; for instance, if the glut.h  
file is in:

  C:\usr\local\include\GLUT
then do
  Set C_INCLUDE_PATH=C:\usr\local\include
before compiling, or set this variable in Windows with System properties.
Similarly, set LIBRARY_PATH for the libraries to link.

--
Regards,
Henk-Jan van Tuyl


--
http://functor.bamikanarie.com
http://Van.Tuyl.eu/
--


___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


[Haskell-cafe] OpenGL and Cabal installation

2009-03-17 Thread Mark Spezzano
Hi all,

 

I’m trying desperately to get OpenGL up and running. 

 

I type 

cabal.exe configure and get the following

 

...

checking GL/glut.h usability... no

checking GL/glut.h presence... no

checking for GL/glut.h... no

checking for GLUT library... no

checking for GL/glut.h... (cached) no

checking GLUT/glut.h usability... no

checking GLUT/glut.h presence... no

checking for GLUT/glut.h... no

configure: error: no GLUT header found, so this package cannot be built

See `config.log' for more details.

 

Where is it looking for these glut.h files? I’ve tried putting them
everywhere in my PATH but msys won’t find them.

 

Cheers,

 

Mark Spezzano 


No virus found in this outgoing message.
Checked by AVG. 
Version: 7.5.557 / Virus Database: 270.11.18/2008 - Release Date: 17/03/2009
4:25 PM
 
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Opengl and Haskell GLdouble/GLfloat vs. Double/Float

2007-09-28 Thread Jules Bean

bbrown wrote:
I am going to be doing a lot of opengl stuff in haskell and so far one thing 
has irked me.  Why does haskell keep the GLFloat and GL types and not just 
the Haskell types.  


It mirrors the C API in doing so.

I assume that this is because, in principle a system might exist where 
the graphics card (and hence, openGL library) used different-precision 
numbers to the CPU.  Perhaps 32bit card on a 64bit machine gives you 
32bit GLints even though you have 64 bit ints?


I don't know how often (if ever) this happens in practice.

Certainly GLfloat, GLdouble, GLint are members of all the type classes 
you would hope them to be and they are no less convenient to use.


Jules
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Opengl and Haskell GLdouble/GLfloat vs. Double/Float

2007-09-28 Thread Bas van Dijk
On 9/28/07, Jules Bean [EMAIL PROTECTED] wrote:
 Certainly GLfloat, GLdouble, GLint are members of all the type classes
 you would hope them to be and they are no less convenient to use.

And so, if you need it, you can always coerce between the GL and the
standard Haskell types by using the general coercion functions:
'fromIntegral' and 'realToFrac'.

Bas
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


[Haskell-cafe] Opengl and Haskell GLdouble/GLfloat vs. Double/Float

2007-09-27 Thread bbrown
I am going to be doing a lot of opengl stuff in haskell and so far one thing 
has irked me.  Why does haskell keep the GLFloat and GL types and not just 
the Haskell types.  

--
Berlin Brown
[berlin dot brown at gmail dot com]
http://botspiritcompany.com/botlist/?

___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] OpenGL

2007-06-02 Thread Jon Harrop
On Saturday 02 June 2007 16:01:05 Dan Piponi wrote:
 On 6/1/07, Jon Harrop [EMAIL PROTECTED] wrote:
  Great, thanks. May I just ask, does ShadowMap.hs work on your machine?

 It compiles fine. Runs imperfectly. The shadow map matrix seems to be
 off so that shadows aren't quite being cast correctly. (ghc 6.6.1 on
 MacOSX with a GeForce FX Go5200 if that's relevant.) Additionally the
 main window lacks a title bar and doesn't receive keyboard events.
 I've no idea if the problems are with HOpenGL or the example code.

I had a suspicion that the problem might be my forcing antialiasing on 
globally using nvidia-settings. This has adversely affected my own code 
(OCaml, nothing to do with HOpenGL). But I tried turning it back off and the 
ShadowMap.hs demo still displays only a black screen here. I'm on Linux with 
a GF7900GX but it shouldn't make any difference.

As long as it works on someone else's computer, I'll keep fiddling with it 
here. Thanks!

BTW, the other OpenGL demos I've seen are fantastic - great work!

-- 
Dr Jon D Harrop, Flying Frog Consultancy Ltd.
OCaml for Scientists
http://www.ffconsultancy.com/products/ocaml_for_scientists/?e
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] OpenGL

2007-06-01 Thread Dan Piponi

Jon asked:

Where should I go to get started with OpenGL and Haskell?


Don't use the examples here:
http://www.haskell.org/HOpenGL/
They don't work with recent versions of HOpenGL.

But do use the examples here:
http://cvs.haskell.org/cgi-bin/cvsweb.cgi/fptools/libraries/GLUT/examples/RedBook/
They worked for me.
--
Dan
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] OpenGL

2007-06-01 Thread Jon Harrop
On Saturday 02 June 2007 02:45:48 Dan Piponi wrote:
 But do use the examples here:
 http://cvs.haskell.org/cgi-bin/cvsweb.cgi/fptools/libraries/GLUT/examples/R
edBook/ They worked for me.

Great, thanks. May I just ask, does ShadowMap.hs work on your machine?

-- 
Dr Jon D Harrop, Flying Frog Consultancy Ltd.
OCaml for Scientists
http://www.ffconsultancy.com/products/ocaml_for_scientists/?e
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] OpenGL

2007-05-31 Thread Andrew Coppin

Creighton Hogg wrote:

Check out this blog entry as a nice starting place
http://blog.mikael.johanssons.org/archive/2006/09/opengl-programming-in-haskell-a-tutorial-part-1/


Thanks for this...

(I too would like to start hacking around with OpenGL!)

___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


[Haskell-cafe] OpenGL

2007-05-30 Thread Jon Harrop

I've found HOpenGL and the Debian package libghc6-opengl-dev. The former seems 
to be very out of date (last release 2003) but I can't find any demos for the 
latter.

Where should I go to get started with OpenGL and Haskell?

-- 
Dr Jon D Harrop, Flying Frog Consultancy Ltd.
OCaml for Scientists
http://www.ffconsultancy.com/products/ocaml_for_scientists/?e
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] OpenGL

2007-05-30 Thread Creighton Hogg

On 5/30/07, Jon Harrop [EMAIL PROTECTED] wrote:



I've found HOpenGL and the Debian package libghc6-opengl-dev. The former
seems
to be very out of date (last release 2003) but I can't find any demos for
the
latter.

Where should I go to get started with OpenGL and Haskell?



For at least GHC you can use the libraries that come with.
Check out this blog entry as a nice starting place
http://blog.mikael.johanssons.org/archive/2006/09/opengl-programming-in-haskell-a-tutorial-part-1/
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] OpenGL

2007-05-30 Thread Bryan O'Sullivan

Jon Harrop wrote:


Where should I go to get started with OpenGL and Haskell?


Take a look at Gtk2Hs, which has OpenGL bindings.

For example, see http://darcs.haskell.org/gtk2hs/demo/opengl/

b

___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] OpenGL

2007-05-30 Thread Duncan Coutts
On Wed, 2007-05-30 at 16:09 -0700, Bryan O'Sullivan wrote:
 Jon Harrop wrote:
 
  Where should I go to get started with OpenGL and Haskell?
 
 Take a look at Gtk2Hs, which has OpenGL bindings.
 
 For example, see http://darcs.haskell.org/gtk2hs/demo/opengl/

The Gtk2Hs OpenGL stuff is only a replacement for the GLUT windowing
tookkit. The Gtk2Hs OpenGL stuff still has to be used in combination
with the standard Graphics.Rendering.OpenGL modules.

The Gtk2Hs OpenGL stuff basically consists of a GL widget that you can
embed into other Gtk+ windows and then use the standard OpenGL calls to
render into the GL widget.

Duncan

___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] OpenGL

2007-05-30 Thread Thomas Schilling

See the examples/RedBook directory in the source code.  It gives you a
good idea how the C-idioms are translated.

For an actual documentation on OpenGL you'll better take a look at
general OpenGL literature and translate them into Haskell.  Note that
it's quite complex, though.

On 5/31/07, Jon Harrop [EMAIL PROTECTED] wrote:


I've found HOpenGL and the Debian package libghc6-opengl-dev. The former seems
to be very out of date (last release 2003) but I can't find any demos for the
latter.

Where should I go to get started with OpenGL and Haskell?

--
Dr Jon D Harrop, Flying Frog Consultancy Ltd.
OCaml for Scientists
http://www.ffconsultancy.com/products/ocaml_for_scientists/?e
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe




--
Remember! Everytime you say 'Web 2.0' God kills a startup! -
userfriendly.org, Jul 31, 2006
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] OpenGL

2007-05-30 Thread Jason Dagit

On 5/30/07, Jon Harrop [EMAIL PROTECTED] wrote:


I've found HOpenGL and the Debian package libghc6-opengl-dev. The former seems
to be very out of date (last release 2003) but I can't find any demos for the
latter.

Where should I go to get started with OpenGL and Haskell?


I started converting the (famous?) NeHe tutorials to Haskell.  I made
it through the 12th tutorial before I moved on to other things.  You
can find my darcs repository here:
http://www.codersbase.com/index.php/Nehe-tuts

If you convert any more of the NeHe lessions to haskell, I accept darcs patches.

Have fun!
Jason
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


[Haskell-cafe] OpenGL and GLUT in GHC

2007-03-24 Thread Ruben Zilibowitz

Hi,

I'm experimenting using OpenGL and GLUT in Haskell using GHC. There  
are modules Graphics.Rendering.OpenGL and Graphics.UI.GLUT. I am  
using these.


I've encountered a strange bug which I'm having trouble with. If I  
render a sphere and a plane, then the plane is facing the wrong way  
and is shaded on the wrong side. If however I only render the plane  
then it appears to be facing the right way and is shaded on the  
correct side.


I have made the source file available as a download here:
http://www.cse.unsw.edu.au/~rubenz/stuff/test.hs

It can be built by running: ghc --make test.hs
I am using ghc 6.6

There is a comment in the source file saying: Commenting out the  
line below here causes the plane to be rendered facing towards the  
camera...
The bug can be seen by commenting out the line of code that follows  
and recompiling.


If anyone can help me by explaining why I am getting this bug or how  
to fix it that would be great. I'd be very appreciative.


Cheers,

Ruben

___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] OpenGL and GLUT in GHC

2007-03-24 Thread Sven Panne
[ Small note: Library-related questions should better be directed to 
[EMAIL PROTECTED], and for mails regardind the OpenGL/GLUT packages there 
is the [EMAIL PROTECTED] mailing list. ]

On Saturday 24 March 2007 13:37, Ruben Zilibowitz wrote:
 [...] I've encountered a strange bug which I'm having trouble with. If I
 render a sphere and a plane, then the plane is facing the wrong way
 and is shaded on the wrong side. If however I only render the plane
 then it appears to be facing the right way and is shaded on the
 correct side. [...]

I guess the problem is a misunderstanding of what 'autoNormal $= Enabled' 
does. It enables the automatic generation of analytic normals when 2D 
evaluators are used. It doesn't affect the rendering of normal primitives 
like quads. You don't provide any normals for your plane, so the current 
normal is used for all four vertices. The value of the current normal is 
(Vector 0 0 1) initially, so this seems to work if you render the plane 
alone, *but* the GLUT object rendering functions provide normals for all 
their vertices. So the net effect is that the normals for the vertices of 
your plane are set to whichever normal GLUT has specified last. Simple fix: 
Provide normals for your quad, i.e. use

   normal (Normal3 0 0 (1 :: GLfloat))

before specifying any vertex of your quad. In general when lighting is used, 
make sure to provide the correct normals for all vertices. Unit normals 
should be preferred, otherwise you have to tell OpenGL about that and this 
leads to more work (= rescaling/normalization of all normals within OpenGL).

A few more notes:

   * There is no need in your example to use 'normalize $= Enabled' when you 
provide unit normals. GLUT does this, BTW.

   * Setting the material properties could be done only once in the 
initialization function.

   * Use postRedisplay only when something has really changed, e.g. at the end 
of 'motion'. Otherwise you get 100% CPU/GPU load for no good reason.

   * IORefs are StateVars, so you can use get and ($=) instead of readIORef 
and writeIORef, this is more consistent with the OpenGL/GLUT API.

Cheers,
   S.
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe