On Sun, Jun 16, 2013 at 4:42 PM, <briand@aracnet.com> wrote:
On Sun, 16 Jun 2013 16:15:25 -0400
Brandon Allbery <allbery.b@gmail.com> wrote:
> On Sun, Jun 16, 2013 at 4:03 PM, <briand@aracnet.com> wrote:
> > Changing the declaration to GLdouble -> GLdouble -> GLdouble -> IO() and
> > using
> > (0.0::GLdouble) fixes it, and I'm not clear on why it's not automagic.
> >  There are many times I see the
>
> I presume the reason the type specification for numeric literals is because
> there is no defaulting (and probably can't be without introducing other
> strange type issues) for GLdouble.

What I was thinking about, using a very poor choice of words, was this :

*Main> let a = 1
*Main> :t a
a :: Integer
*Main> let a = 1::Double
*Main> a
1.0
*Main> :t a
a :: Double
*Main>

so normally 1 would be interpreted as an int, but if I declare 'a' a Double then it gets "promoted" to a Double without me having to call a conversion routine explicitly.

That seems automagic to me.

No magic involved, although some automation is. Take a look at the `default` keyword in the Haskell Report (this is the "defaulting" I mentioned earlier).

http://www.haskell.org/onlinereport/haskell2010/haskellch4.html#x10-790004.3.4

The "default `default`" is `default (Integer, Double)` which means that it will try to resolve a numeric literal as type Integer, and if it gets a type error it will try again with type Double.

You should use this same mechanism to make numeric literals work with OpenGL code: neither Integer nor Double will produce a valid type for the expression, but at the same time the compiler cannot infer a type because there are two possibilities (GLfloat and GLdouble). You could therefore add a declaration `default (Integer, Double, GLdouble)` so that it will try GLdouble to resolve numeric literals when neither Integer nor Double will work.

> How can I simply declare 0.0 to be (0.0::GLdouble) and have the functional call work.  Doesn't a conversion have to be happening, i.e. shouldn't I really have to do (realToFrac 0.0) ?

The first part I just answered. As to the second, a conversion *is* happening, implicitly as defined by the language; the question being, to what type. A numeric literal has type (Num a => a), implemented by inserting a call to `fromIntegral` for literals without decimal points and `fromRational` for others. But the compiler can't always work out what `a` is in (Num a => a) without some help (the aforementioned `default` declaration).

--
brandon s allbery kf8nh                               sine nomine associates
allbery.b@gmail.com                                  ballbery@sinenomine.net
unix, openafs, kerberos, infrastructure, xmonad        http://sinenomine.net